Greetings!
I am running into a problem with implicit conversion of arguments when
trying to insert the contents of a CString object into a stringstream.
I am using the following typedef:
typedef std::basic_stri ngstream<TCHAR> ustringstream;
I have a function that somewhat resembles the following:
void DoSomething(CSt ring TitleString)
{
TryToDoSomethin g();
if (ItDidntWork())
{
ustringstream theStream;
theStream << TitleString << _T("\n");
theStream << WhyItDidntWork( );
TellTheUserItDi dntWork(theStre am);
}
}
If _UNICODE is not defined, TCHAR evaluates to char, and everybody's
happy. The stream is built as expected. But if _UNICODE is defined,
TCHAR becomes unsigned short, and instead of the expected title, the
stream contains a memory address in hexadecimal format.
Stepping into the code at the point of the first insertion, I see that
the first thing that happens is that CString's operator(LPCTST R)
conversion method is called. Then basic_ostream<> 's operator<<(cons t
void*) method is called, which just puts the pointer's value into the
stream.
If I explictly cast TitleString to an LPCTSTR, the title is inserted
into the string as expected! But an explicit cast is really ugly, and
there's a lot of places where I'd have to do it. In an attempt to
find a better way, I wrote the following function:
uostream& operator<<(uost ream& theStream, const CString& theString)
{
return operator<<(theS tream, (LPCTSTR)theStr ing);
}
But when this function was executed, the const char* that I got from
the cast was converted back into a CString and I ended up in an
endless loop and a stack overflow.
Can anybody explain what is going on here and what the best way out of
this morass is?
Thanks very much!
Rob Richardson
I am running into a problem with implicit conversion of arguments when
trying to insert the contents of a CString object into a stringstream.
I am using the following typedef:
typedef std::basic_stri ngstream<TCHAR> ustringstream;
I have a function that somewhat resembles the following:
void DoSomething(CSt ring TitleString)
{
TryToDoSomethin g();
if (ItDidntWork())
{
ustringstream theStream;
theStream << TitleString << _T("\n");
theStream << WhyItDidntWork( );
TellTheUserItDi dntWork(theStre am);
}
}
If _UNICODE is not defined, TCHAR evaluates to char, and everybody's
happy. The stream is built as expected. But if _UNICODE is defined,
TCHAR becomes unsigned short, and instead of the expected title, the
stream contains a memory address in hexadecimal format.
Stepping into the code at the point of the first insertion, I see that
the first thing that happens is that CString's operator(LPCTST R)
conversion method is called. Then basic_ostream<> 's operator<<(cons t
void*) method is called, which just puts the pointer's value into the
stream.
If I explictly cast TitleString to an LPCTSTR, the title is inserted
into the string as expected! But an explicit cast is really ugly, and
there's a lot of places where I'd have to do it. In an attempt to
find a better way, I wrote the following function:
uostream& operator<<(uost ream& theStream, const CString& theString)
{
return operator<<(theS tream, (LPCTSTR)theStr ing);
}
But when this function was executed, the const char* that I got from
the cast was converted back into a CString and I ended up in an
endless loop and a stack overflow.
Can anybody explain what is going on here and what the best way out of
this morass is?
Thanks very much!
Rob Richardson
Comment