how to convert char to string(using TCHAR * or LPTSTR)

Started by
3 comments, last by johnnyBravo 20 years, 6 months ago
i want to convert a: char name[50]; to either TCHAR * atName; or the preffered one: LPTSTR atName; my program keeps crashing when i try things like strcpy or sprintf, thanks
Advertisement
Destination buffer must be an array.

You''re Welcome,
Rick Wong
LPTSTR must point to a NULL terminated string (I think) so when you make your name variable...
char name[50]={0};

then you can just:

LPTSTR atName=reinterpret_cast(name);

I think that should work.

EDIT: This mean that changing name, would change the content of atName.
LPTSTR atName=reinterpret_cast(new char[50]);
then you can strcpy if you want a new copy of it.
remember to delete[] atName when it goes out of scope... or use a std::auto_ptr instead.

[edited by - dmounty on October 12, 2003 10:39:04 AM]
You don't need any casts:

char cstr[] = "pie in the sky";
LPTSTR wstr = cstr;

Note that the conversion to a pointer to a char is deprecated, as stated by the standard. That is, use const char*.

[ Google || Start Here || ACCU || STL || Boost || MSDN || GotW || CUJ || MSVC++ Library Fixes || BarrysWorld || E-Mail Me ]

[edited by - Lektrix on October 12, 2003 11:10:36 AM]
[ Google || Start Here || ACCU || STL || Boost || MSDN || GotW || CUJ || MSVC++ Library Fixes || BarrysWorld || [email=lektrix@barrysworld.com]E-Mail Me[/email] ]
In a nutshell -

LPTSTR translates as a char pointer or a wchar_t pointer depending on whether the UNICODE token is defined or not. UNICODE is native on WNT/2K/XP and ANSI is native on W9x. NT systems do a good job of automatically converting ANSI strings to UNICODE where needed, but W9x doesn't perform the reverse quite so well.

If you have a string literal value that you want in unicode when the UNICODE token is defined, use either the TEXT or _T macros.

LPTSTR myString = _T("Hello World!");

When UNICODE is not defined, the above is the same as

char *myString = "Hello World!";

And when UNICODE is defined, the above is the same as

wchar_t *myString = L"Hello World!";

The 'L' prepending the string literal informs the compiler to render the string using wide characters, aka unicode.

Converting from ANSI to UNICODE and vice versa at run time is more complicated and requires the use of a couple of special API's. A simple pointer assignment won't do it.

// ----------------------------------------------------------------------------////BOOL UnicodeToAnsi(	LPWSTR pszwUniString, 	LPSTR  pszAnsiBuff,	DWORD  dwAnsiBuffSize	){	int iRet = 0;    iRet = WideCharToMultiByte(		CP_ACP,		0,		pszwUniString,		-1,		pszAnsiBuff,		dwAnsiBuffSize,		NULL,		NULL		);	return ( 0 != iRet );}// ----------------------------------------------------------------------------////BOOL AnsiToUnicode(    LPSTR  pszAnsiString,     LPWSTR pszwUniBuff,     DWORD dwUniBuffSize    ){	int iRet = 0;    iRet = MultiByteToWideChar(		CP_ACP,		0,		pszAnsiString,		-1,		pszwUniBuff,		dwUniBuffSize		);	return ( 0 != iRet );}


Google on WideCharToMultiByte and MultiByteToWideChar for more info.

Further, strcpy or sprintf require single byte character strings. If you want to use them with multibyte character strings, use wstrcpy and wsprintf. And if you want to use them with T strings - TCHAR * or LPTSTR, use the associated _t versions of the function, _tstrcpy, _tsprintf. The prototypes for those should be found in tchar.h. They evaluate to the ANSI or WIDE version of the function depending on whether UNICODE is defined or not during compilation.


[edited by - lessbread on October 12, 2003 6:43:57 PM]
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man

This topic is closed to new replies.

Advertisement