unicode - how to convert
I have a prog in multibyte and want to convert to unicode so i can use some dxut api's.
what is imvolved and it is hard to do.
What do you mean? Multibyte as in 16 bit characters? What do you mean by Unicode? Unicode doesn't care about how the characters are stored, and there are lots of different encodings for it.
Quote:Original post by smart_idiot
What do you mean? Multibyte as in 16 bit characters? What do you mean by Unicode? Unicode doesn't care about how the characters are stored, and there are lots of different encodings for it.
well the character set in the proeprty pages has multibyte and if you set to unicode then I get an error. To get a program to compile with unicode what do i do as I can't see any strings I have defined.
If I create a solution and create a basic win32 program then it won't compile in unicode.
Quite a simple/basic question to be frank.
for string literals, put a capitol L in front of them. L"unicode string"
if you want to convert existing strings to unicode, try MultiByteToWideChar
if you want to convert existing strings to unicode, try MultiByteToWideChar
if you take out all the string literals L"__" to just "__" then that is good enough in ansi, I take it.
These might help. YMMV
// ----------------------------------------------------------------------------////BOOL WINAPI UnicodeToAnsi( LPWSTR pszwUniString, LPSTR pszAnsiBuff, DWORD dwAnsiBuffSize ){ int iRet = 0; iRet = WideCharToMultiByte( CP_ACP, 0, pszwUniString, -1, pszAnsiBuff, dwAnsiBuffSize, NULL, NULL ); return ( 0 != iRet );}// ----------------------------------------------------------------------------////BOOL WINAPI AnsiToUnicode( LPSTR pszAnsiString, LPWSTR pszwUniBuff, DWORD dwUniBuffSize ){ int iRet = 0; iRet = MultiByteToWideChar( CP_ACP, 0, pszAnsiString, -1, pszwUniBuff, dwUniBuffSize ); return ( 0 != iRet );}
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement