This topic is now archived and is closed to further replies.

How Can Display 2Bytes Code Character ?

This topic is 6262 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Korean and Japanes, China Character compose 2Byte Code.. but OpenGL can Display only one byte!! Is it right?? Hm.... is It impossible to display 2Byte Code Character in the Windows? Thanks for your reading and Sorry my unskilled English...

Share this post

Link to post
Share on other sites
Guest Anonymous Poster
Under NT, you can use UNICODE strings with the wgl font can''t use DBCS, and you can''t use either in Win98. get DBCS strings in NT, use MultiByteToWideChar() to convert the DBCS string to UNICODE. Then use the UNICODE string in calls to wgl font funcs and the associated glCallLists() call.


1) Convert DBCS character set to UNICODE using MultiByteToWideChar.

2) Build the display lists using wglUseFontWhatever. You need to know the UNICODE index range for the DBCS character set for the "first" and "count" parameters of wglUseFontOutlines.

3) Set the display list base to the same base used in step 2 using glListBase.

4) Convert the output string from DBCS to UNICODE using MultiByteToWideChar.

5)Call glCallLists with the UNICODE string.

Share this post

Link to post
Share on other sites