Archived

This topic is now archived and is closed to further replies.

Dip2K

How Can Display 2Bytes Code Character ?

Recommended Posts

Dip2K    122
Korean and Japanes, China Character compose 2Byte Code.. but OpenGL can Display only one byte!! Is it right?? Hm.... is It impossible to display 2Byte Code Character in the Windows? Thanks for your reading and Sorry my unskilled English...

Share this post


Link to post
Share on other sites
Strylinys    122
If you find no other way and are desperate, you could always just create the characters in a paint program and make a texture out of them... That way you can at least have them on screen somehow.

S.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
Under NT, you can use UNICODE strings with the wgl font functions...you can''t use DBCS, and you can''t use either in Win98.

Soooo....to get DBCS strings in NT, use MultiByteToWideChar() to convert the DBCS string to UNICODE. Then use the UNICODE string in calls to wgl font funcs and the associated glCallLists() call.

So:

1) Convert DBCS character set to UNICODE using MultiByteToWideChar.


2) Build the display lists using wglUseFontWhatever. You need to know the UNICODE index range for the DBCS character set for the "first" and "count" parameters of wglUseFontOutlines.


3) Set the display list base to the same base used in step 2 using glListBase.


4) Convert the output string from DBCS to UNICODE using MultiByteToWideChar.


5)Call glCallLists with the UNICODE string.

Share this post


Link to post
Share on other sites