• 10
• 10
• 12
• 12
• 14

# SDL unicode and scandinavian characters

This topic is 3167 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello! I'm coding an input box for the user into my application, and decided to use unicode to get all kinds of international chars. However, scandinavian letters just don't want to work:
if(event.key.keysym.unicode > 0){
}

The addChar functions parameter is declared "char" and stored in an "std::string". 'ä', 'ö' and 'å' for example, just return wrong characters ('e' etc.) - is this a problem in the fontfile or in the code, or something with SDL's unicode? Thanks!

##### Share on other sites
Not 100% sure on this but should you not be using std::wstring and wchar_t for unicode?

##### Share on other sites
Quote:
 Original post by pkelly83Not 100% sure on this but should you not be using std::wstring and wchar_t for unicode?

Would I also have to convert to ascii to be able to print the correct characters out of the font bitmap?

Edit: if the answer is yes, wouldn't it be the same to convert the single char and pass it as char to the addChar function?

##### Share on other sites
The unicode code point for 'å' is 0xE5. 0xE5 & 0x7F is 0x66 which corresponds to the code point for 'f'. Similarly the unicode code point for 'ä' is 0xE4, which after you and it with 0x7F will spit out 0x65 which is 'e'.

This probably means that the SDL Unicode translations are working right (returning the correct Unicode code points) but your usage of the Unicode code points is incorrect.

##### Share on other sites
Quote:
 Original post by SiCraneThe unicode code point for 'å' is 0xE5. 0xE5 & 0x7F is 0x66 which corresponds to the code point for 'f'. Similarly the unicode code point for 'ä' is 0xE4, which after you and it with 0x7F will spit out 0x65 which is 'e'.This probably means that the SDL Unicode translations are working right (returning the correct Unicode code points) but your usage of the Unicode code points is incorrect.

Is there any replacement for the 0x7F or should I remove it totally? Without it I get weird things (looking like ; and :) when using scandinavian letters :O

##### Share on other sites
Beats me. I don't know how your font handling works. If you're trying to use a font that doesn't map directly to unicode code points, then you're going to have to do some sort of translation, but that depends on your font system.

##### Share on other sites
Quote:
 Original post by SiCraneBeats me. I don't know how your font handling works. If you're trying to use a font that doesn't map directly to unicode code points, then you're going to have to do some sort of translation, but that depends on your font system.

The characters are ordered by their ASCII code in the bitmap, and then I calculate their position with the ascii.....bleb blab blah... etc. - so, is there any simple way to convert? A native C++ function or maybe some addon?

##### Share on other sites
Those characters are not in ASCII. If your bitmap only contains ASCII characters there's no conversion that will work, as there's no image that matches.

##### Share on other sites
I got åäö working with SDL in a game engine i made, and i think i just did like this:

event.key.keysym.unicode & 0xFF

I don't have the source code right now, but i think that worked :P. If you look at www.asciitable.com you see that you need the full 8-bit byte to get the scandinavian letters (im from sweden btw :P).

##### Share on other sites
Quote:
 Original post by gardinI don't have the source code right now, but i think that worked :P. If you look at www.asciitable.com you see that you need the full 8-bit byte to get the scandinavian letters (im from sweden btw :P).
ASCII is, by definition, only 7-bit. Anything that appears in the 128-255 range depends on the current code page; the characters on asciitable.com appear to correspond to MS-DOS code page 437.