reading string from SDL input

Started by
0 comments, last by rip-off 16 years, 6 months ago
Hi, Is there a quick way to convert SDL input keys to ASCII codes? Writing a code converting each keyboard key into ASCII code seems like long and bad way to do this. Moreover SDL documentation says it shouldn't be done since it can cause problems with compability betweeen keyboards. So what is the "right" way?
Advertisement
You can use SDL to do this. First, call "SDL_EnableUNICODE( SDL_ENABLE );". When you no longer care about the ascii values you can disable it again using "SDL_EnableUNICODE( SDL_DISABLE );". Here are the helper functions I use:

// Note: SDL_EnableUNICODE(1); needs to be called before the event is received by SDL//       This returns zero to indicate error/not useful key//       otherwise returns the character that this key representschar getUnicodeValue( const SDL_KeyboardEvent &key ){    assert( SDL_EnableUNICODE(SDL_QUERY) == SDL_ENABLE );    // magic numbers courtesy of SDL docs :)    const int INTERNATIONAL_MASK = 0xFF80, UNICODE_MASK = 0x7F;    int uni = key.keysym.unicode;    if( uni == 0 ) // not translatable key (like up or down arrows)    {        // probably not useful as string input        // we could optionally use this to get some value        // for it: SDL_GetKeyName( key );        return 0;    }    else if( ( uni & INTERNATIONAL_MASK ) == 0 )    {        if( SDL_GetModState() & KMOD_SHIFT )        {            return static_cast<char>(toupper(uni & UNICODE_MASK));        }        else        {            return static_cast<char>(uni & UNICODE_MASK);        }    }    else // we have a funky international character. one we can't read :(    {        // we could do nothing, or we can just show some sign of input, like so:        // return '?';        return 0;    }}

This topic is closed to new replies.

Advertisement