Jump to content
  • Advertisement
Sign in to follow this  
nullsquared

Unicode refuses to work in SDL (not the version that just came out)

This topic is 4512 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So I'm making a little console thingy that toggles with the F1 key. I enabled unicode, and decided I might use some unicode to write stuff. But, no characters are read! Look: Init unicode:
void input::enable_unicode() {
        //if (!is_unicode_on()) // don't need right now
            SDL_EnableUNICODE(1);
    }

input::input() {
        ::benefit::input::enable_unicode(); // HERE

        _fill_keys(m_ascii_keys);
        m_keys = NULL;

        m_quit_requested = NULL;
    }

Convert from unicode to ascii:
char input::to_ascii(Uint16 unicode) {
        char c = '\0';
        if ((unicode & 0xFF80) == 0)
            c = unicode & 0x7F;

        return c;
    }

Take the char and add it:
void on_keydown(SDL_keysym key) {
        switch (key.sym) {
            case SDLK_F1:
                m_console.set_focus((m_console.has_focus()) ? false : true);
            break;

            case SDLK_ESCAPE:
                this->benefit::input::request_quit();
            break;

            default:
            break;
        }

        if (m_console.has_focus()) {
            switch (key.sym) {
                case SDLK_BACKSPACE:
                    m_console.pop_char();
                break;

                case SDLK_RETURN:
                    m_console.push_line();
                break;

                default: {
                    char c = benefit::input::to_ascii(key.unicode);
                    if (c != '\0')
                        m_console.push_char(c); // THIS NEVER GET'S CALLED!
                } break;
            }
        }
    }

And the funky thing is, that if I comment out the 'c' stuff and just push_char() an 'a' or something, it works (I get 'a's for every key, but I actually want the actual unicode [ascii]). Any ideas? And yes, I have enabled my console ;P. I get my message:
app() {
        m_quit = false;
        m_console.font.load_font("Font.bmp", 0, 0, 0);
        m_console.set_wh(VIDEO_W, int(float(VIDEO_H) * 0.5f));

        m_console.push_line("Console initialized!"); // I RECIEVE THIS!!!
        m_console.push_line();
        m_console.set_focus(true);

        m_video.initialize_window(VIDEO_W, VIDEO_H, "App!");
        m_renderer.resize_viewport(0, 0, VIDEO_W, VIDEO_H);
    }

And the other funky thing is that enter does what it is supposed to, and backspace too. Just the unicode seems to not work.

Share this post


Link to post
Share on other sites
Advertisement
I just got back from school, anyone? This is really stopping me [sad]. I *really* don't know what's wrong.

I really only need ascii, is there maybe another way besides the unicode field? (Other than the switching between SDLK_a, SDLK_b, SDLK_c, etc.?)

Share this post


Link to post
Share on other sites
If you just want ascii then shouldn't it be enough to cast the unicode to a char? It works fine for me anyway. I noticed that your to_ascii function is the same as in the offical sdl docs, however I have a hard time right now figuring out why that logic is needed for plain ascii characters.

Share this post


Link to post
Share on other sites
This also happens to me on Windows in the same situation and when directly casting to char. It's actually returning 0 on my machine. However, it works fine on Mac OS X.

I ended up just directly casting the keysym.sym to char; it shows up as ASCII then but input is a bit more complicated as you must manually handle backspace.

Share this post


Link to post
Share on other sites
Quote:
Original post by Ravuya
This also happens to me on Windows in the same situation and when directly casting to char. It's actually returning 0 on my machine. However, it works fine on Mac OS X.

I ended up just directly casting the keysym.sym to char; it shows up as ASCII then but input is a bit more complicated as you must manually handle backspace.


I already handle backspace manually, so that shouldn't be a problem... Are you sure all the SDLKey's map to char's correctly? I'll try...

So... is this a bug in SDL? Because the funny thing is that in a previous version (before a month's worth of work at least) it worked fine. And no, I didn't do anything against the unicode, I just added images and animations and stuff [lol].

Thanks in advance.

Share this post


Link to post
Share on other sites
Quote:
Original post by agi_shi
Quote:
Original post by Ravuya
This also happens to me on Windows in the same situation and when directly casting to char. It's actually returning 0 on my machine. However, it works fine on Mac OS X.

I ended up just directly casting the keysym.sym to char; it shows up as ASCII then but input is a bit more complicated as you must manually handle backspace.


I already handle backspace manually, so that shouldn't be a problem... Are you sure all the SDLKey's map to char's correctly? I'll try...

So... is this a bug in SDL? Because the funny thing is that in a previous version (before a month's worth of work at least) it worked fine. And no, I didn't do anything against the unicode, I just added images and animations and stuff [lol].

Thanks in advance.


Well, it works, sorta. Did you just do a map for every key that has a second version used with shift? The numbers on the top are easy, but what about all the other keys?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!