SDL: coping with trashed GL context on SDL_SetVideoMode

Started by
8 comments, last by Revelation60 14 years, 1 month ago
Hi, I am using SDL and OpenGL as the foundation of my engine. I recently found out that calling SDL_SetVideoMode trashes the GL context. Seeing how most of my engine is done, it would be a shame to switch to another library at this stage. The implication of this bug (I cannot imagine that this is a welcome feature) is that I cannot let my users resize their screen, so they cannot go from windowed to full screen without losing all textures and states. My question is how I can deal with this problem. I can't "just" reload all textures, because that would completely ruin the modularity of my code. My texture manager stores the textures handles, so I can call glGetTexImage to copy the textures to a buffer and then reload them. This however is extremely slow and ugly. How did you guys solve this problem?
Currently working on WalledIn
Advertisement
Is it imperative that you allow dynamic resizing? I have played several games that simply state "you must restart for these changes to take effect". It's not so terrible.
Amateurs practice until they do it right.Professionals practice until they never do it wrong.
@The OP: Are you talking about resizing the screen, or about switching video modes (e.g. changing resolution or switching between fullscreen and windowed)?

If the latter, then I would agree that it's important that the user be able to do that while the app is running (that is, you shouldn't have to restart the app in order for the changes to take effect).

The context thing is an oft-lamented issue with SDL, but the solution is pretty straightforward, and that is to recreate dynamic resources (such as textures) as needed. You said this would ruin your code's modularity, so I take it you've got things set up in a way that's not particularly conducive to dynamic reloading of resources. Unfortunately though, that may be the best (and perhaps only) solution to the problem.

I use SDL also and haven't ever had a problem with this, but then again I set things up from the outset to make it easy to reload resources when needed.
Quote:or about switching video modes (e.g. changing resolution or switching between fullscreen and windowed)?

I am, but the situation is equivalent for both resizing and video mode switching.

Quote:The context thing is an oft-lamented issue with SDL, but the solution is pretty straightforward, and that is to recreate dynamic resources (such as textures) as needed. You said this would ruin your code's modularity, so I take it you've got things set up in a way that's not particularly conducive to dynamic reloading of resources. Unfortunately though, that may be the best (and perhaps only) solution to the problem.

My current situation is that textures are loaded in the initialization phase of the game, by asking the texture manager to load from file. While it is possible to add textures in a later stage, I haven't done that yet. The texture manager stores the textures in an array, and uses the index as a unique identifier for the texture. A texture is a simple class that contains the OpenGL ID and other information.

If I would allow reloading, I should clear the array and somehow tell the game to reinitialize, but only the textures. I would also have to tell that to my font manager, which stores its font in the texture manager. As you can see this is not something you want to do. The texture manager should know nothing about games nor fonts.

My suggestion of reading the texture buffer back from Opengl, storing it, changing view mode and then loading them back sounds like a better solution. But it still is horrible.
Currently working on WalledIn
Quote:My current situation is that textures are loaded in the initialization phase of the game, by asking the texture manager to load from file. While it is possible to add textures in a later stage, I haven't done that yet. The texture manager stores the textures in an array, and uses the index as a unique identifier for the texture. A texture is a simple class that contains the OpenGL ID and other information.

If I would allow reloading, I should clear the array and somehow tell the game to reinitialize, but only the textures. I would also have to tell that to my font manager, which stores its font in the texture manager. As you can see this is not something you want to do. The texture manager should know nothing about games nor fonts.
I might not be fully understanding how your system is set up, but from what you've posted, I'm not quite clear as to what the problem is.

Say that the texture 'my_font.png' is stored in slot '5' in the texture manager array. Presumably some font uses this texture and knows its ID (5).

When it comes time to reload the texture, you don't have to clear the array; you just go through and reload each of the existing textures in turn. The OpenGL texture ID stored internally may change, but the texture object's ID (its position in the array) will not. So, the ID of '5' held by the font (for example) will still be valid.

Anyway, this is more or less how my system works (except the 'handles' are shared pointers to the texture objects), and there are no problems; references are never invalidated, and the texture manager doesn't have to know anything about the game or about any of the objects that make use of textures.
Then you probably store more information in your texture class than I do. I just record width, height and internal id. The buffer used to load the texture in memory is deleted on texture creation.

It seems you store the filename, and maybe type (GL_RGBA, GL_LUMINANCE_ALPHA, etc.). This may work if all textures are loaded from file, but my fonts are loaded from a buffer I generate when the font is loaded from a custom format. Of course you can't tell if a texture is a font texture (and you don't want your font manager to know the difference) so there is no way to recreate this texture, without explicitly loading the font again which generates a different ID.

Another solution would be to not delete the buffers. I'd hate to waste memory like that, just for this.
Currently working on WalledIn
Ok, I see. In that case, I'm afraid I really don't have any suggestions other than to re-arrange things so that these resources can be re-loaded and/or re-created more easily (I know that's not very helpful).

There's been quite a bit of discussion of this issue over on the SDL forums; if you search the archives over there, you might be able to find some info on the current status of this issue, and maybe also some other ideas about how to go about solving the problem.
I wrote a solution :) It involves copying the texture to a temporary buffer before the context gets destroyed.

void Renderer::ToggleFullScreen(){	Singleton<TextureManager>::Instance()->SaveToBuffer(); // save textures    // set dimensions to desktop dimensions    m_pSurface = SDL_SetVideoMode(m_nDesktopWidth, m_nDesktopHeight, 0, m_unFlags ^ SDL_FULLSCREEN);    if (!m_pSurface)        m_pSurface = SDL_SetVideoMode(0, 0, 0, m_unFlags);     else        m_unFlags ^= SDL_FULLSCREEN;	InitGl(); // reload OpenGL	Singleton<TextureManager>::Instance()->LoadFromBuffer(); // load textures}

void TextureManager::SaveToBuffer(){	for (int i = 0; i < m_aList.size(); i++) 	{		glBindTexture(GL_TEXTURE_2D, m_aList.m_nInternalID);		m_aList.m_pBuffer = (char*)malloc(m_aList.m_nWidth * m_aList.m_nHeight * 4); // RGBA -> 4 bytes		glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE,			m_aList.m_pBuffer);	}}/* Assumes SaveToBuffer was called first */void TextureManager::LoadFromBuffer(){	for (int i = 0; i < m_aList.size(); i++)	{			glGenTextures(1, &m_aList.m_nInternalID);		glBindTexture(GL_TEXTURE_2D, m_aList.m_nInternalID);		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); // new texture, so set parameters        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);	    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);		glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_aList.m_nWidth, m_aList.m_nHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE,			m_aList.m_pBuffer);		delete [] m_aList.m_pBuffer;	}}


FYI: all my textures are stored as GL_RGBA.

I hope this codes helps others as well!
Currently working on WalledIn
You shouldn't mix malloc() and delete [] like that. Either use malloc() and free(), or new [] and delete[].

Also, you could use local state rather than having a member for a temporary buffer in your list (consider, your current code leaves an invalid pointer in this area, which is not a great idea):
// Somewhere typedef std::vector<char> Buffer;typedef std::vector<Buffer> BufferList;void Renderer::ToggleFullScreen(){    BufferList buffers;    Singleton<TextureManager>::Instance()->SaveToBuffer(buffers); // save textures    // ...    Singleton<TextureManager>::Instance()->LoadFromBuffer(buffers); // load textures}void TextureManager::SaveToBuffer(BufferList &buffers){	buffers.resize(m_aList.size());	for (int i = 0; i < m_aList.size(); i++) 	{		glBindTexture(GL_TEXTURE_2D, m_aList.m_nInternalID);		Buffer &buffer = buffers;		buffer.resize(m_aList.m_nWidth * m_aList.m_nHeight * 4);		glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE,			&buffer.front());	}}// LoadFromBuffer() would be similar.

This also neatly ensures that any memory allocated is cleanly deallocated.
The malloc delete thing was stupid. I thought I new'ed it... Storing the buffer in the function itself is probably a good idea, thanks!

For users interested in using this code: please note that the interal format I use is always GL_RGBA. If you want to use different formats, remember to change the buffer size, the format in glGetTexImage and both formats in glTexImage2D.

Currently working on WalledIn

This topic is closed to new replies.

Advertisement