Knowing whether or not SDL_SetVideoMode destroyed the OpenGL context
Members - Reputation: 982
Posted 16 December 2009 - 04:53 AM
Moderators - Reputation: 8515
Posted 16 December 2009 - 08:50 AM
The latest 1.2 build *should* have this fix, you can probably just drop in SDL.dll and see if it happens as often.
As jyk alluded to, there is some discussion on the mailing list (Brian = me) about adding an event or other notification mechanism when the context is lost. AFAIK there is no implementation work done on this yet.
Members - Reputation: 982
Posted 16 December 2009 - 12:51 PM
Future features of SDL are great, but this program is supposed to be running right now!
I'm not using SDL 1.3 yet simply because it's not finished yet (and will be excited - but also have some work - when it is).
A related question: what would happen if the GL context is destroyed, a new one is created instead, and I'd do "glDeleteTextures" on some textures that were created on that previous context? I mean, to prevent memory leak in case that the context was NOT destroyed I'd have to use glDeleteTextures. But to be honest, I'm not entirely sure how exactly the relationship between these contexts, the SDL surface, the graphics card and OpenGL works, so please excuse me if the question makes no sense...
Might there be some "gl" function calls to get the info I need? (whether or not the current context is the same as earlier).
Members - Reputation: 122
Posted 03 January 2010 - 11:00 AM
Original post by Lode
A related question: what would happen if the GL context is destroyed, a new one is created instead, and I'd do "glDeleteTextures" on some textures that were created on that previous context?
Use glIsTexture on a texture ID to find out whether or not the current state machine is holding a texture object bound to the ID. If the context is destroyed, though, the state machine shouldn't be holding any old texture IDs. I'm not sure what calling glDeleteTextures on an ID that doesn't exist in the state machine would do, but if you plan on going that route, at least check it with glIsTexture before trying it.
(Although, it might be that in the case of Windows losing the context, the texture is "corrupted" but still maintained with an ID. I haven't tested it personally so I can't say.)
Also, in response to the original post... couldn't you just play it safe and add some code to delete and reload all your textures just in case? Or do you really want to take advantage of that extra little bit of efficiency on Linux?
I mean, I know it's a bit of a pain, but unless you're planning on sticking it out til a 1.3 polished release, it's probably best to follow the advice given in this article.
[Edited by - cmc5788 on January 3, 2010 5:00:11 PM]