Jump to content

  • Log In with Google      Sign In   
  • Create Account


Knowing whether or not SDL_SetVideoMode destroyed the OpenGL context


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
4 replies to this topic

#1 Lode   Members   -  Reputation: 980

Like
0Likes
Like

Posted 16 December 2009 - 04:53 AM

For SDL 1.2: On Windows, SDL_SetVideoMode destroys the OpenGL context, on Linux it does not. So in Windows, all textures have to be updated to the video card again, on Linux not. I even think doing that would give a memory leak in the video RAM on Linux because of not deleting the textures first (when the context is destroyed they get deleted automatically). Currently, I use an "#if defined(_WIN32)" to detect the difference. But is there a more portable way to detect with SDL whether or not it destroyed the OpenGL context? Is there a way to get a pointer or something to "the OpenGL context" and see whether or not it's different? Or an SDL function equivalent to "SDL_DeletesOpenGLContextOnVideoResizeOnCurrentPlatform" (didn't find any). The SDL_Surface* pointer returned by SDL_SetVideoMode turns out to remain the same, so that pointer doesn't represent the opengl context.

Sponsor:

#2 scgames   Members   -  Reputation: 1977

Like
0Likes
Like

Posted 16 December 2009 - 08:27 AM

I don't know the answer, but I know this has been discussed fairly recently on the SDL forums (with respect to 1.3, I believe), so you might try doing a search over there, if you haven't already.

#3 rip-off   Moderators   -  Reputation: 8113

Like
0Likes
Like

Posted 16 December 2009 - 08:50 AM

What version of SDL 1.2 do you have? I recall this being an issue a few months ago, and there was a patch to prevent context loss where possible (I think it still occurs when switching to and from fullscreen).

The latest 1.2 build *should* have this fix, you can probably just drop in SDL.dll and see if it happens as often.

As jyk alluded to, there is some discussion on the mailing list (Brian = me) about adding an event or other notification mechanism when the context is lost. AFAIK there is no implementation work done on this yet.

#4 Lode   Members   -  Reputation: 980

Like
0Likes
Like

Posted 16 December 2009 - 12:51 PM

I'm using the latest version of SDL 1.2 on Linux at least (sdl-1.2.14), on Windows probably not but I can try that if I'm booted in it some next time. But that only makes it worse! If different versions of SDL 1.2 do or do not delete the OpenGL context on Windows, how can I ever know what it's going to be and whether or not the textures need to be recreated? I was just thinking I could rely on the predictable "win32 = they get deleted, linux = they don't".

Future features of SDL are great, but this program is supposed to be running right now!

I'm not using SDL 1.3 yet simply because it's not finished yet (and will be excited - but also have some work - when it is).

A related question: what would happen if the GL context is destroyed, a new one is created instead, and I'd do "glDeleteTextures" on some textures that were created on that previous context? I mean, to prevent memory leak in case that the context was NOT destroyed I'd have to use glDeleteTextures. But to be honest, I'm not entirely sure how exactly the relationship between these contexts, the SDL surface, the graphics card and OpenGL works, so please excuse me if the question makes no sense...

Might there be some "gl" function calls to get the info I need? (whether or not the current context is the same as earlier).

#5 cmc5788   Members   -  Reputation: 122

Like
0Likes
Like

Posted 03 January 2010 - 11:00 AM

Quote:
Original post by Lode
A related question: what would happen if the GL context is destroyed, a new one is created instead, and I'd do "glDeleteTextures" on some textures that were created on that previous context?


Use glIsTexture on a texture ID to find out whether or not the current state machine is holding a texture object bound to the ID. If the context is destroyed, though, the state machine shouldn't be holding any old texture IDs. I'm not sure what calling glDeleteTextures on an ID that doesn't exist in the state machine would do, but if you plan on going that route, at least check it with glIsTexture before trying it.

(Although, it might be that in the case of Windows losing the context, the texture is "corrupted" but still maintained with an ID. I haven't tested it personally so I can't say.)

Also, in response to the original post... couldn't you just play it safe and add some code to delete and reload all your textures just in case? Or do you really want to take advantage of that extra little bit of efficiency on Linux?

I mean, I know it's a bit of a pain, but unless you're planning on sticking it out til a 1.3 polished release, it's probably best to follow the advice given in this article.

[Edited by - cmc5788 on January 3, 2010 5:00:11 PM]




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS