SDL_image to textured quad

Started by
29 comments, last by icecubeflower 15 years, 9 months ago
What? Sorry, but you need to do a little more research on this topic.

When he deletes the memory for the SDL surface he is releasing the memory on the CPU side that was used to load the image from the file. You don't need that pointer anymore as the data is uploaded to the videocard after you call glTexImage(), and with the texture object you call with glBindTexture() you basically retrieve that image to be used again. So there is no need to keep around the data you used with loading the images, unless you need to reload the textures because you lose your RC.
Advertisement
Okay, so it's loaded into video memory.

I used to pass a member of my TextureImage struct to glBindTexture like this:

glBindTexture(GL_TEXTURE_2D, whatever.texID);

texID is a GLuint

You're telling me I can free my pointer because the image is in video memory. But then what do I pass to glBindTexture to make it bind that same image again?

Is that why you guys use a class and I only have a function that is not part of a class? I mean do you guys build your Texture class to hold lots of images and keep a GLuint for each new texture to find it in video memory again?

I guess I'll do some reading, I don't really know where to start.
Yes you keep the GLuint variable around to access the texture with a call to glBindTexture(). Most people use a texture manager, but if your project is small enough you can just use an array with constants defined to keep track.

enum{GRASS, DIRT, WATER, SKY, TOTAL_TEXURES};GLuint textures[TOTAL_TEXTURES] = {0};//load textures and setup texture parametersglGenTextures(1, &textures[GRASS]);glBindTexture(GL_TEXTURE_2D, textures[GRASS]);glTexParameteri();glTexImage2D();//load nextglGenTextures(1, &textures[DIRT]);glBindTexture(GL_TEXTURE_2D, textures[DIRT]);glTexParameteri();glTexImage2D();//Now draw textured polygonsglBindTexture(GL_TEXTURE_2D, textures[GRASS]);//draw some terrain with grassglBindTexture(GL_TEXTURE_2D, textures[SKY]);//draw sky


HTH
Okay, thanks. So we free that pointer ourselves. We keep around a GLint variable somehow and access the image in video memory whenever we want.

So... who frees it from video memory?
The driver will. IIRC it happens when you call glDeleteTextures, but I am not 100% sure on that.
Uh... I've sort of like NEVER called glDeleteTextures so am I leaking video memory all over the place?
Quote:Original post by icecubeflower
Uh... I've sort of like NEVER called glDeleteTextures so am I leaking video memory all over the place?

No, it's called for you when you delete the OpenGL context. In SDL, the context is deleted whenever you call SDL_FreeSurface on the VideoMode surface (AKA, 'screen' or whatever you name it). Because SDL actually calls SDL_FreeSurface(screen) for you, you don't even need to do that.

All you need to do is close SDL when you are done, using SDL_Quit. SDL_Quit will call SDL_FreeSurface(myScreenSurface), along with other things, which will call glDeleteTextures or something equivalent.

It's important to note, that since deleting the VideoMode surface deletes all your OpenGL textures, if you resize the SDL screen, you'll need to reload all your images again. (Or else, not free the SDL_Surface loaded, and keep a copy in non-videocard memory, and just re-glTexImage2D them) I keep a std::string with the filename of the images I load, and reload them whenever I resize the screen.

(Note: I am still a beginner at OpenGL, so I might be wrong here)
What if while my program is running the player leaves one map and goes to another and does that a lot of times and lots and lots of images are loaded into video memory? What if all the video memory is taken? From what you told me I will run out of video memory because it won't be freed until SDL_Quit is called and that doesn't happen until the player closes the program.

And what if the program crashes or the the user exits the program in some way he's not supposed to? Then is all that video memory lost until the computer is rebooted?
Quote:Original post by icecubeflower
What if while my program is running the player leaves one map and goes to another and does that a lot of times and lots and lots of images are loaded into video memory? What if all the video memory is taken?
This is why you have the option of doing it yourself via glDeleteTextures. If your maps aren't too large, you'd probably want to load only one map's textures, or only the nearby maps' textures, when a map is loaded, and free all unnecessary textures when the map is freed. If all maps share the same textures, and you don't have many textures, just load them all at the start of your game, and free them, or let SDL free them, at the end of your program.

glDeleteTextures frees the texture(s) associated with the array of GLuint you pass it, from the video card. If you want to delete one texture from the video card, just do this: glDeleteTextures(1, &myTextureID). If you want to delete an array, do this: glDeleteTextures(lengthOfTheArray, myTextureIDArray).
Quote:Original post by Servant of the Lord
Quote:Original post by icecubeflower
Uh... I've sort of like NEVER called glDeleteTextures so am I leaking video memory all over the place?

No, it's called for you when you delete the OpenGL context. In SDL, the context is deleted whenever you call SDL_FreeSurface on the VideoMode surface (AKA, 'screen' or whatever you name it). Because SDL actually calls SDL_FreeSurface(screen) for you, you don't even need to do that.

All you need to do is close SDL when you are done, using SDL_Quit. SDL_Quit will call SDL_FreeSurface(myScreenSurface), along with other things, which will call glDeleteTextures or something equivalent.


Just to clear up any confusion, you can *never* call SDL_FreeSurface() on the video surface. The video surface is a special one which can be freed by either calling SDL_QuitSubSystem(SDL_INIT_VIDEO) or SDL_Quit(). While logically you can think of these two as including the equivalent to a call to SDL_FreeSurface() on the video surface, this doesn't mean that this is precisely what happens.

Quote:
It's important to note, that since deleting the VideoMode surface deletes all your OpenGL textures, if you resize the SDL screen, you'll need to reload all your images again.


I believe this was actually a bug, and I have vague recollections of it being fixed. However, your advice still holds if the bug hasn't been fixed, or you are unable to obtain a version of SDL.dll that contains the fix.

This topic is closed to new replies.

Advertisement