Sign in to follow this  

glGenTextures() memory problem.

This topic is 3776 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a while loop that is in a display list that is something like this:
while (i < l)
{
  glGenTextures(1, &tex);
  ...
  glDeleteTextures(1, &tex);
}
I thought that when glDeleteTextures() is called it deletes all the memory glGenTextures() allocates. Why Am I getting memory leaks when I run this while loop? And the code in between isn't causing the problem because when I comment out the gl functions there is no leak. However, I need to generate a texture so I can't just leave them commented out.

Share this post


Link to post
Share on other sites
What exactly is the leak from? glGenTextures() just creates a texture ID (no memory to my knowledge is even located.. it simply returns a unique, unused, number) (In my programs i don't even use glGenTextures, i simply create a unique ID).

You claim it's not your code creating the memory leak, but, it'd be a good idea to show us the code anyway. Also, how do you know it's creating a memory leak? (Does it pop up an exception? Or are you using some memory manager (ie mmgr) ).

~Chris

Share this post


Link to post
Share on other sites
I'm looking at the memory for the program and it goes up about 500kbs a second consistently. When comment out those lines it doesn't go up. So, I don't see much need in putting in the code. Especially because it's some sloppily written code that is awfully inefficient. But:


int ft_render_glyph(char c, GLuint texture, int ox, int oy)
{
FT_UInt glyph_index = FT_Get_Char_Index(ft_face, c);

FT_UInt error = FT_Load_Glyph(ft_face, glyph_index, FT_LOAD_DEFAULT);
if (error)
{
return 0;
}

error = FT_Render_Glyph(ft_face->glyph, FT_RENDER_MODE_NORMAL);

if (error)
{
return 0;
}

int width = power2(ft_face->glyph->bitmap.width);
int height = power2(ft_face->glyph->bitmap.rows);

GLubyte* expanded_data = new GLubyte[ 2 * width * height];

for (int j = 0; j < height; j++)
{
for (int i = 0; i < width; i++)
{
expanded_data[2*(i+j*width)]= expanded_data[2*(i+j*width)+1] =
(i>=ft_face->glyph->bitmap.width || j>=ft_face->glyph->bitmap.rows) ?
0 : ft_face->glyph->bitmap.buffer[i + ft_face->glyph->bitmap.width*j];
}
}

glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, expanded_data);

delete [] expanded_data;

glTranslatef(ft_face->glyph->bitmap_left, 0, 0);
glTranslatef(0, -ft_face->glyph->bitmap_top, 0);

float x = (float)ft_face->glyph->bitmap.width/(float)width;
float y = (float)ft_face->glyph->bitmap.rows/(float)height;

render_texture(texture, ox + ft_face->glyph->bitmap_left, oy + y + 16, width, height);

return ft_face->glyph->advance.x >> 6;
}

glGenTextures(1, &tex);
o += ft_render_glyph(str[i], tex, o + x, y);
glDeleteTextures(1, &tex);


The glGenTextures causing the memory to go up is confusing me as much as you. Does anyone have an idea?

Share this post


Link to post
Share on other sites

This topic is 3776 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this