glRasterPos3f linked to memory leak... possible?

Started by
5 comments, last by Dave Eberly 12 years, 9 months ago
Hi,

I have an opengl application, and was wondering why it kept allocating memory during the whole execution time, even when idle. Running in debug mode didn't help because it didn't produce any memory leak warning. However when I look at my application memory usage (in the windows task manager), it climbs and climbs. So I could narrow down the cause of that, at least I could prevent the memory usage climbing (as seen from the windows task manager) by commenting out all "glRasterPos3f" commands in my application.

This appeared very odd to me, and googling for memory leaks and glRasterPos3f didn't give me any related topic. Is it possible that my version of openGl leaks memory? How can I fix the problem or test it further? Did anyone else have similar problems?

Thanks for any insight!
Advertisement
http://www.opengl.org/wiki/FAQ

or more specifically
http://www.opengl.org/wiki/FAQ#Memory_Usage
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Thanks for the info V-Man.

Actually, I also noticed other "funny" things. Following code (in my opengl construction) will cause memory allocation until an out of memory occurs (after a couple of hours). The allocation is constant and linked to the framerate:

[source]
glBegin(GL_QUADS);
glTexCoord2f(texCorners[0],texCorners[1]);
glVertex3i(c0[0],c0[1],0);
glTexCoord2f(texCorners[0],texCorners[3]);
glVertex3i(c0[0],c1[1],0);
glTexCoord2f(texCorners[2],texCorners[3]);
glVertex3i(c1[0],c1[1],0);
glTexCoord2f(texCorners[2],texCorners[1]);
glVertex3i(c1[0],c0[1],0);
glEnd();
[/source]

The above code does what it is intended to do (it display a textured quad). When I replace above code by a display list that was built with the same instructions, then there is no constant memory allocation. Isn't that a bit strange? So I solved one part of my constant memory allocation by using a display list! I am aware that all the other opengl instructions around above piece of code do also play a role, but still, it surprises me a lot.
That sounds more like a driver problem than an OpenGL problem. I'd encourage you to check your driver version, see if an update is available, and then check the readme for the update to see what bugs are fixed.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Is this another thread about Intel integrated crappy solution graphics card?
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Thanks mhagain!

That was the reason! After updating my driver, I have no Problems anymore...
I have an ATI Mobility Radeon HD 5470

Cheers

Thanks mhagain!

That was the reason! After updating my driver, I have no Problems anymore...
I have an ATI Mobility Radeon HD 5470

Cheers



Had the same problem at work, whether desktop or notebook, and also using DX11. The latest drivers worked fine.

This topic is closed to new replies.

Advertisement