OpenGL leaking memory?

Started by
19 comments, last by McZ 18 years, 3 months ago
I just checked in the taskmanager and there I could see that my graphics engine increase 4kb of memory every second. But I can't find any code of mine that leaks (I have no new/delete in render loop). I comment out almost everything in the renderloop so now it looks like this:

void CEngineCore::execute()
{
	MSG msg;
	while(1)
	{
		if(PeekMessage(&msg, NULL, NULL, NULL, PM_REMOVE))
		{
			TranslateMessage(&msg);
			DispatchMessage(&msg);

			if(msg.message == WM_QUIT)
				break;
		}
		else
		{
			// Render a single frame
			drawFrame();
		}
	};
}

// This is all I have in drawFrame, the rest is commented out.
void CEngineCore::drawFrame()
{
	// Start new frame
	m_pSurface->beginFrame();


	// ... code here commented out

	// end frame
	m_pSurface->endFrame();
}

void CWindowSurfaceGL::beginFrame()
{
	glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
}

void CWindowSurfaceGL::endFrame()
{
	SwapBuffers(m_hDC);
}

This is all my engine does in the renderloop but still it increase the memory with 4kb it takes every second. if I comment out beginFrame() and endFrame() it doesn't eat my memory.
Advertisement
I once had a very dumb OpenGL memory leak: everytime I drew a texture, I uploaded it again to video memory (while it already was in there). After a while memory got full because the same texture was many many times in there, and it became extremely slow, then crashed. Make sure you don't do this same mistake :)
I havn't implemented textures yet, just restarted on my new engine. the only thing I draw that I have commented out is a vertexarray with 3 vertices defining a single triangle, no texture and no lights
Do you create the VA every frame?
No I don't create the VA every frame.. once at startup (made that mistake once before)
I had the same problem at work some times ago. The memory used by our application was slowly growing over time. We found out that simply calling glClear(0) after swapping buffers was solving the problem. However we didn't call glClear anywhere else, so I don't know if it can help you.
Lead programmer of the (slowly evolving) syBR game engine --- http://sybr.sourceforge.net
sebarnolds > Thank you. it works for me too, adding a glClear(0); after SwapBuffers(m_hDC); in endFrame() function results in a no memory leak
Glad it helped you.

Now if anybody has an explaination to this, I'm willing to hear it because I don't understand why there is a leak when we don't call glClear().

Besides, we spotted the memory leak when we didn't call glClear at all (because we were drawing on the entire screen). Now it seems you are already calling glClear (at the beginning of the frame) but you have the leak.

I don't think it should have a visible effect on performances, but try to replace the glClear(0) you've just added by the glClear() call you're making in the beginFrame() function and to remove it from the beginFrame() just to see if the leak is still there.
Lead programmer of the (slowly evolving) syBR game engine --- http://sybr.sourceforge.net
I took a look on Google, and it appears a few other people have experienced this too. See here.

I tried to recreate the leak on my own machine, but was unable to. Perhaps its a driver issue?
That's interesting. I just checked and my app also grows 6k of memory per second. Same for Quake3. The demos from codesampler.com don't from what I can see. I'll be looking into this issue soon. glClear didn't change anything for me.

What card are you people having? I have an Radeon with latest drivers. Let us know if you have any solutions.

This topic is closed to new replies.

Advertisement