Jump to content
  • Advertisement
Sign in to follow this  
Riktovitch

DX11 Memory leak solved by Flush()

This topic is 2112 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

First of all, I'm somewhat of a newbie to DX11.

 

I was having trouble find the source of a memory leak earlier and ended up narrowing it down to:

device->GetDeviceContext()->IASetVertexBuffers(0, 1, &vertexbuffer, &stride, &offset);

If I commented out that line than the memory leaking would stop, otherwise the programs memory usage would grow by huge amounts. Reaching a gigabyte in about 30 seconds. Once the usage reached about 2 gigabytes then all of that memory would be freed (which would freeze the program for about a minute) and the program would continue to do that over and over.

 

I found that the only way to fix the memory leak was to add Flush() right after Release():

if (vertexbuffer != 0)
{
    vertexbuffer->Release();
    device.GetDeviceContext()->Flush();
}

I should also note that the memory leak only occurred when the program was started in fullscreen mode, which I find to be very odd. This is the code I am using to create and release the vertex buffers, as well as render them:

bool Mesh::Load()
{
	Unload();

	if (vertices.size() > 0)
	{
		memset(&vertexbufferdesc, 0, sizeof(D3D11_BUFFER_DESC));

		vertexbufferdesc.Usage = D3D11_USAGE_DYNAMIC;
		vertexbufferdesc.ByteWidth = sizeof(Vertex) * vertices.size();
		vertexbufferdesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
		vertexbufferdesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;

		if (!FAILED(device->GetDevice()->CreateBuffer(&vertexbufferdesc, 0, &vertexbuffer)))
		{
			D3D11_MAPPED_SUBRESOURCE resource;

			if (!FAILED(device->GetDeviceContext()->Map(vertexbuffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &resource)))
			{
				for (unsigned int i = 0; i < vertices.size(); i++) static_cast<Vertex *>(resource.pData)[i] = *vertices[i];

				device->GetDeviceContext()->Unmap(vertexbuffer, 0);

				return true;
			}
		}
	}

	return false;
}

void Mesh::Unload()
{
    if (vertexbuffer != 0)
    {
        vertexbuffer->Release();
        device.GetDeviceContext()->Flush();
    }

	vertexbuffer = 0;
}

void Mesh::Render()
{
	if (vertexbuffer != 0)
	{
		if(texture != 0) texture->Apply();

		unsigned int stride = sizeof(Vertex), offset = 0;

		device->GetDeviceContext()->IASetVertexBuffers(0, 1, &vertexbuffer, &stride, &offset);
		device->GetDeviceContext()->IASetPrimitiveTopology(D3D10_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
		device->GetDeviceContext()->Draw(vertexbufferdesc.ByteWidth / sizeof(Vertex), 0);
	}
}

The leak was created by calling these functions every frame:

//begin scene

mesh.Unload();
mesh.Load()
mesh.Render();

//end scene

I am aware that I don't need to recreate the buffers each frame but I am doing it for a special mesh that I am consistently adding and removing vertices from. Mainly, what I'm wondering is:

  1. Am I doing something wrong?
  2. Is it common to have to call Flush() after Release()?
Edited by Riktovitch

Share this post


Link to post
Share on other sites
Advertisement

1. Yeah, you're creating a destroying a vertex buffer every single frame. If the contents need to change that's fine, just create it as DYNAMIC (which you're already doing) and call Map whenever you need to update the contents. Just make sure that when you create it it's big enough to hold the maximum number of verts that you'll need.

2. The driver generally doesn't delete resources right away, because it can't free memory that's currently in-use or will be in-use by the GPU or other worker threads. Flush() forces the driver to execute all pending commands on the GPU which is why it causes memory to be released immediately. However Flush is a very heavy and expensive operation, and it's very rare that it actually gets used.

Share this post


Link to post
Share on other sites
In D3D11, IASetVertexBuffers holds a reference to the vertex buffers passed in, and I'm not sure, but I think the only way to release that reference is: IASetVertexBuffers(0, 1, NULL, 0, 0); But maybe setting another vertex buffer should also release the previous one... I haven't tried it.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!