Sign in to follow this  

Weird leaks

This topic is 2658 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Today I encountered very weird sort of memory leaks. I turned D3D9 into the debug mode with "Break on Memory Leaks" selected and despite this my application didn't break, yet the output of Visual C++ showed this:

D3DX: MEMORY LEAKS DETECTED: 2 allocations unfreed (134 bytes)
D3DX: Set HKLM\Software\Microsoft\Direct3D\D3DXBreakOnAllocId=0x5 to debug

So I quickly checked what is the alloc number 5 and I found out it's... D3DObject. Of course, I release it properly. So what could be wrong? This was all in Debug application mode, so I switched to Release. No more leaks showed. Very very weird... Everything seemed to free correctly and indeed they do. I took a look into my shaders, I noticed that I don't use one output so I deleted this line from the shader and the memory leaks notifications were gone. I don't actually get it why D3D9 considers unused (yet a value was written into it) shader semantics as memory leaks. A bug or something?

Share this post


Link to post
Share on other sites
I think I have found the problem, but it really has nothing to do with memory releases.
I commented all the code I had, leaving only vertex shader creation and deletion. And this creation of the shader caused the leaks (I was releasing it properly). So again I took a look into this shader. Here it is:

uniform float4x4 worldTransform;
uniform float4x4 viewProjTransform;

struct VS_INPUT
{
float4 position: POSITION;
float3 normal: NORMAL;
float2 texCoord: TEXCOORD0;
};

struct VS_OUTPUT
{
float4 position: POSITION;
float3 normal: TEXCOORD0;
float2 texCoord: TEXCOORD1;
float3 worldPosition: TEXCOORD2;
};

VS_OUTPUT main(VS_INPUT input)
{
VS_OUTPUT output;

// world-space
float4 position = mul(input.position, worldTransform);
float3 normal = mul(input.normal, (float3x3)worldTransform);

output.position = mul(position, viewProjTransform);
output.normal = normal;
output.texCoord = input.texCoord;
output.worldPosition = position;

return output;
}


Note what I'm doing in the last output assignment. I assign float4 to float3 without casting. If I do the casting, or simply change worldPosition to float4 or "float4 position" to "float3 position", the leaks don't show.

So can this really cause a leak? If I were to guess then I'd say that DirectX makes a leak during shader parsing. I'm almost sure of that because when I load my shader from cache (with D3DXAssembleShader and using a constant table generated from the previous program running) the leaks don't appear. I'd dare to say it is a bug (yet very slight) of DirectX (I'm using v. June 2010) shader parsing :)

Share this post


Link to post
Share on other sites
That's interesting, I do nasty implicit casts all the time in my shaders and have never had such a leak.

Can you post some more details of the code? For example, are you using the ID3DXEffect interface or giving the vertex shader directly to the device? Would be nice to see more of the program so we can have a look and see if something else could be causing it.

It sounds to me like, for some reason, the casting is just exposing a problem elsewhere in the program. I see no reason why the casting alone should cause a leak.

Share this post


Link to post
Share on other sites
To be honest, I don't see any "elsewhere", since my program is doing nothing more than window and shader creation and deletion :). As the output says, it must be something with D3DX library. Moreover, I've just noticed that this info appears only when I link to d3dx9d.lib, whereas linking to release version d3dx9.lib doesn't show any leaks.
My whole shader creation code is here:
http://wklej.eu/index.php?id=bc2cd612f1 - header
http://www.wklej.eu/index.php?id=67ae204b44 - source
You can omit the part commented with:
// use cached shader (if cache for it exists)
since the problem appears only when I generate shader from HLSL.
Note that I'm not operating strictly on IDirect3DVertex{Pixel}Shader9 but on my own aliases of this interfaces. This allowed me to implement only one shader class.

Share this post


Link to post
Share on other sites
RabidDog is precisely correct. It's as I thought, the casting issue is only exposing a problem elsewhere.

Here's what's happening:
When you create the shader without doing the weird cast, the error buffer is not created and all is well. HOWEVER, when you create the shader WITH the cast, I'm betting that the compiler is trying to give you a warning. So it creates the error buffer and fills it with a warning. But you're not checking for warnings, you're only checking for SUCCEEDED. Compilation will still succeed, even though the compiler is throwing a warning. So the buffer gets filled but never released.

Just add

if (errors)
{
errors->Release();
errors = NULL;
}




somewhere in the SUCCEEDED conditional.

Note that the release DLL is probably not concerned with generating shader compilation warnings. That would make sense, though I have never tested it.

Share this post


Link to post
Share on other sites

This topic is 2658 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this