Jump to content

  • Log In with Google      Sign In   
  • Create Account


[SOLVED] Post Processing BackBuffer using Effects 11


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
13 replies to this topic

#1 Crosire   Members   -  Reputation: 174

Like
0Likes
Like

Posted 14 April 2013 - 03:59 AM

I'm currently trying to add post processing to an existing C++ DirectX11 application. The task is to apply a dynamic FX shader loaded from a file directly to the backbuffer, all rendering is already done before. In short: I have to get a copy of the backbuffer, apply some post processing effects and overwrite the buffer with those afterwards. "IDXGISwapChain->Present" is called at last.

 

This was pretty easy to achieve in DirectX9, however I'm running into multiple issues with DirectX11 now. I only found one example for post processing on D3D11 on MSN and it was written for a Windows 8 app, which I cannot build or run on my Windows 7 desktop and the code did not make a lot of sense to me.

The first one, a missing effects framework, could be solved by using "Effects 11" you can download from MSN already.

 

Now I'm a bit confused on how to continue. The basic idea on how it could work is as following:

  1. Load the effect from a ".fx" HLSL file and compile it using "D3DX11CompileEffectFromFile"
  2. Get a copy of the backbuffer using "IDXGISwapChain->GetBuffer" and save it to a "ID3D11Texture" object
  3. Get the description of the texture, to have width and height of the screen
  4. Create a new render target and tell the device to use it
  5. Create a fullscreen quad and draw the screen texture on it (using the loaded effect)
  6. Reset the render target and update the backbuffer with the drawn quad

 

The following code for step one and two already works:

Spoiler

 

I'm having problems with applying the effect and drawing the fullscreen quad though ...

 

The vertex struct and layout for the quad is declared as following:

Spoiler

 

The fullscreen quad is set up like this:

Spoiler

 

Drawing currently looks similar to the next code:

Spoiler

 

I'm clearing the render target to the color red before drawing the quad, but the screen is red then only, so sadly it doesn't seem to draw the quad at all...

I don't know if there is any way to check that.

 

Also, the a post processing shader normally requires me to pass the screen texture to it, so it can apply the effects on every pixel of it. The Effects Framework provides the "ID3DX11Effect->GetVariableByName" and allows one to set the variable to a specific object / data, but it has no definition for a texture. The nearest thing I found is "Variable->AsShaderResource()->SetResource(&MyResource)", but is that really the most efficient way, to create a shader resource for the screen texture to pass it to the pixel shader?

 

I'm sorry for the tons of code in this post, but I found it the easiest way to show what I got already. In DirectX9 the steps described earlier worked without any problems, but it wasn't required to create a vertex buffer here anyway, so the whole thing was shorter and easier to achieve.

I hope somebody has done something similar / post processing before in D3D11/D3D10 and can help me out here, I would really appreciate it.

 

Thank you and cheers,

Crosire


Edited by Crosire, 18 April 2013 - 06:00 AM.


Sponsor:

#2 Jason Z   Crossbones+   -  Reputation: 4911

Like
2Likes
Like

Posted 14 April 2013 - 06:09 AM

It looks like the declared position format in your vertex layout is too small.  You are using a 4 component float position, but only declaring a 3 component format.  Since your offset in the layout to the texture coordinates is 16 bytes, this is likely the source of your problems.  If you switch to having a DXGI_FORMAT_R32G32B32A32_FLOAT for your position, that should help.

 

Did you get any output in the debug window?  If your shader was expecting a float4 then you vertex layout would not have matched and it should have complained about that...  Is the debug layer enabled in the device you are using?



#3 unbird   Crossbones+   -  Reputation: 4973

Like
2Likes
Like

Posted 14 April 2013 - 07:35 AM

Yep, create your device with the D3D11_CREATE_DEVICE_DEBUG. We should actually make that a big red blinking sticky in this subforum wink.png

The nearest thing I found is "Variable->AsShaderResource()->SetResource(&MyResource)", but is that really the most efficient way, to create a shader resource for the screen texture to pass it to the pixel shader ?

It's the only way. The non-effect counterpart would be ?SSetShaderResources (in your case PSSetShaderResources), so you need to create a shader resource view of your backbuffer texture.

For this to work you also need to create your swapchain with DXGI_USAGE_SHADER_INPUT, not only with DXGI_USAGE_RENDER_TARGET_OUTPUT. Alternatively you could render your scene first to a offscreen texture with both D3D11_BIND_RENDER_TARGET and D3D11_BIND_SHADER_RESOURCE.

Be aware when switching targets and resources that you unbind (set NULL) things first: one can not set a texture as a target and input simultaneously, the (debug) pipeline will complain and undo such an attempt (usually in the non-intended way). This is a bit more of a challenge with the effect framework since you have to find out which slots are currently used, but you can explicitly set the slots with the HLSL register keyword.

Also: Show your shader code, please.

#4 Crosire   Members   -  Reputation: 174

Like
0Likes
Like

Posted 14 April 2013 - 10:42 AM

Thank you for your answers already! I now set the position layout to a float4, using "DXGI_FORMAT_R32G32B32A32_FLOAT", makes a bit more sense.

 

I'm used to PS 2.0, so the new HLSL format is a bit strange to me and I'm sure there is a mistake in there.

 

The shader is just a simple monochrome testing effect:

Spoiler

Edited by Crosire, 15 April 2013 - 12:29 PM.


#5 unbird   Crossbones+   -  Reputation: 4973

Like
1Likes
Like

Posted 14 April 2013 - 12:14 PM

Use technique11 not technique10 in your effect file. This is a nitpick of the effect framework. Also: Where's your vertex shader ? You need one. While you're at it: Make sure the signatures match: The vertex shader must output the SV_Position system value semantic.

I recommend testing the post process in a separate application (e.g. by loading a screenshot from your app as source) to enable the debug layer.

Edited by unbird, 14 April 2013 - 12:58 PM.


#6 Crosire   Members   -  Reputation: 174

Like
0Likes
Like

Posted 15 April 2013 - 12:28 PM

Thanks so far, you helped me a lot. The quad is now drawing fine and the shader / effect gets loaded and executed too.

 

FX:

Spoiler

 

The full screen gets covered in red as expected now. The only thing left is to pass the screen texture to the pixel shader and let it do its job.

 

However, as soon as I remove the comments from the two lines above and comment out the other color code, I just get a black screen. I tried some different shader code and it's always just black, lowering the alpha will show the original image a bit through the now transparent quad, as long as I disable render target clearing.

 

I'm sending the texture to the shader with this code:

ID3D11ShaderResourceView *m_pResource;
D3D11_SHADER_RESOURCE_VIEW_DESC m_pResourceDesc = { DXGI_FORMAT_R32_FLOAT, D3D10_SRV_DIMENSION_TEXTURE2D };
m_pResourceDesc.Texture2D.MipLevels = 1;
m_pResourceDesc.Texture2D.MostDetailedMip = 0;
device->CreateShaderResourceView(g_pScreenTexture11, &m_pResourceDesc, &m_pResource);

g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(m_pResource);

"g_pScreenTexture11" contains the full screen, checked that via "D3DX11SaveTextureToFile". "g_pEffect11" also links to the effect, the other shader code works perfectly fine. The shader layout is the same as the vertex input layout declared in the C++ code too (as seen earlier), so I'm a bit lost here.

 

Thanks again for all help so far, it's greatly appreciated. I'm really close to the target now ... Just this little thing.

 

Cheers,

Crosire



#7 unbird   Crossbones+   -  Reputation: 4973

Like
1Likes
Like

Posted 15 April 2013 - 01:21 PM

Good, you narrowed it somewhat down: The sampling creates black - at least that's what I get.
 
Your easiest approach to narrow down the problem further would be to use a graphics debugger (PIX or the VS 2012 graphics debugger). Check the post-VS values, check if the texture is actually bound (the shader resource view, that is), debug pixels, etc.
 
Alternatively:
  • Check if the texcoords by outputting float4(IN.txcoord, 0, 1). This should give a black-red gradient from left-to-right and a black-green gradient from top to bottom (and yellow bottom right).
  • Your sampler is a bit bare for my taste. Really set all the needed values (not sure what the effects use per default). Or set the sampler explicitly to NULL (you should find the corresponding variable type), this will give you the default sampler. If that gives you something useful, work from those values. For a postprocess effect you e.g. want point sampling.
Currently I don't see anything obvious - yet wink.png

#8 Crosire   Members   -  Reputation: 174

Like
0Likes
Like

Posted 16 April 2013 - 10:10 AM

Texcoords are set correctly, I get the expected green/red gradient image with no blue and alpha at 100%.

 

I tried it by using "D3DX11CreateShaderResourceViewFromFile" and sent an existing image to the shader by that and I get that image with the effects applied drawn on the screen, so the shader is fully working. It's just not reading the data from the texture object (which has it, tested) and using that one when sending it to the shader.

 

In code:

This works:

ID3D11ShaderResourceView *m_pResource;
D3DX11CreateShaderResourceViewFromFile(device, L"tex.bmp", NULL, NULL, &m_pResource, NULL);
g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(m_pResource);

The code from my previous post not:

ID3D11ShaderResourceView *m_pResource;
D3D11_SHADER_RESOURCE_VIEW_DESC m_pResourceDesc = { DXGI_FORMAT_R32_FLOAT, D3D10_SRV_DIMENSION_TEXTURE2D };
m_pResourceDesc.Texture2D.MipLevels = 1;
m_pResourceDesc.Texture2D.MostDetailedMip = 0;

device->CreateShaderResourceView(g_pScreenTexture11, &m_pResourceDesc, &m_pResource);

g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(m_pResource);

 

After looking at it again, I found some mistakes already:

  • D3D11_SRV_DIMENSION_TEXTURE2D instead of D3D10_SRV_DIMENSION_TEXTURE2D (which outputs the same value in the end (4), but it just makes more sense)
  • DXGI_FORMAT_R32G32B32A32_FLOAT instead of DXGI_FORMAT_R32_FLOAT (The image isn't made of red color only obviously, I want the full RGBA color code here)

Still it doesn't work with the corrected code. Does anybody see another error in here?

 

And I just have to thank unbird again, you pushed me in the right directions :)



#9 unbird   Crossbones+   -  Reputation: 4973

Like
0Likes
Like

Posted 16 April 2013 - 10:42 AM

This is really strange. Can you force the debug layer through the control panel ? Already tried a GPU debugger ? I'm definitively out of clues...

#10 Crosire   Members   -  Reputation: 174

Like
0Likes
Like

Posted 17 April 2013 - 10:25 AM

I think it might be because the screen texture gets created by the swapchain interface in "IDXGISwapChain->GetBuffer" as I pass a null texture interface object to it. It then probably gets initialized without "D3D11_BIND_SHADER_RESOURCE", which makes it unusable for the shader resource view object.

 

I tried a different approach now:

Code which gets executed on startup:

// Create screen resource
D3D11_TEXTURE2D_DESC m_pTextureDesc = { m_pDesc.Width, m_pDesc.Height, 1, 1, DXGI_FORMAT_R32G32B32A32_FLOAT, 1, 0, D3D11_USAGE_DEFAULT, D3D11_BIND_SHADER_RESOURCE, 0, 0 };
device->CreateTexture2D(&m_pTextureDesc, NULL, &g_pScreenTexture11);

// Create shader resource view
D3D11_SHADER_RESOURCE_VIEW_DESC m_pResourceDesc = { m_pTextureDesc.Format, D3D11_SRV_DIMENSION_TEXTURE2D };
device->CreateShaderResourceView(g_pScreenTexture11, &m_pResourceDesc, &g_pScreenView11);

Code which gets executed every frame:

ID3D11Resource *m_pBackBuffer;

hr = swapchain->GetBuffer(0, __uuidof(m_pBackBuffer), (LPVOID*)&m_pBackBuffer);

if (SUCCEEDED(hr) && g_pScreenTexture11)
{
	// Update screen texture
	context->CopyResource(g_pScreenTexture11, m_pBackBuffer);

	// Update effect parameters
	g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(g_pScreenView11);

	...
}

 

I thought the shader view interface only stores a pointer to the screen texture, so it will retrieve the updated data, when I change the texture object it was initalized with.

 

The code now fails at "context->CopyResource(g_pScreenTexture11, m_pBackBuffer);" though. The texture is just empty.

 

I tried this one too:

ID3D11Resource *tex;
g_pScreenSurface11->GetResource(&tex);
D3DX11SaveTextureToFile(context, g_pScreenTexture11, D3DX11_IFF_BMP, L"tex.bmp");

But it crashes the application at the second line already, I have no idea why.

 

I'm going to try and put together a quick testing program as I cannot bind PIX to any other software with my project applied. PIX tries to hook the DirectX Exported Functions which was already done by my hook, so it just crashes.


Edited by Crosire, 17 April 2013 - 10:31 AM.


#11 unbird   Crossbones+   -  Reputation: 4973

Like
0Likes
Like

Posted 17 April 2013 - 10:50 AM

I thought the shader view interface only stores a pointer to the screen texture, so it will retrieve the updated data, when I change the texture object it was initalized with.

Nope, the view is tightly bound to the resource, you can't change that after creation.
 

The code now fails at "context->CopyResource(g_pScreenTexture11, m_pBackBuffer);" though. The texture is just empty.

If you look at the docs of this function you will have to check the compatibility of source and target resource. I already suspected some format problem: Is your backbuffer e.g. really 32F ?
 

I tried this one too:
*snip*
But it crashes the application at the second line already, I have no idea why.

BMP can't cope with float formats, use DDS instead.

I'm going to try and put together a quick testing program as I cannot bind PIX to any other software with my project applied. PIX tries to hook the DirectX Exported Functions which was already done by my hook, so it just crashes.

That leaves bare logging, hopefully. Dump the description of the resources and views in question using GetResource and GetDesc and compare them.

#12 Crosire   Members   -  Reputation: 174

Like
0Likes
Like

Posted 17 April 2013 - 01:36 PM

If you look at the docs of this function you will have to check the compatibility of source and target resource. I already suspected some format problem: Is your backbuffer e.g. really 32F ?

And that is the problem. DirectX 9 had the "StretchRect" function, which did the job. DirectX 11 is missing any real replacement. I don't see how I can copy the resource of the backbuffer to my texture, which has a different format?!

 

"CopyResource" and "ResolveSubresource" both require compatible formats. If I set up my texture with the same format as the backbuffer, those work of course, but the Shader View is not created properly again (see below).

Whatever I'm trying, either one or the other thing work, but not both together ...

 

BMP can't cope with float formats, use DDS instead.

It's not the save texture line that makes it crash, it's the "GetResource" and that's because the shader resource view object is NULL (got my testing application working and can successfully debug it with the VS 11 graphics debugger) even though it was created earlier.

 

Tried to create it every frame too:

SAFE_RELEASE(g_pScreenSurface11);
device->CreateShaderResourceView(g_pScreenTexture11, NULL, &g_pScreenSurface11);
ID3D11Resource *tex;
g_pScreenSurface11->GetResource(&tex);

Crashes, it is still NULL. It works when I create the texture from a file instead of using the backbuffer, so I asume either the format is the issue or the backbuffer is missing the "D3D11_BIND_SHADER_RESOURCE" flag (which is probably the case).

The solution would be to create a texture with that flag set and copy the backbuffer contents onto it. But all my tries failed there yet, as shown above.

 

I hope I'm not too annoying here smile.png


Edited by Crosire, 17 April 2013 - 01:37 PM.


#13 unbird   Crossbones+   -  Reputation: 4973

Like
0Likes
Like

Posted 17 April 2013 - 01:48 PM

Yeah, you need the copy resource, and you need D3D11_BIND_SHADER_RESOURCE on your target texture to then be able to sample from. Don't see where you fail, so I just reiterate:

- Your backbuffer is given, so D3D11_BIND_SHADER_RESOURCE is not available
- create a compatible texture, same format (and MS type!), same dimension, this time with D3D11_BIND_SHADER_RESOURCE
- create the view thereof
- use it as source for your postprocess

#14 Crosire   Members   -  Reputation: 174

Like
0Likes
Like

Posted 18 April 2013 - 06:00 AM

Forgive me, I was just stupid. Everything is working now!

 

BIG thanks to you smile.png






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS