In this demo, we are going to render a desert scene to an off-screen texture. This texture will be the input into our blurring algorithm that executes on the compute shader. After the texture is blurred, we will draw a full screen quad to the back buffer with the blurred texture applied so that we can see the blurred result to test our blur implementation.
We assume that the blur is separable, so we break the blur down into computing two 1D blurs (Rolling Box Blur) - a horizontal one and a vertical one. Implementing this requires two texture buffers where we can read and write to both; therefore, we need a SRV and UAV to both textures. Let us call one of the textures A and the other texture B. The blurring algorithm proceeds as follows:
- Bind the SRV to A as an input to the compute shader (this is the input image that will be horizontally blurred)
- Bind the UAV to B as an output to the compute shader (his is the output image that will store the blurred result)
- Dispatch the thread groups to perform the horizontal blur operation.
- Bind the SRV to B as an input to the compute shader (this is the horizontally blurred image that will next be vertically blurred)
- Bind the UAV to A as an output to the compute shader (this is the output image that will store the final blurred result)
- Dispatch the thread groups to perform the vertical blur operation.
VIDEO:
[media]
[/media]
WEB:
https://sites.google..._rollingboxblur
SOURCE CODE:
http://code.google.com/p/dx11/