Hello I have a little question about the UAVs in directx. I read some explanations about UAVs but there are always 2 types of buffers for input and output in the examples. For my implementation I need a buffer that is only used in the shaders, so I use a pixel or compute shader to write to this buffer and in an other shader I want to read this data from it...so there is no need to manipulate the buffer in my c++ code.
The problem is I saw in the tutorials that there are used 2 types of buffers, the StructuredBuffer as a ShaderResource and a RWStructuredBuffer as a UAV ( for output). My question is : why they use 2 buffers ?? Isn't it possible to create in my c++ code the UAV -> set it to the pixelshader and that write in my first shader to it and then read it in my second pixel shader ??
Something like:
//Create UAV
ID3D11UnorderedAccessView* uav = ....
...
m_pImmediateContext->PSSetUnorderedAccessViews( 0, 1, &m_destDataGPUBufferView,
NULL );
Shader one:
struct VPL {
float3 pos;
float3 norm;
float3 flux;
};
RWStructuredBuffer<VPL> buffer : register( u0 );
...
buffer[0].pos = float3(0.1f,0.5f,0.4f);
Shader two:
RWStructuredBuffer<VPL> buffer : register( u0 );
float3 pos = buffer[0].pos;