Sign in to follow this  

HLSL StructuredBuffer usage

Recommended Posts


I am a newbie for DirectX and HLSL shader.

In my HLSL shader,  I would like to use a StructuredBuffer to store a volume data:



uniform StructuredBuffer<float> _VolBuffer;

float SampleVol(int index1D ) {

return __VolBuffer[index1D];


However, I am not clear how to initialize this buffer from CPU side, i.e., how to create this buffer and fill with some initial data?





Share this post

Link to post
Share on other sites

You said you are newbie in directx and hlsl, i will strongly recommend you to stay away from d3d12 and stick with d3d11. d3d12 is only useful for people that are already Expert ( with a big E ) and project that has to push very far or have edge cases for a feature in d3d12 not possible in 11.


But anyway, on 12, you have to create instances of ID3D12Resource. You will need two, one in the an upload heap and one in the default heap. You fill the first one on the CPU using Map. Then you need to use a command list to copy to the second one. Of course, you have to manage the resource state of your resource with barriers ( copy destination, shader resource, ... ). You need then to execute the command list on the command queue. You also need to add a fence to listen the gpu for completion before you can destroy the resource in the upload heap. After all of these, you have a gpu usable buffer, you will need to bind it somehow for the shader to use it, choice are multiple, root srv, descriptor table in a descriptor heap.


On D3D11, you call ID3D11Device::CreateBuffer, by providing the description struct with a SRV binding flag and the pointer to the cpu data you want to put in it… Done :)

Share this post

Link to post
Share on other sites

I have to agree with galopin, D3D12 is not something you just jump into.  


If you are still determined, then I recommend the microsoft dev channel on youtube


It goes through everything, they talk alot about the general things you need to keep in your mind when working in DX12.  This presentation on porting from DX11 to DX12 is one I really enjoyed, mainly because it shows you DX11 and 12 are chalk and cheese.

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this