• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By isu diss
      How do I fill the gap between sky and terrain? Scaling the terrain or procedural terrain rendering?

    • By Jiraya
      For a 2D game, does using a float2 for position increases performance in any way?
      I know that in the end the vertex shader will have to return a float4 anyway, but does using a float2 decreases the amount of data that will have to be sent from the CPU to the GPU?
    • By ucfchuck
      I am feeding in 16 bit unsigned integer data to process in a compute shader and i need to get a standard deviation.
      So I read in a series of samples and push them into float arrays
      float vals1[9], vals2[9], vals3[9], vals4[9]; int x = 0,y=0; for ( x = 0; x < 3; x++) { for (y = 0; y < 3; y++) { vals1[3 * x + y] = (float) (asuint(Input1[threadID.xy + int2(x - 1, y - 1)].x)); vals2[3 * x + y] = (float) (asuint(Input2[threadID.xy + int2(x - 1, y - 1)].x)); vals3[3 * x + y] = (float) (asuint(Input3[threadID.xy + int2(x - 1, y - 1)].x)); vals4[3 * x + y] = (float) (asuint(Input4[threadID.xy + int2(x - 1, y - 1)].x)); } } I can send these values out directly and the data is as expected

      Output1[threadID.xy] = (uint) (vals1[4] ); Output2[threadID.xy] = (uint) (vals2[4] ); Output3[threadID.xy] = (uint) (vals3[4] ); Output4[threadID.xy] = (uint) (vals4[4] ); however if i do anything to that data it is destroyed.
      If i add a
      vals1[4] = vals1[4]/2; 
      or a
      vals1[4] = vals[1]-vals[4];
      the data is gone and everything comes back 0.
      How does one go about converting a uint to a float and performing operations on it and then converting back to a rounded uint?
    • By fs1
      I have been trying to see how the ID3DInclude, and how its methods Open and Close work.
      I would like to add a custom path for the D3DCompile function to search for some of my includes.
      I have not found any working example. Could someone point me on how to implement these functions? I would like D3DCompile to look at a custom C:\Folder path for some of the include files.
    • By stale
      I'm continuing to learn more about terrain rendering, and so far I've managed to load in a heightmap and render it as a tessellated wireframe (following Frank Luna's DX11 book). However, I'm getting some really weird behavior where a large section of the wireframe is being rendered with a yellow color, even though my pixel shader is hard coded to output white. 

      The parts of the mesh that are discolored changes as well, as pictured below (mesh is being clipped by far plane).

      Here is my pixel shader. As mentioned, I simply hard code it to output white:
      float PS(DOUT pin) : SV_Target { return float4(1.0f, 1.0f, 1.0f, 1.0f); } I'm completely lost on what could be causing this, so any help in the right direction would be greatly appreciated. If I can help by providing more information please let me know.
  • Advertisement
  • Advertisement

DX11 Constant Buffer binding ( how long does it remain )

Recommended Posts


i have read very much about the binding of a constantbuffer to a shader but something is still unclear to me.

e.g. when performing :   vertexshader.setConstantbuffer ( buffer,  slot )

 is the buffer bound

a.  to the VertexShaderStage


b. to the VertexShader that is currently set as the active VertexShader

Is it possible to bind a constantBuffer to a VertexShader e.g. VS_A and keep this binding even after the active VertexShader has changed ?
I mean i want to bind constantbuffer_A  to VS_A, an Constantbuffer_B to VS_B  and  only use updateSubresource without using setConstantBuffer command every time.

Look at this example:

SetVertexShader ( VS_A )
vertexshader.setConstantbuffer ( buffer_A,  slot_A )

perform drawcall       ( buffer_A is used )

SetVertexShader ( VS_B )
vertexshader.setConstantbuffer ( buffer_B,  slot_A )

perform drawcall   ( buffer_B is used )

SetVertexShader ( VS_A )
perform drawcall   (now which buffer is used ??? )


I ask this question because i have made a custom render engine an want to optimize to

the minimum  updateSubresource, and setConstantbuffer  calls






Share this post

Link to post
Share on other sites

Bindings of constant buffers are separate and unrelated to the currently bound Vertex Shader.

When you bind a constant buffer you are binding it to the "pipeline", not to the currently set Vertex Shader. For that reason, you are free to set a constant buffer once and then repeatedly update its contents and change Vertex Shader without needing to rebind the constant buffer.

Share this post

Link to post
Share on other sites

You must consider the pipeline as a huge finite state machine:

(vertex shader program, vertex shader c|t|s registers, hull shader program, hull shader c|t|s registers, ...)

Note that this state also includes non-shader-related components such as rasterizer state (RS), blend state (OM) and depth-stencil state (OM).

As long as you do not rebind a component of the state, the currently bound component is used. The shader program and shader registers are separate components of the state. So if you bind a constant buffer (c register), sampler state (s state), etc. to a shader stage and you do not rebind these slots, all bindings will be persistent (independent of which shader program is bound).


Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement