Jump to content

  • Log In with Google      Sign In   
  • Create Account

How to declare integers in vertex shader input?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 jamesxli   Members   -  Reputation: 303

Like
0Likes
Like

Posted 05 January 2014 - 10:55 PM

I have a vertex shader that takes a 3D position and an integer as input parameters. The vertex shader passes the integer to the geometry shader that will generates quads depending on that integer value. The relevant HLSL code looks as following:

struct VS_In {
    float4 p:  SV_POSITION;
    uint gType : TEXCOORD; 
};

void VShader(inout VS_In v) {
   v.p = mul(v.p, mTrans);
}

[maxvertexcount(8)]
void GShader(point VS_In input[1], inout TriangleStream<PS_In> stream )
{
    EpmitGlyph(stream, input[0].p, input[0].gType);
    ....
}

The code works as expected when the DeviceCreationFlags.Debug is NOT set. When the debug flag is set, the program always fails at the context.DrawIndexed() call with the following error message: 

D3D11 ERROR: ID3D11DeviceContext::DrawIndexed: Input Assembler - Vertex Shader linkage error: Signatures between stages are incompatible. Semantic 'TEXCOORD' has mismatched data types between the output stage and input stage. [ EXECUTION ERROR #344: DEVICE_SHADER_LINKAGE_COMPONENTTYPE]

It looks like that the integer parameter gType has wrong data type, as TEXCOORD should be float4 according to HLSL documentation. I tried to use different sematics for the gType parameter, but none worked. My question is: Can I pass generic integer values to the vertex shader via the vertex buffer? If yes, how should I declare its semantic so that the debug layer doesn't complain about it?

 

 



Sponsor:

#2 Hodgman   Moderators   -  Reputation: 31800

Like
1Likes
Like

Posted 06 January 2014 - 12:08 AM

What does your input assembler layout look like?



#3 MJP   Moderators   -  Reputation: 11742

Like
0Likes
Like

Posted 06 January 2014 - 02:43 AM

It's basically telling you that the DXGI format for your TEXCOORD input element is incompatible with the 'uint' HLSL type. You're probably using a FLOAT format, and the uint type is only compatible with INT or UINT formats.



#4 jamesxli   Members   -  Reputation: 303

Like
0Likes
Like

Posted 06 January 2014 - 11:00 AM

Thank you for the reply. The following is my code to setup the input assembly:

 

            int vSize = 16;
            using (DataStream ds = new DataStream(vSize * ( bodies.Count + 1 ), true, true)) {
                FillVertexStream(ds);
                EffectPass ePass = effect.GetTechniqueByName("SpaceMap").GetPassByName("Glyphs");
                ctx.InputAssembler.InputLayout = new InputLayout(device, ePass.Description.Signature, new[] { 
                    new InputElement("SV_POSITION", 0, Format.R32G32B32_Float, 0),
                    new InputElement("TEXCOORD", 0, Format.R32_UInt, 0),
                });                 
                ctx.InputAssembler.PrimitiveTopology = PrimitiveTopology.PointList;                
                vertexBuffer = new Buffer(device, ds, (int)(ds.Length),
                    ResourceUsage.Default, 
                    BindFlags.VertexBuffer, 
                    CpuAccessFlags.None, 
                    ResourceOptionFlags.None, 0);
            }


#5 Jason Z   Crossbones+   -  Reputation: 5303

Like
0Likes
Like

Posted 06 January 2014 - 11:43 AM


My question is: Can I pass generic integer values to the vertex shader via the vertex buffer? If yes, how should I declare its semantic so that the debug layer doesn't complain about it?
Yes, you most definitely can.  The error is indicating that there is a mismatch between what your vertex shader is expecting and what your input assembler is producing.  Your input layout seems to be ok, although I'm not familiar with the C# syntax for creating an InputElement.  In particular, what do the two parameters that you have marked as "0" stand for?  I would guess one of them is the semantic index, and the other might be the byte offset.  If so, then the byte offset shouldn't be zero for the second InputElement - instead it should be the size of all elements that came before it (i.e. 12 bytes in your case).

 

The other thing that I noticed is that you gave us your VS_In struct above, but you showed it being consumed by the Geometry Shader.  What does your Vertex Shader signature look like?  Does it use the same struct for input and output of the Vertex Shader?



#6 jamesxli   Members   -  Reputation: 303

Like
0Likes
Like

Posted 06 January 2014 - 04:02 PM

Hi Jazon, thanks for the reply. The first index of InputElement() is indeed the semantic index; but the second index is the slot that identifies the input assembler. Thus, both of them need to be zero in this case. The struct VS_In is indeed both the input and the output signature of the vertex shader (as implied by the parameter modify "inout".)  I guess that Debug Layer some how doesn't interpret this way; and report it as an error. For now, I just configure the debug layer to ignore this kind of errors, and it seemed to work so far.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS