# Vertex stride in stream 0 is less than in the declaration

This topic is 3928 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello all, I am calling CreateVertexBuffer() for my output vertices followed by ProcessVertices() on them. The CreateVertexBuffer() returns S_OK, but the ProcessVertices() fails with an error. I used Debug version of the SDK with DebugView to view the messages and the message that I got was: Vertex stride in stream 0 is less than in the declaration. Can someone please elucidate me on this error? I would really appreciate it. Please also give some explanation since I'm pretty much a newbie. :) I investigated myself a little bit and the error is not because of me not specifying the output VertexDecl in the function call below since I'm using shader model less than 3.0.
HRESULT ProcessVertices(
UINT SrcStartIndex,
UINT DestIndex,
UINT VertexCount,
IDirect3DVertexBuffer9 * pDestBuffer,
IDirect3DVertexDeclaration9* pVertexDecl,
DWORD Flags
);


Thanks, [EDIT] I looked at the code and I'm calling:
SetStreamSource(0, pVertexBuffer, 0, 12)

Before this code, I have CreateVertexBuffer with dwFVF as 0x42, which is setting XYZ and DIFFUSE color. Is the error I'm getting because size of XYZ and DIFFUSE add up to 4 * 4 = 16 bytes as opposed to 12? This code wasn't written by me and I'm just trying to debug the problem. I'd appreciate if someone could explain me what's going on. :) [Edited by - madiyaan on April 7, 2007 5:43:34 PM]

##### Share on other sites
Yes, with XYZ and DIFFUSE, you got 3 + 1 = 4 4-bytes elements, that is 4 * 4 = 16 bytes stride. The vertex stride you provide to SetStreamSource has to be equal to the stride that would be computed based on the FVF or vertex declaration you provide in SetFVF or SetVertexDeclaration.

I don't know for sure if there are situations when it would be allowed to be greater -- I don't think it's possible either.

##### Share on other sites
Thanks for the quick reply, Janta.

I have a few more questions/clarifications.

So, this error is popping because when I set my stream source via
SetStreamSource()
, the last parameter to this function call (the stride) did not equal the size of the struct indicated by the FVF flags by an earlier call to
CreateVertexBuffer()
? Just want to make sure because I'm creating the output buffer via declarations and not FVF and want to make sure that output buffer isn't causing the problem. It's the input vertex buffer that's causing the problems, right?

What's happening in the code is this:

1) GetFVF() is called and the dwFVF is recorded. After tracing through the code, I found this to be DIFFUSE | XYZ.
2) This dwFVF is used as the parameter to CreateVertexBuffer() as the 3rd parameter).
3) SetStreamSource() is called with first parameter 0 and stride is 12 (this I found after tracing through the code).
4) Then the output FVF is set and an output Vertex buffer is created via CreateVertexBuffer().
5) Now ProcessVertices() is called and the DX runtime throws the error that I described in the first post.

Can you please verify that the error is happening because of the input vertex buffer?

I did an experiment. I created the input vertex buffer using a dwFVF of 0. I just gave it a stride of 12. It still gave me the same error. Does this mean your theory is correct?

[EDIT 1] What I don't get is why is the error saying that DX is comparing declaration when I'm not even using it. I'm using FVF, not DECL. Can someone please clarify?

[Edited by - madiyaan on April 7, 2007 5:23:24 PM]

##### Share on other sites
Going to bump this for the Sunday crowd. Any comments/answers would be appreciated.

Thanks,

##### Share on other sites
Yeah, you seem to have got the right idea - D3D is finding conflicting information about the data and getting all upset over it [smile]

Quote:
 [EDIT 1] What I don't get is why is the error saying that DX is comparing declaration when I'm not even using it. I'm using FVF, not DECL. Can someone please clarify?

Internally D3D is likely to be using declarations - they're a superset and successor to FVF's. Therefore it might not ever 'see' the FVF as some code in the runtime immediately converts it to a declaration internally, or something along those lines...

Just as a matter of coding style - the use of an arbitrary constant 12 in that parameter is just nasty! Having it abstracted out as a constant matching the vertex declaration, or using a sizeof(my_vertex_type) or even D3DXGetFVFVertexSize()...

Bottom line - get everything matching up and you should be sorted.

hth
Jack

##### Share on other sites
Quote:
Original post by jollyjeffers
Yeah, you seem to have got the right idea - D3D is finding conflicting information about the data and getting all upset over it [smile]

Quote:
 [EDIT 1] What I don't get is why is the error saying that DX is comparing declaration when I'm not even using it. I'm using FVF, not DECL. Can someone please clarify?

Internally D3D is likely to be using declarations - they're a superset and successor to FVF's. Therefore it might not ever 'see' the FVF as some code in the runtime immediately converts it to a declaration internally, or something along those lines...

Just as a matter of coding style - the use of an arbitrary constant 12 in that parameter is just nasty! Having it abstracted out as a constant matching the vertex declaration, or using a sizeof(my_vertex_type) or even D3DXGetFVFVertexSize()...

Bottom line - get everything matching up and you should be sorted.

hth
Jack

Thank you for the reply, Jack.

First, let me try to explain what I'm trying to do in the app (background). I'm trying to emulate the functionality of DrawPrimitivesUP without using DrawPrimitivesUP and want to get the output of the vertex shader myself after doing it in Software.

I'm doing the following:

1. Creating an input vertex buffer. I set the FVF flag to 0. This is because I only know the user-defined stride and the user-defined input data (because I'm trying to emulate DrawPrimitivesUP() function). I just have a vertex shader, a user provided buffer and the stride in this buffer. I want to store the result in an output buffer.

I don't want to use FVF or any other declaration. I want to use ONLY the user-defined stride (and that I do in SetStreamSource).

Notice that I can't really extract the FVF from the user-provided data, therefore I set the FVF to 0, which I assume is invalid FVF.

2. I then call SetStreamSource with the newly-created vertex-buffer and user-provided stride.

3. I then create an output vertex buffer.

4. I call ProcessVertices(). Unfortunately, it still gives me the same error, i.e. Vertex stride in stream 0 is less than in the declaration.

How can this be? I'm not even using FVF now. The FVF I use to create the the input vertex buffer is 0.

Can anyone tell me how to do this properly? I'm not given anything but the vertex shader, the user-provided data, the user-provided stride and want to get the output vertices. What is DirectX comparing the vertex stride in stream 0 to when it produces that error? Is there a way for it to not compare that (that's what I tried to do when I set the FVF to 0.. so it will compare the stride to nothing).

Maybe I need to generate an FVF code based on the user-provided stride to make sure it's less than that, but invalid FVF (value of 0) is the smallest I can provide... which still errors.

Help would be appreciated.

[Edited by - madiyaan on April 16, 2007 4:27:09 PM]

##### Share on other sites
Bumping this for the last time.

Any and every kind of help would be highly appreciated.

Thank you very much,

##### Share on other sites
Quote:
 I don't want to use FVF or any other declaration. I want to use ONLY the user-defined stride (and that I do in SetStreamSource).
Can't do that. The input assembler on the graphics card needs to know where to pull data from to fill the vertex shader's input registers with. That's what the declaration is used for - mapping decl_usage instructions in the shader to byte offsets within the vertex.

Check the docs for DrawPrimitiveUP again - at the end there's this line:

Quote:
 When converting a legacy application to Direct3D 9, you must add a call to either IDirect3DDevice9::SetFVF to use the fixed function pipeline, or IDirect3DDevice9::SetVertexDeclaration to use a vertex shader before you make any Draw calls.

So, normally, DPUP requires that the caller has already set up an FVF or declaration via SetFVF or SetVertexDeclaration.

##### Share on other sites
Quote:
Original post by superpig
Quote:
 I don't want to use FVF or any other declaration. I want to use ONLY the user-defined stride (and that I do in SetStreamSource).
Can't do that. The input assembler on the graphics card needs to know where to pull data from to fill the vertex shader's input registers with. That's what the declaration is used for - mapping decl_usage instructions in the shader to byte offsets within the vertex.

Check the docs for DrawPrimitiveUP again - at the end there's this line:

Quote:
 When converting a legacy application to Direct3D 9, you must add a call to either IDirect3DDevice9::SetFVF to use the fixed function pipeline, or IDirect3DDevice9::SetVertexDeclaration to use a vertex shader before you make any Draw calls.

So, normally, DPUP requires that the caller has already set up an FVF or declaration via SetFVF or SetVertexDeclaration.

First of all, thank you very much, superpig. I really appreciate it.

Next question, how do I get the FVF just from the stride information? And yes, you guessed it correctly, I'm trying to use DPUP without actually calling the function.

So, to re-iterate: Any suggestions on how to extract the correct FVF from the stride and buffer of DrawPrimitivesUP(). Remember that all I have is the parameters of DrawPrimitivesUP and the current state of the pipeline from the device interface.

Regards,

##### Share on other sites
IDirect3DDevice9::GetFVF?

##### Share on other sites
Quote:
 Original post by sirobIDirect3DDevice9::GetFVF?

Thank you for the reply, Sirob. If you noted my first post (last part), when I do GetFVF() it returns 0x42, which means 16 bytes. But the stride given to me by the user is only 12 bytes. This is how I first noted the error. If I just give the function to DrawPrimitivesUP() directly, it works.

But I'm trying to avoid using DrawPrimitivesUP(). I want to do the vertex processing in software and want to get the output vertices. So, back to the question: Is there a way I can get the FVF from the user-provided stride, user-provided buffer and the current pipeline state?

There must be a way, since DirectX can do DrawPrimitivesUP() itself. :)

Thank you once again for your time, DX Gurus,

##### Share on other sites
Quote:
 Original post by madiyaanNext question, how do I get the FVF just from the stride information?
You can't. Consider the following FVF codes:

D3DFVF_XYZ | D3DFVF_DIFFUSE | D3DFVF_SPECULAR
D3DFVF_XYZ | D3DFVF_TEX1 | D3DFVF_TEXCOORDSIZE2(0)
D3DFVF_XYZW | D3DFVF_DIFFUSE

These are clearly very different vertex configurations, but they all pack down to the same stride (20 bytes). When FVF->stride is a many-to-one function, you can't invert it.

Quote:
 the current state of the pipeline from the device interface.
That's where you get the vertex configuration data from, indeed - but the point of my earlier post (which you seem to have missed) is that if you're writing a function to be used in place of DPUP then it's not your responsibility to calculate and set the vertex format - it's the caller's. DPUP doesn't do any work to set the FVF or vertex declaration, it assumes that the caller has already set that up (as part of the current pipeline state) and just pushes its draw calls through.

If you're doing something more interesting than DPUP that does require vertex layout data, though, then use IDirect3DDevice9::GetVertexDeclaration(). Do not use FVF codes if you want your function to work for any vertex format as there are many vertex formats that vertex declarations can express which FVFs can't.