Jump to content
  • Advertisement
Sign in to follow this  
SiegeLord

OpenGL Using custom vertex declarations without a shader

This topic is 3057 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

In a certain library that I am working on I have created this vertex format that is understandable by both OpenGL and Direct3D. On the Direct3D side I do this via the D3D vertex declarations (i.e. not the FVF constants). I do not need any vertex/pixel shaders, so I don't use any. This seems to work fine in retail mode, but when I run my library using the debug Direct3D libraries I get a lot of errors like this: 00000007 0.18344063 [724] Direct3D9: Decl Validator: X247: (Element Error) (Decl Element [0]) Declaration can't map to fixed function FVF because position must use D3DDECLTYPE_FLOAT3. 00000008 0.18347779 [724] Direct3D9: Decl Validator: X291: (Element Error) (Decl Element [2]) Declaration can't map to fixed function FVF because gaps or overlap between vertex elements are not allowed. Offset encountered is: 28, but expected offset is 12. 00000009 0.18351243 [724] Direct3D9: Decl Validator: X288: (Global Error) Declaration can't map to fixed function FVF because position field is missing. Usage: D3DDECLUSAGE_POSITION, with usage index 0 is required for fixed function. 00000010 0.18356439 [724] Direct3D9: (ERROR) :DrawPrimitiveUP failed. I.e. for some reason D3D is trying to shoehorn my vertex declaration into the FVF code, and is obviously failing. I read somewhere that unless you specify a vertex shader, D3D will assume that you are using the fixed function pipeline. Is that correct? Any way I can tell it that I don't want to use it? I'd prefer not to have to create a dummy shader, and I definitely can't alter the vertex structure to conform to a FVF vertex. Feel free to ask for code if that's needed.

Share this post


Link to post
Share on other sites
Advertisement
If you're not using a shader, how do you expect directx to interpret you information, the vertex and pixel shader are major points in the pipeline.

Share this post


Link to post
Share on other sites
I guess I expect it to generate a do-nothing shader, which is what the retail libraries apparently do, as it works without a hitch there. Perhaps I can ignore these errors?

Share this post


Link to post
Share on other sites
You must call dev->SetFVF() before you call DrawPrimitiveUP so the pipeline knows how to render the buffer. And it expects the vertex data to match the vertex format to the extent it needs position, color, normals, etc. It also expects the vertex data in a certain order.

It appears you're using some vertex parameters for your use outside the pipeline. Is that correct?

If so, the following should work (I say not knowing exactly how you're doing things).

If your custom vertex structure is in the proper order to satisfy FVF demands, you can simply pad the FVF with extra TEXCOORDS so the FVF calculated size is the same as your custom size. The stride in the SetStreamSource must match the size calculated from the FVF. If you "fake" the FVF, you can set the stride in SetStreamSource to match - provided, of course, that your pipeline settings will ignore the extra data.

FVF data expects the actual vertex data order to match any FVF flags that are set for: "position, vertex blending weights, normal, colors, and the number and format of texture coordinates" (from the docs for Fixed Function FVF Codes). It will use, however, only the data from each vertex that the pipeline demands. Beyond that, your vertex can contain whatever data you want.

Share this post


Link to post
Share on other sites
Quote:
Original post by SiegeLord
I read somewhere that unless you specify a vertex shader, D3D will assume that you are using the fixed function pipeline. Is that correct? Any way I can tell it that I don't want to use it? I'd prefer not to have to create a dummy shader, and I definitely can't alter the vertex structure to conform to a FVF vertex.


If your shaders are set to NULL, you'll get the fixed-function pipeline. The only way around that is to set shaders.


Share this post


Link to post
Share on other sites
Quote:

You must call dev->SetFVF() before you call DrawPrimitiveUP so the pipeline knows how to render the buffer.

I don't use SetFVF(), I use SetVertexDeclaration(). As for rearranging the vertex layout, that's not really an option for me.

Quote:

If your shaders are set to NULL, you'll get the fixed-function pipeline. The only way around that is to set shaders.

I see, that's unfortunate... Do you know if setting either shader type will do the trick, or it must be a vertex shader?

Wonder if just this might be sufficient, hehe:
"mov oPos, v0".

Share this post


Link to post
Share on other sites
Quote:
Original post by SiegeLord
Wonder if just this might be sufficient, hehe:
"mov oPos, v0".


You'll need to set both. And that's a sufficient vertex shader, at least if you add the header and dcl stuff. If you want you can even create a super-basic shader, compile it to byte code, and then store the compiled byte code as a static array of unsigned int in some class. That way you don't have to depend on a file.

Share this post


Link to post
Share on other sites
Okay. I think I'll end up just ignoring the errors... making these shaders work is too much trouble, as the library supports arbitrary vertex layouts and I'd have to generate the shaderes during runtime, which I just don't feel like doing. Direct3D side of the library is already a giant hack to bring it up to a semblance of flexibility offered by OpenGL, no need to make it worse, hah.

Thanks for the help, everyone.

Share this post


Link to post
Share on other sites
Sure.


//The vertex structure
struct PRIM_COLOR
{
uint32_t d3d_color;
float r, g, b, a;
};

struct VERTEX
{
float x, y;
PRIM_COLOR color;
float u, v;
};

//The declaration
D3DVERTEXELEMENT9 vertex_decl[] =
{
{0, 0, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0},
{0, 8, D3DDECLTYPE_D3DCOLOR, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0},
{0, 28, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0},
D3DDECL_END()
}



Note the color structure. That is the reason I can't change the vertex layout, because I want to keep the OpenGL color (the 4 floats) together with the D3D color (so the user can just treat the structure as a cross-API color), and this is not possible if you use an FVF, since it requires the vertex to be: position,color,texture coords,non-fvf data. Since the non-FVF data is in this case the OpenGL color, I am stuck. Also, OpenGL cannot use the D3D color because the color components are in a different order.

Also, I have this function elsewhere in the code:


VERTEX_DECL* create_vertex_decl(const VERTEX_ELEMENT* elements, int stride)
{
VERTEX_DECL* ret = malloc(sizeof(VERTEX_DECL));
ret->elements = malloc(sizeof(VERTEX_ELEMENT) * PRIM_ATTR_NUM);
memset(ret->elements, 0, sizeof(VERTEX_ELEMENT) * PRIM_ATTR_NUM);
while(elements->attribute) {
ret->elements[elements->attribute] = *elements;
elements++;
}

#ifdef CFG_D3D
{
int flags = get_display_flags();
if (flags & DIRECT3D) {
DISPLAY *display;
LPDIRECT3DDEVICE9 device;
D3DVERTEXELEMENT9 d3delements[PRIM_ATTR_NUM + 1];
int idx = 0;
VERTEX_ELEMENT* e;
D3DCAPS9 caps;

display = get_current_display();
device = d3d_get_device(display);

IDirect3DDevice9_GetDeviceCaps(device, &caps);
if(caps.PixelShaderVersion < D3DPS_VERSION(3, 0)) {
ret->d3d_decl = 0;
} else {
e = &ret->elements[PRIM_POSITION];
if(e->attribute) {
int type = 0;
switch(e->storage) {
case PRIM_FLOAT_2:
type = D3DDECLTYPE_FLOAT2;
break;
case PRIM_FLOAT_3:
type = D3DDECLTYPE_FLOAT3;
break;
case PRIM_SHORT_2:
type = D3DDECLTYPE_SHORT2;
break;
}
d3delements[idx].Stream = 0;
d3delements[idx].Offset = e->offset;
d3delements[idx].Type = type;
d3delements[idx].Method = D3DDECLMETHOD_DEFAULT;
d3delements[idx].Usage = D3DDECLUSAGE_POSITION;
d3delements[idx].UsageIndex = 0;
idx++;
}

e = &ret->elements[PRIM_TEX_COORD];
if(!e->attribute)
e = &ret->elements[PRIM_TEX_COORD_PIXEL];
if(e->attribute) {
int type = 0;
switch(e->storage) {
case PRIM_FLOAT_2:
case PRIM_FLOAT_3:
type = D3DDECLTYPE_FLOAT2;
break;
case PRIM_SHORT_2:
type = D3DDECLTYPE_SHORT2;
break;
}
d3delements[idx].Stream = 0;
d3delements[idx].Offset = e->offset;
d3delements[idx].Type = type;
d3delements[idx].Method = D3DDECLMETHOD_DEFAULT;
d3delements[idx].Usage = D3DDECLUSAGE_TEXCOORD;
d3delements[idx].UsageIndex = 0;
idx++;
}

e = &ret->elements[PRIM_COLOR_ATTR];
if(e->attribute) {
d3delements[idx].Stream = 0;
d3delements[idx].Offset = e->offset;
d3delements[idx].Type = D3DDECLTYPE_D3DCOLOR;
d3delements[idx].Method = D3DDECLMETHOD_DEFAULT;
d3delements[idx].Usage = D3DDECLUSAGE_COLOR;
d3delements[idx].UsageIndex = 0;
idx++;
}

d3delements[idx].Stream = 0xFF;
d3delements[idx].Offset = 0;
d3delements[idx].Type = D3DDECLTYPE_UNUSED;
d3delements[idx].Method = 0;
d3delements[idx].Usage = 0;
d3delements[idx].UsageIndex = 0;

IDirect3DDevice9_CreateVertexDeclaration(device, d3delements, (IDirect3DVertexDeclaration9**)&ret->d3d_decl);
}
}
}
#else
ret->d3d_decl = 0;
#endif

ret->stride = stride;
return ret;
}



So, even if I do write a shader for that declaration up-top, it will still not work for the custom declarations I support.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!