Jump to content
  • Advertisement
Sign in to follow this  
-Tau-

Unsigned int in vertex shader

This topic is 1467 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to pass unsigned int number to vertex shader, but the shader acts like it gets only the first byte of that number.

 

Vertex structure:

struct HMMULTITEX
{
    D3DXVECTOR3 pos;
    D3DXVECTOR3 norm;
    D3DXVECTOR2 tex;
    UINT texID;
};

Vertex declaration

D3DVERTEXELEMENT9 aVertDecl[] =

{
    { 0, 0,  D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
    { 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL,   0 },
    { 0, 24, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
    { 0, 32, D3DDECLTYPE_UBYTE4, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 1 },
    D3DDECL_END()
};

 

HLSL code:

unsigned int vTexHeight : TEXCOORD1
out float oTH[8] : TEXCOORD3

unsigned int retnum=vTexHeight % 0x10;
oTH[0]=(retnum/15.0f);
	
retnum=vTexHeight/0x10;
retnum=retnum % 0x10;
oTH[1]=(retnum/15.0f);

retnum=vTexHeight/0x100;
retnum=retnum % 0x10;
oTH[2]=(retnum/15.0f);

retnum=vTexHeight/0x1000;
retnum=retnum % 0x10;
oTH[3]=(retnum/15.0f);

...oTH[7]

Problem: Only oTH[0] and oTH[1] works, all other ale allways 0. It looks like vertex shader gets only the first byte of vTexHeight.
Everything above 0xff is ignored.
How can i pass unsigned int to vertex shader? Why is current vertex shader ignoring anything above the first byte?

Share this post


Link to post
Share on other sites
Advertisement

D3DDECLTYPE_UBYTE4 is a declaration for a four-component variable of unsigned bytes. When the vertex is passed to the shader, that's what the four bytes comprising texID are interpreted as. Your vertex structure is just storage for values. The vertex declaration determines how those values are to be interpreted when they're passed to the shader.

Share this post


Link to post
Share on other sites

Thanks for the tip, i changed D3DDECLTYPE_UBYTE4 to D3DDECLTYPE_D3DCOLOR and unsigned int vTexHeight to float4 vTexHeight. Now i get bytes with

int u1=(int)(vTexHeight.b*255);
int u2=(int)(vTexHeight.g*255);
int u3=(int)(vTexHeight.r*255);
int u4=(int)(vTexHeight.a*255);

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!