Sign in to follow this  
StratBoy61

[solved]HLSL issue : BLENDINDICES & SetMatrixArray

Recommended Posts

Hello all, I must be doing something wrong, because a google search for "BLENDINDICES & SetMatrixArray" returns only 5 hits... However, this is where I am now, and it kinda makes sense to me... So here it is : In order to use the GPU for displaying my MilkShape3D animation, I decided to create a Vertex Shader that will apply the bones' transformations to my vertices. I am passing an array of matrices to the Vertex Shader. This array represents the different poses for each bones. I am also passing the vertex structure, in which I added a bone index information for each vertex. Here is my vertex structure's definition in C++ and HLSL :
#define D3DFVF_BLENDVERTEX (D3DFVF_XYZ|D3DFVF_NORMAL|D3DFVF_TEXCOORDSIZE3(1)|D3DFVF_TEX2)

// The corresponding structure in the vertex shader is as described below
struct VS_INPUT {
    float4 position			: POSITION;
    float4 normal			: NORMAL;
    float3 boneIndices			: BLENDINDICES;
    float2 texCoord			: TEXCOORD0;
};


I have no idea how/why I came up with this definition of D3DFVF_BLENDVERTEX. It was just random, after tries and failures. I would be glad to have information/correction regarding this structure. The code below is my render function. It shows how I pass the parameters to the Vertex Shader, including the array of matrices:
m_pEffect->SetTechnique("RenderScene"); 
m_pEffect->SetMatrix("g_mWorldViewProjection", &modelViewProjection);

D3DXMATRIX SomeMat[25];
// tentative code
for (int i = 0; i < m_arrVertices.size(); i++)
{
	int nBoneIndex = m_arrVertexBoneIds[i];

	D3DXMatrixMultiply(&SomeMat[nBoneIndex], &m_arrBones[nBoneIndex].matWorldInv, &m_arrBones[nBoneIndex].matWorldAnim);
	m_arrBlendVertices[i].p = m_arrVertices[i].p;
	//D3DXVec3TransformCoord(&m_arrBlendVertices[i].p, &m_arrVertices[i].p, &SomeMat[nBoneIndex]);
	m_arrBlendVertices[i].boneIndex[0] = nBoneIndex;
}

HRESULT hr = m_pEffect->SetMatrixArray("g_mhBones", &SomeMat[0], m_arrBones.size());
if (FAILED(hr))
	return E_FAIL;

D3DBlendVertex_t *pVertices = NULL;
hr = m_pBlendVertexBuffer->Lock(0, 0, (void **) &pVertices, D3DLOCK_DISCARD);
if (FAILED(hr))
	return E_FAIL;

//memcpy(pVertices, &m_arrBlendVertices[0], (sizeof(D3DBlendVertex_t) * m_arrVertices.size()));
hr = m_pBlendVertexBuffer->Unlock();
if (FAILED(hr))
	return E_FAIL;

UINT uPasses;
m_pEffect->Begin( &uPasses, 0 );

for( UINT uPass = 0; uPass < uPasses; ++uPass )
{
	m_pEffect->Pass(uPass);

	m_pDevice->SetFVF(D3DFVF_BLENDVERTEX);
	m_pDevice->SetStreamSource(0, m_pBlendVertexBuffer, 0, D3DXGetFVFVertexSize(D3DFVF_BLENDVERTEX));
	for (i = 0; i < GetNumSubsets(); i++)
	{
		SetSubsetShader(i);
		DrawSubset(i);
	}
}
m_pEffect->End();


Now, if I uncomment the commented lines, and I remove the call to m_pEffect->Pass(uPass), I am simply displaying the animation through the CPU. I tried to implement the D3DXVec3TransformCoord() function in my vertex shader as decribed below :
float4x4 g_mhBones[25] : Bones;     // Our array of bone transforms
float4x4 g_mWorldViewProjection;    // World * View * Projection matrix

struct VS_INPUT {
    float4 position			: POSITION;
    float4 normal			: NORMAL;
    float3 boneIndices			: BLENDINDICES;
    float2 texCoord			: TEXCOORD0;
};

struct VS_OUTPUT
{
    float4 Position   : POSITION;
    float4 Diffuse    : COLOR0;
    float2 TextureUV  : TEXCOORD0;
};

VS_OUTPUT VS_StandardSkinning( VS_INPUT IN )
{
    VS_OUTPUT Output;

    float4 v = mul(IN.position, g_mhBones[IN.boneIndices[0]]);
    Output.Position = mul(v, g_mWorldViewProjection);

    Output.Diffuse.a = 1.0f;
    Output.Diffuse.rgb = 1.0f;

    Output.TextureUV = IN.texCoord;
    return Output;
}

technique RenderScene
{
    pass P0
    {          
        VertexShader = compile vs_1_1 VS_StandardSkinning();
    }
}


Well, it just does not work. I can see my 3D model, but no animation --I actually think that it is pure luck if I can see my model ! It seems to me that either the array of matrices is not properly setup, or the access in the vertex shader does not work... and there is defintely something wrong with the D3DFVF_BLENDVERTEX as well... Any ideas ? Thank you in advance. Cheers StratBoy61 [Edited by - StratBoy61 on November 28, 2006 5:55:00 AM]

Share this post


Link to post
Share on other sites
It's been a very long time since I did something similar, so take this with a grain of salt..

Quote:
Original post by StratBoy61
Here is my vertex structure's definition in C++ and HLSL :

*** Source Snippet Removed ***
I have no idea how/why I came up with this definition of D3DFVF_BLENDVERTEX. It was just random, after tries and failures. I would be glad to have information/correction regarding this structure.


The FVF needs to be:
#define D3DFVF_BLENDVERTEX (D3DFVF_XYZB1 | D3DFVF_LASTBETA_UBYTE4 | D3DFVF_NORMAL | D3DFVF_TEX2)


XYZB1 means: XYZ followed by 1 weight. LASTBETAUBYTE4 means that the last weight is not a weight, but rather blend indices, which is a 32-bit integer. More details on this can be found in the topics "Indexed Vertex Blending" and "Using Indexed Vertex Blending" in the SDK docs.

The vertex input structure would probably be like this (I've only used this with the FFP, not the PP, so I'm not sure):

struct VS_INPUT {
float4 position : POSITION;
float4 normal : NORMAL;
int4 boneIndices : BLENDINDICES;
float2 texCoord : TEXCOORD0;
};


The boneIndices can hold indices for up to 4 bones, in the xyzw components.

Share this post


Link to post
Share on other sites
I'd suggest not using BLENDINDICES at all. Just use texture coordinates, and interpret them as indices. That'd get rid of any special semantics you're not aware of (and I'm not aware of), and you've used texture coords before, so that shouldn't pose a problem. If you then have to go back to BLENDINDICES (say because you're using too many texture coords), then you'll at least know that your basic code is fine. I see no practical reason to use BLENDINDICES.

Share this post


Link to post
Share on other sites
Thanks Muhammad !
I got it working, thanks to your D3DFVF_BLENDVERTEX definition.
Thank you too ET3D, for the tip. I'll experiment with that later...
Cheers
StratBoy61

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this