XNA, wrong BlendIndices from Vertexbuffer

Started by
-1 comments, last by HyDr0x 12 years, 8 months ago
Hi,
im actually working to read out the vertexdata from different models in XNA. Everything seems to work fine except the blendindices are completely wrong. They should be between 0 and 59 as the maximum supportet bones in my models. But actually they are 232 190 and so on. The strange thing is, that when i read only the blendindices it works ... . The vertexdeclaration im using is 100% the same which the vertices in the model are. The offsets, the vertexelementformat and so on, everything is the same.

My VertexDeclaration:

struct VertexNormalSkinned : IVertexType
{
public VertexNormalSkinned(Vector3 pos, Vector3 normal, Vector2 texture, Byte4 blendIndices, Vector4 blendWeights, Vector3 tangent, Vector3 binormal)
{
Position = pos;
BlendIndices = blendIndices;
BlendWeights = blendWeights;
Normal = normal;
Binormal = binormal;
Tangent = tangent;
TexCoords = texture;
}

public readonly static VertexDeclaration VertexDeclaration = new VertexDeclaration
(
new VertexElement(0, VertexElementFormat.Vector3, VertexElementUsage.Position, 0),
new VertexElement(12, VertexElementFormat.Vector3, VertexElementUsage.Normal, 0),
new VertexElement(24, VertexElementFormat.Vector2, VertexElementUsage.TextureCoordinate, 0),
new VertexElement(32, VertexElementFormat.Byte4, VertexElementUsage.BlendIndices, 0),
new VertexElement(36, VertexElementFormat.Vector4, VertexElementUsage.BlendWeight, 0),
new VertexElement(52, VertexElementFormat.Vector3, VertexElementUsage.Tangent, 0),
new VertexElement(64, VertexElementFormat.Vector3, VertexElementUsage.Binormal, 0)
);

public Vector3 Position;
public Byte4 BlendIndices;
public Vector4 BlendWeights;
public Vector3 Normal;
public Vector3 Binormal;
public Vector3 Tangent;
public Vector2 TexCoords;

VertexDeclaration IVertexType.VertexDeclaration { get { return VertexDeclaration; } }
};



The Code where im using it:

foreach (ModelMesh mesh in mainMesh.Meshes)
foreach (ModelMeshPart part in mesh.MeshParts)
{
Vector4 blendIndices;
VertexNormalSkinned[] vertexBuffer = new VertexNormalSkinned[part.VertexBuffer.VertexCount];
part.VertexBuffer.GetData<VertexNormalSkinned>(0, vertexBuffer, 0, part.NumVertices, part.VertexBuffer.VertexDeclaration.VertexStride);// wrong indices

Byte4[] V = new Byte4[part.VertexBuffer.VertexCount];
part.VertexBuffer.GetData<Byte4>(32, V, 0, part.NumVertices, part.VertexBuffer.VertexDeclaration.VertexStride); //right indices

for (int i = 0; i != vertexBuffer.Length; i++)
{
Matrix SkinningMatrix = Matrix.Identity;

blendIndices = vertexBuffer.BlendIndices.ToVector4();
SkinningMatrix += mSkin[(Byte)blendIndices.X] * vertexBuffer.BlendWeights.X;
SkinningMatrix += mSkin[(Byte)blendIndices.Y] * vertexBuffer.BlendWeights.Y;
SkinningMatrix += mSkin[(Byte)blendIndices.Z] * vertexBuffer.BlendWeights.Z;
SkinningMatrix += mSkin[(Byte)blendIndices.W] * vertexBuffer.BlendWeights.W;
}
}


Edit: Is there any possibility to create or take the existent vertexdeclaration from the model for the new vertexbuffer dynamically? Because its not very handy to look up the declaration in the debugger and rebuild it somewhere in a struct ...


Edit: Okay i found the mistake, I dont know why XNA does this, but it was my order in which I declared the variables in the struct. Seems like XNA is copying the values direct into them instead of using the constructor ... but they have to be in the same order like the Declaration.
For my example:

public Vector3 Position;
public Vector3 Normal;
public Vector2 TexCoords;
public Byte4 BlendIndices;
public Vector4 BlendWeights;
public Vector3 Binormal;
public Vector3 Tangent;

This topic is closed to new replies.

Advertisement