I am using this at the moment:
float4x3 SkinTransform = 0;
[unroll]
for (int i = 0; i < 4; i++)
{
SkinTransform += BoneTransforms[PointBoneIndices[i]] * PointBoneWeights[i];
}
float3 normal = input.Normal;
normal = mul(normal, (float3x3)SkinTransform);
normal = normalize(normal);
Though that doesn't seem to work when the model's world changes like it is rotated, and the normals don't rotate as well. I switched to this:
normal = mul(normal, (float3x3)InverseTransposeWorld);
normal = normalize(normal);
But it doesn't seem to take into account when the model animates.
How should this work? Tbh I don't even remotely understand the math so I can't even hazard a guess as to where I am going wrong.