Sign in to follow this  
Mathy

Hardware instancing shader not working as expected!

Recommended Posts

Hi there. I'm new to XNA, even though I have novice experience in MDX and SlimDX. Now, I've finally managed to understand how shaders basically work, and how to do hardware instancing, which is needed for one of my upcoming spare-time projects, where I have to render thousands of small triangles in different colors. Now, the game itself is being rendered in 3D, but I am setting a specific camera angle that makes the game look 2D (and all the triangles too). This by itself works pretty okay, but as soon as I start changing my triangles matrices to reposition them, they act all weird. Here you see a screenshot of my triangles when they all have a matrix of Matrix.Identity: http://testpreview.flamefusion.net/identity.png Now, this is a screenshot of my triangles when they all have a matrix of Matrix.Identity * Matrix.CreateTranslation(5,0,0): http://testpreview.flamefusion.net/matrixidentitytranslate.png On the first screenshot, it renders as I want it. On the second screenshot, I had in mind that the triangle (compared to how it looks now) instead should be indented a little bit to the right (since I increased the X coordinate). However, it does not. Instead, it kind of makes it 3-dimensional, and makes it follow the perspective. I have reason to believe that I am doing something wrong in the shader code. Therefore, I have pasted the code below:
float4x4 View; 
float4x4 Projection; 
float4x4 World; 
 
struct VertexShaderInput 
{ 
    float4 position : POSITION0; 
    float4 color : COLOR0; 
    //float3 Normal : NORMAL0; 
    float2 textureCoordinate : TEXCOORD0; 
}; 
 
 
struct VertexShaderOutput 
{ 
    float4 position : POSITION0; 
    float4 color : COLOR0; 
    float2 textureCoordinate : TEXCOORD0; 
}; 
 
 
// Vertex shader helper function shared between the different instancing techniques. 
VertexShaderOutput VertexShaderCommon(VertexShaderInput input, float4x4 instanceTransform) 
{ 
    VertexShaderOutput output; 
 
    // Apply the world and camera matrices to compute the output position. 
    float4 worldPosition = mul(input.position, instanceTransform); 
    float4 viewPosition = mul(worldPosition, View); 
     
    output.position = mul(viewPosition, Projection); 
    output.color=input.color; 
    output.textureCoordinate = input.textureCoordinate; 
 
    return output; 
} 
 
// On Windows shader 3.0 cards, we can use hardware instancing, reading 
// the per-instance world transform directly from a secondary vertex stream. 
VertexShaderOutput HardwareInstancingVertexShader(VertexShaderInput input, 
                                                float4x4 instanceTransform : TEXCOORD1) 
{ 
    return VertexShaderCommon(input, instanceTransform); 
} 
 
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 
{ 
    return input.color; 
} 
 
technique HardwareInstancing 
{ 
    pass Pass1 
    { 
        VertexShader = compile vs_3_0 HardwareInstancingVertexShader(); 
        PixelShader = compile ps_3_0 PixelShaderFunction(); 
    } 
} 
I hope you can help me fix my shader, so that it will work properly, making a transform with a high X value simply just indent the triangle to the right, and a transform with a high Y value indent the triangle downwards. My view and perspective setups (if needed) are shown below. As you have probably already noticed, they are set up perfectly for a 2D perspective (at least I think so):
 Viewport viewport = GraphicsDevice.Viewport; 
 
            float aspectRatio = (float)viewport.Width / (float)viewport.Height; 
            Matrix world = Matrix.CreateWorld(Vector3.Zero, new Vector3(0, 0, -1), new Vector3(0, 0, 1)); 
            Matrix projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.PiOver4, aspectRatio, 1f, 100); 
This is most likely incredibly hard to fix. At least I can't seem to get around the problem. I hope somebody has patience enough to read this whole thing through, and actually help me.

Share this post


Link to post
Share on other sites
You have to break your instance matrices to multiple float4:s. This is not very difficult :)

I would imagine that the shaders - as of now - only see the first row of the matrices (the first float4). This is because a variable with the texture coordinate semantic can only have 1 to 4 floating-point components.

Share this post


Link to post
Share on other sites
Alright, so I would need something that can store 16 floating-point components to create a full Matrix out of it. What you are saying is that only parts of the Matrix for each component is being transfered.

Is that correct, or did I misunderstand something?

If so, what other type than TexCoord0 can I use?

Share this post


Link to post
Share on other sites
I LOVE YOU!!!

Your last suggestion right there didn't help, but you lead me on the right track. I changed this:

return VertexShaderCommon(input, instanceTransform);

In the shader to this:

return VertexShaderCommon(input, transpose(instanceTransform));

Now it appears to work! Rating for you!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this