Jump to content
  • Advertisement
Sign in to follow this  
OpaqueEncounter

3D Result calculated against WorldInverseTranspose is not normailzed

Recommended Posts

I have a very simple vertex/pixel shader for rendering a bunch of instances with a very simple lighting model.

When testing, I noticed that the instances were becoming dimmer as the world transform scaling was increasing. I determined that this was due to the fact that the the value of float3 normal = mul(input.Normal, WorldInverseTranspose); was shrinking with the increased scaling of the world transform, but the unit portion of it appeared to be correct. To address this, I had to add normal = normalize(normal);

I do not, for the life of me, understand why. The WorldInverseTranspose contains all of the components of the world transform (SetValueTranspose(Matrix.Invert(world * modelTransforms[mesh.ParentBone.Index]))) and the calculation appears to be correct as is.

Why is the value requiring normalization? under);

);

float4 CalculatePositionInWorldViewProjection(float4 position, matrix world, matrix view, matrix projection)
{
    float4 worldPosition = mul(position, world);
    float4 viewPosition = mul(worldPosition, view);
    return mul(viewPosition, projection);
}

VertexShaderOutput VS(VertexShaderInput input)
{
    VertexShaderOutput output;

    matrix instanceWorldTransform = mul(World, transpose(input.InstanceTransform));

    output.Position = CalculatePositionInWorldViewProjection(input.Position, instanceWorldTransform, View, Projection);

    float3 normal = mul(input.Normal, WorldInverseTranspose);
    normal = normalize(normal);

    float lightIntensity = -dot(normal, DiffuseLightDirection);
    output.Color = float4(saturate(DiffuseColor * DiffuseIntensity).xyz * lightIntensity, 1.0f);

    output.TextureCoordinate = SpriteSheetBoundsToTextureCoordinate(input.TextureCoordinate, input.SpriteSheetBounds);

    return output;
}


float4 PS(VertexShaderOutput input) : SV_Target
{
    return Texture.Sample(Sampler, input.TextureCoordinate) * input.Color;
}

 

Edited by OpaqueEncounter

Share this post


Link to post
Share on other sites
Advertisement

If your world matrix contains scaling, the normals will be scaled, which ends up scaling the lighting. Normalisation is required if you use scaling. 

The inverse-transpose is required if you use non-uniform scaling, as it makes sure that normals are scaled in such a way that they become the normal of the newly scaled faces. It does not remove the need for normalisation. 

Also, normalisation should always be performed in the pixel shader even without scaling, as the interpolation of three unit-length vertex normals is unlikely to still be unit-length. 

Share this post


Link to post
Share on other sites
On ‎11‎/‎21‎/‎2017 at 2:07 AM, Hodgman said:

If your world matrix contains scaling, the normals will be scaled, which ends up scaling the lighting. Normalisation is required if you use scaling. 

The inverse-transpose is required if you use non-uniform scaling, as it makes sure that normals are scaled in such a way that they become the normal of the newly scaled faces. It does not remove the need for normalisation. 

Also, normalisation should always be performed in the pixel shader even without scaling, as the interpolation of three unit-length vertex normals is unlikely to still be unit-length. 

Thanks for clarifying. This was the gap in my understanding.

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!