Sign in to follow this  
george7378

3D Finite difference normal calculation for sphere

Recommended Posts

Hi everyone,

I'm currently adapting a planar quadtree terrain system to handle spherical terrain (i.e. I want to render planets as opposed to just an endless terrain stretching in the X-Z plane).

A the moment, I use this algorithm to calculate the normal for a given point on the terrain from the height information (the heights are generated using simplex noise):

	public Vector3 GetNormalFromFiniteOffset(float x, float z, float sampleOffset)
        {
            float hL = GetHeight(x - sampleOffset, z);
            float hR = GetHeight(x + sampleOffset, z);
            float hD = GetHeight(x, z - sampleOffset);
            float hU = GetHeight(x, z + sampleOffset);
	            Vector3 normal = new Vector3(hL - hR, 2, hD - hU);
            normal.Normalize();
	            return normal;
        }
	

The above works fine for my planar quadtree, but of course it won't work for spherical terrain because it assumes the Y direction is always up. I guess I need to move the calculation into the plane which lies at a tangent on the sphere for the point I'm evaluating. I was wondering if anyone knows of a good or proven way to transform this calculation for any point on a sphere, or if there's a better way to calculate the normal for a point on a sphere which has been radially displaced using a noise-based height field?

I'm using XNA by the way. Thanks very much for looking!

Share this post


Link to post
Share on other sites

A quick addition - I tried creating my own adaptation:

	public Vector3 GetNormalFromFiniteOffset(Vector3 location, float sampleOffset)
        {
            Vector3 normalisedLocation = Vector3.Normalize(location);
            Vector3 arbitraryUnitVector = Math.Abs(normalisedLocation.Y) > 0.999f ? Vector3.UnitX : Vector3.UnitY;
	            Vector3 tangentVector1 = Vector3.Cross(arbitraryUnitVector, normalisedLocation);
            tangentVector1.Normalize();
            Vector3 tangentVector2 = Vector3.Cross(tangentVector1, normalisedLocation);
            tangentVector2.Normalize();
	            float hL = GetHeight(location - tangentVector1*sampleOffset);
            float hR = GetHeight(location + tangentVector1*sampleOffset);
            float hD = GetHeight(location - tangentVector2*sampleOffset);
            float hU = GetHeight(location + tangentVector2*sampleOffset);
	            Vector3 normal = 2*normalisedLocation + (hL - hR)*tangentVector1 + (hD - hU)*tangentVector2;
            normal.Normalize();
	            return normal;
        }
	

I can't test it yet, but I wonder if anyone thinks this looks like a decent approach, or are there obvious issues with how I'm doing this?

Thanks again!

Share this post


Link to post
Share on other sites

I assume you get discontinuities because of the arbitary tangents. It's impossible to move a orientation over the surface of a sphere without singularities.

If you use normals at discrete offsets (e.g. per vertex or per face) the dicontinuity might not matter, but if yo do it per pixel i expect a visible seam.

In the latter case the solution is to consider the singularities when calculating the normal - similar to a cubemap where in the corner you need to sample from 3 instead 4 texels, and projected angles become 120 instead 90 degrees. (The orthogonality you assume in your code breaks at the corners.)

Share this post


Link to post
Share on other sites

Thanks for the reply - yes, I thought about what would happen if I used the same arbitrary vectors for every position, which is why I choose between the X direction or the Y direction depending on the Y component of the position. I was thinking this would ensure that the tangents always form an orthonormal basis around the point on the sphere.

it will result in different tangents for different positions, but I was thinking this wouldn't matter as long as I was taking samples across two perpendicular directions in the tangent plane.

Share this post


Link to post
Share on other sites

The tangents are orthogonal, but if you would rotate them slightly, you get slightly different results because you sample from different neighbouring positions. And this is what happens when the branch alternates between X and Y, you get large discontinuities in tangent directions at the branch point, and this can cause a visible seam.

The problem is unavoidable - you can't cover a sphere with a net of orthonormal isolines without introducing singularities, so you can't calculate a orthonormal basis around the sphere without discontinuity in orientation. You could however trick the issue by ensuring the jumps in orientation are always 90 degrees. (Might be more efficient than special cases for edges / corners, but the math won't be super easy.)

You would be surprised how much impact this problem has on topics like surface parametrization, remeshing etc.

But as said chances are that in practice you won't notice the seam - i'm just nit picking. See how your solution works for you...

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Announcements

  • Forum Statistics

    • Total Topics
      628368
    • Total Posts
      2982293
  • Similar Content

    • By JVGameDev
      Hello! I am looking for a concept artist for a 3d brawl game. I will need sketches done for characters, and arenas. I would prefer a hobbyist for free, but I will pay if the work i am getting is significantly better than a free hobbyists. If interested, please do email me at JVGameDev@gmail.com
    • By Nikola Tesla

                               I created this model in Maya and have been practicing on the side while studying for school. This may not be the best picture to show faces or line flow and the resolution does not help things either. Still, I would love some opinions on where I can improve on my 3D modeling. Also, this is just a skin model and I spent no time texturing the model or accounting for clothing. Thank you in advance for any feedback.
    • By G-Dot
      Hello everybody! I've got a little problem. I need to create jetpack action. The main target is when I will press some button on my keybord my character will fly in the sky and stay here for some time then he will remove to the ground. I'm working with Unreal Engine 4 with blueprints.
    • By OpaqueEncounter
      I have a very simple vertex/pixel shader for rendering a bunch of instances with a very simple lighting model.
      When testing, I noticed that the instances were becoming dimmer as the world transform scaling was increasing. I determined that this was due to the fact that the the value of float3 normal = mul(input.Normal, WorldInverseTranspose); was shrinking with the increased scaling of the world transform, but the unit portion of it appeared to be correct. To address this, I had to add normal = normalize(normal);. 
      I do not, for the life of me, understand why. The WorldInverseTranspose contains all of the components of the world transform (SetValueTranspose(Matrix.Invert(world * modelTransforms[mesh.ParentBone.Index]))) and the calculation appears to be correct as is.
      Why is the value requiring normalization? under);
      );
      float4 CalculatePositionInWorldViewProjection(float4 position, matrix world, matrix view, matrix projection) { float4 worldPosition = mul(position, world); float4 viewPosition = mul(worldPosition, view); return mul(viewPosition, projection); } VertexShaderOutput VS(VertexShaderInput input) { VertexShaderOutput output; matrix instanceWorldTransform = mul(World, transpose(input.InstanceTransform)); output.Position = CalculatePositionInWorldViewProjection(input.Position, instanceWorldTransform, View, Projection); float3 normal = mul(input.Normal, WorldInverseTranspose); normal = normalize(normal); float lightIntensity = -dot(normal, DiffuseLightDirection); output.Color = float4(saturate(DiffuseColor * DiffuseIntensity).xyz * lightIntensity, 1.0f); output.TextureCoordinate = SpriteSheetBoundsToTextureCoordinate(input.TextureCoordinate, input.SpriteSheetBounds); return output; } float4 PS(VertexShaderOutput input) : SV_Target { return Texture.Sample(Sampler, input.TextureCoordinate) * input.Color; }  
    • By pristondev
      Hey, Im using directx allocate hierarchy from dx9 to use a skinned mesh system.
      one mesh will be only the skeleton with all animations others meshes will be armor, head etc, already skinned with skeleton above. No animation, idle position with skin, thats all I want to use the animation from skeleton to other meshes, so this way I can customize character with different head, armor etc. What I was thinking its copy bone matrices from skeleton mesh to others meshes, but Im a bit confused yet what way I can do this.
       
      Thanks.
  • Popular Now