Jump to content
  • Advertisement
Sign in to follow this  
george7378

3D Finite difference normal calculation for sphere

Recommended Posts

Hi everyone,

I'm currently adapting a planar quadtree terrain system to handle spherical terrain (i.e. I want to render planets as opposed to just an endless terrain stretching in the X-Z plane).

A the moment, I use this algorithm to calculate the normal for a given point on the terrain from the height information (the heights are generated using simplex noise):

	public Vector3 GetNormalFromFiniteOffset(float x, float z, float sampleOffset)
        {
            float hL = GetHeight(x - sampleOffset, z);
            float hR = GetHeight(x + sampleOffset, z);
            float hD = GetHeight(x, z - sampleOffset);
            float hU = GetHeight(x, z + sampleOffset);
	            Vector3 normal = new Vector3(hL - hR, 2, hD - hU);
            normal.Normalize();
	            return normal;
        }
	

The above works fine for my planar quadtree, but of course it won't work for spherical terrain because it assumes the Y direction is always up. I guess I need to move the calculation into the plane which lies at a tangent on the sphere for the point I'm evaluating. I was wondering if anyone knows of a good or proven way to transform this calculation for any point on a sphere, or if there's a better way to calculate the normal for a point on a sphere which has been radially displaced using a noise-based height field?

I'm using XNA by the way. Thanks very much for looking!

Share this post


Link to post
Share on other sites
Advertisement

A quick addition - I tried creating my own adaptation:

	public Vector3 GetNormalFromFiniteOffset(Vector3 location, float sampleOffset)
        {
            Vector3 normalisedLocation = Vector3.Normalize(location);
            Vector3 arbitraryUnitVector = Math.Abs(normalisedLocation.Y) > 0.999f ? Vector3.UnitX : Vector3.UnitY;
	            Vector3 tangentVector1 = Vector3.Cross(arbitraryUnitVector, normalisedLocation);
            tangentVector1.Normalize();
            Vector3 tangentVector2 = Vector3.Cross(tangentVector1, normalisedLocation);
            tangentVector2.Normalize();
	            float hL = GetHeight(location - tangentVector1*sampleOffset);
            float hR = GetHeight(location + tangentVector1*sampleOffset);
            float hD = GetHeight(location - tangentVector2*sampleOffset);
            float hU = GetHeight(location + tangentVector2*sampleOffset);
	            Vector3 normal = 2*normalisedLocation + (hL - hR)*tangentVector1 + (hD - hU)*tangentVector2;
            normal.Normalize();
	            return normal;
        }
	

I can't test it yet, but I wonder if anyone thinks this looks like a decent approach, or are there obvious issues with how I'm doing this?

Thanks again!

Share this post


Link to post
Share on other sites

I assume you get discontinuities because of the arbitary tangents. It's impossible to move a orientation over the surface of a sphere without singularities.

If you use normals at discrete offsets (e.g. per vertex or per face) the dicontinuity might not matter, but if yo do it per pixel i expect a visible seam.

In the latter case the solution is to consider the singularities when calculating the normal - similar to a cubemap where in the corner you need to sample from 3 instead 4 texels, and projected angles become 120 instead 90 degrees. (The orthogonality you assume in your code breaks at the corners.)

Share this post


Link to post
Share on other sites

Thanks for the reply - yes, I thought about what would happen if I used the same arbitrary vectors for every position, which is why I choose between the X direction or the Y direction depending on the Y component of the position. I was thinking this would ensure that the tangents always form an orthonormal basis around the point on the sphere.

it will result in different tangents for different positions, but I was thinking this wouldn't matter as long as I was taking samples across two perpendicular directions in the tangent plane.

Share this post


Link to post
Share on other sites

The tangents are orthogonal, but if you would rotate them slightly, you get slightly different results because you sample from different neighbouring positions. And this is what happens when the branch alternates between X and Y, you get large discontinuities in tangent directions at the branch point, and this can cause a visible seam.

The problem is unavoidable - you can't cover a sphere with a net of orthonormal isolines without introducing singularities, so you can't calculate a orthonormal basis around the sphere without discontinuity in orientation. You could however trick the issue by ensuring the jumps in orientation are always 90 degrees. (Might be more efficient than special cases for edges / corners, but the math won't be super easy.)

You would be surprised how much impact this problem has on topics like surface parametrization, remeshing etc.

But as said chances are that in practice you won't notice the seam - i'm just nit picking. See how your solution works for you...

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Similar Content

    • By Andrew Parkes
      I am a talented 2D/3D artist with 3 years animation working experience and a Degree in Illustration and Animation. I have won a world-wide art competition hosted by SFX magazine and am looking to develop a survival game. I have some knowledge of C sharp and have notes for a survival based game with flexible storyline and PVP. Looking for developers to team up with. I can create models, animations and artwork and I have beginner knowledge of C sharp with Unity. The idea is Inventory menu based gameplay and is inspired by games like DAYZ.
      Here is some early sci-fi concept art to give you an idea of the work level. Hope to work with like minded people and create something special. email me andrewparkesanim@gmail.com.
      Developers who share the same passion please contact me, or if you have a similar project and want me to join your team email me. 
      Many thanks, Andrew.

    • By thecheeselover
      I made this post on Reddit. I need ideas and information on how to create the ground mesh for my specifications.
    • By Canadian Map Makers
      GOVERNOR is a modernized version of the highly popular series of “Caesar” games. Our small team has already developed maps, written specifications, acquired music and performed the historical research needed to create a good base for the programming part of the project.

      Our ultimate goal is to create a world class multi-level strategic city building game, but to start with we would like to create some of the simpler modules to demonstrate proof of concept and graphical elegance.

       

      We would like programmers and graphical artists to come onboard to (initially) create:

      A module where Province wide infrastructure can be built on an interactive 3D map of one of the ancient Roman Provinces.
      A module where city infrastructure can be built on a real 3D interactive landscape.
      For both parts, geographically and historically accurate base maps will be prepared by our team cartographer. Graphics development will be using Blender. The game engine will be Unity.

       

      More information, and examples of the work carried out so far can be found at http://playgovernor.com/ (most of the interesting content is under the Encyclopedia tab).

       

      This project represents a good opportunity for upcoming programmers and 3D modeling artists to develop something for their portfolios in a relatively short time span, working closely with one of Canada’s leading cartographers. There is also the possibility of being involved in this project to the point of a finished game and commercial success! Above all, this is a fun project to work on.

       

      Best regards,

      Steve Chapman (Canadian Map Makers)

       
    • By RobMaddison
      Hi
      I’ve been working on a game engine for years and I’ve recently come back to it after a couple of years break.  Because my engine uses DirectX9.0c I thought maybe it would be a good idea to upgrade it to DX11. I then installed Windows 10 and starting tinkering around with the engine trying to refamiliarise myself with all the code.
      It all seems to work ok in the new OS but there’s something I’ve noticed that has caused a massive slowdown in frame rate. My engine has a relatively sophisticated terrain system which includes the ability to paint roads onto it, ala CryEngine. The roads are spline curves and built up with polygons matching the terrain surface. It used to work perfectly but I’ve noticed that when I’m dynamically adding the roads, which involves moving the spline curve control points around the surface of the terrain, the frame rate comes to a grinding halt.
      There’s some relatively complex processing going on each time the mouse moves - the road either side of the control point(s) being moved, is reconstructed in real time so you can position and bend the road precisely. On my previous OS, which was Win2k Pro, this worked really smoothly and in release mode there was barely any slow down in frame rate, but now it’s unusable. As part of the road reconstruction, I lock the vertex and index buffers and refill them with the new values so my question is, on windows 10 using DX9, is anyone aware of any locking issues? I’m aware that there can be contention when locking buffers dynamically but I’m locking with LOCK_DISCARD and this has never been an issue before.
      Any help would be greatly appreciated.
    • By MikhailGorobets
      I have a problem with SSAO. On left hand black area.
      Code shader:
      Texture2D<uint> texGBufferNormal : register(t0); Texture2D<float> texGBufferDepth : register(t1); Texture2D<float4> texSSAONoise : register(t2); float3 GetUV(float3 position) { float4 vp = mul(float4(position, 1.0), ViewProject); vp.xy = float2(0.5, 0.5) + float2(0.5, -0.5) * vp.xy / vp.w; return float3(vp.xy, vp.z / vp.w); } float3 GetNormal(in Texture2D<uint> texNormal, in int3 coord) { return normalize(2.0 * UnpackNormalSphermap(texNormal.Load(coord)) - 1.0); } float3 GetPosition(in Texture2D<float> texDepth, in int3 coord) { float4 position = 1.0; float2 size; texDepth.GetDimensions(size.x, size.y); position.x = 2.0 * (coord.x / size.x) - 1.0; position.y = -(2.0 * (coord.y / size.y) - 1.0); position.z = texDepth.Load(coord); position = mul(position, ViewProjectInverse); position /= position.w; return position.xyz; } float3 GetPosition(in float2 coord, float depth) { float4 position = 1.0; position.x = 2.0 * coord.x - 1.0; position.y = -(2.0 * coord.y - 1.0); position.z = depth; position = mul(position, ViewProjectInverse); position /= position.w; return position.xyz; } float DepthInvSqrt(float nonLinearDepth) { return 1 / sqrt(1.0 - nonLinearDepth); } float GetDepth(in Texture2D<float> texDepth, float2 uv) { return texGBufferDepth.Sample(samplerPoint, uv); } float GetDepth(in Texture2D<float> texDepth, int3 screenPos) { return texGBufferDepth.Load(screenPos); } float CalculateOcclusion(in float3 position, in float3 direction, in float radius, in float pixelDepth) { float3 uv = GetUV(position + radius * direction); float d1 = DepthInvSqrt(GetDepth(texGBufferDepth, uv.xy)); float d2 = DepthInvSqrt(uv.z); return step(d1 - d2, 0) * min(1.0, radius / abs(d2 - pixelDepth)); } float GetRNDTexFactor(float2 texSize) { float width; float height; texGBufferDepth.GetDimensions(width, height); return float2(width, height) / texSize; } float main(FullScreenPSIn input) : SV_TARGET0 { int3 screenPos = int3(input.Position.xy, 0); float depth = DepthInvSqrt(GetDepth(texGBufferDepth, screenPos)); float3 normal = GetNormal(texGBufferNormal, screenPos); float3 position = GetPosition(texGBufferDepth, screenPos) + normal * SSAO_NORMAL_BIAS; float3 random = normalize(2.0 * texSSAONoise.Sample(samplerNoise, input.Texcoord * GetRNDTexFactor(SSAO_RND_TEX_SIZE)).rgb - 1.0); float SSAO = 0; [unroll] for (int index = 0; index < SSAO_KERNEL_SIZE; index++) { float3 dir = reflect(SamplesKernel[index].xyz, random); SSAO += CalculateOcclusion(position, dir * sign(dot(dir, normal)), SSAO_RADIUS, depth); } return 1.0 - SSAO / SSAO_KERNEL_SIZE; }  



    • By Ike aka Dk
      Hello everyone 
      I am a programmer from Baku.
      I need a 3D Modeller for my shooter project in unity.I have 2 years Unity exp.
      Project will paid when we finish the work 
      If you interested write me on email:
      mr.danilo911@gmail.com
    • By MarcusAseth
      Hi guys, I'm trying to learn this stuff but running into some problems 😕
      I've compiled my .hlsl into a header file which contains the global variable with the precompiled shader data:
      //... // Approximately 83 instruction slots used #endif const BYTE g_vs[] = { 68, 88, 66, 67, 143, 82, 13, 236, 152, 133, 219, 113, 173, 135, 18, 87, 122, 208, 124, 76, 1, 0, 0, 0, 16, 76, 0, 0, 6, 0, //.... And now following the "Compiling at build time to header files" example at this msdn link , I've included the header files in my main.cpp and I'm trying to create the vertex shader like this:
      hr = g_d3dDevice->CreateVertexShader(g_vs, sizeof(g_vs), nullptr, &g_d3dVertexShader); if (FAILED(hr)) { return -1; } and this is failing, entering the if and returing -1.
      Can someone point out what I'm doing wrong? 😕 
    • By Toastmastern
      Hello everyone,
      After a few years of break from coding and my planet render game I'm giving it a go again from a different angle. What I'm struggling with now is that I have created a Frustum that works fine for now atleast, it does what it's supose to do alltho not perfect. But with the frustum came very low FPS, since what I'm doing right now just to see if the Frustum worked is to recreate the vertex buffer every frame that the camera detected movement. This is of course very costly and not the way to do it. Thats why I'm now trying to learn how to create a dynamic vertexbuffer instead and to map and unmap the vertexes, in the end my goal is to update only part of the vertexbuffer that is needed, but one step at a time ^^

      So below is my code which I use to create the Dynamic buffer. The issue is that I want the size of the vertex buffer to be big enough to handle bigger vertex buffers then just mPlanetMesh.vertices.size() due to more vertices being added later when I start to do LOD and stuff, the first render isn't the biggest one I will need.
      vertexBufferDesc.Usage = D3D11_USAGE_DYNAMIC; vertexBufferDesc.ByteWidth = mPlanetMesh.vertices.size(); vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; vertexBufferDesc.MiscFlags = 0; vertexBufferDesc.StructureByteStride = 0; vertexData.pSysMem = &mPlanetMesh.vertices[0]; vertexData.SysMemPitch = 0; vertexData.SysMemSlicePitch = 0; result = device->CreateBuffer(&vertexBufferDesc, &vertexData, &mVertexBuffer); if (FAILED(result)) { return false; } What happens is that the 
      result = device->CreateBuffer(&vertexBufferDesc, &vertexData, &mVertexBuffer); Makes it crash due to Access Violation. When I put the vertices.size() in it works without issues, but when I try to set it to like vertices.size() * 2 it crashes.
      I googled my eyes dry tonight but doesn't seem to find people with the same kind of issue, I've read that the vertex buffer can be bigger if needed. What I'm I doing wrong here?
       
      Best Regards and Thanks in advance
      Toastmastern
    • By Tibsun
      We are a growing team of hobbyists working on a small prototype for a horror game called Soulanity. We are working in Unreal Engine 4 and we need a few more members to help see this project to completion.

      All team members must be active on discord and willing to show up to team meetings.  You will need a good internet connection to download the project and any updates. Beginners are welcome, but you need to be motivated and capable of working independently. Any profits gained from our projects will be split evenly among the team.
      The positions we need:
      Level Designer – We have one level designer who has done an awesome job so far, but they don’t have as much time to work on the project anymore, so we need someone to help complete the map
      3D Artist – We still need 3d models for general props like books, tools, shelves, janitorial supplies, and some organic modeling for characters and the environment.
      Rigging/Animation – Still need animations for the player character, the main antagonist, and 2 smaller mobs.
      Unreal Engine Programmer – This game is currently entirely coded in blueprints. Most of the basic gameplay mechanics are complete but there is still much more to do.
       
      If you are interested in joining the team just add me on Discord and we’ll have a chat. Tibsun#1244










    • By cozzie
      Hi all,
      now that I’ve managed to get my normal mapping working, I’m at the point of making a “clean” implementation.
      Would you know if there’s a general consensus on the following 2 things:
      1. Normal mapping in world space vs tangent space (multiply T, B and N in PS versus converting light direction to tangent space in VS, pass to PS)
      2. Provide tangents and bitangents to the GPU in vtxbuffer versus calculating in the shader 
      What I’ve learned so far is that performing it in tangentspace is preferred (convert light dir in VS and pass that to the PS), combined with calculating the bitangent (and perhaps also tangent) in the VS, per vertex.
      I know the answer could be profiling, but that’s not something I’m doing at this stage. Just want to decide my 1st approach.
      Any input is appreciated.
  • Advertisement
  • Popular Now

  • Forum Statistics

    • Total Topics
      631350
    • Total Posts
      2999479
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!