Terrain mapping - more earthquake when closer to ground

Started by
35 comments, last by Green_Baron 4 years, 7 months ago

I am working on planet terrain rendering with OpenGL 4.x shader programming.  When I move closer to ground, ground is becoming more earthquake.

Does anyone know any solution with 'earthquake effect' elimination when closer to ground for landing, etc? I use unit per cubic kilometer for terrain mapping.

Advertisement

Earthquake effect? Do you mean something you don't want like a rolling sea effect because your vertices aren't snapping to a grid? Or do you mean you want to create an earthquake effect when a ship lands?

No, when landing from air or space, ground is more shaking and distorted. I think values are too small to rendering.  One meter is .0001 each.  When further from terrain textures, they are more stable. 

Can't imagine what you mean. Apparently there are differences when close to the terrain and when far away ?

How does it look when you look along the surface to the horizon ?

What changes between far and near ? Like projection, depth ?

Can you post a video or screen shots ?

Which technique do you use to render the terrain ?

Or do the textures look blurry when you're near ?

Any more info you can give us ?

Ok, When far away, textures are stable (no shaky or distorted). I believe possible float inaccuracies due to large offset between viewpoint and vertices since planet's radius is 6400km and using one unit per one kilometer. One meter is each .0001 unit.

At horizon, distant textures are more stable than local textures.

I tried to reduce planet size to normalized coordinate system (one unit is one planet radius) and moved it to (0,0,-5.0) but it still got shaky and distorted - no difference.

I tried different near and far planes but it did not resolve that problem.  When I tried smaller far plane, it got worse shaky/distorted.  Original was near .001 and far 1.0e9 (need that see stars in 3D coordinates system).  I tried near .0000001 and far 10.0, it got worse shaky and distorted.

I set camera position is always (0, 0, 0) as origin to relative world coordinate system by subtracting view position from planet position (relative world coordinate system).

I still need solution by remove an offset from viewpoint and vertices to eliminate shaky/distorted. 

Clip wont be the saviour if floating point accuracy is the problem, or else you'd be seeing "Z fighting" type issues not what looks like an earthquake. Even still you shouldn't make a big far clip to suit stars. You can probably just force a max distance for rendering stars regardless of the far clip in the planet projection matrix (ie manually set the Z value).

0.0001 is a healthy increment size for a 32 bit floating point but if you are doing functions with that like pow it might be a problem. It would be best to show your shader code.

Is your height just a single texture that gives height and do you calculate the vertices of a grid to be on a sphere? Are you just doing default linear sampling of the texture?

My comment about snapping vertices to a grid still stands; how do you calculate the vertices of the sphere?

I had not implemented elevation yet but just flat texture at this time.   I calculates 32-bit vertices from origin (0, 0, 0)  in normalized or planet-sized coordinate system for spherical vertices with 16-bit indices and 32-bit texture coordinates.  It uses GL_TRIANGLES with CCW winding orders.  Both coordinates resulted same (locally shaky/distorted).

I am using glEnable(GL_DEPTH_TEST) and glDepthFunc(GL_LEQUAL).

I am using GL_LINEAR for texture mapping.


glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

Here are my shader programs to rendering planet textures.  Here is vertex program.


#version 420

// vertex buffer objects
layout (location=0) in vec3 vPosition;
layout (location=1) in vec3 vNormal;
//layout (location=2) in vec3 vColor;
layout (location=2) in vec2 vTexCoord;

uniform mat4 mvp;

out vec4 myColor;
out vec2 texCoord;

void main()
{
    gl_Position = mvp * vec4(vPosition, 1.0);
    myColor = vec4(0.7, 0.7, 0.7, 1.0); // vec4(vColor, 1.0);
    texCoord = vTexCoord;
}

Here is fragment program.


#version 420
// #extension GL_ARB_shading_language_420pack: enable  Use for GLSL versions before 420.

layout (binding=0) uniform sampler2D sTexture;

in vec2 texCoord;
in vec4 myColor;

out vec4 fragColor;

void main()
{
    fragColor = texture(sTexture, texCoord);
    // fragColor = myColor;
}

 

6 hours ago, Sword7 said:

Ok, When far away, textures are stable (no shaky or distorted). I believe possible float inaccuracies due to large offset between viewpoint and vertices since planet's radius is 6400km and using one unit per one kilometer. One meter is each .0001 unit.

At horizon, distant textures are more stable than local textures.

I tried to reduce planet size to normalized coordinate system (one unit is one planet radius) and moved it to (0,0,-5.0) but it still got shaky and distorted - no difference.

I tried different near and far planes but it did not resolve that problem.  When I tried smaller far plane, it got worse shaky/distorted.  Original was near .001 and far 1.0e9 (need that see stars in 3D coordinates system).  I tried near .0000001 and far 10.0, it got worse shaky and distorted.

I set camera position is always (0, 0, 0) as origin to relative world coordinate system by subtracting view position from planet position (relative world coordinate system).

I still need solution by remove an offset from viewpoint and vertices to eliminate shaky/distorted. 

with what I've done closer is a darker hue, and a good portion of stuff obtains a value from the previous group, pretty much just adding more texture lines

Is there something incorrect in that spot?

Are you missing a diagonal viewpoint in 3d? overlapping?

Also when switch from going towards the planet to looking horizontal on the planet the entire depth group would be off (one spot would get real small while another gets bolder, would be the correct pattern), There's not too much you can fit horizontally as well compared to looking down on

My guess is that you see 'an earthquake', because you are having floating point precision problems. That means the distances you use are too large for a 32 bit float.

IIRC 32 bit floats give you a precision of seven digits, that means for one hundred kilometers you have roughly a precision of one centimeter. When the distances get larger the numbers aren't precise enough anymore, and the objects in your scene start to 'jump around', because their positions can not be computed exactly anymore. Here's a visualization of that problem.

That is a non-trivial problem, and there's no easy way to fix this. AFAIK most space games continuously shift the origin of the scene to have close objects with a high precision, and/or use additional cameras to render large objecs that are far away, and/or use 64bit floats internally and convert them to 32 bits when rendering.

Here's a video by the developers of Kerbal Space Program, that details how they fixed their precision issues. He starts to talk about it at about 4 minutes.

Well, i'd like to add to the guessing game as i still don't know what's exactly happening:

The technique with pass through shading and a low poly model is acceptable for viewing a planet as a small ball from far away. But when closing in edges of the primitives of the model become visible and the texture is being stretched, leading to a "blurry" view as a few or one texture pixel corresponds to a large screen area and the pipeline interpolates. Is that what happening ?

A solution would be to adapt the resolution of the model to the distance from the viewer with a level of detail algorithm.

KSP planets only have a few 10s or hundreds of km diameter. For rendering, they are still within the range of single precision when adapting the depth range appropriately and when using double on the cpu and convert to float while transferring to the shaders. A real planet is not, it needs other techniques to preserve precision for the field of view, like for example transfer doubles to two floats for crunching in the shader and render relative to eye.

Edit: the depth range is a problem on its own, besides single/double precision of position values. If you have jittering while looking at large objects, experiment with the depth range (near and far values), adjusting them to what's visible on screen, pushing near as far out as possible and hauling far in as near as possible.

It'll all change once the graphics card manufacturers let us use double precision without paying too much ... ?

This topic is closed to new replies.

Advertisement