Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 12 Jun 2013
Offline Last Active Aug 11 2015 09:13 AM

Posts I've Made

In Topic: Question about nDotL across LOD levels

11 August 2015 - 09:14 AM

Yes, it was the normal calculation. I inherited the code form the CPU version so I guess that one was bad too. I eventually settled on a Sobel operator:

float4 PSNHeightToNormal(float4 inPos : SV_POSITION, 
                         float2 inTex : TEXCOORD0) : SV_TARGET {
	float ps = 1 / size;

	float3 n;
	float scale = worldScale;

	n.x = -(h(inTex,  ps,  ps) - h(inTex, -ps, ps) + 2 * (h(inTex, ps, 0)  - h(inTex, -ps, 0))  + h(inTex, ps, -ps) - h(inTex, -ps, -ps));
	n.y = -(h(inTex, -ps, -ps) - h(inTex, -ps, ps) + 2 * (h(inTex, 0, -ps) - h(inTex,  0, -ps)) + h(inTex, ps, -ps) - h(inTex,  ps,  ps));
	n.z = 1 / scale;

	n = normalize(n);
	n = n * 0.5 + 0.5;

	return float4(n.x, n.z, n.y, 1);

Hopefully this one works as expected. I still need to test it a bit because I am having a bit of a brain fart since the switch from RH to LH. I have no longer of an intuitive concept of "forward" and I need to contentiously convert in my head from one system to another :).


There is still a bunch of things to decide regarding how to interpret data for LOD transition and if to use mipmaps or just secondary lower resolution textures.


Is there a way to control how DeviceContext->GenerateMips works? What filters it uses? I couldn't find anything.


Additionally, since for LOD I am generating every chunk separately using noise, I have reintroduced the issue of TSeams at chunk borders...

In Topic: Question about nDotL across LOD levels

10 August 2015 - 09:32 AM

I'm not really sure about what cure you are talking, but have you verified that the normal vector is correct and normalized? (Even if it's normalized in the vertex shader does not mean that it will be normalized in the pixel shader due to interpolation.)


There's a good point that was raised but didn't get attention: You need to normalize the normals.


Sampling a normal from a bilinear or trilinear fetch won't result in a normalized normal, even if the pixels themselves are normalized.

If you sample right in the middle between (-0.70711; 0.70711) and (0.70711; 0.70711) the interpolated normal will be (0; 0.70711)  which is not normalized. The correct, normalized result is (0; 1)


As a side note, it looks like you're not doing gamma-correct rendering. Try rendering to sRGB render targets.


The normal are normalized, in the PS too.


And I am using gamma correct rendering. The debug maps you see on the left are all ran though a pixel shader. The height map is made more human readable by coloring terrain above water gray and bellow blue and the normal map is rendered in "fake non sRGB mode".


Universally used gamma correct rendering is fairly new, and in the past people would just output their linear normals from the shaders to the screen and I have gotten used to the look of normal maps like that, displayed wrong. So I wrote a little pixel shader to fake that look on a gamma corrected renderer.

In Topic: Question about nDotL across LOD levels

10 August 2015 - 08:20 AM

So I tried two blending methods for the highest detail LOD based loosely on more physically sound operations, and got these two attached results.


Man, rendering...


I'll go with 5 from my previous reply for stylized look and with 7from this reply for "realistic" look for now until I can shed more light on the problem.

In Topic: Question about nDotL across LOD levels

10 August 2015 - 08:00 AM

Thanks for the input Krohm!



(b) I have no idea what curve you're talking about. What I see seems to be a normal map... in world space I assume. It seems convincing. Yes, in theory you should try to stick close to it.


Of course it does. You're sampling different points, you get different results. When it comes to terrains, you don't sample them at some random interval you decide: you sample them at native resolution, stepping across adjacent samples. If you have some interpolation method you might think about super-sampling but that's backwards. The heightmaps from which you pull normals must be the highest resolution you have and then eventually bake them to a normalmap.


I'm talking about the general shape of the curve the nDotL goes from 1 to 0 based on sampling rate.


I believe too that in theory you should use you maximum resolution/LOD0 height map to get the normal map. But I do not like the visual result I get when I build it at LOD0. Maybe I am building it wrong! The results are worse and worse as I increase resolution. Here are the results for 4096x4096:


Attached File  nn01.png   537.73KB   3 downloads


Maybe it is correct, but I do not think so.


If I go to LOD2 (4 time lower size), I get a bit of normals:


Attached File  nn02.png   632.17KB   3 downloads


Going to LOD4, the second to last lowest in quality, I get this result:


Attached File  nn03.png   668.26KB   3 downloads


I decided to try some things out. Here is a normal LOD5, the lowest shot, with regular normals:


Attached File  nn04.png   650.13KB   3 downloads


And here is shot that uses a high resolution normal map corresponding to LOD0, only using some physically very unsound blending a normals:


Attached File  nn05.png   710.86KB   3 downloads


I need to try some more physically sound blending.


I have no idea yet which direction to follow. More like the first screenshot or more like the last or next to last? Artistically I like the last ones.




You are not going to figure it out with a test set like the one you're using. Pull in a special test case, you will find the correct result should be pretty trivial to identify. Besides, artistic decisions might apply.


What kind of special test case do you have in mind?

In Topic: I guess I manged to mess up GPU perlin?

09 August 2015 - 08:20 PM

Here is one thing I do not understand...


I generate a texture map with values [0..1], where 0 is ocean floor maximum and 1 is mountain maximum, 0.5 being sea level, using GPU simplex noise.


In my vertex shader, I sample that texture:

input.Position.y = permTexture2d.SampleLevel(permSampler2d, float4(input.texCoord, 0, 0), 0).x * 30;
float4 worldPosition = mul(float4(input.Position, 1), World);
output.Position = mul(worldPosition, ViewProj);

In the pixel shader I used to take the world position and use its y coordinate to done coloring, like:

float inz = ((input.wPos.y / 30) - 0.5) * 2;

Since at the moment I am using just low resolution meshes for terrain, I did not like the interpolation artifacts. Plus using the y coordinate causes very flat color gradients. So I decided to sample the same map as in the vertex shader, getting the actual height at that point:

float inz = (permTexture2d.SampleLevel(permSampler2d, float4(input.texCoord, 0, 0), 0).x - 0.5) * 2;

And this is what I get when I render all values inz <= 0 as black and the rest with some colors:


Attached File  1002_06.png   555.07KB   3 downloads


I really can't figure out why I get that pattern. It does not look like interpolation errors. It is not a big issue, I can work around it, but I am very curious what causes this. The shape depends on the size of the input maps, so it is related to sampling somehow...