Sign in to follow this  

Generating a Normal Map

This topic is 2643 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ok so I'm hoping this is possibly a straight forward question, but I can't find any useful information on it anywhere. Basically in a project I'm working on I create a procedural planet. I procedurally generate a landscape texture that is used as the surface for the planet. So these textures that I generate are always random and different. I'm wondering if I could somehow generate a normal map for that texture on the cpu and store it, but I'm unsure on how to create a normal map based off a texture.

Cheers,
Scott.

Share this post


Link to post
Share on other sites
You can take the difference in x and y direction from your heightfield, from which you can construct two vectors that are (more or less) tangential to the surface at this point. The cross product of the two will give you a vector that is perpendicular to them. Normalizing this vector will give you a normal.

If you want better precision and a smoother overall look, you do that 8 times for all the adjacent height values and average the normals.

Share this post


Link to post
Share on other sites
Hate to be a pain but could you demonstrate in a simple pseudo code example what you mean. What you said is a little blurry to me.

Cheers.

Share this post


Link to post
Share on other sites
given a depth map, calculate the xy distance it should be spread apart by every pixel, and how much z distance stands for a change in height.

youve then effectively made "3d" from "2d".

you can then calculate the surface vector by sampling the pixel and the pixel above it and making a vector. do this twice, once to the pixel above, and once to the pixel besides.

then the cross product of these two surface vectors is the normal.



VEC xyzdist;

xyzdist.x=1.0f; //modify these values to get the output youd like.
xyzdist.y=1.0f;
xyzdist.z=1.0f;

for(i=0;i<mapsizex;i++)
{
for(j=0;j<mapsizey;j++)
{
VEC surfacesample0=VEC(i*xyzdist.x,j*xyzdist.y,heightmap[i+(j+1)*mapsizex]*xyzdist.z);
VEC surfacesample1=VEC(i*xyzdist.x,(j+1)*xyzdist.y,heightmap[i+j*mapsizex]*xyzdist.z);
VEC surfacesample2=VEC((i+1)*xyzdist.x,j*xyzdist.y,heightmap[(i+1)+j*mapsizex]*xyzdist.z);

VEC surfacevec0=surfacesample1-surfacesample0;
VEC surfacevec1=surfacesample2-surfacesample0;

surfacevec0=Normalize(surfacevec0);
surfacevec1=Normalize(surfacevec1);

VEC surfacenormal=Cross(surfacevec0,surfacevec1);

surfacenormal=Normalize(surfacenormal);

normaloutput[i+j*mapsizex]=surfacenormal;
}
}




Share this post


Link to post
Share on other sites
Your terrain is laid out as a regular grid, which means that you already know 2 components of the tangent/bitangent vectors, they are 0 and 1, respectively (it may actually be 0 and any number, depending on your scale. But it does not really matter as long as you consistenly use the same number, because only the direction is what you're interested in).
So, in pseudocode, for the tangent:
vec3 t(1, 0, height[y][x+1] - height[y][x]);

It's the exact same for the bitangent, only swapping 0 and 1, and using y+1 instead of x+1.

The cross product gives you a vector orthogonal to two (non-zero, non-parallel) vectors. So, cross(t, u) will give you a (non-normal) normal corresponding to the triangle formed by the three points used.
Because your input vectors are not unit length (they are larger, unless the terrain is flat), the length of the resulting normal will not be 1, thus the normal needs to be renormalized for really being a normal.
If you skip this, your lighting will be too bright in steep places.

Now, this will give you the normal for one triangle touching a point. If the terrain is flat, all triangles will have the same normal, but if the terrain is not flat, they will be different. Therefore you may see a more or less disturbing discontinuity from one vertex to the next. Which is the reason why you may want to average several normals. It depends on whether the discontinuity is disturbing or not (or maybe even desired), and on how much computional power you can invest.

You can find a dozen different algorithms for generating terrain normals via google, including an article on this site. Some of them do this "averaging" implicitely.

Share this post


Link to post
Share on other sites
Thanks for the replys guys I'm doing all my calculations prior to in game, so it's all precalculated before you get into the game. So I've not to worry about how much is going on during run time.

I'm not sure if I explained myself very well. If any of use are familiar with a program called crazy bump I want to produce an image result like that. So no mater the image I put in, I was thinking I might be able to have an outcome like:

image

Using your algorithm rouncED (Not sure if I implemented part of it right, the float samples for the color values - R + G + B). And don't worry about the -1 in the for loops for now.

I'm generating a perlin noise texture thats sphereized so it wraps around the sphere mesh perfectly, it's of random colors for different planets, but for the sakes of what I'm showing use I just made it white and blue, so it's cloud like. But there times where it could be yellow brown and violet possibly etc. This is why I'm not to sure about how to generate a normal map like above. Anyway I implemented the algorithm like such.


Color[] normalArray = new Color[squared * squared];

Vector3 xyzdist = new Vector3(1.0f, 1.0f, 1.0f);
for (int i = 0; i < squared-1; i++)
{
for (int j = 0; j < squared-1; j++)
{
float sample1 = (colorArray[i + (j + 1) * squared].R + colorArray[i + (j + 1) * squared].G + colorArray[i + (j + 1) * squared].B);
float sample2 = (colorArray[i + j * squared].R + colorArray[i + j * squared].G + colorArray[i + j * squared].B);
float sample3 = (colorArray[(i + 1) + j * squared].R + colorArray[(i + 1) + j * squared].G + colorArray[(i + 1) + j * squared].B);
Vector3 surfacesample0 = new Vector3(i * xyzdist.X, j * xyzdist.Y, sample1 * xyzdist.Z);
Vector3 surfacesample1 = new Vector3(i * xyzdist.X, (j + 1) * xyzdist.Y, sample2 * xyzdist.Z);
Vector3 surfacesample2 = new Vector3((i + 1) * xyzdist.X, j * xyzdist.Y, sample3 * xyzdist.Z);

Vector3 surfacevec0 = surfacesample1 - surfacesample0;
Vector3 surfacevec1 = surfacesample2 - surfacesample0;

surfacevec0.Normalize();
surfacevec1.Normalize();

Vector3 surfacenormal = new Vector3();
Vector3.Cross(ref surfacevec0, ref surfacevec1, out surfacenormal);

surfacenormal.Normalize();

normalArray[i + j * squared] = new Color(surfacenormal);
}
}
texture = new Texture2D(Device, squared, squared);
texture.SetData(normalArray);



Heres the image that was used:

image

And heres the map generated off this type of noise image. Obviously it's not the same, but it's the same perlin generation, also I'm just displaying the image that was generated from the input image, im not actually using it as a normalmap yet.

image

Thanks for the help so far guys :)

Share this post


Link to post
Share on other sites
Hey great! I told you what to write and here it goes!
Hey that actually almost looks right.
We are almost there...

looks like you might have to invert the z, cause its not looking blue...

so try that, but youve got to remember I only told you how to get world space normals, you see, textures are stored in unsigned format, so you dont get negative values.

to convert unsigned you have to add 1 to it and divide it by two, like this->

red=(normal.x+1)/2;
green=(normal.y+1)/2;
blue=(normal.z+1)/2;

then in the normal mapping shader, you then have to do this.

x=red*2-1;
y=green*2-1;
z=blue*2-1;

to restore the normal back to negative values. (but to just use the normal map as a colour map (to see it look right like that normal map you showed us)) you needent convert to negatives yet.



but either invert the z, or reverse the cross product, in otherwords go "cross(surfacevec2, surfacevec1)" instead of the other way around, like it is now.

Actually try reversing the cross product, it would be more correct than inverting the z.

Dont forget to show me a picture once youve done this!

Awesome work man! You really make me feel like im making a difference, i never get that...

Share this post


Link to post
Share on other sites
Awesome man, thanks for the help :D! So by doing it to the first way you said, inverting the Z and turning the values into unsigned numbers:

surfacenormal.Z = -surfacenormal.Z;
surfacenormal.X = (surfacenormal.X + 1.0f / 2.0f);
surfacenormal.Y = (surfacenormal.Y + 1.0f / 2.0f);
surfacenormal.Z = (surfacenormal.Z + 1.0f / 2.0f);

normalArray[i + j * squared] = new Color(surfacenormal);

I get:

image

And by flipping the cross product the other way around instead of inverting the Z I get:

image

These images seem much more correct now, but would I be wrong in thinking there is maybe a little to much red? I'm not to sure but either way I'll test this out in a shader now and see how it goes. Thanks a heap :)!!!

Share this post


Link to post
Share on other sites
Ok, I feel a little stupid now. To use this as a normal map in a shader, am I going to need to know the tangents and bi-normals? If so how am I meant to generate the tangents from my diffuse texture and new normal texture?

Share this post


Link to post
Share on other sites
Alright So I've tried applying the normal map and have these 2 results, I personally can't tell any difference. So I'm going to presume I'm doing it wrong or possible need to be able to calculate and use the tangets and what not some how, or maybe even add specular lighting or something. I wouldn't say this is my strong point in programming, so I'm stumped at the moment.

Heres the pixel shader calculation WITHOUT the normal map and the result:
TexPixelToFrame TexturedPS(TexVertexToPixel PSIn) 
{
TexPixelToFrame Output = (TexPixelToFrame)0;
float4 returnColor = 0;

float4 planetColor = tex2D(TextureSampler1, PSIn.TextureCoords);

returnColor = planetColor;
Output.Color.rgba = returnColor;
Output.Color.rgb *= saturate(PSIn.LightingFactor);

return Output;
}

image

Heres the pixel shader calculation WITH the normal map and the result:
TexPixelToFrame TexNormPS(TexVertexToPixel PSIn) 
{
TexPixelToFrame Output = (TexPixelToFrame)0;

float3 N = tex2D( NormalSampler1, PSIn.TextureCoords ).xyz * 2.0 -1.0;
float NdotL = saturate( dot(N, xLightDirection * 2.0 - 1.0) );

float3 C = saturate( lightColor * NdotL + ambientColor) * (float3)tex2D(TextureSampler1, PSIn.TextureCoords).xyz;

Output.Color = float4(C, 1);
Output.Color.rgb *= saturate(PSIn.LightingFactor);

return Output;
}


image

For all I know I could be going totally wrong about this, so any help would be greatly appreciated. Thanks guys for all the help so far :)

Share this post


Link to post
Share on other sites
Ok... first things first.

you forgot to put brackets around surfacenormal.x=(surfacenormal.x+1)/2; (dont forget bracket around PLUS! it parses AFTER a MULTIPLY!)

And secondly, dont convert the light direction!

Only convert the normal!


And thirdly!

Tangent space takes ages to explain properly, but see if you can understand this in 2 seconds. you have to referbish your vertex structure to include an up and right as well as the normal.

TANGENT SPACE IS DERIVATIVE SPACE, THE 3D MODEL COUNTS AS A FUNCTION OF A SPHERE, SO YOU CAN SPHERICALLY CALCULATE TANGENT COORDINATES.

A tangent is RIGHT, UP, AND NORMAL.

to make tangent space, do this.

I know you prefer looking at psuedo c, so rinse and repeat this...




//in application (this calculates tangent space, do it when you make the normal map)

int i;
for(i=0;i<VERTSINSPHERE;i++) //note this could be a 2 dimensional array if you prefer.
{
v[i].nor=Normalize(v[i].pos);
v[i].up=VEC(0,1,0);
v[i].right=Normalize(Cross(v[i].nor,v[i].up));
v[i].up=Cross(v[i].nor,v[i].right);
}
//lock it into hardware resource...


// then in pixel shader

Matrix TangentMatrix;


float4 ps(VS_INPUT in)
{
TangentMatrix.11=in.right.x; TangentMatrix.21=in.right.y; TangentMatrix.31=in.right.z; TangentMatrix.41=0;
TangentMatrix.12=in.up.x; TangentMatrix.22=in.up.y; TangentMatrix.32=in.up.z; TangentMatrix.42=0;
TangentMatrix.13=in.nor.x; TangentMatrix.23=in.nor.y; TangentMatrix.33=in.nor.z; TangentMatrix.43=0;
TangentMatrix.14=0;TangentMatrix.24=0; TangentMatrix.34=0; TangentMatrix.44=1;


float3 lightdir_in_tangent_space=mul(light_dir, TangentMatrix);
float3 nor=Tex2D(normap, float2(in.uv)).rgb*2-1;

float pixel_light=dot(lightdir_in_tangent_space, nor);


return float4(pixel_light.r,pixel_light.g,pixel_light.b,1);
}












[edit] its more proper to normalize the interpolated up and right and normal before you make the tangent matrix, because they only linear truck instead of rotating properly along the triangle [/edit]

I bet if you listen to me this time youll get a nice ol' bump mapped planet. ;)

[Edited by - rouncED on October 21, 2010 1:14:57 AM]

Share this post


Link to post
Share on other sites
Ok I have everything mapped out how you said, one question though, how should I be inserting the actual diffuse/skin texture of the planet into your pixel shader function?

Share this post


Link to post
Share on other sites
To make things a little more clearer, just in case I've done some of the tangent calculation incorrectly. This is how I tried to implement your post.

Vertex:

public struct VertexPositionNormalUpRightTexture
{
public Vector3 Position;
public Vector3 Normal;
public Vector3 Up;
public Vector3 Right;
public Vector2 Texture;

public static int SizeInBytes = (3 + 3 + 3 + 3 + 2) * sizeof(float);
public static VertexElement[] VertexElements = new VertexElement[]
{
new VertexElement(0, 0, VertexElementFormat.Vector3, VertexElementMethod.Default, VertexElementUsage.Position, 0),
new VertexElement(0, sizeof(float) * 3, VertexElementFormat.Vector3, VertexElementMethod.Default, VertexElementUsage.Normal, 0 ),
new VertexElement(0, sizeof(float) * 6, VertexElementFormat.Vector3, VertexElementMethod.Default, VertexElementUsage.Position, 1),
new VertexElement(0, sizeof(float) * 9, VertexElementFormat.Vector3, VertexElementMethod.Default, VertexElementUsage.Position, 2),
new VertexElement(0, sizeof(float) * 12, VertexElementFormat.Vector2, VertexElementMethod.Default, VertexElementUsage.TextureCoordinate, 0),
};
}


Sphere Generation:

public void CreateSphere(ref VertexPositionNormalUpRightTexture[] vpnt, ref int[] indices, int m_NumSlices, int m_NumStacks, float m_Radius)
{
int tDiv = m_NumSlices;
int pDiv = m_NumStacks;
float dt = ((float)Math.PI * 2) / tDiv;
float dp = (float)Math.PI / pDiv;

int vertex = 0;
for (int pi = 0; pi <= pDiv; pi++)
{
float phi = pi * dp;

for (int ti = 0; ti <= tDiv; ti++)
{
// we want to start the mesh on the x axis
float theta = ti * dt;

vpnt[vertex].Position = GetPosition(theta, phi, m_Radius);
vpnt[vertex].Normal = GetNormal(theta, phi);
vpnt[vertex].Texture = GetTextureCoordinate(theta, phi);
vertex++;
}
}

int index = 0;
for (int pi = 0; pi < pDiv; pi++)
{
for (int ti = 0; ti < tDiv; ti++)
{
int x0 = ti;
int x1 = (ti + 1);
int y0 = pi * (tDiv + 1);
int y1 = (pi + 1) * (tDiv + 1);

indices[index] = x0 + y0;
indices[index + 1] = x0 + y1;
indices[index + 2] = x1 + y0;

indices[index + 3] = x1 + y0;
indices[index + 4] = x0 + y1;
indices[index + 5] = x1 + y1;
index += 6;
}
}
for (int i = 0; i < vertex; i++)
{
Vector3.Normalize(ref vpnt[i].Position, out vpnt[i].Normal );
vpnt[i].Up = new Vector3(0, 1, 0);
Vector3 cross = Vector3.Cross(vpnt[i].Normal, vpnt[i].Up);
Vector3.Normalize(ref cross, out vpnt[i].Right);
vpnt[i].Up = Vector3.Cross(vpnt[i].Normal, vpnt[i].Right);
}
}



Vertex and Pixel Shader:

TexVertexToPixel TexturedVS( float4 inPos : POSITION0, float3 inNormal: NORMAL0, float4 inUp : POSITION1, float4 inRight : POSITION2 ,float2 inTexCoords: TEXCOORD0)
{
TexVertexToPixel Output = (TexVertexToPixel)0;
float4x4 preViewProjection = mul (xView, xProjection);
float4x4 preWorldViewProjection = mul (xWorld, preViewProjection);

Output.Position = mul(inPos, preWorldViewProjection);
Output.TextureCoords = inTexCoords;

//enable for better light...
float3 Normal = normalize(mul(normalize(inNormal), xWorld));
Output.LightingFactor = 1;
if (xEnableLighting)
Output.LightingFactor = saturate(dot(Normal, -xLightDirection));

Output.Color = float4(1.0f, 0.0f, 0.0f, 1.0f);

Output.vView = -xView[3].xyz - Output.Position;

Output.UP = mul(inUp, preWorldViewProjection);
Output.RIGHT = mul(inRight, preWorldViewProjection);
Output.NORMAL = mul(inNormal, preWorldViewProjection);
//Output.UP = inUp;
//Output.RIGHT = inRight;
//Output.NORMAL = inNormal;

return Output;
}

TexPixelToFrame GDPS(TexVertexToPixel PSIn)
{
TexPixelToFrame Output = (TexPixelToFrame)0;

//float4x4 TangentMatrix;
TangentMatrix[0].x=PSIn.RIGHT.x; TangentMatrix[0].y=PSIn.RIGHT.y; TangentMatrix[0].z=PSIn.RIGHT.z; TangentMatrix[0].w=0;
TangentMatrix[1].x=PSIn.UP.x; TangentMatrix[1].y=PSIn.UP.y; TangentMatrix[1].z=PSIn.UP.z; TangentMatrix[1].w=0;
TangentMatrix[2].x=PSIn.NORMAL.x; TangentMatrix[2].y=PSIn.NORMAL.y; TangentMatrix[2].z=PSIn.NORMAL.z; TangentMatrix[2].w=0;
TangentMatrix[3].x=0; TangentMatrix[3].y=0; TangentMatrix[3].z=0; TangentMatrix[3].w=1;

float3 lightdir_in_tangent_space = mul(xLightDirection, TangentMatrix);
float3 nor=tex2D(NormalSampler1, float2(PSIn.TextureCoords)).rgb*2-1;

float3 pixel_light=dot(lightdir_in_tangent_space, nor);

Output.Color = float4(pixel_light.r,pixel_light.g,pixel_light.b, 1);

return Output;
}



I could be wrong in my vertex shader where I set my normal, up and right.

Anyway the resets this produce is such:

image

I took away the my lighting calculation which makes it so half the planet is dark. As when I had this turned on you could see it shifting a little which was weird. Apart from that when you fly around this planet you can see all the black and white moving as if the normal map is working how it should. Just though I'd put this here though in case I'm doing something wrong.

Thanks :)

Share this post


Link to post
Share on other sites
You shouldnt be doing this->



Output.UP = mul(inUp, preWorldViewProjection);
Output.RIGHT = mul(inRight, preWorldViewProjection);
Output.NORMAL = mul(inNormal, preWorldViewProjection);


//do this instead.

Output.UP = inUp;
Output.RIGHT = inRight;
Output.NORMAL = inNormal;







And yes :) you can include a diffuse texture if you so desire, and specular helped make all the early bump mapping demos.

I see you had that commented out so you already tried it, if its working but the lights going the wrong direction its because the tangent space could be crossed around the wrong way.

So that could be it too - get back to me i wanna see you solve this!

Share this post


Link to post
Share on other sites
Ah ok cool, i changed it to what you said, but as I'm a shader nub I'm confused on how I'm meant to apply the normal calculations to the diffuse texture. How would you go about doing that. I'll post a result of what I get with the way I'm doing it.

EDIT
If I make the changes you said, and then also change the pixel shader to:

Output.Color.rgb = tex2D(TextureSampler1, float2(PSIn.TextureCoords)).rgb;
Output.Color += float4(pixel_light.r,pixel_light.g,pixel_light.b, 1);

image

I figure this is wrong? As the planets don't look so pretty like this.

Share this post


Link to post
Share on other sites


float4 outcol=tex2d(Diffusesampler,input.uv);

outcol.r*=light.r; //here it modulates the texture with the light.
outcol.g*=light.g;
outcol.b*=light.b;

return outcol;







is it still too bright? And I think you should go up a step of resolution, it could look more pro...

Share this post


Link to post
Share on other sites
try reflecting to the negative of the normal with the dot product in the pixel shader.

Share this post


Link to post
Share on other sites
Changing to 0.6 fixes it a little bit, but it still looks pretty funky. And how would insert that you just said into the pixel shader. How am I mean to actually incorporate it?

Thanks for your patience and your help man :) I really appreciate it :)

Share this post


Link to post
Share on other sites
I want to see you nail this!

And on second thoughts, lower xyzdist.z even more, like to .1f or maybe even .025f somewhere between them two.


TexPixelToFrame GDPS(TexVertexToPixel PSIn)
{
TexPixelToFrame Output = (TexPixelToFrame)0;

//float4x4 TangentMatrix;
TangentMatrix[0].x=PSIn.RIGHT.x; TangentMatrix[0].y=PSIn.RIGHT.y; TangentMatrix[0].z=PSIn.RIGHT.z; TangentMatrix[0].w=0;
TangentMatrix[1].x=PSIn.UP.x; TangentMatrix[1].y=PSIn.UP.y; TangentMatrix[1].z=PSIn.UP.z; TangentMatrix[1].w=0;
TangentMatrix[2].x=PSIn.NORMAL.x; TangentMatrix[2].y=PSIn.NORMAL.y; TangentMatrix[2].z=PSIn.NORMAL.z; TangentMatrix[2].w=0;
TangentMatrix[3].x=0; TangentMatrix[3].y=0; TangentMatrix[3].z=0; TangentMatrix[3].w=1;

float3 lightdir_in_tangent_space = mul(xLightDirection, TangentMatrix);
float3 nor=tex2D(NormalSampler1, float2(PSIn.TextureCoords)).rgb*2-1;

float pixel_light=dot(lightdir_in_tangent_space, -nor); //DONT FORGET THE MODIFICATIONS ON THIS LINE

Output.Color = tex2D(DiffuseSampler1; //HERE SIMPLY SAMPLE THE DIFFUSE

Output.Color*=pixel_light; //MODULATE IT WITH THE LIGHT (sorry i made a mistake before, you only need "white" light
Output.Color.a=1.0f;

return Output;
}






Share this post


Link to post
Share on other sites
Try different kinds of cross products when you make the tangent space, maybe a fork is going in the wrong direction.

Remember, keep the xyzdist.x and xyzdist.y to 1 still, just shrink xyzdist.z till it looks more appropriately scaled (like there isnt black all over it)


//try this

for (int i = 0; i < vertex; i++)
{
Vector3.Normalize(ref vpnt[i].Position, out vpnt[i].Normal );
vpnt[i].Up = new Vector3(0, 1, 0);
Vector3 cross = Vector3.Cross(vpnt[i].Up, vpnt[i].Normal);
Vector3.Normalize(ref cross, out vpnt[i].Right);
vpnt[i].Up = Vector3.Cross(vpnt[i].Normal, vpnt[i].Right);
}

//also try this

for (int i = 0; i < vertex; i++)
{
Vector3.Normalize(ref vpnt[i].Position, out vpnt[i].Normal );
vpnt[i].Up = new Vector3(0, 1, 0);
Vector3 cross = Vector3.Cross(vpnt[i].Up, vpnt[i].Normal);
Vector3.Normalize(ref cross, out vpnt[i].Right);
vpnt[i].Up = Vector3.Cross(vpnt[i].Right, vpnt[i].Normal);
}

//also try this

for (int i = 0; i < vertex; i++)
{
Vector3.Normalize(ref vpnt[i].Position, out vpnt[i].Normal );
vpnt[i].Up = new Vector3(0, 1, 0);
Vector3 cross = Vector3.Cross(vpnt[i].Normal, vpnt[i].Up);
Vector3.Normalize(ref cross, out vpnt[i].Right);
vpnt[i].Up = Vector3.Cross(vpnt[i].Right, vpnt[i].Normal);
}


//also try this

for (int i = 0; i < vertex; i++)
{
Vector3.Normalize(ref vpnt[i].Position, out vpnt[i].Normal );
vpnt[i].Up = new Vector3(0, 1, 0);
Vector3 cross = Vector3.Cross(vpnt[i].Normal, vpnt[i].Up);
Vector3.Normalize(ref cross, out vpnt[i].Right);
vpnt[i].Up = Vector3.Cross(vpnt[i].Normal, vpnt[i].Right);
}







Share this post


Link to post
Share on other sites
Sign in to follow this