Sign in to follow this  

terrain following

This topic is 4134 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi ive got my terrain mesh up and ive stored all of its normals in an array of Vectors so i dont have to unlock the vertex buffer every frame to read them. Now i want to make a character object rotate to the angle of the ground below it. 1. Do i have to create an average normal for each face from the 3 vertex normals? 2.how do i get the angle to rotate the character? is it the cross product of the characters position and the normal of the face it intersects with? diagram thanks [Edited by - treeway on August 18, 2006 8:36:45 AM]

Share this post


Link to post
Share on other sites
I'll move this over to M&P - I think it'll be more suitable over there.

If you've got an algebra book to hand you may want to read up on the various tricks involving planes. If you interpolate the height and have a surface normal then you've got a plane definition.

I'll stop now before I go too far down the wrong path - my algebra isn't quite as good as it used to be [lol]

hth
Jack

Share this post


Link to post
Share on other sites
Is the terrain a regular grid of vertices like in a height map? If so you can determine which tri you fall in directly from the character position without having to do any intersection testing. Otherwise you'll have to cast a ray downwards from the character and intersect it with the terrain in order to determine the correct tri. Do a google for ray tri intersection testing. If you're having to do this and your terrain is very large you may also have to look into spacialy partitioning the terrain to cut down on the number of intersection tests you have to perform.

Once you have the correct tri the best solution would be to interpolate the normals from the 3 verts which make up the tri in order to get the smoothest motion. You can do this by projecting the character position onto two of the tri edges and taking the distance that projection lies into the edge as a blending factor between the normals. This would look something like:


// dont trust this, I just made it up on the spot

vec3 BlendNormals(vec3* pVerts, vec3* pVertNormals, vec3& point)
{
vec3 edges[2];
edges[0] = pVerts[1] - pVerts[0];
edges[1] = pVerts[2] - pVerts[0];

float blends[2];
blends[0] = (point - pVerts[0]).dot(edges[0]) / edges[0].length();
blends[1] = (point - pVerts[0]).dot(edges[1]) / edges[1].length();

vec3 norm = pVertNormals[0] + blends[0]*(pVertNormals[1] - pVertNormals[0]);
norm = norm + blends[1]*(pVertNormals[2] - norm);

return norm;
}


Unfortunately this would involve accessing the vert array which you dont want to do. If your terrain isn't a regular grid of verts you're going to have to have access to the vert array anyway for the intersection testing so you might want to consider storing two copies, one for the graphics buffer and the other in main memory for this stuff.

Share this post


Link to post
Share on other sites
its an irregular terrain - ive already stored each normal as a vector which can be indexed from the face intrersection with the players down vector using D3DXIntersect(see diagram i added to my first post). I dont think im blending the normals properly - im just adding the three normals of the face together then normalizing it. i will try your suggestion. thanks

Share this post


Link to post
Share on other sites
Somewhat off-topic and you might already know this, but some of your normals are messed up, to an extent. If you look at the quad on the top of the ramp, you will notice that it has a slight gradient shading, instead of smooth shading like (how I assume) you want it. See, it's interpolating between the straight-up normal of the vertices that aren't next to the ramp, and the diagnol normals of the vertices that are next to the ramp. If you use these normals to change the slop of the player, this could be a fairly confusing to the player.

Share this post


Link to post
Share on other sites
Hmm, been giving it some more thought and I've realised that the maths I've given you will only give a correct blending for right angled triangles. Its not far off but not perfect. A better solution might be to calculate the distance from the point to each vert of the tri and use the ratios of these distances as your blending factors between the 3 vertex normals. Something like:


vec3 BlendNormals(vec3* pVerts, vec3* pVertNormals, vec3& point)
{
float distances[3]
distances[0] = (pVerts[0] - point).length();
distances[1] = (pVerts[1] - point).length();
distances[2] = (pVerts[2] - point).length();

float totalDistance = distances[0] + distances[1] + distances[2];
distances[0] /= totalDistance;
distances[1] /= totalDistance;
distances[2] /= totalDistance;

vec3 norm;
norm = distances[0]*pVertNormals[0];
norm += distances[1]*pVertNormals[1];
norm += distances[2]*pVertNormals[2];

norm.Normalise();
return norm;
}


I'm not completely convinsed of the top of my head though, give it a try and see.

Share this post


Link to post
Share on other sites
i see what you mean about the normals on the ramp but i cant think of any other way to make the character follow an arbetry terrain correctly. Are there any other ways of doing it? The player is colliding with the terrain correctly,it is just not roatating upwards when walking up hill / downards when going down hill or whatever. I thought this would be a simple x rotation to

acosf(dot(playerForwardAxis,FaceNormal)

with the face normal as the normalized sum of the three vertex normals of the face but it seems more difficult then that.

Share this post


Link to post
Share on other sites
The trick to combining smooth surfaces and sharp edges in terrain data is to create degenerate verts along sharp edges so that each joining face references a different set of normals. This way its possible to have sharp changes in normal direction so that your shading and character orientation respond correctly. You can still continue to use the normal blending in these scenarious too since a flat surface will reference normals which all point in the same direction and thus will blend to give the same normal direction all over the tri.

Share this post


Link to post
Share on other sites
thanks motorherp im working with directX so ive converted your funtion to use D3DXVectors:

D3DXVECTOR3 BlendNormals(D3DXVECTOR3* pVerts, D3DXVECTOR3* pVertNormals, D3DXVECTOR3& point)
{
D3DXVECTOR3 distances;
distances.x = D3DXVec3Length(D3DXVec3Subtract(&pVerts[0],&pVerts[0],&point));
distances.y = D3DXVec3Length(D3DXVec3Subtract(&pVerts[1],&pVerts[0],&point));
distances.z = D3DXVec3Length(D3DXVec3Subtract(&pVerts[2],&pVerts[0],&point));

float totalDistance = distances.x + distances.y + distances.z;
distances.x /= totalDistance;
distances.y /= totalDistance;
distances.z /= totalDistance;

D3DXVECTOR3 norm;
D3DXVec3Scale(&norm,&pVertNormals[0],distances.x);
norm += *D3DXVec3Scale(&norm,&pVertNormals[1],distances.y);
norm += *D3DXVec3Scale(&norm,&pVertNormals[2],distances.z);

D3DXVec3Normalize(&norm,&norm);

return norm;
}

is that right? also what is the point you specify as the third perameter?

Share this post


Link to post
Share on other sites
Aha ... sorry to go back on myself but I think I've got it sussed this time. The reason my first implementation only works for right angled triangles is because otherwise the two blendings are no longer inter-dependant. The trick to solving this would be to take one of the tri edges to calculate your first blend and then use the vecor which is tangent to this edge and the tri norm to calculate the second blend. Therefore you'd end up with this:


vec3 BlendNormals(vec3* pVerts, vec3* pVertNormals, vec3& point)
{
vec3 edges[2];
edges[0] = pVerts[1] - pVerts[0];
edges[1] = pVerts[2] - pVerts[0];

float blends[2];
blends[0] = (point - pVerts[0]).dot(edges[0].normalise()) / edges[0].length();
// not too sure about the normalise above. I think its needed but there might be a better way

vec3 triNorm = edges[0].cross(edges[1]); // no need to normalise
vec3 tangent = (triNorm.cross(edges[0])).normalise();

float vert0Proj = tangent.dot(pVerts[0]);
float vert2Proj = tengent.dot(pVerts[2]);

blends[1] = (point.dot(tangent) - vert0Proj) / (vert2Proj - vert0Proj);

vec3 norm = pVertNormals[0] + blends[0]*(pVertNormals[1] - pVertNormals[0]);
norm = norm + blends[1]*(pVertNormals[2] - norm);

return norm;
}


Anyways, 'point' just means the character position.

Share this post


Link to post
Share on other sites
Quote:
Original post by treeway
alright its returning the normal (i think) now how do i rotate the character?


Hi again. There's not really any need to calculate angles and rotation axes to rotate the character, you could just construct the new orientation matrix manualy like so:


vec3 norm = BlendNormals(tri.verts, tri.vertNormals, character.pos);
character.yAxis = norm;
character.xAxis = (norm.cross(character.zAxis)).normalise();
character.zAxis = (norm.cross(character.xAxis)).normalise();

Share this post


Link to post
Share on other sites
Okay ive calculated the roatation matrix but I would like to be able to get the actual angle of the ground below the player as a single 3d vector - how do I find this?

Thanks

[Edited by - treeway on August 20, 2006 9:29:18 AM]

Share this post


Link to post
Share on other sites
This is how im trying to do it:

D3DXVECTOR3 faceRotation;
faceRotation = scene->terrain->BlendNormals(faceId,playerPosition);
D3DXVECTOR3 xAxis;
D3DXVECTOR3 yAxis;
D3DXVECTOR3 zAxis;
D3DXVec3Normalize(&yAxis,&faceRotation);
D3DXVec3Cross(&xAxis,&faceRotation,playerLeftAxis);
D3DXVec3Normalize(&xAxis,&xAxis);
D3DXVec3Cross(&zAxis,&faceRotation,playerforwardHandAxis);
D3DXVec3Normalize(&zAxis,&zAxis);

Xrotation = atanf(D3DXVec3Dot(playerLeftAxis,&xAxis));
Zrotation = atanf(D3DXVec3Dot(playerforwardHandAxis,&zAxis));
playerYRotation = atanf(D3DXVec3Dot(playerUpAxis,&yAxis));

playerObject->setRotation(Xrotation,playerYRotation,Zrotation);

and the rotation seems right on a flat surface (0,90,0) but on hills its fluctuates between 0 and -0.000041 both of which are wrong.

Share this post


Link to post
Share on other sites
For a start which implimentation of BlendNormals() are you using. I've realised the second one I posted is completely bogus now that I've studied it. Use the last one instead which should be correct forgiving any typos or minor mistakes I might have made since I just made it up on the spot. The idea is right though so if you step through it with a debugger you should be able to fix any minor problems.

Secondly, which way round do you take the z and x axes? I usualy use z for heading and x for left (or right depending on handedness). If you're using the same then you're constructing the character orientation matrix incorrectly since you should be crossing the up vector with the left vector to get the z axis.

lastly the way to calculate the angle between two vectors in 3d is like this:

angle = atan2(Length(v1.Cross(v2)), v1.Dot(v2));

Note that the angle will come out in radians not degrees.

Share this post


Link to post
Share on other sites
Hi

I used the last method you posted and i was converting the angles to degrees in my output because i find them easier to visualise but im setting the rotation using radians. My Z axis is into the screen and left is left on the screen - I see the problem from before but when i fiexed it its still not right :( .

D3DXVECTOR3 faceRotation;
D3DXVECTOR3 xAxis;
D3DXVECTOR3 yAxis;
D3DXVECTOR3 zAxis;

faceRotation = scene->terrain->BlendNormals(faceId,playerPosition);
yAxis = faceRotation;
D3DXVec3Cross(&xAxis,&faceRotation,playerforwardHandAxis);
D3DXVec3Normalize(&xAxis,&xAxis);
D3DXVec3Cross(&zAxis,&faceRotation,playerLeftAxis);
D3DXVec3Normalize(&zAxis,&zAxis);

Xrotation = atan2f(D3DXVec3Length(D3DXVec3Cross(&xAxis,&xAxis,&zAxis)),D3DXVec3Dot(&xAxis,&zAxis));


playerObject->setXRotation(Xrotation);

//////////////// BLEND NOTMALS /////////////////

D3DXVECTOR3 PhysicsTerrain::BlendNormals(DWORD faceIndex, D3DXVECTOR3* point)
{
D3DXVECTOR3* verts;
verts = new D3DXVECTOR3[3];
verts[0] = verticies[faceIndex];
verts[1] = verticies[faceIndex +1];
verts[2] = verticies[faceIndex +2];

D3DXVECTOR3* norms;
norms = new D3DXVECTOR3[3];
norms[0] = normals[faceIndex];
norms[1] = normals[faceIndex +1];
norms[2] = normals[faceIndex +2];

return BlendNormals(verts,norms,point);
}

D3DXVECTOR3 PhysicsTerrain::BlendNormals(D3DXVECTOR3* pVerts, D3DXVECTOR3* pVertNormals, D3DXVECTOR3* point)
{
D3DXVECTOR3 edges[2];
D3DXVec3Subtract(&edges[0],&pVerts[1],&pVerts[0]);
D3DXVec3Subtract(&edges[1],&pVerts[2],&pVerts[0]);

float blends[2];

blends[0] = D3DXVec3Dot(&(*point - pVerts[0]),&edges[0]) / D3DXVec3Length(&edges[0]);

D3DXVECTOR3 triNorm;
D3DXVec3Cross(&triNorm,&edges[0],&edges[1]); // no need to normalise
D3DXVECTOR3 tangent;
D3DXVec3Cross(&tangent,&triNorm,&edges[0]);
D3DXVec3Normalize(&tangent,&tangent);

float vert0Proj = D3DXVec3Dot(&tangent,&pVerts[0]);
float vert2Proj = D3DXVec3Dot(&tangent,&pVerts[2]);

blends[1] = (D3DXVec3Dot(point,&tangent) - vert0Proj) / (&vert2Proj - &vert0Proj);

D3DXVECTOR3 norm = pVertNormals[0] + blends[0]*(pVertNormals[1] - pVertNormals[0]);
norm = norm + blends[1]*(pVertNormals[2] - norm);

return norm;
}


Share this post


Link to post
Share on other sites
Couple of problems I can spot from the off there which I'll highlight in bold:

blends[0] = D3DXVec3Dot(&(*point - pVerts[0]),&edges[0]) / D3DXVec3Length(&edges[0]);

I'm not actualy sure if this is valid or not, it might complile to something which doesn't do what you expect. Not totaly sure on this one but I'd create the vector seperately before you point to it just to be sure. Also I think for the maths to come out right the first reference to edges has to be normalised. Try something like this:


D3DXVECTOR3 temp = *point - pVerts[0];
D3DXVECTOR3 normEdge;
D3DXVec3Normalize(&normEdge,&edges[0]);
blends[0] = D3DXVec3Dot(&temp,&normEdge) / D3DXVec3Length(&edges[0]);


Whats really messing you up is this line here though:

blends[1] = (D3DXVec3Dot(point,&tangent) - vert0Proj) / (&vert2Proj - &vert0Proj);

What you're doing here is taking the difference in memory locations rather than the difference in values. Change this to:


blends[1] = (D3DXVec3Dot(point,&tangent) - vert0Proj) / (vert2Proj - vert0Proj);


You probably want to normalise the return of BlendNormals inside the actual fundtion rather than after so it makes more sense too. Its good practice to step through any maths intensive or complex code line by line at least once checking you're getting the expected values at each stage. If you make a habit of this you'll spot these kind of problems easily.


PS: Just spotted another mistake here:

Xrotation = atan2f(D3DXVec3Length(D3DXVec3Cross(&xAxis,&xAxis,&zAxis)),D3DXVec3Dot(&xAxis,&zAxis));

You're storing the result of the cross product in the xAxis which also gets used in the dot product. You'll need to stor the cross product result in a seperate vector to make sure the xAxis is preserved for the dot product. Also there's no point in calculating the angle between x and z axes since you just orthonormalised the matrix in the previous step. This angle will always come out as 90 degrees.

[Edited by - Motorherp on August 20, 2006 4:07:50 PM]

Share this post


Link to post
Share on other sites

This topic is 4134 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this