Rendering Normals

Started by
8 comments, last by Bitem2k 18 years, 9 months ago
Does anyone know how to get a list of Normals and postions of faces from a loaded mesh? Im looking to render the normals on a model, like the directx sdk meshviewer. At the moment im looping through the vertex buffer and i think im getting the normals but i cannot retrieve the correct position to start drawing them on the model. Thanks
Advertisement
The correct position for the normal would be the position of the vertex the normal belongs to. The vertex buffer seems the most appropriate place to look.

If the VB is created with a flexible vertex format (FVF) the order of elements in each vertex is fixed (see the SDK docs). From this you can derive at which offset to look for position and normal of a vertex. Then for each vertex you should skip an amount indicated by the vertex size. With FVF, and a vertex size N and M vertices the layout will look like this:
Offset:   0 bytes. V0: position, normal, ..., Offset: 1*N bytes. V1: position, normal, ...,Offset: 2*N bytes. V2: position, normal, ...,...Offset: M*N bytes. V2: position, normal, ...,


If the VB is created with a vertex declaration, that vertex declaration will give the size and offset for each member per vertex. The layout is subject to this declaration.

For a simple normal you can draw a line from the vertex position to the vertex position increased by the (normalized) normal. More fancy representations include a cone at the top and use a cylinder for the base, making it a three-dimensional arrow.

Greetz,

Illco
Hi thanks for your reply. Im using managed directx 9 in c# . I turned off the rendering of the model and now i can see that only some on my normals are rendering correctly!

Does anyone know what is wrong with the following code:


public Vector3[] normals = null;
public Vector3[] positions = null;

public void CreateNormals()
{

Mesh mesh = xdata.Mesh;
normals = new Vector3[mesh.NumberFaces];
positions = new Vector3[mesh.NumberFaces];

using (IndexBuffer ib = mesh.IndexBuffer )
{
short[] faces = ib.Lock (0,typeof(short),LockFlags.None,mesh.NumberFaces *3 ) as short[];

using (VertexBuffer vb = mesh.VertexBuffer)
{
CustomVertex.PositionNormalTextured[] verts = vb.Lock (0,typeof(CustomVertex.PositionNormalTextured) ,LockFlags.None,levelMesh.NumberVertices) as CustomVertex.PositionNormalTextured [];
//Now find each face vertex position

for (int i =0;i<verts.Length ;i++)
{
if (i % 3==0)
{
CustomVertex.PositionNormalTextured d;

d= verts[faces];

normals = d.Normal ;<br> positions = d.Position; <br> System.Diagnostics.Debug.WriteLine (d.ToString ());<br> }<br> }<br><br> vb.Unlock();<br><br><br> }<br><br> ib.Unlock();<br> }<br>}<br><br>Here is a picture of what i mean. Some off the normals are deposited in the center of the level, which is all wrong, this is not the case if you view in meshviewer.<br><br>The &#111;nes shown in the background are in the correct place though. Why would this be? any ideas?<br><img src ="http://www.eekelephant.plus.com/Normals.jpg">
Are you transforming your normals with the same matrix as when you render the real geometry?

That is, the position you're reading from the vertex buffer should be transformed by whatever world matrix you're using before you construct your normal representation.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Im not using any transforms at all! just matrix.identity.

I dont think that the problems lies there because some of the normal lines (the ones on the back wall) are positioned correctly. I just cant find why some work and some dont!

1) normals used for lighting in D3D, and as shown in MeshView are per-vertex rather than per-face; so you don't need to be doing any of the "if (i % 3==0)" type stuff, in fact you don't need to reference the index buffer at all to show vertex normals.


2) if you actually require face normals rather than vertex normals, then the best thing to do is compute the normals for the face directly (the cross product of two edges of the face) rather than trying to use vertex normals (which will be difficult to turn back into face normals unless the mesh is deliberatey meant to be faceted when lit).


3) a normal only has a direction, no position information is present in a normal, so to display a line for a normal you need to display it at the position of the vertex. The start position of the line is: vertex_position, the end position of the line is: vertex_position + (normal * length_of_line).


4) lines rendered with 3D untransformed vertices will be affected by the currently set world transformation matrix (assuming you're using the fixed function pipeline). So the best place render your normals is immediately after rendering the object that the normals are from.



A combination of points 3 & 4 will be why most of your normals are clustered around one spot, and a few appear correct.

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Ok ive taken your advice and removed the face i%3 thing.
I dont think its due to Matrices as i have cleaned my code of Transform.Worlds , plus some of the normals are in the right place.

One thing i have noticed is that when i change the vertex format of the vertex buffer i get slightly different incorrect results,i.e

device.VertexFormat = CustomVertex.PositionNormal.Format ;

gives different result than

device.VertexFormat = CustomVertex.PositionNormalTextured.Format ;

although for both of them the normals that are correctly positioned are the same(in correct place), its only the normals in the middle that move.


Here is the code used to draw the normal:

//Work out verts
int startCol = Color.Yellow.ToArgb(),endCol = Color.Red.ToArgb();

float scale =5;
Vector3 start = position;

Vector3 dir = direction;


dir *= scale;


Vector3 end = start + dist;

int i =0;
verts[i++] = new Microsoft.DirectX.Direct3D.CustomVertex.PositionColored(start.X ,start.Y,start.Z ,startCol);
verts[i++] = new Microsoft.DirectX.Direct3D.CustomVertex.PositionColored(end.X ,end.Y,end.Z ,startCol);



Heres the screen shots of what i mean when i say some are positioned correctly (see the ones round the boxes and the tops of the walls.




Here is my c# project.

Thanks for your help.
It's quite possible that this is just something regarding how you posted the code - I don't have time to actually download/inspect your C# project [smile]

Quote:Original post by Bitem2k
//Work out verts
int startCol = Color.Yellow.ToArgb(),endCol = Color.Red.ToArgb();

float scale =5;
Vector3 start = position;

Vector3 dir = direction;


dir *= scale;


Vector3 end = start + dist;

int i =0;
verts[i++] = new Microsoft.DirectX.Direct3D.CustomVertex.PositionColored(start.X ,start.Y,start.Z ,startCol);
verts[i++] = new Microsoft.DirectX.Direct3D.CustomVertex.PositionColored(end.X ,end.Y,end.Z ,startCol);

It's this line that caught my attention...
• Vector3 end = start + dist
What is the variable dist? From the fragment you've posted shouldn't it be dir instead of dist ??

And I presume you've noticed that you're defining an end colour and not actually using it [grin]

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Yeah, that was just the code i posted! The project does contain the modification of dist to dir (sorry). The end colour has been changed to the start colour intentionally ( to match the single yellow colour used in the meshViewer)

Thanks
At last i have cracked it!


It seems that my problem was due to the 3dsmax exporter that comes with the directx sdk. When i use that to export, the vertex positions dont seem to load in my program correctly. When i use the Panda exporter all the normals positions are correct.

This is very strange as the Meshviewer app will render both the exported versions ( with no problems ) from each exporter.

Armed with the knowledge that the vertex positions must be correct (because the models render on screen correctly), i figured it must be something to do with the fact that the ms dx sdk exporter uses a different vertex format.

Mesh mesh = loadedMesh.Clone (MeshFlags.Dynamic,CustomVertex.PositionNormalTextured.Format,xdata.Mesh.Device);

Adding this line fixed all my problems!


Now when i read from the vertex buffer, its using the same vertex format so it works!

Thanks to all that replied.

This topic is closed to new replies.

Advertisement