varying color per vertex based on vertex normal in Cg/HLSL

Started by
10 comments, last by iNsAn1tY 17 years, 1 month ago
I was just trying to achieve color varying per vertex based on vertex normals for testing purposes. However I only get solid color per whole mesh (and I know the normals are ok cause I drew them using OpenGL lines). Maybe someone can take a look? struct vertex_input { float4 position : POSITION; float3 normal : NORMAL; }; struct vertex_output { float4 position : POSITION; float4 color : COLOR; }; vertex_output vertex_program(vertex_input IN, uniform float4x4 model_view_proj, { vertex_output OUT; OUT.position = mul(model_view_proj, float4(position, 1.0f)); float3 normal = normalize(IN.normal); normal = 0.5f + 0.5f*normal; OUT.color = float4(normal, 1.0f); return OUT; }
Advertisement
It looks like there's an error in your program, it isn't being compiled properly, and OpenGL is using the default pipeline instead of your program. There's a comma ',' after model_view_proj instead of a left parenthesis ')'. Are you checking for compilation errors? I assume you're using Cg, so you need to check the value returned from cgGetError after you create and compile your vertex program. If it returns anything other than CG_NO_ERROR, then you can use cgGetErrorString and cgGetLastListing to get detailed information about what failed, and where.

A very simple way of testing for your current problem would be to call glColor with a known colour like red before you render. If the model is all red instead of all white (which is the default OpenGL colour), you know that OpenGL is using the default pipeline instead of your program.
My opinion is a recombination and regurgitation of the opinions of those around me. I bring nothing new to the table, and as such, can be safely ignored.[ Useful things - Firefox | GLee | Boost | DevIL ]
no, no, it does compile properly, I check that in Cg, this coma was added by mistake when I edited the post. The model takes bluish color but it is uniform on the whole model and it should vary per vertex since the normals are very varied.


[Edited by - therealremi on March 4, 2007 7:07:23 AM]
This is just a guess, but if the model has a uniform blueish colour, then it looks to me like your normals are defined (or being transformed) in local space, or local to each face. Are you sure you didn't transform them into world space before you drew them?

To clarify a bit: you could think your normals are correct, when they are not, because of a flawed transformation somewhere.
hmm, I'm not a math guy but they are computed directly from face vertices (taking the cross product of two face vectors) and so they should be in the same coordinate space as the vertices.
So the shader looks ok? Is it just the normals that can be wrong? (although I was able to draw them as lines in OpenGL and they looked right)
Quote:Original post by therealremi
no, no, it does compile properly, I check that in Cg, this coma was added by mistake when I edited the post. The model takes bluish color but it is uniform on the whole model and it should vary per vertex since the normals are very varied.

Ah, if it's a typo, then the program's not the problem.

Quote:Original post by therealremi
hmm, I'm not a math guy but they are computed directly from face vertices (taking the cross product of two face vectors) and so they should be in the same coordinate space as the vertices.
So the shader looks ok? Is it just the normals that can be wrong? (although I was able to draw them as lines in OpenGL and they looked right)

Your program works fine in RenderMonkey (tested on a sphere object, the sphere is coloured as you'd expect)...so I guess it has to be your normals.

Are you submitting them to OpenGL properly?
My opinion is a recombination and regurgitation of the opinions of those around me. I bring nothing new to the table, and as such, can be safely ignored.[ Useful things - Firefox | GLee | Boost | DevIL ]
thanks for checking iNsAn1tY.

Here's what I do in OpenGL:
glVertexPointer(3, GL_FLOAT, 0, geometry_coordinates);
glNormalPointer(GL_FLOAT, 0, normal_coordinates);
glIndexPointer(GL_INT, 0, indices);
glDrawElements(GL_TRIANGLES, indices_count, GL_UNSIGNED_INT, indices);

And as I said, I did this to draw normals:
glLineWidth(3.0f);
glBegin(GL_LINES);
for (unsigned int j=0; j<geometry_coordinates_count/3; j++)//for every vertex
{
glVertex3f(geometry_coordinates[j*3 + 0], geometry_coordinates[j*3 + 1], geometry_coordinates[j*3 + 2]); //line start
float x = geometry_coordinates[j*3 + 0] + 0.7f * normal_coordinates[j*3 + 0];
float y = geometry_coordinates[j*3 + 1] + 0.7f * normal_coordinates[j*3 + 1];
float z = geometry_coordinates[j*3 + 2] + 0.7f * normal_coordinates[j*3 + 2];
glVertex3f(x, y, z); //line end
}
glEnd();
I don't really see an error (logical or otherwise) there either, and since iNsAn1tY tested it in RenderMonkey, I guess the problem lies somewhere else. Could you please try it in ATI's RenderMonkey or something similar yourself? I can't really think of anything else, except for the fact that you have to enable the throughput of vertex attributes, or in this case:

glEnableClientState( GL_NORMAL_ARRAY );


But I suspect you already know this...
Quote:Original post by Todo
<!--QUOTE--></td></tr></table></BLOCKQUOTE><!--/QUOTE--><!--ENDQUOTE--><br>I see a suspect function call in your code: the &#111;ne to <tt>glIndexPointer</tt>. Try removing that and any call you have to <tt>glEnableClientState( GL_INDEX_ARRAY )</tt>. "Index" here doesn't have anything to do with an index array, it refers to color indexes (not sure what they are, never used 'em; could be a relic of an earlier OpenGL version). This could be messing up your rendering.<br><br><!--QUOTE--><BLOCKQUOTE><span class="smallfont">Quote:</span><table border=0 cellpadding=4 cellspacing=0 width="95%"><tr><td class=quote><!--/QUOTE--><!--STARTQUOTE--><i>Original post by Todo</i><br>..except for the fact that you have to enable the throughput of vertex attributes, or in this case:<br><pre>glEnableClientState( GL_NORMAL_ARRAY );</pre>But I suspect you already know this...<!--QUOTE--></td></tr></table></BLOCKQUOTE><!--/QUOTE--><!--ENDQUOTE--><br>Heh, I had a really nice &#111;ne like this a few months ago. I forgot &#111;ne little call to enable texturing, and it caused huge problems [grin]
My opinion is a recombination and regurgitation of the opinions of those around me. I bring nothing new to the table, and as such, can be safely ignored.[ Useful things - Firefox | GLee | Boost | DevIL ]
iNsAn1tY, Todo, your response has been great. I'm ashamed to admit that;) but I did forget to include
glEnableClientState( GL_NORMAL_ARRAY );
and now it seems to work.

iNsAn1tY, are you sure with that?:
Quote:
I see a suspect function call in your code: the one to glIndexPointer. Try removing that and any call you have to glEnableClientState( GL_INDEX_ARRAY ). "Index" here doesn't have anything to do with an index array, it refers to color indexes

This topic is closed to new replies.

Advertisement