Problems with vertex buffer lighting

Started by
0 comments, last by Kalidor 19 years ago
Let me preface this by saying that Im not using vertex or fragment shaders of any kind. Just standard OpenGL lighting. Alright, this is a problem thats been confusing me for months. I have my lighting set up perfectly correctly in my application. I set the lighting position after the camera does all its rotations and translations, and everything appears to work fairly well. When I use glVertex3f and glNormal and so on specifically, everything works great, the lighting is perfect. However, when I attempt to use a vertex and normal buffer/pointer and use glDrawElements, something very odd happens. No matter how distant from the light the buffered object gets, its lighting is as if it is directly next to the light source. This is made even stranger by the fact that the object is still affected by the position of the light. Such as, if it is below the light, lighting is correctly rendered as though the object is below the light. Only, no matter how distant the object gets from the light, the light is as bright as ever. Its as if the lighting is being calculated based only on the angle between the normals of the vertices and the light. I have all my attenuation values correctly set, Ive got my materials working correctly, I just cant figure it out. I use a constant attenuation of 1.0, default values for the rest. Objects are rendered correctly if I directly call glVertex3f and so on, just not when I use the vertex buffer and so on. Im using GL_NORMALIZE, and Im sure its not a problem with the normals, as the normals seem to be the only thing working correctly. Even if a given buffered object has polygons near to the light and polygons far away from the light, the polygons that are far away from the light will be equally illuminated. Any ideas would be appreciated. I could provide some screen shots if anyone wants to see. Thanks.
Advertisement
Well, from the Red Book (link in forum faq)
attenuation factor = 1 / (kC + kL*d + kQ*d^2)
So with constant attenuation=1.0 and 0.0 for linear and quadratic (the default is 1.0, 0.0, and 0.0 for constant, linear, and quadric attenuations respectively) you will get an attenuation factor of 1.0, which is then multiplied into the lighting term. So what you are seeing with VBOs should be the correct results, if you are not getting it with immediate mode, then the problem is somewhere in there.

EDIT: Man, too many errors to list. [grin]

This topic is closed to new replies.

Advertisement