Jump to content
  • Advertisement
Sign in to follow this  
Elig

OpenGL Problems with vertex buffer lighting

This topic is 4861 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Let me preface this by saying that Im not using vertex or fragment shaders of any kind. Just standard OpenGL lighting. Alright, this is a problem thats been confusing me for months. I have my lighting set up perfectly correctly in my application. I set the lighting position after the camera does all its rotations and translations, and everything appears to work fairly well. When I use glVertex3f and glNormal and so on specifically, everything works great, the lighting is perfect. However, when I attempt to use a vertex and normal buffer/pointer and use glDrawElements, something very odd happens. No matter how distant from the light the buffered object gets, its lighting is as if it is directly next to the light source. This is made even stranger by the fact that the object is still affected by the position of the light. Such as, if it is below the light, lighting is correctly rendered as though the object is below the light. Only, no matter how distant the object gets from the light, the light is as bright as ever. Its as if the lighting is being calculated based only on the angle between the normals of the vertices and the light. I have all my attenuation values correctly set, Ive got my materials working correctly, I just cant figure it out. I use a constant attenuation of 1.0, default values for the rest. Objects are rendered correctly if I directly call glVertex3f and so on, just not when I use the vertex buffer and so on. Im using GL_NORMALIZE, and Im sure its not a problem with the normals, as the normals seem to be the only thing working correctly. Even if a given buffered object has polygons near to the light and polygons far away from the light, the polygons that are far away from the light will be equally illuminated. Any ideas would be appreciated. I could provide some screen shots if anyone wants to see. Thanks.

Share this post


Link to post
Share on other sites
Advertisement
Well, from the Red Book (link in forum faq)
attenuation factor = 1 / (kC + kL*d + kQ*d^2)
So with constant attenuation=1.0 and 0.0 for linear and quadratic (the default is 1.0, 0.0, and 0.0 for constant, linear, and quadric attenuations respectively) you will get an attenuation factor of 1.0, which is then multiplied into the lighting term. So what you are seeing with VBOs should be the correct results, if you are not getting it with immediate mode, then the problem is somewhere in there.

EDIT: Man, too many errors to list. [grin]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!