Sign in to follow this  

OpenGL Fixed-Function Light Flickering when dot(N,L)=0

This topic is 1683 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts



I have a bizarre problem. There is a simple OpenGL fixed-function light drawn at (0.0f,10.0f,-1.0f). There are several boxes that happen to have faces lying in the plane z=-1.0f. Consequently, the light vector and the normal are exactly orthogonal.


When this happens, instead of being black, the color flickers light and dark along that face, even when the camera is stationary. The draw settings are identical each frame. It's not a z-fighting effect; there's only one layer of polygons, and the flickering continues even when the camera is stationary. Each vertex of the face appears either lit or unlit, though it's unpredictable which a given vertex will be from frame to frame.


The light has no ambient, and by tweaking the light's diffuse and specular factors, I find that the problem only occurs when the light's specular coefficient is nonzero. Perturbing the light slightly along z fixes the problem; it's only when the light is exactly in line with the surface that the problem occurs. I can fix the problem better by just using a shader (which I plan to do anyway), but I'm wondering if anyone has encountered this before?


I suspect it may be some kind of driver bug.

Hardware: NVIDIA GeForce GTX 580M

Driver: 305.60




Edited by Geometrian

Share this post

Link to post
Share on other sites

I'm not sure what effect you are trying to achieve doing this. Floating point numbers do not have infinite precision so the light is sometimes located in front of the face and sometimes behind it. If I remember correctly, DirectX do not have two sided lightning in the fixed pipeline, so you compute the specular color (the diffuse color is clearly mostly zero in both cases) in only one of the two cases and you thus have flickering. Shaders may helo fixing this, but I do not understand why you want a light located on a surface in this way.

Share this post

Link to post
Share on other sites

This just ended up happening accidentally; it seemed like undefined/wrong behavior. I was just trying to achieve basic fixed-function lighting.


When N dot L is 0.0, then R, the reflected light direction is the same no matter which way the normal faces. This is then dotted against the viewing ray to produce a specular term. This is raised to the power of the specular coefficient.


The only thing I can think of is that the driver is optimizing the case where N dot L < 0 and not computing any specular at all, causing variation among different vertices because of floating point precision. This is sorta plausible since even at extremely low angles, specular can still be strong. But it still doesn't explain why the variation can change each frame.

Share this post

Link to post
Share on other sites

The last time I used the fixed pipeline was years ago and I do not remember the details. What happen when you move the light behind the surface? If the polygon has no lightning, then lights behind a surface are ignored as I said in my last post. Since the light is located on the surface, any small change may cause the light to move between the two "states" (in front or behind the surface). I think you should simply try to avoid such cases (or use shaders to make it work as you wish).

Share this post

Link to post
Share on other sites

What happen when you move the light behind the surface?

Perturbing the light slightly along z fixes the problem


Since the light is located on the surface, any small change may cause the light to move between the two "states" (in front or behind the surface).

Yes, but the diffuse should be ~0.0 in either case, and while specular can jump, the only time it should be visible at all is at an extreme glancing angle (0.0, to be specific, in which case it's edge-on and not visible anyway).

The problem recurred in a shader, but I managed to solve it. It was related to the object's tangents for normalmapping (they weren't being passed in). This doesn't explain why the fixed function pipeline was being drawn incorrectly, however.

Share this post

Link to post
Share on other sites

Maybe some variable that is being applied to the lighting is not initialized. This will cause flickering. It might be present at other angles but washed out so you can only see it where the model is dark.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Similar Content

    • By _OskaR
      I have an OpenGL application but without possibility to wite own shaders.
      I need to perform small VS modification - is possible to do it in an alternative way? Do we have apps or driver modifictions which will catch the shader sent to GPU and override it?
    • By xhcao
      Does sync be needed to read texture content after access texture image in compute shader?
      My simple code is as below,
      glBindImageTexture(0, texture[0], 0, GL_FALSE, 3, GL_READ_ONLY, GL_R32UI);
      glBindImageTexture(1, texture[1], 0, GL_FALSE, 4, GL_WRITE_ONLY, GL_R32UI);
      glDispatchCompute(1, 1, 1);
      // Does sync be needed here?
      glBindFramebuffer(GL_READ_FRAMEBUFFER, framebuffer);
                                     GL_TEXTURE_CUBE_MAP_POSITIVE_X + face, texture[1], 0);
      glReadPixels(0, 0, kWidth, kHeight, GL_RED_INTEGER, GL_UNSIGNED_INT, outputValues);
      Compute shader is very simple, imageLoad content from texture[0], and imageStore content to texture[1]. Does need to sync after dispatchCompute?
    • By Jonathan2006
      My question: is it possible to transform multiple angular velocities so that they can be reinserted as one? My research is below:
      // This works quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); quat quaternion2 = GEMultiplyQuaternions(quaternion1, GEQuaternionFromAngleRadians(angleRadiansVector2)); quat quaternion3 = GEMultiplyQuaternions(quaternion2, GEQuaternionFromAngleRadians(angleRadiansVector3)); glMultMatrixf(GEMat4FromQuaternion(quaternion3).array); // The first two work fine but not the third. Why? quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); vec3 vector1 = GETransformQuaternionAndVector(quaternion1, angularVelocity1); quat quaternion2 = GEQuaternionFromAngleRadians(angleRadiansVector2); vec3 vector2 = GETransformQuaternionAndVector(quaternion2, angularVelocity2); // This doesn't work //quat quaternion3 = GEQuaternionFromAngleRadians(angleRadiansVector3); //vec3 vector3 = GETransformQuaternionAndVector(quaternion3, angularVelocity3); vec3 angleVelocity = GEAddVectors(vector1, vector2); // Does not work: vec3 angleVelocity = GEAddVectors(vector1, GEAddVectors(vector2, vector3)); static vec3 angleRadiansVector; vec3 angularAcceleration = GESetVector(0.0, 0.0, 0.0); // Sending it through one angular velocity later in my motion engine angleVelocity = GEAddVectors(angleVelocity, GEMultiplyVectorAndScalar(angularAcceleration, timeStep)); angleRadiansVector = GEAddVectors(angleRadiansVector, GEMultiplyVectorAndScalar(angleVelocity, timeStep)); glMultMatrixf(GEMat4FromEulerAngle(angleRadiansVector).array); Also how do I combine multiple angularAcceleration variables? Is there an easier way to transform the angular values?
    • By dpadam450
      I have this code below in both my vertex and fragment shader, however when I request glGetUniformLocation("Lights[0].diffuse") or "Lights[0].attenuation", it returns -1. It will only give me a valid uniform location if I actually use the diffuse/attenuation variables in the VERTEX shader. Because I use position in the vertex shader, it always returns a valid uniform location. I've read that I can share uniforms across both vertex and fragment, but I'm confused what this is even compiling to if this is the case.
      #define NUM_LIGHTS 2
      struct Light
          vec3 position;
          vec3 diffuse;
          float attenuation;
      uniform Light Lights[NUM_LIGHTS];
    • By pr033r
      I have a Bachelor project on topic "Implenet 3D Boid's algorithm in OpenGL". All OpenGL issues works fine for me, all rendering etc. But when I started implement the boid's algorithm it was getting worse and worse. I read article ( inspirate from another code (here: but it still doesn't work like in tutorials and videos. For example the main problem: when I apply Cohesion (one of three main laws of boids) it makes some "cycling knot". Second, when some flock touch to another it scary change the coordination or respawn in origin (x: 0, y:0. z:0). Just some streng things. 
      I followed many tutorials, change a try everything but it isn't so smooth, without lags like in another videos. I really need your help. 
      My code (optimalizing branch):
      Exe file (if you want to look) and models folder (for those who will download the sources):
      Thanks for any help...

  • Popular Now