Jump to content
  • Advertisement
Sign in to follow this  

OpenGL Point light attenuation

This topic is 2117 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all

I'm learning OpenGL and I'm trying to write a shader for light attenuation and it's working ok except for a small problem as you can see in the image I'm getting circular strips





Here is my fragment shader:

#version 400

smooth in vec4 diffuseColor;
smooth in vec3 normal;
smooth in vec4 cameraSpacePosition;

out vec4 outputColor;

uniform vec3 pointLightCamPos;
uniform vec4 lightIntensity;
uniform vec4 ambientIntensity;

uniform float lightAttenuation;
uniform bool bUseRSquare;

vec4 ApplyLightIntensity(in vec3 cameraSpacePos, out vec3 lightDirection)
	vec3 lightDifference =  pointLightCamPos - cameraSpacePos;
	float lightDistanceSqr = dot(lightDifference, lightDifference);
	lightDirection = lightDifference * inversesqrt(lightDistanceSqr);
	float distFactor = bUseRSquare ? lightDistanceSqr : sqrt(lightDistanceSqr);

	return lightIntensity * (1 / ( 1.0 + lightAttenuation * distFactor));

void main()
	vec3 lightDir = vec3(0.0);
	vec4 attenIntensity = ApplyLightIntensity(vec3(cameraSpacePosition), lightDir);

	float cosAngIncidence = max(dot(normalize(normal), lightDir), 0);
	outputColor = (diffuseColor * ambientIntensity) + 
				  (diffuseColor * attenIntensity * cosAngIncidence);

Is this a normal behavior? or How I can fix it?

Share this post

Link to post
Share on other sites

Those are called colour banding artifacts. If you open that image in a paint program and examine the colours, each band is "1" unit brightener than previous one.

e.g. 37/255, 38/255, 39/255

The problem is that we've only got 255 levels of brightness, but the human eye is much more sensitive than that.

This causes a further problem where an optical illusion makes these bands more visible than they really should be!


So, no, this is not your fault biggrin.png

Once you apply textures to the ground, they will make this effect almost imperceptible. Otherwise, if you aren't planning on using textures, the fallback is to implement dithering.

Edited by Hodgman

Share this post

Link to post
Share on other sites

As Hodgman said, this is banding. The problem is that 8-bit is enough for images if and only if they are generated from a higher bit depth with dithering. If you render polygons directly into a 8-bit buffer or you convert high bit depth to 8-bit without dithering, then you will get banding artifacts in your gradients. See here.


This will be less noticeable when you factor in albedo and normal maps, but there will still be times when smooth gradients pop up, and they won't look smooth at all. As far as I know, most games don't bother to do anything about it because it's a relatively small problem (as far as most people are concerned, I personally think this is a huge issue :p) and the fix is potentially a performance issue since generating random numbers on the GPU is fairly expensive and hard to do right. It would be nice if GPUs had built in functionality for this (sort of like RdRand on modern CPU) and maybe even automatic dithering built into the API.

Edited by Chris_F

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!