Jump to content
  • Advertisement
Sign in to follow this  

Light Attenuation Model for Deferred Shading

This topic is 2706 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

EDIT: Disregard my solution! I guess I missed the graphic units along the y axis. It decays far too quickly! But if anyone has any suggesstions for a good model based on a simple radius input, let me know please!

A while back I started working on the optimizations to my deferred renderer. One big thing this includes is a new light attenuation model. I came up with some quick hack which didn't really save too many cpu cycles because the radius of the light which it generated was so big. I also never implemented the scissor op, instead I did a distance check from the pixel I was shading to the light... all of this is bad so I decided it's time to go back and rework it.

After doing lots of reading about finite attenuation models, I concluded there's no great solution or "standard" out there, so off to the drawing board.

First I read this tutorial, however it uses a simple linear attenuation model which I don't feel is very realistic, and I get the idea that they just used it because the tutorial had it's focuses elsewhere. I read a couple forum posts here on game dev regarding the subject and most people just end up arguing over it.

So I decided to try modifying my current attenuation model... which is the commonly used one with constant, linear and quadratic coefficients. I simply subtracted 0.01 from the end of it, and although it's solvable for att = 0 then, it gave ludicrous radii.

Then I realized, having sliders in a gui or having the user of my renderer make guesses as to which attenuation factors they would want to use is silly, so I started from the tutorials linear attenuation mode which is:
a = 1 - (d / r) where a is attenuation, d is distance and r is the desired light radius...
and scaled it by the classic inverse square laws equation of a = (1 / (d * d))...

ending up with:
a = (1 - (d / r)) * (1 / (d * d))

Although this is asymptotic as d approaches 0 from the right, the hardware itself should limit the actual color that gets output, and it limits the radius to whatever you set r to.
Here is a wolfram alpha link to the equation: click me

I am just looking for feedback on the model and whether or not people think it is too drastic of a dropoff to look realistic or if anyone thinks this would look good?

Thanks in advance for everyone's time!

Share this post

Link to post
Share on other sites
You are on the right track. Use any fitting attenuation model and an additional "damping" factor which will ensue that the lighting intensity is 0 when reaching the radius. Here's a simple formula:

alpha = distance / radius
damping_factor = 1.0 - pow(alpha,beta)
final_intensity = attenuation(distance) * damping_factor

For beta =1 it is linear,but the damping effect will be too strong. A better value is 2 or 3. I got quite nice results with this approach.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!