Jump to content
  • Advertisement

DX11 Additive lighting problems

Recommended Posts

Posted (edited)

Hey everyone!

I implemented additive lighting, so I basically have a full screen texture that is the sum of all light colors (per pixel, of course). Now I have the problem that if I place too many camp fires next to each other, they will add up to more than 1.0. If I just clip it at 1.0, it will look weird, because the camp fires shine orange light, which becomes green-ish if you clip the red to 1.0.

I also want the camp fire to be much less bright than the sun, but if I make it only like 0.1 bright, it will be too dark in the night. What can I do to fix this? I looked for high dynamic range and stuff, but I didn't find anything other than the HDR functionality of DirectX 12. In fact, I didn't find any good tutorial or article at all, that is why I ask here.




Edited by Magogan

Share this post

Link to post
Share on other sites

What you're asking about is just one piece of a process collectively known as "tone mapping".  Even when you are working with an HDR pipeline, at some point is has to get mapped back down to LDR in order to actually be presented.  A simple implementation of this process can be described by simply computing all of your lighting in linear space, defining a "maximum intensity" over the scene, and then scaling the lighting in the scene back down based on the "maximum intensity".  While the lighting computations must be linear for this to work correctly, the subsequent scaling does not necessarily need to be linear.  The "maximum intensity" you use can either be hard-coded, passed down as a uniform, or even generated during pixel/compute execution and stored for the scaling to occur in a post-processing pass.

There are other (and more complicated) solutions to this problem.  Try searching for some tone mapping algorithms to get ideas.

Share this post

Link to post
Share on other sites
16 hours ago, Magogan said:

but I didn't find anything other than the HDR functionality of DirectX 12

So, annoyingly, "HDR" refers to two completely different things now:

  • HDR rendering / HDR imaging - the process of working with images that can go outside of the fixed "0 to 1" range. Typically using floating point. 
  • HDR displays (A.K.A. UHD TV, HDR10, DolbyVision, etc) - Monitors that can accept 10bpp or 12bpp signals and display a wider gamut than traditional HDTV or digital displays.

It sounds like your quote, above, is talking about the second category, which is not what you need.

To implement the first category, you just need to:

  1. render your images to a RGBA16_FLOAT type texture instead of the typical practice of rendering to an RGBA8_UNORM texture.
  2. use a tone-mapping algorithm to copy your "HDR" / FP16 data over to a traditional 8-bit texture. The simplest algorithm is "return x / (x+1);", but there's lots of much better ones.
  3. display that 8-bit texture

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!