HDR Black

Started by
7 comments, last by ava 17 years, 6 months ago
Hi, For tonemapping in our game we compute the average luminance value in the backbuffer. The tonemapping works very good but has a major problem. When the backbuffer contains even a single NaN value, the average will be a NaN too and thus the final tonemapped result turns black. When using adaptive tonemapping, and thus using results from the last tonemap this gets even worse: when the screen gets black once, it stays black or the tonemapping must be reset. Apart from the simple solution, i.e. don't cause NaN values, is there a way that I can detect in my tonemapping shader or in the code that I have a NaN value in the backbuffer? Alex.
Alex Vanden AbeeleGraphics ProgrammerLarian Studioshttp://www.larian.com
Advertisement
I had similar problems when messing around with HDRI last year - got some very unusual results [lol]

In the end I found that adding a tiny offset, say 1e-9f, to each pixel stopped the problem...

If you're working with Direct3D you can see my approach in the HDRPipeline/HDRDemo sample code.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Adding a value to a NaN still gives me a NaN. The same counts for taking max or min values. But for some reason I only discover just now that there is a HLSL intrinsic function called isnan... No idea why I didn't see that before...

Alex Vanden AbeeleGraphics ProgrammerLarian Studioshttp://www.larian.com
Quote:Original post by ava
Adding a value to a NaN still gives me a NaN. The same counts for taking max or min values.
Yup, thats to be expected. My idea was for adding a small offset to the calculation where the NaN/INF would originally be generated. For example, when using log() for luminance measurement and/or any division operations.

Purely a guess, but I'd imagine a preventative measure such as mine will be better than writing additional isnan() code to handle the possibility. But if it works for you then great [smile]

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Quote:Original post by jollyjeffers
Purely a guess, but I'd imagine a preventative measure such as mine will be better than writing additional isnan() code to handle the possibility.

Usually it is. NaN and to a lesser extend +-inf can be really annoying. If somehow possible, you should try to avoid generating them in the first place.
Ah now I understand why you suggested adding a value. But the problem is that the NaN values are originating from the backbuffer that I am sampling, and not from the HDR shader itself.
If the NaN value would be created from my own divisions then off course an offset helps and is obviously the way to go. In fact I believe I even did that somewhere already.

The problem is that other programmers/artists can write shaders that cause NaN values in the backbuffer, so my samples have NaN values to begin with, without even doing the simplest calculation. Its those values that I want to guard my shader against.

That's why I say: the simple solution is make sure there are no NaN values in the backbuffer to sample. The problem however is that people write shaders that create NaN values and when their screen goes black they come to me and say that it is my fault.
With the NaN-check on the samples I avoid this waste of time, because now people see only their own object going black.
Alex Vanden AbeeleGraphics ProgrammerLarian Studioshttp://www.larian.com
Quote:Original post by ava
That's why I say: the simple solution is make sure there are no NaN values in the backbuffer to sample. The problem however is that people write shaders that create NaN values and when their screen goes black they come to me and say that it is my fault.

Writing shaders that produce FP specials (nan, inf, etc) is very bad practice, and usually considered a serious bug (just try to run a convolution on such an output, and you're in hell...). If your users write bad shaders, then they have to fix them. Silently allowing buggy shader output through a NaN test is only going to encourage them to write sloppy shaders in the future.

Quote:Original post by ava
With the NaN-check on the samples I avoid this waste of time, because now people see only their own object going black.

It wastes shader cycles though, and encourages bad practices. I wouldn't do it if I were you.
I agree that allowing NaN shaders is a bad practise in the first place.

Maybe you can try to use min/max instructions in the shader to remove the NaNs/Infs. That might not give better results though, since the tone mapping will still be technically incorrect.

Y.
Quote:
I agree that allowing NaN shaders is a bad practise in the first place.


I agree too, and I agree on all points with you guys, really, you do not need to convince me, I am with you all on the whole line.

Quote:
It wastes shader cycles though, and encourages bad practices. I wouldn't do it if I were you.


Indeed it does, and if it were my call I wouldn't either. Its just a temporary solution, so people working on their shaders can still see that something is going wrong, and what is going wrong.
Alex Vanden AbeeleGraphics ProgrammerLarian Studioshttp://www.larian.com

This topic is closed to new replies.

Advertisement