Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


HDR Help


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
16 replies to this topic

#1 jeffkingdev   Members   -  Reputation: 787

Like
0Likes
Like

Posted 11 September 2012 - 10:01 PM

So,

I'm starting to implement HDR in my scene. I have a few questions after reading online and going through the HDR_Pipeline demo in the SDK.

I setup my render target texture as D3DFMT_A16B16G16R16F. How do I actually get those extra bits of color? If I render my scene as-is and then render that texture out to the screen, it's exactly the same.

Question 1:
The HDR_Pipeline actually has a pixel shader that multiplies each RGB by a scalar factor. Is this how HDR is done? So, every object (or really every shader) in my scene now has to multiply it's pixel RGB by some magical scalar factor so I can later use that for my luminance calculation? Is this how everyone does it, take the result RGB from a model and multiply it by some scalar to get over 1.0 results?

Question 2:
Is there anyway faster to get the average or log luminance value other than downsampling? If not, do people generally start at 256x256 for a total of 9 downsamples? The HDR_Pipeline used 3x3 downsamples instead of 2x2, is that better?

Thanks!
Jeff.

Sponsor:

#2 allingm   Members   -  Reputation: 521

Like
1Likes
Like

Posted 11 September 2012 - 10:43 PM

I setup my render target texture as D3DFMT_A16B16G16R16F. How do I actually get those extra bits of color? If I render my scene as-is and then render that texture out to the screen, it's exactly the same.
Question 1:
The HDR_Pipeline actually has a pixel shader that multiplies each RGB by a scalar factor. Is this how HDR is done? So, every object (or really every shader) in my scene now has to multiply it's pixel RGB by some magical scalar factor so I can later use that for my luminance calculation? Is this how everyone does it, take the result RGB from a model and multiply it by some scalar to get over 1.0 results?


So, LDR (low dynamic range) is when the light values are in the range [0, 1] while HDR (high dynamic range) is when light values are in the range [0, infinity). The D3DFMT_A16B16G16R16F texture will hold [-infinity, +infinity], so the texture format doesn't need any work to hold extra data. What needs to change is two things. First, the lights in your scene must actually add up to something past 1 (or start at a value greater than 1). Second, you must realize that no matter what your render target supports, the computer screen only supports [0, 1]. You need to map [0, infinity] to [0, 1]. This conversion is called "tone mapping".

MJP has a really greate demo here: http://mynameismjp.w...ss.com/2010/04/

There is also a great book on GameDev that has tons of information, but I can't seem to find it at the moment.

Question 2:
Is there anyway faster to get the average or log luminance value other than downsampling? If not, do people generally start at 256x256...


Not really, you can start sampling into any size texture, but anything too small will not be 100% accurate. MJP did also have a demo that did this in a compute shader, but I'm not sure that is exactly what you're asking for. ( http://mynameismjp.w...ss.com/2011/08/ )

#3 Ashaman73   Crossbones+   -  Reputation: 11066

Like
0Likes
Like

Posted 11 September 2012 - 11:37 PM

First, the lights in your scene must actually add up to something past 1 (or start at a value greater than 1).

Why ? If you have a very dark scene , the whole scene got lightened up. You can explain it with the ability of humans to see better in the darkness after some time.

Is there anyway faster to get the average or log luminance value other than downsampling?

There's always some overhead when switching the render target, setting up the shader etc. , therefore a 3x3 downsample could be faster, because of reducing the layers faster.

If not, do people generally start at 256x256 for a total of 9 downsamples? The HDR_Pipeline used 3x3 downsamples instead of 2x2, is that better?

You can use some tricks like hardware filtering, that is, you can sample 4 pixels with just one sample (when linear filtering is on). A 2x2 tap would in fact sample a 4x4 area, this is:
256->64->16->4 and the last 4x4 texture can be sampled at the center.

Or when useing a 3x3 tap:
4<-24<-144<-864

Edited by Ashaman73, 11 September 2012 - 11:39 PM.

Ashaman

 

Gnoblins: Website - Facebook - Twitter - Youtube - Steam Greenlit - IndieDB - Gamedev Log


#4 CryZe   Members   -  Reputation: 768

Like
0Likes
Like

Posted 12 September 2012 - 01:32 AM

Is there anyway faster to get the average or log luminance value other than downsampling? If not, do people generally start at 256x256 for a total of 9 downsamples? The HDR_Pipeline used 3x3 downsamples instead of 2x2, is that better?

In my engine, I used a Fast Fourier Transform to get a fast ( per pixel) Bloom that actually is able to influence the whole screen and has a non-separable filter while still having the performance of a separable filter. I didn't need to calculate the average luminance, cause it's exactly the same as the value at the frequency 0 of the FFT transformed image. So it's basically free in my case, and even more correct than if it were calculated by downsampling.

If I'd need to implement it, though, I'd probably write 2 compute shaders. The first compute shader dispatches thread groups for every row, while every thread group has threads for each pixel in that row. Every thread begins by reading it's associated pixel and stores into groupshared memory. Than, every thread begins by adding 2 values together and than storing the result in groupshared memory again. Now you do that again, until only 1 value remains (also only half the threads are actually adding values every time. Be sure to release the unnecessary warps). You than divide the resulting value by the number of elements and store it in a Texture1D (They should definitely add some fast intermediate memory to DX12, so that you don't have to store buffers like this in global memory). The second compute shader basically does the same thing, but is only dispatched once (and has as many threads as the image has rows).

This should perform way faster than downsampling a image multiple times, because of the multiple passes needed for downsampling and the resulting slow writes to global memory (as I said, we need intermediate memory for DX12).

Update: Oh, I actually did the same thing MJP and NVidia did, without reading their work xD (MJP used a 2-dimensional approach though, but that shouldn't result in any difference at all. But MJP's improved version might cause bank conflicts on NVidia hardware, because of the 128-bit strided access).

Edited by CryZe, 12 September 2012 - 03:38 AM.


#5 jeffkingdev   Members   -  Reputation: 787

Like
0Likes
Like

Posted 12 September 2012 - 09:38 AM

allingm,

Thanks for the response.

So, my lighting calculation would have to change in the shader. Currently it's...

saturate(IN.SunLight + gAmbient) * pixCol;

So, would I have to get rid of my saturate and just combine ambient + light for proper HDR?

Thanks
Jeff.

#6 CryZe   Members   -  Reputation: 768

Like
0Likes
Like

Posted 12 September 2012 - 09:53 AM

Yes, saturate makes everything low dynamic range. Never clamp your lighting data if you want it to be high dynamic range.

#7 allingm   Members   -  Reputation: 521

Like
0Likes
Like

Posted 12 September 2012 - 03:34 PM

saturate(IN.SunLight + gAmbient) * pixCol;

The saturate is a problem. To make it properly put the color into the proper range you should start with the Reinhard tone mapper. You can experiment with other ones once you get it working.

Reinhard tone mapper:
y = x / (1 + x)

You can plot this in wolfram alpha to see that it does indeed map the range [0, infinity] to [0, 1]
http://www.wolframal... x), x = 0, 100

Keep in mind this is the "classic" tone mapper, but it doesn't necessarily create the most pleasing results. However it should be plenty for learning.
http://filmicgames.com/archives/183

#8 jeffkingdev   Members   -  Reputation: 787

Like
0Likes
Like

Posted 13 September 2012 - 09:09 AM

I took out the saturate, but things like my terrain are blowing out to almost white... does that sound right?

Jeff.

#9 Hodgman   Moderators   -  Reputation: 38652

Like
0Likes
Like

Posted 13 September 2012 - 09:13 AM

Yes, the data inside your HDR texture can't sensibly be displayed directly to the screen -- if you do, and if there's bright lights, then things will just look white.

Now that you've got your HDR data, you've got to tone-map it back down to "LDR" in order to display it. Allingm posted a very simple tone-mapping function above -- you can use that in a post-processing pass that reads your HDR texture as input, performs that function, and outputs a regular 888 RGB value.

#10 jeffkingdev   Members   -  Reputation: 787

Like
0Likes
Like

Posted 13 September 2012 - 09:40 AM

I'm confused.

I have my HDR texture. I calculate my average luminance value (by down sampling and averaging).

What does my pixel shader use the luminance for?

y = x / (1 + x)

does my pixel shader look like (HLSL):

hdrCol = tex2d(SampHdr, texCoords);
hdrCol.r = hdrCol.r / (1.0 + hdrCol.r); ??
hdrCol.g = hdrCol.g / (1.0 + hdrCol.g); ??
hdrCol.b = hdrCol.b / (1.0 + hdrCol.b); ??

Where goes the luminance?

Currently I'm using the HDR_Pipeline's method (from the SDK)

final = hdrPixel;
fExposure = 1.0;
fGaussianScalar = 1.0;
float Lp = (fExposure / l.r) * max( final.r, max( final.g, final.b ) );
float LmSqr = (l.g + fGaussianScalar * l.g) * (l.g + fGaussianScalar * l.g);
float toneScalar = ( Lp * ( 1.0f + ( Lp / ( LmSqr ) ) ) ) / ( 1.0f + Lp );
c = final * toneScalar;

Thanks
Jeff.

#11 allingm   Members   -  Reputation: 521

Like
0Likes
Like

Posted 13 September 2012 - 02:56 PM

You're going to use the average luminance to scale the scene exposure. This allows the camera to adapt to the scene brightness. This is simulating what a camera or your eye does when you walk into a dark room and your eye brightens everything up.

I feel like what you really need is a good tutorial or article. Unfortunetly the one that was on GameDev.net seems to be gone. The only one I could find quickly is: http://www.gamedev.n...-on-mains-r2485 If somebody can find a better one it would be great.

#12 jeffkingdev   Members   -  Reputation: 787

Like
0Likes
Like

Posted 16 September 2012 - 08:30 PM

Guys:

I'm so close to having this working. What happens now:

If I turn my light really low (like 0.2, 0.2, 0.2), then if I look at my sky, the terrain turns black, if I look at my terrain (no sky), the terrain shows dark, but I can see it. Nothing is using HDR values.

Now, I set my light to (5.0, 5.0, 5.0) and my terrain is just about white, its blown out. If I mess with the exposure, I can get my terrain visible with an exposure of like 0.01, but then my sky is really dark.

I'm doing something wrong here.

I've got my tonemapping down to this:
final = hdrCol;
float Lp = (fExposure / l.r) * max( final.r, max( final.g, final.b ) );
float toneScalar = Lp / ( 1.0f + Lp );
retCol = final * toneScalar;

I'm not sure my equation above is right.

For example, if my average luminance (l.r) = lets say 1.0... and my exposure was 1.0 and my max color was 5.0, then lp = 5... so 5 / (1+5) = 0.9-ish... well, 0.9 * 5 is still greater than 1, so it will be blown out... so how is this suppose to work???

Thanks for any more help!
Jeff.

#13 larspensjo   Members   -  Reputation: 1561

Like
-1Likes
Like

Posted 17 September 2012 - 01:03 AM

Correct me if I am wrong, but the way I understand this:
  • Transform the colors from the range [0,1) to the range [0,inf).
  • Do various manipulations, like multiplying with lights from lamps.
  • Transform the colors back again from [0,inf) to [0,1)
So if you have dark scene, with no lights or transformations, it will come out exactly the same again.

Using the tone mapping function mentioned above, x/(1+x), in step three, means you need to do a reverse mapping in step one: x(1-x). Notice that this reverse transformation can't take a value of 1, you have to clamp it somewhere below 1.
Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

#14 jeffkingdev   Members   -  Reputation: 787

Like
0Likes
Like

Posted 17 September 2012 - 01:35 PM

larspensjo,

I'm not sure that's right. So, I need to do something like:

final = hdrCol * (1 - hdrCol); ?

That doesn't seem right, being hdrCol can be any value, for my example, let's say (5,5,5). final color would be = (20,20,20)

Any other thoughts?
Jeff.

#15 larspensjo   Members   -  Reputation: 1561

Like
0Likes
Like

Posted 17 September 2012 - 02:25 PM

I'm not sure that's right. So, I need to do something like:

final = hdrCol * (1 - hdrCol); ?

That doesn't seem right, being hdrCol can be any value, for my example, let's say (5,5,5). final color would be = (20,20,20)

That function shall be used for the first phase, not the last. The values you reading from a normal texture sampler (RGB8) in in first phase are scaled from [0,255] to [0,1]. So you use the function x/(1-x) to get a hdrCol. That is what you render into the D3DFMT_A16B16G16R16F.

In the last phase, when you render D3DFMT_A16B16G16R16F to the screen, you have to do the tone mapping x/(1+x). The value going to the final display will then be in the range [0,1].

In between these two phases, you can manipulate the D3DFMT_A16B16G16R16F as you want, as going above 1 is no longer a problem. For example, if you have many lamps that all add light, the value can get rather high, but will still transform into something in the range [0,1] in the last phase.
Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

#16 MJP   Moderators   -  Reputation: 13626

Like
0Likes
Like

Posted 17 September 2012 - 04:20 PM

larspensjo, you seem to be a bit confused on this subject. A typical albedo texture does not contain "HDR" values, it contains "LDR" albedo values that are of the [0, 1] range. You don't want to "convert" these to HDR, you still want them to be [0, 1] even in an HDR lighting scenario. For instance if you have a grey surface with albedo = 0.5 and light source with intensty = 100, your diffuse reflectance would be 0.5 * 100 (divided by Pi if you're energy conserving) which would be 50. If you were to perform some transformation on the albedo color that would give you a value above 1, it would mean that the surface could reflect more light than the amount of light that it's receiving. That's obviously not want you want.

#17 larspensjo   Members   -  Reputation: 1561

Like
0Likes
Like

Posted 18 September 2012 - 05:07 AM

larspensjo, you seem to be a bit confused on this subject.

Thanks for info, and please excuse me for adding confusion!

I guess that what is clear, is that there are at least two ways to transform back from HDR (tone mapping). One simple solution is to use the Reinhard tonemapper, x/(1+x), for every pixel. But this has disadvantages. It won't give extra details at the lower brightness parts of the picture. The OP is using a more advanced solution, which makes use of adaptive luminance. That is, pixels are sampled to get an average luminance, that is then used in the tone mapping function.

The advantage of the Reinhard tonemapper is that you don't need to know average luminance, and so you don't need the downsampling process. I guess the choice then depends on the requirements. If the requirements are to enhance a HDR photo so as to make both high and low luminance areas visible, then the Reinhard filter isn't good enough.

I setup my render target texture as D3DFMT_A16B16G16R16F. How do I actually get those extra bits of color? If I render my scene as-is and then render that texture out to the screen, it's exactly the same.

One way is to load one or more external bitmaps in floating point format, already saved with the extra information. If you just take a normal 24-bit bmp file, for example, you have already lost the HDR information.

Another way is that you have a picture manipulation algorithm that adds details of higher resolution.

The HDR_Pipeline actually has a pixel shader that multiplies each RGB by a scalar factor. Is this how HDR is done? So, every object (or really every shader) in my scene now has to multiply it's pixel RGB by some magical scalar factor so I can later use that for my luminance calculation? Is this how everyone does it, take the result RGB from a model and multiply it by some scalar to get over 1.0 results?


That would transform the color into HDR space, this time using a linear transform. But then we are in the same "problem domain" MJP is talking about, and his arguments has to be considered.

I will explain the way I use HDR in my game. It may, or may not, be relevant to the requirements of the OP. And there may be basic problems in the design, as pointed out by MJP. However, it seems to work quite good.

I have a diffuse color map where I want to add effects of lighting. I add up the contributions for all light sources (maintained in a separate R16F buffer) for every pixel. The diffuse color map is coded in the range [0,1), and if I simply multiply the color with the "light intensity" from the R16F buffer, I can get a value bigger than one (depending on the number light sources). I think it can be seen as adding energy. Instead of using clamping, I want to use tone mapping, and the Reinhard filter was my first choice.

When applying the Reinhard filter, and there are a light contribution near a factor of 1, high value luminance of the original diffuse colors will get depressed. That is the reason I use an inverse Reinhard filter of the diffuse color before adding light manipulations. So to speak, I am creating the HDR image from the diffuse colors. I suppose this may be physically incorrect, but it looks fine. The Reinhard filter works fine in this application, as the goal is not to enhance low luminance details.
Saturation1.jpg Saturation2.jpg
Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS