• Advertisement
Sign in to follow this  

Image based lighting - how to do it?

This topic is 4134 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, are there any tutorials, that would explain in a simple way how to do HDR image based lighting + source code? I know that there's even a link in forum faq to Debevec's web page and ATI implementation of IBL, but there's no source code available (at least I couldn't find it). I've also found NVIDIA presentation from GDC04, but I don't really get it.

Share this post


Link to post
Share on other sites
Advertisement
You may find some helpful information on ATI's CubeMapGen webpage:

http://www.ati.com/developer/cubemapgen/index.html

There's a stand alone tool as well as a library for integrating the tool into your own application. There also a link to a presentation that explains it all. You can use this tool to generate cubemaps that you can use for image based lighting.

--Chris

Share this post


Link to post
Share on other sites
Thanks guys, but I still can't figure out how it should be done.
I've seen Debevec's paper before, but it just describes general idea and implementation using Radiance renderer and I'd like to implement in in DX and HLSL.

I'm rather a beginner when it comes to computer graphics, so I can't figure out how it should be done by reading just general idea. I have a small app that renders HDR cube map to a skybox and an object with basic enviroment mapping.

So I guess I should just modify enviroment mapping shader and maybe HDR cube map and it should work. If anyone could present implementation and some basic explanation I would be really grateful.

[Edited by - g0nzo on October 15, 2006 5:22:41 AM]

Share this post


Link to post
Share on other sites
I've found an example of IBL in NVIDIA SDK sample "HDR with 2x FP16 MRT" and made it work in my app, but I still don't really understand how it works [smile].

Does anyone know how the diffuse map is created or what exactly it stores? I know how to create it using HDRShop or CubeMapGen, but I don't know what exactly they do to generate it.

[Edited by - g0nzo on October 22, 2006 7:09:25 AM]

Share this post


Link to post
Share on other sites
The diffuse map essentially stores the hemispherical irradiance from the environment for a given direction. The location of each pixel represents a different surface normal direction, and its value is the total irradiance for that direction.

It's constructed by iterating over each pixel (pixel A), constructing a normal direction from its location, then iterating over every other pixel (pixel B) and adding pixel B's value multiplied by the dot product of normals A and B to the running total of pixel A.

Hope you got that. The algorithms quite simple but exaplining it clearly someitmes isn't!

Share this post


Link to post
Share on other sites
Thank you, but I still don't really get it. I still can't see the difference between diffuse map and environment map. Couldn't I just use blurred (or downsampled) environment map instead of diffuse map?

BTW. Does IBL have any application in computer games? It's probably useful for special effects in movies, where one can store radiance of the whole scene and use it to illuminate an object generated by computer to combine it with the real scene or architecture rendering and such. But in computer games, where everything is created by computer, could it be somehow useful?

[Edited by - g0nzo on October 25, 2006 8:01:05 AM]

Share this post


Link to post
Share on other sites
Quote:
I still can't see the difference between diffuse map and environment map

It's almost similar, but the diffuse map is an environment map filtered using a cosine filter.
Furthermore environment map is almost always sampled using the reflected eye vector (eye vector reflected by the fragment normal), hence the term environment reflections.

When doing IBL you generally talks about diffuse ligting, thus you're using the fragment normal to sample the texture.

To understand what the filtering does, think of the texels in the environement map as a distant light source. I.e it has a color/intensity (the RGB value) and a normalized direction (3D vector = Same as the texture coordinate for that texel but normalized, in the case of a cube map).

To shade a fragment using an environment map we need to light it using every texel in the map (half of them will be on the negative side though).
In other words we have something like this:

function getLightForNormal(Normal)
Light = 0
TotalIntensity = 0
For every texel in the environment map
Intensity = Normal dot DirectionToTexel // The influence = same as in the standard diffuse lighting equations
If Intensity > 0 Then
Light = Light + TexelColor * Intensity
TotalIntensity += Intensity
Light = Light / TotalIntensity
return Light

// For every pixel fragment, do something like:

FragmentDiffuseColor = FragmentTextureColor * getLightForNormal(FragmentNormal)


This is of course VERY expensive since there are many texels in a environmap.
But it turns out that you can precalculate all the weighting stuff and just use one map lookup to get the exact results as the above (in reality ofcourse it's not identical due to rounding etc).

The input to the above is only one normal, hence we can precompute the light for a bunch of normals, i.e a look up, sort of like: Light = LightLookUp[Normal].
Why not using a map of the same type as the environment map as a look up table?

To build a filtered map is easy (but slow):

For every texel in the diffuse map
set pixel(getLightForNormal(DirectionToTexel));

// Now we have our look up table stored in the filtered map
// If we're using a cube map, we could just use the fragment normal as texture coordinates to get the correct lighting

FragmentDiffuseColor = FragmentTextureColor * filteredMap[FragmentNormal]



Since normals are "spherical" it's suitable to use cube maps, but the technique could be used for 2d sphere maps, dual parabolic maps and other types that tries to capture an even distibution over the sphere surface.

Quote:
BTW. Does IBL have any application in computer games

I think that Half Life 2's Source-engine uses something similar, or maybe they are just using it for environment reflections (I've read somewhere that the artists could put environment probes into the world, or so).

[Edited by - eq on October 25, 2006 10:10:57 AM]

Share this post


Link to post
Share on other sites
Thank you very, very much!
Unfortunately I can't rate you higher anymore [smile]

Share this post


Link to post
Share on other sites
Quote:
Thank you very, very much!

Np!
Quote:
Unfortunately I can't rate you higher anymore [smile]

Create another user? *lol*

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement