Jump to content
  • Advertisement
Sign in to follow this  
jjtulip

Integrating Image Based Lighting

This topic is 1524 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

In my forward renderer I am using a PBR shader for my materials and the final lighting equation looks something like (in sort of pseudo code and ignoring the fact that I have multiple analytical lights)


diffuseComponent = diffuseLighting * materialDiffuseColor;
specularComponent = specBRDF * materialSpecColor;
outputCol = (1.0 - specularIntensity) * diffuseComponent  + specularIntensity * specularComponent ;  

And I am now willing to incorporate in this a basic IBL for both diffuse lighting and reflection. 

 

I was reading a bit online but I found only material where they have entirely reflective material ( the final color is the queried reflection from cubemap ) or with just diffuse. 

 

 

For the diffuse I thought of just multiply my diffuseComponent by the value retrieved from irradiance map. 

I am now unsure where to put the reflection. I that this should be related to the roughness of the surface, but while I have a couple ideas but neither satisfy me. 

 

Where should I put the reflections so to maintain a PBR shading? Adding params wouldn't be an issue. 

 

 

Thank you!

Share this post


Link to post
Share on other sites
Advertisement

In an offline quality renderer, you'd sample every pixel in the environment map, treating them as a little directional light (using your full BRDF function), which is basically integrating the irradiance.
This is extremely slow, but correct... Even in an offline renderer, you'd optimise this by using importance sampling to skip most pixels in the environment map.

For a realtime renderer, you can 'prefilter' your environment maps, where you perform the above calculations ahead of time. Unfortunately the inputs to the above are at a minimum, the surface normal, the view direction, the surface roughness and the spec-mask/colour... That's 4 input variables (some of which are multidimensional), which makes for an unpractically huge lookup table.
So when prefitering, typically you make the approximation that the view direction is the same as the surface normal and the spec-color is white, leaving you just with surface normal and roughness.
In your new cube-map, the pixel location corresponds to the surface normal and the mip-level corresponds to the roughness. For every pixel in every mip of this new cube-map, sample all/lots of the pixels in the original cube-map * your BRDF (using the normal/roughness corresponding to that output pixels position).

 

To add to this, there was a quite nice explanation given at SIGGRAPH 2013 on how they filter their cubemaps in Unreal 4. They split the problems into two parts and solve each part separately in different lookup textures. The first part is an actual prefiltered environment map using importance sampling as Hodgman describes, the second part is a lookup table containing your environment BRDF and is expressed in terms of your surface roughness and viewing angle.

 

The only real effort involved in implementing this is building your tools to generate the environment map and LUT. Once this is done it will be a breeze to implement the evaluation of these in your shader.

 

Slides can be found here, course notes here

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!