Jump to content

  • Log In with Google      Sign In   
  • Create Account

DirectX SDK PRT tools, light probes


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
12 replies to this topic

#1 trs79   Members   -  Reputation: 126

Like
0Likes
Like

Posted 02 April 2012 - 09:19 PM

Hey all,

I'm trying to see if it's possible to use the PRT DirectX tools to setup light probes similar to this from Unity:

http://blogs.unity3d.com/2011/03/09/light-probes/

I notice there is a spherical light, as well as functionality to convert cubemaps to spherical harmonics coefficients. Any tips on how to be able to place light probes and create spherical harmonics coefficients from them? Thanks

Sponsor:

#2 MJP   Moderators   -  Reputation: 11741

Like
0Likes
Like

Posted 02 April 2012 - 11:25 PM

They're not using PRT in Unity, nor is PRT particularly popular for light probes in general.

If you're just starting out with light probes, then you can implement them like this:

1. Pick your probe locations throughout the scene. Easiest way is a 3D grid.
2. For each probe location render an cubemap by rendering in all 6 directions
3. Convert the cubemap to SH (you can use the D3DX utility functions for this if you'd like, but it's not too hard to do on your own)

Then at runtime you just lookup and interpolate the probes, and look up the irradiance in the direction of the normal by performing an SH dot product (just make sure that you include the cosine kernel). This will give you indirect lighting, and you can add in direct lighting on top of this.

#3 trs79   Members   -  Reputation: 126

Like
0Likes
Like

Posted 03 April 2012 - 12:39 AM

They're not using PRT in Unity, nor is PRT particularly popular for light probes in general.

If you're just starting out with light probes, then you can implement them like this:

1. Pick your probe locations throughout the scene. Easiest way is a 3D grid.
2. For each probe location render an cubemap by rendering in all 6 directions
3. Convert the cubemap to SH (you can use the D3DX utility functions for this if you'd like, but it's not too hard to do on your own)

Then at runtime you just lookup and interpolate the probes, and look up the irradiance in the direction of the normal by performing an SH dot product (just make sure that you include the cosine kernel). This will give you indirect lighting, and you can add in direct lighting on top of this.


Thanks so much for the reply. As I suspected, I must be confusing PRT with SH in general. So are light probes with SH like in Unity known as an "irradiance map"? (sorry I'm trying to wrap my head around the terminology). If this is the case, then as far as I can tell the main difference between PRT and an irrandiance map is PRT computes self shadowing whereas an irradiance map does not? Thanks for any clarifications.

Basically, I'm just try to set my game up such that I can have nice indirect lighting mixed with lightmaps and direct lighting. It seems like light probes are how the commercial engines do it (UDK, Source, Unity, etc.)

#4 jameszhao00   Members   -  Reputation: 271

Like
0Likes
Like

Posted 03 April 2012 - 02:08 AM

SH = (roughly/sometimes) Light probes = environment maps = cube maps
- Storage
- Maps a direction to a color (or any value)
- Light probes can mean SH, which is an approximation and can store blurry stuff. It cannot store sharp (high frequency) details.
- Light probes can also mean environment maps / cube maps. They can handle sharp details.

PRT = Precomputed Global Illumination
- Method
- Need to read paper...
- Can store its results in some format... (paper mentions SH)

Irradiance map = filtered map of incoming radiance.
- http://codeflow.org/...map/#irradiance
- Time saver.
- For diffuse stuff in image based lighting, you usually sum incoming light for a given direction, with each light 'beam' weighted by its angle. That's your irradiance.
- The summation is expensive, so we precompute a direction -> irradiance map.
- Usually looks like a blurred light probe.
- Stored in environment map, SH, etc...

Also I feel like that unless you're reading a paper, the terms 'SH', 'light probe', 'environment map', 'irradiance map' usually are used interchangeably.

#5 InvalidPointer   Members   -  Reputation: 1443

Like
0Likes
Like

Posted 03 April 2012 - 02:15 PM

Spherical harmonics refers solely to the math/concepts behind storage, PRT is just the idea of calculating how light bounces around ahead of time. Technically speaking, boring old Quake 2 lightmaps are just as much PRT as are the fancypants spherical harmonics stuff that's in vogue today.


tl;dr you can have spherical harmonics without PRT, and have PRT without spherical harmonics. SH just happens to work really, really well for PRT.



Also I feel like that unless you're reading a paper, the terms 'SH', 'light probe', 'environment map', 'irradiance map' usually are used interchangeably.

While I hate to come off like a smug tool, I don't think any of those are really 'interchangeable' even outside of academic fantasyland. If you're just starting out I can understand the confusion, but if they're all one big haze then it sounds like your initial source(s) need some work. Reading the last few pages of the seminal paper on SH for environment capture was an epiphany for me-- I was making most of this out to be vastly more complex than it really was on account of technique explanations written by seasoned mathematicians for seasoned mathematicians.

There are a few other annoyances I have, like how irradiance convolutions are described as 'not a blur' though it mechanically works out to the same thing, though the underlying reasoning made more sense once you become familiar with all the terms slung about. There is a consistent internal scheme to it all, though, I promise :)
clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

#6 trs79   Members   -  Reputation: 126

Like
0Likes
Like

Posted 03 April 2012 - 04:15 PM

SH = (roughly/sometimes) Light probes = environment maps = cube maps
- Storage
- Maps a direction to a color (or any value)
- Light probes can mean SH, which is an approximation and can store blurry stuff. It cannot store sharp (high frequency) details.
- Light probes can also mean environment maps / cube maps. They can handle sharp details.

PRT = Precomputed Global Illumination
- Method
- Need to read paper...
- Can store its results in some format... (paper mentions SH)

Irradiance map = filtered map of incoming radiance.
- http://codeflow.org/...map/#irradiance
- Time saver.
- For diffuse stuff in image based lighting, you usually sum incoming light for a given direction, with each light 'beam' weighted by its angle. That's your irradiance.
- The summation is expensive, so we precompute a direction -> irradiance map.
- Usually looks like a blurred light probe.
- Stored in environment map, SH, etc...

Also I feel like that unless you're reading a paper, the terms 'SH', 'light probe', 'environment map', 'irradiance map' usually are used interchangeably.


Thanks for the information, I'm beginning to get a better grasp of the terminology. It sounds like I probably just need SH for now, and I'll use standard shadow mapping for my characters on top of that.

Spherical harmonics refers solely to the math/concepts behind storage, PRT is just the idea of calculating how light bounces around ahead of time. Technically speaking, boring old Quake 2 lightmaps are just as much PRT as are the fancypants spherical harmonics stuff that's in vogue today.


Thank you, that clears up a lot for me. So, to summarize, it sounds like to get the basic ambient indirect lighting I just need to do the steps by MJP. Just to make sure I'm understanding things now, it seems like both Unity and UDK are doing this, i.e. see this link http://udn.epicgames....html#Character lighting

there it looks like they are just doing an irradiance map as well? The DirectX SDK has an Irradiance volume sample from ATI that looks similar.

#7 MJP   Moderators   -  Reputation: 11741

Like
0Likes
Like

Posted 03 April 2012 - 07:26 PM

From the UDK wiki it sounds like they're storing SH irradiance maps (containing indirectly lighting only) at probe locations in a 3D grid, which is conceptually pretty similar to the process that I outlined earlier.

#8 InvalidPointer   Members   -  Reputation: 1443

Like
1Likes
Like

Posted 03 April 2012 - 11:58 PM

Unreal may not be the best example for a newcomer as there's a *lot* of cheating/cleverness going on. Based on my understanding, things work like so:

Static lighting for static objects, both direct and indirect, is all baked into an SH lightmap. I think older versions of the engine actually use the HL2 basis and didn't capture specular, though with Lightmass that's obsolete.

Static object shadows from static light sources on dynamic objects are handled using their proprietary distance field shadows technique. I *think* this involves a sort of 'shadow edge detect' filter and then using some blending to create nice smooth shadows, but I don't know how it works for certain. Sorry! :(

Static lighting for dynamic objects is done mostly through LightEnvironments, which in non-Tim Sweeney-speak translates out to diffuse environment probes. Lightmass will generate these and the actual runtime will probably look very much like the IrradianceVolumes sample in the DXSDK. There's some interesting cleverness here with the modulated shadow system-- the game engine will extract the dominant light from the interpolated SH coefficients and then use this as a modulative shadow projection direction. While not perfect, this is actually a really clever way to handle that problem.

Dynamic lighting on static objects and dynamic lighting on dynamic objects is just your average forward renderer, though I believe that LightEnvironments will futz around with the contributions from unshadowed lights and try to bake them into the SH coefficients for shading.
clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

#9 Quat   Members   -  Reputation: 414

Like
0Likes
Like

Posted 05 April 2012 - 10:48 AM

If you're just starting out with light probes, then you can implement them like this:

1. Pick your probe locations throughout the scene. Easiest way is a 3D grid.
2. For each probe location render an cubemap by rendering in all 6 directions
3. Convert the cubemap to SH (you can use the D3DX utility functions for this if you'd like, but it's not too hard to do on your own)

Then at runtime you just lookup and interpolate the probes, and look up the irradiance in the direction of the normal by performing an SH dot product (just make sure that you include the cosine kernel). This will give you indirect lighting, and you can add in direct lighting on top of this.


This idea sounds interesting to me. Is there a good paper/tutorial on this idea?

What world space resolution would the light probe grid be? Every meter, every 10 meters? What resolution cube maps?

Do you pass your grid of SH to your pixel shaders for adding the indirect lighting?
-----Quat

#10 MJP   Moderators   -  Reputation: 11741

Like
0Likes
Like

Posted 05 April 2012 - 12:40 PM

This idea sounds interesting to me. Is there a good paper/tutorial on this idea?


If you search for "irradiance volumes" you should be able to find some papers. This presentation gives a pretty good overview. As for converting to SH, the paper that InvalidPointer linked to has all of the details.

What world space resolution would the light probe grid be? Every meter, every 10 meters? What resolution cube maps?


It depends on how much memory you have, and how quickly the lighting changes throughout the environment. For slow changes in lighting you really don't too many sample points, but if you want to capture higher-frequency shadowing then you need a more dense grid. If you're only storing indirect lighting then you probably won't your grid to be very dense, since it tends to be low-frequency

You can get a way with a fairly low-res cubemap for irradiance, since it's so low-frequency. 32x32 or 64x64 should be plenty.

Do you pass your grid of SH to your pixel shaders for adding the indirect lighting?


You can do that if you want, in fact you can even use a volume texture so that the GPU will do the interpolation for you. Or you can do the interpolation on the CPU once per dynamic object, and just pass that result to the pixel shader.

#11 Frenetic Pony   Members   -  Reputation: 1396

Like
1Likes
Like

Posted 05 April 2012 - 01:25 PM

Ubisoft has a very good overview of what they're doing for Far Cry 3 here: http://www.gdcvault....-Volumes-Global

They're also going with a neat trick with precomputed radiance transfer. But the basis of it is a low cost spherical harmonic probe grid.

#12 belfegor   Crossbones+   -  Reputation: 2716

Like
0Likes
Like

Posted 30 September 2013 - 01:40 PM

I am sorry to bring this topic back, but i need some help.

 

If you're just starting out with light probes, then you can implement them like this:

1. Pick your probe locations throughout the scene. Easiest way is a 3D grid.
2. For each probe location render an cubemap by rendering in all 6 directions
3. Convert the cubemap to SH (you can use the D3DX utility functions for this if you'd like, but it's not too hard to do on your own)

 

 

2. Render what exactly?

 

3. Do you mean like D3DXSHProjectCubeMap function?

 


Then at runtime you just lookup and interpolate the probes...

This is for "forward" renderer? You pick some arbitrary number of closest probes and interpolate those?

If i use deferred rendering, i should render probes as "volumes" (like my other lights) in screen space and add them additively ?

 


...and look up the irradiance in the direction of the normal by performing an SH dot product (just make sure that you include the cosine kernel). This will give you indirect lighting, and you can add in direct lighting on top of this.

 

How to use those SH coefficients exactly?

How do i lookup probe cubemap, what to use for texcoords?

How is this combined together?

 

Thank you for your time and patience.



#13 MJP   Moderators   -  Reputation: 11741

Like
2Likes
Like

Posted 30 September 2013 - 11:49 PM

For the cubemap, you want to render exactly what you normally render to the screen: your scene being lit by direct light sources.

D3DXSHProjectCubeMap is indeed the function that I was referring to.

It really doesn't matter whether you use forward rendering or deferred rendering, you can use light probes in either setup. For forward rendering, you need to sample the probes in the vertex or pixel shader and add the probe contribution to the lighting that you compute for direct light sources. For deferred rendering you can do it exactly the same way if you wish, by doing the same thing during the G-Buffer pass. Or alternatively you can add in the probes in a deferred pass, as described in this thread. There are advantages and disadvantages to both approaches.

Interpolating your probes depends on how you organize them. If the probes are in a regular grid, then you can do linear interpolation by grabbing the 8 neighboring samples and blending between then just like a GPU would when sampling a volume texture (in fact you can even store probes in a volume texture and let the GPU do the interpolation for you) If you probes aren't organized into any structure (just placed at arbitrary points), then it gets a bit more complicated. The blog post that was linked in the OP actually has a good overview of different approaches.

 

Your SH coefficients will be a set of RGB values that you store in an array. Typically you'll work with 3rd-order, which gives you 9 coefficients for a total of 27 floating point values. These coefficients basically give you a very low-frequency version of the lighting environment at the probe location, which basically means it can tell you a very blurry version of the lighting in a particular direction. To compute the irradiance for a surface with a given normal direction, you have to construct a cosine lobe centered at the normal direction and integrate it with the lighting environment. This is actually really efficient to do in SH, it basically amounts to computing 9 coefficients with a bit of math and then performing a dot product with the lighting environment coefficients. Ravi's paper from 2001 covers all of the details, and I would suggest at least attempting to read through it a few times to become familiar with the process.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS