Recently, I started looking to add realism to my scenes (mostly indoor ). I have a deferred renderer and I'm happy with it, but every time I look at some globally illuminated scenes and I realize how much the proper lighting contributes to realism. So I started looking to add some form of real-time global illumination to my scenes and choose to play around with some form of precomputed radiance transfer and light probes.
I put a camera in a corner, made it render around six times to fill a cubemap and capture the surrounding illumination.
Now I want to use that cubemap to sample it from my deferred lighting pass , using my pixels normals to find out what color is near that point and eventually bleeding to it. That part works as espected, but durring my research I read about compressing the low frequency illumination on a sphere using spherical harmonics. Luckily, D3DX has nice utility functions to compute spherical harmonics from a cube map. Now the questions so far a two :
Am I on the right track with this ? As I can see from other demos, the compute spherical harmonics per vertex ? I tend to do it in my deferred lighting pass per pixel ?
How to sample a spherical harmonics "matrix" ( a set of float values ) using my direction normal to get the illumination value at that point ? I now sample my cube maps with my direction normals, but I want to use spherical harmonics to store the data more efficiently.
Thanks in advance.