The last couple of days I've been reading about Image based lighting and spherical harmonics and stuff like precomputed radiance transfer and trying to get a grip on how these techniques are used and implemented.
Now as I seem to get closer to understanding how all of this works there are a few questions popping up for me.
And I'd hoped you guys could help me with those...
So here I go:
1. If I understood correctly, both of these general ideas try to approximate indirect lighting from the environment.
But then when I read about spherical harmonic lighting, people still talk about using an environment map to precompute parts of the light equation and storing them via SH. Which is then what you'd call spherical harmonic lighting. But if you use an "image" to compute the incoming radiance, where on the other hand is the difference to "image-based lighting", or is it the same ? Or a combination of both ? Then there's also the talk about "prefiltering" an environment map(?)
2. When people talk about IBL, mostly they refer to indirect lighting using an environment probe. But this is where some confusion comes in.
Since the meaning of talking about IBL seems to differ quite a lot. There's distant IBL and local IBL. I understand that using a distant IBL would only make sense when rendering only the hemisphere (the sky) to simulate the sky lighting on the scenery. Another meaning of IBL is having "glossy reflections" or so, but as far as I can tell, isn't that just good old environment mapping ? Or is it special because of the "local" technique being used ? If not how does it differ ? And how does it work?
3. Now for something more practically, if I were to implement image-based lighting in my application, the first thing to do would be placing environment probes in the scene. Now those are rendering the final scene image of the surrounding area into 6 render targets as a precompute pass, correct ? That basically means that the reflections I'm getting for everything will be completely static or not ?
I've read somewhere that you could render a single probe on your player's position (in real-time?) for a solution on this problem.
But then where exactly would I place this ? On top of the player's head ?
4. This is a design question. In my deferred renderer I have a handful material types predefined, e.g. default, skin, sky,...
Would I just assume that everything is going to be reflective to some degree ? Or would I need to define a material for this ? If yes then I'd also need to have another variable telling the lighting shader how reflective this material has to be. Is there any good way around this ? As I've never seen any g-buffer layout having this kind of parameter packed somewhere.
5. Finally which technique of advanced lighting or indirect lighting approx. techniques would you recommend, to further read about and try to implement? Is PRT worth getting into ? I'm trying to implement dynamic daytime change later on. So something dynamic, at least for the sky would be quite nice.
Edited by lipsryme, 01 January 2013 - 03:30 PM.