Jump to content
  • Advertisement
Sign in to follow this  
Shirakana2

Global illumination vs Real-time

This topic is 3885 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all ! My current project is about studying methods and algorithm for real-time global illumination. The main goal is to adapt theory from actual research and integration into next-gen video game engine. So I have started to read a lot of research papers, thesis, publications, but none can be really applied to real-time or fulfill video games requirements. For now, the best I have is the PRT/SH stuff and also irradiance volume. But before adapting and developping this, I would like to know if any of you have already try to implement such kind of methods before. I also know next-gen games already used advanced lighting technics, like Halo, Crysis or Unreal3, or even middleware like Lightsprint, Fantasylab or Geometrics... but there are no way to know how they did... Thanks for those who have already studied / implemented such stuff for their tips and tricks.

Share this post


Link to post
Share on other sites
Advertisement
It depends on your requirement list. If you have a time of day feature everything needs to be dynamic. If you do not have that you can store lots of data in textures.
Assuming next-gen games all have a time of day feature, then you can say that everyone is using a hack here and there but no one used a generic GI solution in a shipped game so far.

Share this post


Link to post
Share on other sites
Quote:
Original post by Shirakana2
I also know next-gen games already used advanced lighting technics, like Halo, Crysis or Unreal3, or even middleware like Lightsprint, Fantasylab or Geometrics... but there are no way to know how they did...


There's some information about how Lightsprint works here if that's any help.

Share this post


Link to post
Share on other sites
Quote:
Original post by Shirakana2
I also know next-gen games already used advanced lighting technics, like Halo, Crysis or Unreal3, or even middleware like Lightsprint, Fantasylab or Geometrics... but there are no way to know how they did...


Michael Bunnell of Fantasy Lab wrote two chapters in GPU Gems 2 that describe his algorithms for displacement mapping and real-time lighting. The lighting one is available from the NVIDIA site.

Share this post


Link to post
Share on other sites
Yup, I already read this. It covers AO (which for me is not global illumination) combined with 2-bounces indirect lighting and have good-looking result. But in Fantasy Lab page, he claims that he doesn't use AO, nor PRT/SH, nor radiosity method o_O.

Share this post


Link to post
Share on other sites
The method described in the chapter is not strictly AO - since you can add as many bounces as you want it's more like a GI solution. I would be shocked if fantasy labs is not using either exactly this technique or a modification of it.

If I were working on a real-time GI system, this is the technique I would use (at least as a starting point).

Share this post


Link to post
Share on other sites
There's quite a lot of information on Crysis' lighting model in the paper they presented at Siggraph 2007 - if you have access to the Siggraph proceedings or an ACM Digital Library account you should be able to find it. There's also a fair bit of detail on how Valve does things in the Source engine at http://www.valvesoftware.com/publications.html. Neither of them are doing full real-time global illumination - Half Life 2 uses a lot of pre-baked radiosity data and Crysis uses a combination of tricks to approximate some GI effects.

Share this post


Link to post
Share on other sites
Have you read Simulating Photon Mapping for Real-time Applications? There are some papers about doing Photon Mapping completely on the GPU.
Then there is Caustics Mapping: An Image-space Technique for Real-time Caustics which uses a similar approach as the paper above.
For Subsurface Scattering there is Real Time Subsurface Scattering in Image Space (Same author(s) as previous paper).
If you only need diffuse lighting, then "Instant Radiosity" might work well (add Deferred Shading for more speed).
You already discovered PRT/SH-lighting based methods, which can include SSS, indirect lighting and stuff.

And you could use plain old Lightmaps, generated by Photon Mapping (e.g. from q3map2 Map Compiler). This is valid real-time Global Illumination :-D

You did not list any requirements ("I want to do real-time GI" isn't any requirement at all) and Global Illumination is a very broad field...so find out your requirements and maybe we can help you in a better way.

Share this post


Link to post
Share on other sites
I posted alot of questions about this the last months. Maybe its worth searching for them.

Anyway, I'm looking for a (Fast) realtime ambient / indirect lighting method as well. So far I produced a system that collects light from 6 directions in nodes (placed manually, typically nearby vertices). You can see this node as a cubeMap with 1x1 sized faces. Each face is a color, coming from a direction. But to avoid a lot of texture switching, all cubeMaps are placed in a large 3D texture.

Each vertex is connected to a node (so you can do blending between 3 nodes on a polygon). The vertex shader reads the 6 colors (from the 3D texture) that belong to that node. These are passed to the fragment shader. The fragment shader calculates a world normal (eventually with a normalMap). This normal is used to blend between the colors.


It works on my GeForce 8800. Rendering the world with these nodes is fast, no problem. Only updating those damn nodes is more tricky. Each node has to render the surrounding world 6 times (like a cubeMap). And the 64x64 rendering needs to be scaled down to 1x1 (Actually its 2x2 in my case). That takes time. I can update ~20 nodes per frame before it really gets slow. But probably it will be less nodes when the world to render around the node is getting more complex.

So maybe I can update 4 nodes in the end (and still do all the other stuff without problems). However, most nodes don't need to be updated all the time. And if I get 40 frames per second, I can still update 4 x 40 = 160 nodes in a second. Not that bad, is it?


However, there's another problem. The normal I calculate in the fragment shader is used to blend between the colors. Basically its the same as pointing to a pixel in a cubeMap, depending on a reflection vector. That means I only get the indirect lighting coming straight to the pixel. In reality, a piece of surface will also gather light coming from a larger angle (Lambert Cosine Law stuff). Maybe I can "simulate" this by taking multiple samples around the normal. But I don't know how to generate x new vectors, 45 degrees bended from the original normal.



In the end, I don't know if its worth. Of course, it's dynamic. And if you add something like SSAO, you also get somewhat smaller details in it. But it costs alot, and a pre-calculated lightMap simply looks better for now. There is 1 little advantage though, a traditional lightMap can't be used with a normalMap, unless you do something like they did in Halflife2. The method I described above can do normalMapping without a problem. In fact, it's recommended to that, since it will make the result look more varied (otherwise you get a "per vertex lighting" look).

Don't know if this is a good method, but I hope the info was usefull.
Succes!
Rick

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!