Realtime Ambient Occlusion, part 2

Started by
16 comments, last by Enrico 16 years, 5 months ago
Hi, A couple of days ago I asked about ambient lighting. With the help of you guys here, I managed to create Ambient Occlusion lighting via a static baked map. After seeing a demo of the Crysis Sandbox editor, realtime AO seems very attractive to me. Implementing SSAO itself might not be that difficult, but I wonder how they link the ambient colors from the surrounding world. The occlusion factor is one thing, but I also need an ambient color and light direction(s) for normalMapping. In the previous post Harry Hunt came with the idea of placing nodes in the world that measure the incoming light on that point. These nodes can be converted to a cubeMap so that I can perform normalMapping onto the ambient light, or I pass the colors from that node to the nearby vertices. This could be a realtime solution, if I change the node colors, the ambient light of the nearby surfaces will change as well. But... how to measure the incoming colors of such a node fast enough to make it realtime? Currently I was planning to render the surrounding world in multiple passes for a radiosity effect. The "ambient nodes" would measure the surrounding colors for 6 sides, and store that. But obviously, that is too slow for realtime applications. I wonder how games as Crysis do it. Anyone a good idea? I could simply use the lightColors from nearby lights arround a node. But I don't know if that gives the desired (indirect light) effect. It's also hard to tell if there something between the node and the lightSource. Greetings, Rick
Advertisement
You could render at a very low resolution, and only render large, preferably close objects
I don't think that is an option (yet). Even if I would render the surrounding scene in a simply way on a low resolution, it will still give problems. If I would render the surrounding area like you would do for a cubeMap, I need to render the scene 6 times per node. Maybe that's not a problem for 1 node, but there will be definitely more (in fact, the more, the better). At least each corner of a room should have a node. Swapping FBO buffers already gives me slowdowns, and maybe even the vertex GPU could come into problems for so many passes...

Not all nodes should be updated all the time of course. In many cases, the lighting remains static. But if I need to figure out which nodes should be updated when a light or part of the scenery changes, is hard to manage.

I downloaded the Crysis demo, wanted to see how the (realtime) ambient would look there. But for some reason the demo won't do any shadowing. It looks worse than good old Farcry, and it still runs terrible slow. I guess my computer is not up for the task :|
Greetings,
Rick
Quote:Original post by spek
In the previous post Harry Hunt came with the idea of placing nodes in the world that measure the incoming light on that point. These nodes can be converted to a cubeMap so that I can perform normalMapping onto the ambient light, or I pass the colors from that node to the nearby vertices. This could be a realtime solution, if I change the node colors, the ambient light of the nearby surfaces will change as well. But... how to measure the incoming colors of such a node fast enough to make it realtime?

This sounds like what Half-Life 2 is doing. They places Ambient Cubemaps throughout a level and these are then used for rendering. However, I think they use prerendered cubemaps for this...

--
Correct, they were pre-calculated. I'm trying to do the same thing, but if possible, they should update themselves. HL2 measures the (indirect) light from the surrounding environment by rendering it with the radiosity lightMap. In my case there is no lightMap, everything is litten with realtime lighting (shadowMaps). Besides from the cost to update these nodes, its hard to find out which ambient/indirect light a node catches. Crysis managed to do it somehow though...

Greetings,
Rick
Quote:Original post by spek
Correct, they were pre-calculated. I'm trying to do the same thing, but if possible, they should update themselves. HL2 measures the (indirect) light from the surrounding environment by rendering it with the radiosity lightMap. In my case there is no lightMap, everything is litten with realtime lighting (shadowMaps). Besides from the cost to update these nodes, its hard to find out which ambient/indirect light a node catches.

I am not sure what you are looking for. Are you looking for a possibility to calculate ambient lighting at real time or indirect diffuse lighting? Both maybe close together, but might need different techniques.

When you know what you want, you need to ask yourself questions like "Is precomputation possible? If not, why not?". When you have answers for these questions, we might help you in a better way ;)
--
Pre-calculated is certainly possible. I did it before, and since much of the lighting is pretty static, a pre-calculated solution won't hurt. Most probably it even gives better and certainly faster results.

However, I was interested in realtime solutions like Crysis managed to do somehow. I guess pre calculated maps won't disappear soon, but realtime lighting gets the future I think. In my case it's just for learning new/different techniques. And its nice for my level editor. Earlier we had to wait hours before a lightMap was finished, check if it was any good, and do it over again. With a realtime solution an artist can directly see the result in a map editor. Nice in our case, since we do don't have that much time for our hobby :)


I don't know the exact difference between indirect diffuse lighting and ambient lighting. As far as I know, ambient light doesn't really exist in the real world. But that is not a problem, we are not trying to achieve 100% physical correct lighting.... it just has to look nice. In our case there are many indoor locations with just a few lights. We are using deferred lighting and shadow maps. It looks nice, but everything outside the lightspot is pitch dark. Since there are not many lights (think about dark basements, corridors, industrial settings), ambient light really becomes necessary. And we also want to apply normalMapping on the ambient portion. This is possible if those nodes measure the light like a cubeMap (just like HL2 performed normalMapping as well on the dynamic objects via "ambient cubes").

I think a relative simple solution would be to check which lightSources are nearby a node, and add that color to 1 or more sides of the cube (depending on the angle between node and light). It doesn't matter if that light is blocked by some little obstacles, since it somewhat simulates the indirect lighting effect. However, if a big wall is between the node and the light, it shouldn't be used. It might be tricky to calculate that.

This is not a very realistic approach of course, but maybe its good enough... I found a few shots of indoor locations of Crysis. I doubt if they are using a much more realistic approach. It probably has a reason that 99 of the 100 screenshots are outdoor locations... Its less spectacular than the outdoor shots.

http://crysis-online.com/Media/Screenshots/Screenshots/Carrier-01.jpg
http://media.pc.ign.com/media/694/694190/img_4948263.html
http://media.pc.ign.com/media/694/694190/img_4948266.html
http://media.pc.ign.com/media/694/694190/img_3533351.html

The realtime ambient occlusion does 50% of the job. But they also need the colors and eventually light directions from the (indirect) light for the normalMapping.

Greetings,
Rick
Ok, then I do not see why you do not want to create the cubemap probes dynamically at runtime. The cubemaps can be very small in size (maybe 32x32) and do not need any fancy filtering or anti aliasing. And you do not need to update them every frame ;)

Or maybe you should try Spherical Harmonic lighting, which is good for this kind of ambient style lighting...
--
If I understand you right, you would:
- Place the camera in a node, point it towards 1 of the 6 sides
- Render the area in front of the camera with "simple" shading, on a low resolution
- Each surface gets its node(cubeMap) for the ambient part

You think this can be done fast enough? I have my doubts... I can indeed measure incoming light by rendering the scene without fancy effects. Just render the lights(shadowMaps in my case) and multiply it with the surface "reflection" color (no texturing/normalMapping/etc.), eventually add emissive.

However, everything is done on HDR buffers which makes it a little bit slower. And a simple square room should at least have 4 nodes I think (unless the lighting is really not much different for each corner). That would be 6 x 4 = 24 passes. Maybe that is still possible, and maybe I can do with less nodes. But I also expect more nodes for larger complex scenes (outdoor, or a big warehouse or something).

Another little downside about using cubeMaps is that I can use only 1 per polygon. If I pass colors per vertex, I can blend between 3 nodes, which is especially nice for larger surfaces. I could get these colors by downscaling the 32x32 renderings to 1x1 and pick the pixel, but that would be much slower (picking colors is slow, switching between FBO's as well).



Probably this will be possible sooner or later with faster cards. And maybe its already possible, but I'll have to add that a lot of power is already gone to deferred lighting, tonemapping, blurring shadowMaps, and so on...

I don't know how Spherical Harmonic lighting works, I shall look into that.
In the end it doesn't matter that much if we have to use pre-calculated nodes, especially not if the quality is much better. But I'd like to explore the current available options. I haven't been graphics programming for 1.5 years, I picked that hobby up again 2 months ago, so I want to be up-to-date again :)

Thanks for the tips!
Rick
Quote:Original post by spek
If I understand you right, you would:
- Place the camera in a node, point it towards 1 of the 6 sides
- Render the area in front of the camera with "simple" shading, on a low resolution
- Each surface gets its node(cubeMap) for the ambient part

You forgot to set the viewport to 32x32 for each cubemap face ;)

Quote:You think this can be done fast enough? I have my doubts...

What is your target hardware then, if you think it is too slow? Have you tried it? ;-) There is an article in GPU Gems 1 or 2 which is about Realtime calculation of (ir)radiance maps. I need to look up the complete title at home, but this might be exactly what you want ;-)
As I said, you do not need to update the cubemaps every frame...

Quote:Another little downside about using cubeMaps is that I can use only 1 per polygon.

No, you can use as much as you have texture units. 8 texture units = 8 cubemaps per polygon.

Quote:Probably this will be possible sooner or later with faster cards. And maybe its already possible, but I'll have to add that a lot of power is already gone to deferred lighting, tonemapping, blurring shadowMaps, and so on...

All this stuff is useless without proper indirect lighting (in my opinion).

Quote:I don't know how Spherical Harmonic lighting works,

Try my article/paper and if you have problems, just ask. Oh, and the article does not talk about runtime SH lighting, where you dont need any preprocessing. Still need to add this...


Best regards,
Enrico
--

This topic is closed to new replies.

Advertisement