Ambient Occlusion for Deforming Geometry

Started by
6 comments, last by Ashaman73 9 years, 3 months ago

Could ambient occlusion be used correctly on skinned models that are animating? It would seem that it only works on static objects that could be repositioned, rotated and scaled as a whole, but not when its geometry is deforming because the radiance map would have to be re-computed. Is this correct?

Advertisement

Could ambient occlusion be used correctly on skinned models that are animating? It would seem that it only works on static objects that could be repositioned, rotated and scaled as a whole, but not when its geometry is deforming because the radiance map would have to be re-computed. Is this correct?

If by pre-baked ambient occlusion, then it's certainly used less on skinned models but is still used. Some will prebake only on areas that move less relative to each area. World of Warcraft, for example, has always used it on character models. Areas like armpits and etc. can still make use of it, and it's probably a good idea for any game still needing prebaked ambient occlusion.

If you are prebaking it on a character, often you position them with arms out to the side and legs spread apart, so that the results will have the least false-occlusion when they're animated later laugh.png

Alternatively, instead of pre-baking AO from animated objects you can compute it at runtime, so that the animation is taken into account. Some games do this by attaching a small number of ellipsoids to the character's bones to act as a very low-detail proxy of the character's volume. You can then very cheaply ray-trace against this array of ellipsoids to calculate occlusion.

Could ambient occlusion be used correctly on skinned models that are animating? It would seem that it only works on static objects that could be repositioned, rotated and scaled as a whole, but not when its geometry is deforming because the radiance map would have to be re-computed. Is this correct?

If by pre-baked ambient occlusion, then it's certainly used less on skinned models but is still used. Some will prebake only on areas that move less relative to each area. World of Warcraft, for example, has always used it on character models. Areas like armpits and etc. can still make use of it, and it's probably a good idea for any game still needing prebaked ambient occlusion.

I imagine that WoW and many other games are not prebaking AO but simply painting it into the diffuse/albedo maps by hand. That's something artists have done for decades. Centuries? Millenia?

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.


I imagine that WoW and many other games are not prebaking AO but simply painting it into the diffuse/albedo maps by hand

But it is changing, I see more and more handpainted texture work which utilized baked maps from previously sculpted surfaces. Here one of blizzards artists show off his work on Siege of Orgrimmar (WoW). As shown in the threat he used a lot of sculped 3d models as base for his handpainted texture work.

The same is valid for character work, many artists use a scupled model, bake a lot of maps and use this maps as part of the eventually handpainted character texture (Dota2, TF) . In my game I use a dedicated AO texture channel in the otherwise handpainted character textures to enchance the appearance too.

I believe, that a cleverly baked AO map enchanced the appearance of animated characters more than that the potential artifacts are annoying smile.png

Thanks guys! I'm new to AO, and haven't even implemented it yet, so this helps a lot.


Alternatively, instead of pre-baking AO from animated objects you can compute it at runtime, so that the animation is taken into account. Some games do this by attaching a small number of ellipsoids to the character's bones to act as a very low-detail proxy of the character's volume. You can then very cheaply ray-trace against this array of ellipsoids to calculate occlusion.

Does this occur once during load-time? I could see this being beneficial as it saves the artists time, and I can just have that resources available when I want the effect turned on. I'd only do this once though, right? If my model is animating, I'd never want to re-update my AO map, right?


I imagine that WoW and many other games are not prebaking AO but simply painting it into the diffuse/albedo maps by hand. That's something artists have done for decades. Centuries? Millenia?

When you say pre-baking, do you mean by calculating the AO offline, then multiplying the result into diffuse/albedo's color, therefore bypassing any type of dynamic AO calculations in the shader? That'd be like baking light maps into textured geometry before multi-texturing was possible.


I believe, that a cleverly baked AO map enchanced the appearance of animated characters more than that the potential artifacts are annoying

Once I have AO working, I'm really looking forward to testing animated characters with AO enabled and disabled to see the difference in appearance. I'll probably just be generating the maps in Blender using some sort of plugin at the beginning.


Does this occur once during load-time? I could see this being beneficial as it saves the artists time, and I can just have that resources available when I want the effect turned on. I'd only do this once though, right? If my model is animating, I'd never want to re-update my AO map, right?

No, you would do it at runtime (how could you do it at load time if you're changing the AO based on the animation of the character?). It's generally something you would do as a deferred pass, where you evaluate the AO contribution from the occluders per-pixel and write the results into a render target texture (just like SSAO).


When you say pre-baking, do you mean by calculating the AO offline, then multiplying the result into diffuse/albedo's color, therefore bypassing any type of dynamic AO calculations in the shader? That'd be like baking light maps into textured geometry before multi-texturing was possible.



He just means generating a separate AO map offline, and then using that map at runtime.


I'll probably just be generating the maps in Blender using some sort of plugin at the beginning.

I use blender too, blender can render AO maps out of the box.

Are you aware of this dynamic solution? http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter14.html

Not as high resolution as a pre-baked AO map, but then again AO needs not be terribly high-res anyway.

This topic is closed to new replies.

Advertisement