The Order 1886: Spherical Gaussian Lightmaps

Started by
7 comments, last by redspike474 8 years ago

I'm going through the slides of our MVP MJP about spherical gaussian lightmaps in The Order 1886 but I'm having trouble following the talk in a few places without additional commentary so I'll just quickly try to recap of what I think is going on:

Link: https://readyatdawn.sharefile.com/share#/download/s9979ff4b57c4543b

1) A set of random, uniformly spaced directions is picked for the SG (e.g. 9). They are hard-coded inside the shader + 1 width value that is shared among all the SGs (27 floats + 1 float constant). The lightmap only stores the color (float3) for each of the 9 directions (27 scalars per lightmap texel).

Question #1: what's the reasoning behind the golden ratio spiral argument? (Slide 43)

2) When converting the GGX NDF to 3 SGs: what does the approximation (equation) look like?

Assuming it looks like this (leaving out details): A * e^p + B * e^q + C * e^r.

Question #2: how do you rotate this function to the light direction? Does p = q = r?

"Some people use Singletons, some people are Simpletons." - Bill Gates
"Yum yum, I luv Cinnabon." - Mahatma Gandhi
Advertisement

1) To uniformly distribute the basis vectors - you can choose any distribution you want but that is just the one we went with.

2) Yes, the free parameters you want are the sharpness and amplitude.

There really is no rotation its just use the lights direction vector.

-= Dave

Graphics Programmer - Ready At Dawn Studios

Thanks for clearing that up David! I didn't even know you were on GD.net - giving too much credit to MJP here. :)

I might have some follow up questions about the ASG later - I don't have the slides open atm.

Not really related to the talk: did you bake the lightmaps inside Maya (with something like a custom mental ray version) or did you use a raytracer outside of Maya (Embree, Optix)?

Congratulations to you and your team on the (imo) best-looking game out there.

"Some people use Singletons, some people are Simpletons." - Bill Gates
"Yum yum, I luv Cinnabon." - Mahatma Gandhi
We had a custom GI baking system written top of Optix. Our tools were integrated into Maya (including our renderer), so the lighting artists would open the scene and Maya and initiate bakes. From there, we would package up the scene data and distribute it to multiple nodes on our bake farm, which were essentially Linux PC's running mostly GTX 780's.

We're still working on finish up our course notes, but once they're available there will be a lot more details about representing using an SG NDF and warping it to the correct space. We're also working on a code sample that bakes SG lightmaps and renders the scene.

Also, regarding the golden spiral: if you do a google search for "golden spiral on sphere", you can find some articles (like this one) that show you how to do it.

Great, looking forward to dig into that!

(Off-topic)

About the Maya integration - what are your thoughts on that in hindsight? I only dabbled in Mel/Py to do small tasks. I'm assuming most of the heavy lifting was written with the Maya Cpp SDK?

Also: you two should feel free to join the gd.net chat from time to time if you find yourself yerning for an extremely long and one-sided conversation about this. Milk and cookies are ready...at dawn. ph34r.png

"Some people use Singletons, some people are Simpletons." - Bill Gates
"Yum yum, I luv Cinnabon." - Mahatma Gandhi
I have more one question to MJP related with Lightmaps...

There no information about segmentation,paramatrization,box packing algos has been used in The Order 1886. For me it's most complicated part of pre-backed GI implementation.

(Off-topic)
About the Maya integration - what are your thoughts on that in hindsight? I only dabbled in Mel/Py to do small tasks. I'm assuming most of the heavy lifting was written with the Maya Cpp SDK?


Sorry, I forgot to reply to this!

There were a lot of ups and downs with having such tight Maya integration. In general the artists were big fans, since they were already familiar with Maya and did a lot of work in there anyway. Being able to render with our engine inside of the Maya viewport was a big win for them, since they could see exactly what the game would look like as they were modeling. We also made it so that our material editor could run inside of Maya, which was another big win for them: most of the time they actually authored materials right inside of Maya, which let them do the authoring while viewing the material in the environment of their choosing. For gameplay/level authoring it's a little less clear cut. In some ways using Maya is natural since it already supports a lot of things that you need for a level editor (3D viewer, orthographic views, translation/scaling/rotation widgets, user-defined attributes and UI, etc.), and that kept us programmers from having to re-implement all of those things. But in some ways it's also rather clunky and heavyweight, especially if you just want to move a few locators around.

We actually have a good mix of C++ plugins as well as Python tools. We also still have some MEL tools, but we've been deprecating that stuff in favor of Python. The C++ plugins do most of the things that need tight integration with the engine, most notably our Viewport 2.0 override plugin that renders the Maya scene with our renderer. There's also plugins for registering a bunch of custom node types as well as their custom attributes, loading asset data, and kicking off GI bakes to our bake farm. We then mostly Python to create our own UI, and also for helper scripts that automate repetitive art and gameplay tasks. Most of our programmers hate working on the C++ plugins, since you have to start up Maya to run them and their API isn't always easy to work with. The Viewport plugin in particular was a *lot* of work, and has a pretty high maintenance cost. We basically have to treat it like an additional platform for our automated graphics tests, since a lot of things go through custom code paths in order to extract the right data from Maya on-the-fly, instead of being processed in our content build pipeline. The artists love it, so we can't get rid of it now. tongue.png

Also: you two should feel free to join the gd.net chat from time to time if you find yourself yerning for an extremely long and one-sided conversation about this. Milk and cookies are ready...at dawn.


Haha, I'll definitely come by!

There no information about segmentation,paramatrization,box packing algos has been used in The Order 1886. For me it's most complicated part of pre-backed GI implementation.


Indeed, and that's a pretty complex topic on its own. In most cases we just re-use the base parameterization that the artists created for the mesh, instead of automatically computing a new one. We essentially extract each separate UV chart from Maya, and then run a packing algorithm to pack all of the charts from all of the meshes into a set of atlased textures. For The Order this algorithm was complicated and slow, since it tried many different positions and orientations to try to find the tightest fit. However we've since started moving to a simpler algorithm that respects 4x4 BC6H tile boundaries, which has more unused space but is much much faster, and also avoids compression artifacts. I unfortunately didn't do any of the implementation for this part of the pipeline, so I'm not sure if I could give a thorough interview. But perhaps I can convince my coworker to write up a blog post, or something similar.

We're also working on a code sample that bakes SG lightmaps and renders the scene.

Hows this coming along? is it still in the works?

This topic is closed to new replies.

Advertisement