This stuff was popular in realtime skin shading, after the Matrix 2 published a technique based on it in 2003 Most people stopped using it pretty quickly though.
Since then, I've always wanted to try a general purpose implementation of it like this. Although, I was going to do a hybrid, where only some calculations were done in object space, and others would still be done during rasterization.
Great to see someone's actually done it!
Back on the Wii, we couldn't afford to do per pixel shading either, especially not with large number of dynamic lights, so we actually did a hybrid of this - a kind of light-pre-pass object-space shading, in view-space For each object, we'd render it's lighting into a small (32px IIRC) texture, so that during rasterization it could combine the lighting and material data very quickly, and independently of the number of lights. That hid some of the artefacts because even though the lighting was low resolution, the materials were not.
I don't get the comparisons to REYES
They spend a good amount of time talking about fixing it to work for terrain, after all. What on earth would happen to it if you gave it general purpose world geometry, like you'd see in a Battlefield game?
REYES tessellates, shades, tessellates some more, fine-culls and then samples/filters -- i.e. it does shading before it does triangle rasterization. I think that's the only inspiration they've taken from it.
For arbitrary geo, they'd have to use the indirection technique on everything. FWIW, some film renderers already solve this with with tech like ptex, and it in particular has been implemented in realtime too. It would actually be nice to see something like this...
e.g. Megatexturing first showed up as a specialized terrain texturing technique, but then quickly moved to a texture-all-the-things general purpose technique.
Since no object can share a shaded texture I guess memory consumption must be high.
Another point is the authoring tool : every object must have "well flattened" textures. Flattening texture is generally difficult and often requires manual tweaking.
Yeah - they mention using "several" 4k 16bpp lighting sheets, which comes to 128MiB each (plus mips!).
They also mention that "our implementation requires artists to chart their models" and that they were already doing this anyway, but yeah, they probably had to do extra tweaking of those UV coordinates to work out some kinks per model... Combining this with a great automatic charting tool would be nice (e.g. ptex above doesn't require explicit UV coords), but would probably require a better texture filter to deal with the increase in seams.
And yeah - Great for VR (shade every 3rd/4th frame) and great for 4k monitors (allow weaker GPU's to shade at lower resolution but keep high res rasterization).