(Physically based) Hair shading

Started by
3 comments, last by MJP 7 years, 11 months ago

Hey guys.

I'm playing around with hair rendering these days and have implemented the shading model that is used in AMD's Tress FX sample which I believe is the kajiya-kay model.

However having done that I'm not really satisfied with the results. Especially after using a physically based GGX for my regular shading.

UE4 has recently worked and presented something that can be seen in this Video here:

which looks really nice.

I've also no idea how to use my image-based-lighting with this Kind of BRDF since using my ggx convoluted probes the normal way looks very wrong.

Since the kajiya-kay diffuse term uses something like


float cosTL = saturate(dot(hairTangent, LightDirection)); // dot(T, L) instead of dot(N, L)

I was thinking maybe use the Tangent as the lookup vector for the irradiance map and something similar for the glossy env map but that also didn't seem to work out of the box.

Brian Karis told me on Twitter he was using an approximation that square-enix uses however I could not find anything about that on the Internet.

Does anyone know of a physically based hair model (marschner?) that can be used for realtime shading ?

And does somebody have an idea how to solve the IBL Problem ?

Update: After watching the Video again I found the paper his shading model is based on (not sure I'll be able to translate that into Code but I'll try)

http://www.eugenedeon.com/wp-content/uploads/2014/04/egsrhair.pdf

Advertisement

I implemented physically based hair shading last year. I also started work on the TressFX 2.0 sample :)

I selected a LUT based approach to real-time rendering like GPU Gems 2 and cortex did.

Azimuthal scattering of the Marschner's shading model is very complicated and it's not free to evaluate this in run-time.

Karis' approximation included in UE4 is quite impressive. Unfortunately, his implementation was opened after my work was done :(

Anyway, you can find my work here. See 47 - 63 pages.

In fact, I did not solve the IBL problem yet. In my knowledge The Order uses tangent irradiance maps.

But it's not easy to apply this to our game because we need time of day lighting changes. I have been finding better solutions.

Not hair exactly, for for anisotropic materials in general:

IIRC the Disney BRDF paper that all the game PBR papers seem to cite, includes two forms of GGX -- the common (isotropic) one, but also an anisotropic version.

I'm using that at the moment, but yeah, now IBL is a problem. I allow importance sampling at runtime, which solves it, but it's not really feasible except as an "Uber" detail option.

IIRC (again), the frostbite PBR presentation introduced a nice hack, which just bends the IBL lookup vector based on the anisotropy data, which is a completely empirical model rather than physically-based... but it creates the right impression for the viewer and is better than doing nothing. You can also fiddle with using an anisotropic texture filter and passing your own ddx/ddy values into TextureCube::SampleGrad to try and blur your IBL probe along the anisotropy direction (just more hacks though).

Not sure if the cubemaps your using have a high enough resolution, or rather what the end result for hair would look like. But with screenspace stuff you'd at least have the resolution and etc.

http://www.frostbite.com/2015/08/stochastic-screen-space-reflections/

In fact, I did not solve the IBL problem yet. In my knowledge The Order uses tangent irradiance maps.

We didn't end up shipping with that, since we removed all usage of SH from the game towards the end of the project. Instead we applied specular from the 9 spherical gaussian lobes stored in our probe grid.

This topic is closed to new replies.

Advertisement