Jump to content
  • Advertisement
ChenMo

3D Data in light map when using Patch Tracing

Recommended Posts

Posted (edited)

Hi, guys.

I am developing a path tracing baking renderer now, which is based on OpenGL and OpenRL. It can bake some scenes e.g. the following, I am glad it can bake bleeding diffuse color. : )

222223333.thumb.png.b2d577d8bad7b74fb4ff6f3acd72e071.png

I store the irradiance directly, like this. Albedo * diffuse color has already come into irradiance calculation when baking, both direct and indirect together.

lightmap_1_.thumb.png.6787ba28831678063dd73af098d7db76.png

After baking, In OpenGL fragment shader, I use the light map directly.

333333.png.8a639038aed1ad7ce18ff4ed22ad6a9e.png

I think I have got something wrong, due to most game engines don't do like this.

Which kind of data should I choose to store into the light maps? I need diffuse only.

Thanks in advance!

Edited by ChenMo

Share this post


Link to post
Share on other sites
Advertisement

The main question is how to store directional information to support at least normal mapping (or no directional info at all).

There are multiple options:

Primary light direction

Spherical Harmonics (or a tweaked variant optimized for half space, guess it's called Haar basis IIRC)

Spherical Gaussian

Small enviroment map per texel (e.g. 2x2 or 4x4 pixel spherical map)

...

With more storage, you can even get reflections for glossy materials.

https://mynameismjp.wordpress.com/2016/10/09/new-blog-series-lightmap-baking-and-spherical-gaussians/

Share this post


Link to post
Share on other sites

Hi JoeJ,

I will go to read the blog you have post to learn something about it.

Thank you!

Share this post


Link to post
Share on other sites

Of course you can also store directional stuff at lower resolutions than the diffuse irradiance.

E.g. crazy 8x8 enviroment maps at 1/8th of the resolution to support reflections for most natural materials. You could even correct the error by calculating the difference between low and high resulotion data.

Options are infinite and it really depends on the game what to do.

Share this post


Link to post
Share on other sites

Yes, there existing reflections will be really cool, it's worth doing I think.

Share this post


Link to post
Share on other sites

Generally you won't store irradiance * albedo in your lightmaps, because that would mean that your albedo resolution is limited by your lightmap resolution. Typically you'll store some form of irradiance in your lightmap (usually irradiance / Pi), and then compute the diffuse reflectance in your fragment shader by sampling both the lightmap and albedo map and multiplying the results together.

Share this post


Link to post
Share on other sites
Posted (edited)

Yes, as you mentioned, albedo resolution will be limited by lightmap resolution if I took albedo into irradiance calculation.

But I have a confusion. Whether albedo should be took into the rendering equation. For an example, a ray hit a red area, the reflected color of it will be red too, but if I don't take albedo into the calculation,the resulting irradiance won't be red, so I lose the red bleeding when sampling lightmap.

And I am reading your post which JoeJ told me.It's excellent!

Edited by ChenMo

Share this post


Link to post
Share on other sites
Posted (edited)

@MJPIs there any information of why writing the final light color after dividing by PI? Not sure how to search this, but I have seen it floating around a lot.

Does this remain true for things like deferred renderers?

Edited by orange451

Share this post


Link to post
Share on other sites
Posted (edited)
2 hours ago, ChenMo said:

But I have a confusion. Whether albedo should be took into the rendering equation. For an example, a ray hit a red area, the reflected color of it will be red too, but if I don't take albedo into the calculation,the resulting irradiance won't be red, so I lose the red bleeding when sampling lightmap.

I did not notice this is your problem yesterday.

 

I the shader you do something like this for diffuse:

vec3 reflect = componentWiseMul (unlit aldebo from material texture, irradiance from light map texture)

vec3 emit = light emitted from material, if you have this 

vec3 fragmentColor = reflect + emit 

 

componentWiseMul (a,b) is to return vec3 (a.x*b.x, a.y*b.y, a.z*b.z).

Shading languages do this anyways if you multiply two vectors, but it clarifies a diffuse red surface color can only reflect red light, and only if the received light contains red light too.

I wonder you ask because you obviously do this correctly in the path tracer. The idea always is to store only information independent of the surface data to be flexible with shading, resolution, reusing same texture on multiple surfaces etc. So here you store only the incoming light at the surface (irradiance) in the lightmap and calculate outgoing light on the fly.

 

2 hours ago, orange451 said:

Is there any information of why writing the final light color after dividing by PI?

I've struggled about this a decade ago and forgot the reason for confusion, but usually you integrate incoming light over the unit halfsphere projected to the unit disk to the normal plane. Unit disk has area of Pi so the division is necessary at some point because area is irrelevant when shading a pixel.

An example would be this small radiosity solver in this post: https://www.gamedev.net/projects/380-real-time-hybrid-rasterization-raytracing-engine/ But unfortunately the math is optimized and not clear here. Also ray tracing replaces the concept of sample area by ray density or weighting so the math looks different there, but it may help on how to separate surface and lighting data.

 

Edited by JoeJ

Share this post


Link to post
Share on other sites
Posted (edited)
1 hour ago, JoeJ said:

I did not notice this is your problem yesterday.

Yes, I did not describe it clearly yesterday. : ).

1 hour ago, JoeJ said:

I wonder you ask because you obviously do this correctly in the path tracer.

I had took albedo account into baking when doing path tracing, so I got bleeding color.

1 hour ago, JoeJ said:

The idea always is to store only information independent of the surface data to be flexible with shading, resolution, reusing same texture on multiple surfaces etc

I have tried this as you mentioned just now, and then I lost bleeding color, no bleeding happens. There is no way bleeding comes from I think, because no albedo was took into account when baking, all the irradiance was calculated based on the light color, which is vec3(1.0, 1.0, 1.0) now.

I hope I have described it clearly. : )

Edited by ChenMo

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!