Jump to content
  • Advertisement
Sign in to follow this  
jerrycao_1985

Where is the cosine factor in extended LTE?

This topic is 968 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

By expending the rendering equation, you will have a nice symmetric equation describing the light transport, please refer to the pbrt book at the page of 760. (Sorry about the large size of this equation, I've no idea how to scale it....)

equation.png

 

What I'm wondering is that where is the cosine factor at the camera side of this equation? I never recall any renderer taking it into consideration, at least for real time rendering engine.

 

Take a real world example here, you are watching a movie, which is displaying a uniform white image, let's say radiance of each ray is exactly 1, obviously the ones that hit the center of the screen will reflect more lights to the viewer, while the ones hit around the edge of the screen will be a little bit darker depending on the FOV of the projector, no matter how small it is, it should be there.

 

So what is the real world solution for this issue? scale the radiance by a factor of 1/cos(theta)? Or totally ignore it since theta should be very small.

Share this post


Link to post
Share on other sites
Advertisement

Take a real world example here, you are watching a movie, which is displaying a uniform white image, let's say radiance of each ray is exactly 1, obviously the ones that hit the center of the screen will reflect more lights to the viewer, while the ones hit around the edge of the screen will be a little bit darker depending on the FOV of the projector, no matter how small it is, it should be there.


The edge receives less light because both the distance and the angle to the projector are larger than at the center.
Classic Rendering eq. defines that correctly, but i doupt that's comparable to the way a camerea captures the image to film.
Might have to do with optics of the lens.
Game devs typically use a vignette effect if thy care at all but most probably they don't care for physical correctness here.

pbrt book at the page of 760

What book?
What the hell is LTE?
And what is ray pdf? (Belongs to the other thread you started, but i could not resist)

What i mean is, you need to provide more information to get some answers ;)

Share this post


Link to post
Share on other sites

 

Take a real world example here, you are watching a movie, which is displaying a uniform white image, let's say radiance of each ray is exactly 1, obviously the ones that hit the center of the screen will reflect more lights to the viewer, while the ones hit around the edge of the screen will be a little bit darker depending on the FOV of the projector, no matter how small it is, it should be there.


The edge receives less light because both the distance and the angle to the projector are larger than at the center.
Classic Rendering eq. defines that correctly, but i doupt that's comparable to the way a camerea captures the image to film.
Might have to do with optics of the lens.
Game devs typically use a vignette effect if thy care at all but most probably they don't care for physical correctness here.

pbrt book at the page of 760

What book?
What the hell is LTE?
And what is ray pdf? (Belongs to the other thread you started, but i could not resist)

What i mean is, you need to provide more information to get some answers ;)

 

 

Physically based rendering.

LTE stands for light transport equation, or rendering equation.

By ray PDF, I mean the probability density function value of a specific ray.

 

That's more of an offline rendering question than a game development one, :)

Share this post


Link to post
Share on other sites
As JoeJ said, there's missing a lot of context.

Posting an equation without mentioning what does phi, theta, We, Li, dA and Pfilm stand for, or what happens inside those function; we're pretty much clueless.

I might remember something from my PBR readings; but that requires a lot of effort, which we're expecting from your side, not ours. We need some refreshers.

What I'm wondering is that where is the cosine factor at the camera side of this equation? I never recall any renderer taking it into consideration, at least for real time rendering engine.

Honestly I do not understand whether you're asking why is the LTE considering the camera, or if you're asking why the LTE is not considering the camera.

Anyway, rendering does take into account the camera (the "eye") simply because of the specular component of the light and the fresnel term.
We basically need to check whether the eye is being directly hit by polarized light.

Since a movie projection is basically mostly diffuse light though, the eye position is pretty much irrelevant in your example.

Share this post


Link to post
Share on other sites

The answer is really subtle-- it's actually implicit/handled in the projection into screen space!

 

It's important to first remember that the actual equations in use here are approximating overall energy distributions over the sphere/hemisphere, though a lot of learning materials just kind of present things as pretty arbitrary quantity modifications. Engineering calculus says you can chop that space up into little tiny bits, and very loosely that's what's going on when you shoot rays in a path tracer/*mumble mumble mumble* in a rasterizer. Ever seen all the cool environment map integration techniques for spherical harmonic projection, etc.? Think about applying that literal process, just from the perspective of the camera. 

 

As a simple illustrative thought experiment, consider the case of a white triangle on a back background. If you were to hypothetically draw this triangle at increasing distances/oblique angles, then sum up all the white pixels, you should notice that the white pixel count decreases according to the inverse square law/cosine of the triangle normal with camera forward vector.

Share this post


Link to post
Share on other sites

Honestly I do not understand whether you're asking why is the LTE considering the camera, or if you're asking why the LTE is not considering the camera.


Same for me, but i think a better example for the question is:
Taking a pictuture of a white wall, euqually lit over its entire area, why are the corners of the picture darker than the center?

I've found this wikipedia page about that: https://de.wikipedia.org/wiki/Cos4-Gesetz
But i don't know how to get the english version. (There is one about vignetting, but vignetting is the wrong term and has different reasons)

Share this post


Link to post
Share on other sites

 

Honestly I do not understand whether you're asking why is the LTE considering the camera, or if you're asking why the LTE is not considering the camera.


Same for me, but i think a better example for the question is:
Taking a pictuture of a white wall, euqually lit over its entire area, why are the corners of the picture darker than the center?

I've found this wikipedia page about that: https://de.wikipedia.org/wiki/Cos4-Gesetz
But i don't know how to get the english version. (There is one about vignetting, but vignetting is the wrong term and has different reasons)

 

I think this cos4 law is the key to my questions.

 

My guess is that to avoid vignetting effect, the We factor (importance function) is proportion the inverse of cos4. And since the pdf of primary ray contains cos3, it just gets three of the four cos cancelled out. And the left one in the denominator gets cancelled with the cos factor in the LTE, hidden in the G(v_0 - v_1) component.

 

I've checked the pbrt-v3 implementation, it works this way.

https://github.com/JerryCao1985/pbrt-v3/blob/master/src/cameras/perspective.cpp

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!