# aborption along ray

## Recommended Posts

Quat    568
I have a 3D graphics book, which gives the formula for absorption of radiance along a ray. I am trying to derive the details and would like to see if my derivation is correct. Let o(p) be the probability density that light is absorbed per unit length at point p. They give the formula as: exp( -int_0^d(o(p+tv)dt) ) where p is the starting point of the ray entering the medium, and it exits at the point p+dv. So to set this up, I look at a small section along the ray, to see how much light is absorbed across that small section of length h. I also recast things in terms of t, since the position along the ray is a function of t. Let L(t) denote the light radiance at point t. L(t+h) = L(t) - o(t)*h*L(t) That is, the light radiance after passing through a segment of length h equals the incoming light radiance minus the amount absorbed. [L(t+h) - L(t)] / h + o(t)*L(t) = 0 Taking the limit h-->0 dL(t) / dt + o(t)*L(t) = 0 I multiply by integrating factor exp(int_0^t(o(t)dt)) to get (exp(int_0^t(o(t)dt)) * L(t))' = 0 Integrating from 0 to d: exp(int_0^d(o(t)dt))*L(d) - exp(int_0^0(o(t)dt))*L(0) = C We know L(0) = 0 since no absorption yet. L(d) = C*exp(-int_0^d(o(t)dt)) I'm not sure how to get rid of the C. I think it should not have occurred since I use definite integral. Also, is my integrating factor correct: exp(int_0^t(o(t)dt))??

##### Share on other sites
Pragma    395
I think you solved the equation right, but I think you applied the boundary condition wrong. If L(0) is the radiance you don't want L(0) = 0 (that would mean zero incoming light). Just take your final equation and set d = 0, using the fact that int_0^0 of anything is zero. You should find that C = L(0) which is the correct solution.