Okay I got time to review the radiometry terms again. In the figure below I have drawn a spherical light source. Are my explanations correct? Figure (a) was hard for me to reason about. I found that I wanted to think of intensity as radiance, but that I had to consider all areas on the sphere-light can emits photons in the set of direction defined by w.
Now assuming the above is correct, going back to real-time graphics mode. When we define a point light source that emits photons equally in every direction we specify its radiance magnitude, say I_0. Even though we think of radiance as a ray of light, it is really a thin cone. So when the ray hits a surface, the photons in the ray have "spread out" based in the inverse-square of the distance, so to compute the irradiance at the surface we do: E = I_0/d^2 in our shader to get the irradiance from the point light source.
Now we apply some BRDF to find the outgoing radiance that will reach the eye, call it O for outgoing. So O is a ray leaving the surface and reaching the eye. But shouldn't O be attenuated based on the distance of the lit surface point and the eye? As the distance increases, the photons will again "spread out" and cause less stimulii to a sensor in the eye.