Decreasing brightness of light sources over really large distances

Started by
2 comments, last by Bacterius 11 years, 1 month ago

Just watched this video about how the sun appears dimmer when viewed from a large distance:

">

It was stated that this effect was due to decreasing intensity over distance. I took the following guess about why this happens:

"Intensity (power per solid angle) is decreasing following the inverse square law, that's right. But what ideal receptors measure is not intensity, but radiance, which is power per solid angle per area. Due to this, ideally a light source should appear with a certain brightness? independend of the observation distance. I would guess that this rule breaks down for really large distances because there are too few protons left to ensure that the receptor continuously gets hit by those."

What do you think?

Advertisement

It's photons, no protons. And, yes, at any sufficiently large distance, the photon density will simply be too low for us to see anything, but you're talking hundreds of light years for this to be relevant. This is why Hubble cannot see individual stars, but only immense galaxy clusters, which emit unimaginably more photons than any one star.

For reference, our Sun emits about 10^44 photons per second, overall. Some back of the envelope calculations indicate that at a distance of 200 light years, you would be hard-pressed to even detect it with the naked eye, as on average only about 3000 photons would hit your eyes every second.

Now consider that most stars are millions of light years away, and you can see we have no hope of ever clearly seeing them from Earth, but only as part of a supermassive galaxy cluster composed of billions upon billions (upon billions) of stars.

On the other hand, the solid angle law is itself not related to the receptor "getting hit continuously". It simply comes from the intuitive fact that objects at a distance appear smaller than closer objects (both in width and height, thus the inverse square relationship) and so you need more photons coming from this smaller area to make it appear as bright as the larger area. Of course, as you correctly deduced, this assumes that light permeates space, and breaks down when the photon density becomes too low.

Note that receptors don't actually measure radiance, since they don't care about the characteristics (distance, shape) of the object they are picking up emissions from. All they care about is the incident intensity (energy per second) over the receptor's area, which is not radiance but irradiance (in watts per metre squared). The two are related and you have the correct idea, so this is just a terminology nitpick.

This reply brought to you by a computer graphics/radiometry background, so I don't know about any extra effects happening in space. Perhaps there are more things to consider, e.g. interstellar dust, gravity wells... but I would guess those are negligible :)

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Argh, of course photons. :\

Anyway, thanks for your answere. What I actually had in mind when I wrote "receptors" was the combination of receptors with a lens which would limit the incident photon directions for every receptor. For such a case (like the human eye) it would totally make sense to use radiance, wouldn't it?

Anyway, thanks for your answere. What I actually had in mind when I wrote "receptors" was the combination of receptors with a lens which would limit the incident photon directions for every receptor. For such a case (like the human eye) it would totally make sense to use radiance, wouldn't it?

No, it wouldn't. The human eye, for instance, only measures the total number of photons (taking into account frequency, for color, etc..) incident to the retina at various locations of the retina (well, it's more complicated than that, but yeah). That's all it does, at a low level. The direction those photons are incident to the retina at (in this case, they all incide to the retina at normal incidence, due to the lens inside our eyes, which is good to avoid artifacts) is irrelevant to this, for they could be coming from a flashlight ten metres away from you or a star billions of km away. This isn't a characteristic of the light emission which the receptor measures, but of the object (and the eye) and so don't really make sense in the context of a receptor.

Now if you were talking about how much energy "this object right there" emits and is received from "this receptor here a metre away", then yes, radiance makes sense in that case, and you would say that "this object subtends a solid angle of X with respect to the receptor, and so the radiance is Y" or something like that. But in general, a "receptor" in the broader sense only measures whatever light hits it, and doesn't know nor care what is emitting it, so "radiance" is meaningless in this case.

In fact in the case of the human eye, I think it's the brain that does the job of discerning which objects are responsible for emitting or reflecting such and such light at various locations in the image you see. The eye merely conveys a bunch of colors and shapes to the brain. But I'm not sure about that, though.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

This topic is closed to new replies.

Advertisement