Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Decreasing brightness of light sources over really large distances


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 Tasty Texel   Members   -  Reputation: 1357

Like
0Likes
Like

Posted 04 March 2013 - 02:00 PM

Just watched this video about how the sun appears dimmer when viewed from a large distance: 

 

It was stated that this effect was due to decreasing intensity over distance. I took the following guess about why this happens:

 

"Intensity (power per solid angle) is decreasing following the inverse square law, that's right. But what ideal receptors measure is not intensity, but radiance, which is power per solid angle per area. Due to this, ideally a light source should appear with a certain brightness independend of the observation distance. I would guess that this rule breaks down for really large distances because there are too few protons left to ensure that the receptor continuously gets hit by those."

 

What do you think?



Sponsor:

#2 Bacterius   Crossbones+   -  Reputation: 9262

Like
1Likes
Like

Posted 04 March 2013 - 04:29 PM

It's photons, no protons. And, yes, at any sufficiently large distance, the photon density will simply be too low for us to see anything, but you're talking hundreds of light years for this to be relevant. This is why Hubble cannot see individual stars, but only immense galaxy clusters, which emit unimaginably more photons than any one star.

 

For reference, our Sun emits about 10^44 photons per second, overall. Some back of the envelope calculations indicate that at a distance of 200 light years, you would be hard-pressed to even detect it with the naked eye, as on average only about 3000 photons would hit your eyes every second.

 

Now consider that most stars are millions of light years away, and you can see we have no hope of ever clearly seeing them from Earth, but only as part of a supermassive galaxy cluster composed of billions upon billions (upon billions) of stars.

 

On the other hand, the solid angle law is itself not related to the receptor "getting hit continuously". It simply comes from the intuitive fact that objects at a distance appear smaller than closer objects (both in width and height, thus the inverse square relationship) and so you need more photons coming from this smaller area to make it appear as bright as the larger area. Of course, as you correctly deduced, this assumes that light permeates space, and breaks down when the photon density becomes too low.

 

Note that receptors don't actually measure radiance, since they don't care about the characteristics (distance, shape) of the object they are picking up emissions from. All they care about is the incident intensity (energy per second) over the receptor's area, which is not radiance but irradiance (in watts per metre squared). The two are related and you have the correct idea, so this is just a terminology nitpick.

 

This reply brought to you by a computer graphics/radiometry background, so I don't know about any extra effects happening in space. Perhaps there are more things to consider, e.g. interstellar dust, gravity wells... but I would guess those are negligible :)


The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#3 Tasty Texel   Members   -  Reputation: 1357

Like
0Likes
Like

Posted 04 March 2013 - 05:11 PM

Argh, of course photons. :\ 

Anyway, thanks for your answere. What I actually had in mind when I wrote "receptors" was the combination of receptors with a lens which would limit the incident photon directions for every receptor. For such a case (like the human eye) it would totally make sense to use radiance, wouldn't it?



#4 Bacterius   Crossbones+   -  Reputation: 9262

Like
1Likes
Like

Posted 04 March 2013 - 05:19 PM

Anyway, thanks for your answere. What I actually had in mind when I wrote "receptors" was the combination of receptors with a lens which would limit the incident photon directions for every receptor. For such a case (like the human eye) it would totally make sense to use radiance, wouldn't it?

 

No, it wouldn't. The human eye, for instance, only measures the total number of photons (taking into account frequency, for color, etc..) incident to the retina at various locations of the retina (well, it's more complicated than that, but yeah). That's all it does, at a low level. The direction those photons are incident to the retina at (in this case, they all incide to the retina at normal incidence, due to the lens inside our eyes, which is good to avoid artifacts) is irrelevant to this, for they could be coming from a flashlight ten metres away from you or a star billions of km away. This isn't a characteristic of the light emission which the receptor measures, but of the object (and the eye) and so don't really make sense in the context of a receptor.

 

Now if you were talking about how much energy "this object right there" emits and is received from "this receptor here a metre away", then yes, radiance makes sense in that case, and you would say that "this object subtends a solid angle of X with respect to the receptor, and so the radiance is Y" or something like that. But in general, a "receptor" in the broader sense only measures whatever light hits it, and doesn't know nor care what is emitting it, so "radiance" is meaningless in this case.

 

In fact in the case of the human eye, I think it's the brain that does the job of discerning which objects are responsible for emitting or reflecting such and such light at various locations in the image you see. The eye merely conveys a bunch of colors and shapes to the brain. But I'm not sure about that, though.


Edited by Bacterius, 04 March 2013 - 05:20 PM.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS