Just watched this video about how the sun appears dimmer when viewed from a large distance:
It was stated that this effect was due to decreasing intensity over distance. I took the following guess about why this happens:
"Intensity (power per solid angle) is decreasing following the inverse square law, that's right. But what ideal receptors measure is not intensity, but radiance, which is power per solid angle per area. Due to this, ideally a light source should appear with a certain brightness independend of the observation distance. I would guess that this rule breaks down for really large distances because there are too few protons left to ensure that the receptor continuously gets hit by those."
What do you think?