It's photons, no protons. And, yes, at any sufficiently large distance, the photon density will simply be too low for us to see anything, but you're talking hundreds of light years for this to be relevant. This is why Hubble cannot see individual stars, but only immense galaxy clusters, which emit unimaginably more photons than any one star.
For reference, our Sun emits about 10^44 photons per second, overall. Some back of the envelope calculations indicate that at a distance of 200 light years, you would be hard-pressed to even detect it with the naked eye, as on average only about 3000 photons would hit your eyes every second.
Now consider that most stars are millions of light years away, and you can see we have no hope of ever clearly seeing them from Earth, but only as part of a supermassive galaxy cluster composed of billions upon billions (upon billions) of stars.
On the other hand, the solid angle law is itself not related to the receptor "getting hit continuously". It simply comes from the intuitive fact that objects at a distance appear smaller than closer objects (both in width and height, thus the inverse square relationship) and so you need more photons coming from this smaller area to make it appear as bright as the larger area. Of course, as you correctly deduced, this assumes that light permeates space, and breaks down when the photon density becomes too low.
Note that receptors don't actually measure radiance, since they don't care about the characteristics (distance, shape) of the object they are picking up emissions from. All they care about is the incident intensity (energy per second) over the receptor's area, which is not radiance but irradiance (in watts per metre squared). The two are related and you have the correct idea, so this is just a terminology nitpick.
This reply brought to you by a computer graphics/radiometry background, so I don't know about any extra effects happening in space. Perhaps there are more things to consider, e.g. interstellar dust, gravity wells... but I would guess those are negligible
The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.
- Pessimal Algorithms and Simplexity Analysis