Bloom is sometimes (ab)used to simulate phenomena that it's not really suited for. e.g. on my last project, we used a really wide bloom filter to make up for a lack of atmospheric scattering -- that effect should technically be happen in the world, but it was cheap/easy to do it in screen-space.
When talking about glare, there's more than one part to it:
Part of it happens near the light source itself -- the air will scatter some of the light, causing a 'fog' around the source.
Part of it happens near the sensor, which is actually the same effect as "lens flare"! Imperfections in the lenses cause diffraction/scattering/abberations/etc, which causes some of the light not to land where the lens system intended it to land.
Both of these effects are most noticeable when the background is very dark, so there's a large amount of contrast between the light and background. If, for example, the lens causes 0.1% of the incoming light to become in-lens glare (or if the air causes 0.1% of the light to be scattered), then this won't be noticed with a dim light and bright background. But if the background is 0.01% as bright as the light source, then the glare will overpower the background and make it hard to see.
Physically speaking, it's correct to simulate the second part using a screen-space bloom filter. But games often use bloom to simulate the first part as well.
n.b. even with a pinhole camera, which in theory has perfect focus (just like our rasterization with standard projection matrices gives us), not all of the light will land on the intended sensor pixel. When the light passes through the pinhole, a very small percentage will be bent outwards and land in rings around the intended sensor pixel. If the light is bright enough, these halos will become noticeable as glare/bloom.
Edited by Hodgman, 24 May 2013 - 12:56 AM.