It depends on the GL specification, and the texture format.
for glteximage2d portion, it says if you use luminance format, GL assembles it in a RGB and sets alpha to 1.
So that's what happens in GLSL as well. In other words, that how the GPU is wired.
You should not be able to use the in qualifier on a uniform. The in qualifier is only for attributes (in vec4 in_Position) and varyings (in vec4 mid_Colour) so it is possible that the driver is shitty. is it AMD or nVidia?
and yes, you can ask users to buy themselves a GL 2.0 or above card. There are a lot of Intel drivers out there that could handle GL 2.0 just fine but Intel doesn't write proper drivers for them. They are limited to 1.4 or 1.5 I think and are extremely buggy. You better go with Direct3D if you want Intel.
anything i can try to get it running on an ATI card? i'm going to try it on an nVidia card tonight but sadly i have to work on this ATI one!!!
ATI is known for strict syntax requirements, nvidia for once has a less strict interpretation of GLSL. That often result in shaders running flawless on nvidia cards, but not running at all on ATI. Always check for errors when compiling you shaders and log them, the error messages contain detailed information about the type and occurrence of syntax errors.
He isn't using GLSL. He is using the 8 year old ASM extension.