Well, now I feel like an idiot. I had a typo in the C++ when setting the uniform for my position texture (was spelled "texPositon"). My position was always 0,0,0 which gave me a bad falloff value. Thanks for the help!
It's subtle, but this has two point lights and a directional light blending nicely. And I can see it from far away.
So here's the new issue. The lights are blending into each other better (see fig. 1), but I have to be inside (or almost inside) the sphere. Everything goes black if I back out (see fig.2 and fig. 3). The falloff variable in my fragment shader is controlling how much to mix the color. This should should be determined by how far the current light vertex is away from the position sampled from the G-buffer. Both are calculated using gl_ModelViewProjectionMatrix * gl_Vertex, so the values remain relative. I'm not sure if the camera's position is somehow reducing this distance or there is something inherent in drawing these spheres that is blocking the view of what's inside.
Fig .1 Lights are now blending into each other
Fig.2 Image gets darker from outside
Fig. 3 Light position when the image gets too dark to see.
The lights are blending, but now it seems I can only see from "inside" the lights, or close to it. If I zoom out, I would expect the scene to be illuminated, but smaller. Instead it goes black, as if the only visible area is inside the sphere.