e.g.given a forward-rendering pixel shader snippet like:
float3 result = (float3)0; result += diffuse * nDotL0 * lightColor0; result += diffuse * nDotL1 * lightColor1; result += diffuse * nDotL2 * lightColor2; return result;if diffuse was white, color0 was pink(255,192,192), color1 was grey(128,128,128) and color2 was green(0,255,0), then the end result can be as high as (383,575,320), however when you output it to an 8-bit render target it gets clamped to white (255,255,255).
If your forward renderer used HDR, then after tone-mapping you would instead end up with a greenish-whiteish colour.
This sounds a bit like "light pre pass" (or some call it "deferred lighting" instead of "deferred shading"). If you implemented this method with low-range (non-HDR) buffers, then when you place too many lights in one area, the result will look ok, but the artefact will be that your lights lose colour, because at some point they will clamp at white.
Another idea I had was to apply the light attenuation to a separate render texture, then use that texture as a mask to blend all lights in the scene at once, to avoid over-saturation. But that would involve another pass and wouldn't work very well for multi-colored lights.