In pixel shaders (SM3 to 5) I often do some mixing / interpolation between two or more vectors (e.g. colors) or scalars (e.g. luminances).
A simple example of a common code as used for example in gaussion-blur-like implementation, looks like this
float4 mixColor = (tex2D(colorSampler, uv-blur)+tex2D(colorSampler, uv+blur)) / 2.f;
I would like to improve that, performance- and instruction-limits-wise, making use of hardware-supported HLSL intrinsic functions, but I'm not sure what would best here. lerp(), for example? I think I'd basically need the opposite of the mad() command. Are there better ways?