How do I determine camera view angle to an object with the object's rotation factored in, to display a 2d "imposter" based on the actual view angle of the object?
I can get the camera angle to the object with:
half3 center = mul(input.inPos, xWorld);
half3 EyeVector = normalize(center-CameraPosition);
float lookYaw = atan2(EyeVector.x, EyeVector.z);
But I'm not having any luck determining the view angle with object rotation factored in either by adding or subtracting the EyeVector with the ObjectRotation vector or by atan2'ing the ObjectRotation vector and adding or subtracting the result with lookYaw. All vectors in question are normalized.
Hope this makes sense. Thanks!