Hello,
I m building a ray tracer and something really weird goes on :P
Global variables:
D3DXVECTOR3 camPos(0.0f, 0.0f, -10.0f);
float fovW = (45.0f)*3.14f/180;
float fovH = (float)HEIGHT / (float)WIDTH * fovW;
I create 5 balls
(0, 1, 50.0f)
(0, -3, 100.0f)
(2, 0, 90.0f)
(-1, 0, 75.0f)
(-4, 4, 75.0f)
and one light at (0,0,1000)
To create the ray for each pixel I use the following for its direction and the camera position as its origin:
float xz = ( (2.0f * (float)x) - (float)WIDTH) / (float)WIDTH * tanFovW;
float yz = ( (2.0f * (float)y) - (float)HEIGHT) / (float)HEIGHT * tanFovH;
D3DXVECTOR3 m_direction = D3DXVECTOR3(xz- camPos.x, yz-camPos.y, -camPos.z);
NORMALIZE(m_direction);
Notice that light is further ahead of every ball and the camera yet I get as a result:
Also regardless of whether the objects are in shadow I get the same result :S
Any ideas why I am getting this result and not proper normalized spheres even though I calculate lambert
//90+ degrees
if (lambert > 0)
{
const float DIFFUSE_COEF = 0.8f;
tempSphere->color = lambert * DIFFUSE_COEF;
}