Hi,
I am quite sure am doing something wrong here but not able to figure it out. Basically,
D3DXMatrixLookAtLH(&m_matView, &m_vecPosition, &m_vecLookAtPoint, &m_WorldUp);
results in :
Whereas. custom view matrix calculation :
m_vecLookAtPoint = m_vecLookAtPoint - m_vecPosition;
D3DXVec3Normalize(&m_vecLookAtPoint, &m_vecLookAtPoint);
D3DXVec3Cross(&m_vecRight, &m_vecLookAtPoint, &m_WorldUp);
D3DXVec3Normalize(&m_vecRight, &m_vecRight);
D3DXVec3Cross(&m_WorldUp, &m_vecLookAtPoint, &m_vecRight);
D3DXVec3Normalize(&m_WorldUp, &m_WorldUp);
m_matView._11 = m_vecRight.x; m_matView._12 = m_WorldUp.x; m_matView._13 = m_vecLookAtPoint.x; m_matView._14 = 0;
m_matView._21 = m_vecRight.y; m_matView._22 = m_WorldUp.y; m_matView._23 = m_vecLookAtPoint.y; m_matView._24 = 0;
m_matView._31 = m_vecRight.z; m_matView._32 = m_WorldUp.z; m_matView._33 = m_vecLookAtPoint.z; m_matView._34 = 0;
m_matView._41 = -D3DXVec3Dot(&m_vecPosition, &m_vecRight);
m_matView._42 = -D3DXVec3Dot(&m_vecPosition, &m_WorldUp);
m_matView._43 = -D3DXVec3Dot(&m_vecPosition, &m_vecLookAtPoint);
m_matView._44 = 1.0f;
results in :
If i manually change the WorldUp vector to (0,-1,0) instead of (0,1,0) i get the default output as in first image.
These are my vectors used for the matrix calculations :
m_vecPosition = D3DXVECTOR3(3.0f, 3.0f, -10.0f);
m_WorldUp = D3DXVECTOR3(0.0f, 1.0f, 0.0f);
m_vecRight = D3DXVECTOR3(1.0f, 0.0f, 0.0f);
m_vecLookAtPoint= D3DXVECTOR3(0.0f, 0.0f, 0.0f);
The question is, why I have to invert my World up vector during custom view matrix calculation? what is it that am doing wrong?
Cheers.