Archived

This topic is now archived and is closed to further replies.

Right and Left hand coordinates

This topic is 5571 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey, I wondering if anyone has had any experience using both coordinsytems in one program. What I''m trying to do is merge two programs, 1 that uses a left hand coordinate system and the other that uses a right hand coordinate system. The SDK outlines a method to change between the two but I''d like to leave it as is...but I running into some problems rendering them together. Is there anything more to it than just setting the each projection matrix before rendering each part? Thanks Paul

Share this post


Link to post
Share on other sites
i have done it. What i did was have 2 copies of my code. 1 copy for direct3d and another for opengl. Both optimised for its own coord system. Its quite easy to code.

having a single piece of code for both is not that simple. Since many functions will have to check and adjust the result for the left or right hand coord system.



Share this post


Link to post
Share on other sites
I''m not sure what you mean by merging two programs. Do you mean two mesh models instead?

If it''s a new project, setting the projection matrix to one coordinate system is enough because both opengl and directx uses the same memory layout. See the memory layout in both API setmatrix method, the translation portion is always the 13,14,15th element. Then use your own matrix class and avoid any matrix setting calls provided by the API.

If you are trying to convert meshes (ie read RHS exported meshes into a LHS renderering system) , you have more problems.

Things you need to do
1. Change the winding order of vertices
2. Change the transformation matrix of the meshes. Hint, quaternion helps here.
3. You need to rearrange the vertices (depends on your requirements but usually I flip them to make them face the same)

jwalker: It is possible to do it with 1 set of code. In fact, I find it much easier to maintain.

Share this post


Link to post
Share on other sites
Thanks for the replies...lemme get across what I''m doing with an example...


D3DXMatrixPerspectiveFovLH( &matProj, D3DX_PI/4, fAspect, 0.1f, 100.0f );

m_pd3dDevice->SetTransform( D3DTS_PROJECTION, &matProj );
RenderObject();

D3DXMatrixPerspectiveFovRH(&m_matProjection, D3DXToRadian(60.0f), fAspectRatio, 0.1f, 2000.0f);
m_pd3dDevice->SetTransform(D3DTS_PROJECTION, &m_matProjection);

RenderObject();

Mixing the two systems as such. I''m just having some problems placing the two in the same world appropriately, but I think it will just take some tweaking, hopefully.

Paul

Share this post


Link to post
Share on other sites
quote:
Original post by Paul Dolhai
Mixing the two systems as such. I''m just having some problems placing the two in the same world appropriately, but I think it will just take some tweaking, hopefully.



Care to explain why you want to do that?

If you are trying to use objects from different coordinate system (ie. .X LHS and .3ds RHS), then stick to 1 system and modify the model as I have described when loading.

Note: .X models exported from 3dmax is not correct because it is oriented using 3dmax uncanny axis. If you mix them with other .X files like from SoftImage, you have trouble.

Share this post


Link to post
Share on other sites