Sign in to follow this  

DirectX vs OpenGL vertex normals

Recommended Posts


I currently have a generic vertex struct which holds position, colour and normal properties. (I'm not up to UV coords yet).

I can get my cube to display correctly in DirectX, but in OpenGL I get nothing. I may be missing something, but I think I've been pretty thorough in checking light properties, drawing method calls etc. When I switch to wireframe mode, my cube comes up correctly.

I did however need to make a scale (1, 1, -1) on the view matrix to get the cube to come up in the same position on screen as in directx. I believe this is to do with the fact that one has +Z and the other has -Z coming out of the screen.

I am wondering, as I share vertex information between both render engines, do the surface normals also require a scale of some kind to get them into the same space or should I be ok to share normal data between apis.


Share this post

Link to post
Share on other sites
I'll offer my usual advice here, which is:

Since both OpenGL and Direct3D can be set up to be either left-handed or right-handed, I suggest using the same coordinate system setup for both APIs so that you don't have to worry about mirroring/reflecting, changing triangle windings or cull modes, or anything like that. Just use the same setup for both APIs, and all those problems will go away.

(Note that OpenGL and Direct3D use different z clipping ranges for the canonical view volume, so you'll need to set up your projection matrices differently for each API in that respect.)

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this