Jump to content
  • Advertisement
Sign in to follow this  
FantasyVII

OpenGL Texture mapping coordinates for OpenGL and DirectX

This topic is 790 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to map a texture on to a quad in DirectX 11 and OpenGL 4.5. From my understanding this how the texture mapping coordinates for OpenGL and DirectX should look like

 

20100531_DX_OpenGL.png

 

However in my case both OpenGL and DirectX looks to be using OpenGL's way of mapping. I don't understand why is that happening.

 

This is my UV for both DirectX and OpenGL. The image looks perfect in both OpenGL and DirectX. This way of mapping should only work in OpenGL coordinates system and in DirectX the image should be upside down. However that is not the case. 

//create vertices clockwise
//Top Left
tri1Vertices[0].position = Vector3(-1.0f, 1.0f, 0.0f);
tri1Vertices[0].UV = Vector2(0.0f, 1.0f);

//Top Right
tri1Vertices[1].position = Vector3(1.0f, 1.0f, 0.0f);
tri1Vertices[1].UV = Vector2(1.0f, 1.0f);

//Bottom Right
tri1Vertices[2].position = Vector3(1.0f, -1.0f, 0.0f);
tri1Vertices[2].UV = Vector2(1.0f, 0.0f);

//Bottom Left
tri1Vertices[3].position = Vector3(-1.0f, -1.0f, 0.0f);
tri1Vertices[3].UV = Vector2(0.0f, 0.0f);

unsigned int indices[6] { 0, 1, 2, 2, 3, 0 };
 
If I switch the V value to match DirectX way of mapping the image appears upside down for both OpenGL and DirectX.
 
 
//Top Left
tri1Vertices[0].UV = Vector2(0.0f, 0.0f);

//Top Right
tri1Vertices[1].UV = Vector2(1.0f, 0.0f);

//Bottom Right
tri1Vertices[2].UV = Vector2(1.0f, 1.0f);

//Bottom Left
tri1Vertices[3].UV = Vector2(0.0f, 1.0f);
 
I don't understand why is this happening.
Edited by FantasyVII

Share this post


Link to post
Share on other sites
Advertisement

Are you loading the images the right way up in both APIs? I actually flip mine in GL to match up the APIs.

Edited by Promit

Share this post


Link to post
Share on other sites

I'm using freeimage to load the image for both API's. I believe freeimage loads the image from the bottom left to the top right. Do you think that is why UV coordinates are the same for both API's? Does the way you load the image matter? 

Share this post


Link to post
Share on other sites

AFAIK both APIs do use the same UV coordinate systems and it's a common misconception that they're inverted from each other...

 

GL and D3D do however define a different interpretation for image data that you pass them to create a texture. IIRC GL assumes the first row of pixels that you pass it is the bottom of the image, and D3D assumes the first row of pixels is the top of the image. So if your image loader operates the same way on both APIs, and you simply pass this buffer of pixels to both APIs unaffected, it will flip the image for one of the APIs.

 

[edit] agh, apparently the UV systems are Y-inverted from each other :(

Edited by Hodgman

Share this post


Link to post
Share on other sites

AFAIK both APIs do use the same UV coordinate systems and it's a common misconception that they're inverted from each other...

 

So both API use 0,0 UV for the bottom left corner? 

 

This feels like row/column major matrices misconception all over again :P

 

 

So if your image loader operates the same way on both APIs, and you simply pass this buffer of pixels to both APIs unaffected, it will flip the image for one of the APIs.

 

But the image is not flipped in either API's. It is correctly displayed as long as I use 0,0 UV coordinates system for the bottom left corner.

 

Are you saying that the image should be flipped in one of the API's if I load the image the same way for both API's? Because that is not the case here. 

Share this post


Link to post
Share on other sites

So both API use 0,0 UV for the bottom left corner?


No.
 

OpenGL uses {0,0} for bottom-left, D3D uses {0,0} for top-left.

 

However, when you're loading data to a texture, OpenGL begins loading at bottom-left and D3D begins loading at top-left.

 

So what happens in practice is that the differences cancel each other out and you can use the same texcoords for both.

Share this post


Link to post
Share on other sites

 

So both API use 0,0 UV for the bottom left corner?


No.
 

OpenGL uses {0,0} for bottom-left, D3D uses {0,0} for top-left.

 

However, when you're loading data to a texture, OpenGL begins loading at bottom-left and D3D begins loading at top-left.

 

So what happens in practice is that the differences cancel each other out and you can use the same texcoords for both.

 

 

ooh. alright. that makes sense. Thanks !

Share this post


Link to post
Share on other sites

Note that it's still worth 'unifying' your code. Once you start using render targets as textures, the cancellation will not happen (since there is no loading of an image), and you'll need to have the texture coordinates correct for both apis.

Share this post


Link to post
Share on other sites

Note that it's still worth 'unifying' your code. Once you start using render targets as textures, the cancellation will not happen (since there is no loading of an image), and you'll need to have the texture coordinates correct for both apis.

 

good idea. Thx  ^_^

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!