Jump to content
  • Advertisement
Sign in to follow this  
Yours3!f

OpenGL DX11 changing window coordinate origin (upper-left -> lower-left)

This topic is 1207 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hi there,

 

I want to change the window coordinate origin to use OpenGL conventions (ie. lower-left is (0,0)). 
In GLSL the default is lower-left, however you can change to upper-left using layout(origin_upper_left). Is there such a thing in HLSL that changes to lower-left?

 

best regards,

Yours3!f

Share this post


Link to post
Share on other sites
Advertisement
Which part of the vertex-transform / rasterization pipeline do you want to change?

Both OpenGL and D3D use x,y=-1,-1 for the lower left, and x,y=1,1 for the top right.

You can construct your projection matrices differently (concatenate with a "scale y by -1" matrix) if you want to flip that upside down.

After projection into NDC (-1 to 1 range). The viewport is used to convert into pixel coordinates. D3D's viewport coords do treat the upper left as the 0,0 origin. You can flip these with something like:
vp.y = renderTarget.height - vp.y - vp.height;

With texture coords, GL uses lower left origin and D3D uses upper left. Easy to flip with uv.y=1-uv.y;

D3D typically doesn't have options to adopt GL's conventions at the API level, while GL does have a few nice 'compatability' extensions.

Share this post


Link to post
Share on other sites

Which part of the vertex-transform / rasterization pipeline do you want to change?

Both OpenGL and D3D use x,y=-1,-1 for the lower left, and x,y=1,1 for the top right.

You can construct your projection matrices differently (concatenate with a "scale y by -1" matrix) if you want to flip that upside down.

After projection into NDC (-1 to 1 range). The viewport is used to convert into pixel coordinates. D3D's viewport coords do treat the upper left as the 0,0 origin. You can flip these with something like:
vp.y = renderTarget.height - vp.y - vp.height;

With texture coords, GL uses lower left origin and D3D uses upper left. Easy to flip with uv.y=1-uv.y;

D3D typically doesn't have options to adopt GL's conventions at the API level, while GL does have a few nice 'compatability' extensions.

 

well, I'm rendering a fullscreen quad, and the vertices are defined in ndc space

      vec3 ll( -1, -1, 0 );
      vec3 lr( 1, -1, 0 );
      vec3 ul( -1, 1, 0 );
      vec3 ur( 1, 1, 0 );
This way I don't need to multiply them w/ a matrix in the vertex shader. I wanted to display a texture, but it appeared upside down.
So I figured that I have two options, flip the window coords, or flip the texture coords, as you mentioned. I figured the window coords would be less painful smile.png
where do I set this? vp.y = renderTarget.height - vp.y - vp.height;
Edited by Yours3!f

Share this post


Link to post
Share on other sites

where do I set this? vp.y = renderTarget.height - vp.y - vp.height;

In your D3D11_VIEWPORT, but... that's just to convert the placement of your viewport rect from GL coords to D3D (or vice versa), it won't actually flip it vertically.

well, I'm rendering a fullscreen quad, and the vertices are defined in ndc space, but it appeared upside down.

That's because NDC is the same in GL/D3D, but tex-coords are flipped. So the right thing(tm) to do is to flip your texcoords.

Or as a quick fix, you can just flip your VS's position.y output variable, but that won't fix the same bug in other cases (e.g. when artists put a texture on a model, it will be vertically flipped between the two APIs...)
 
As an alternative real fix, you can flip all of your textures on disk or when loading them (D3D expects the top row of pixels to come first in the buffer, GL expects the bottom row of pixels to come first in the buffer...), and also flip all your projection matrices upside down in one of the APIs (so that render-targets get flipped as well as texture files -- D3D's NDC->pixel coordinate mapping is flipped vertically from GL's)... In shaders like that which don't use a projection matrix, you'd just multiply the VS's position.y output by -1.
This will have the same effect -- the texture data itself (and render-target data) will now be upside down, so there's no need to flip the texcoords any more.

 

Personally, I choose to use D3D's coordinate systems as the standard, and do all this flipping nonsense in GL only -- but vice versa works too.

 

[Edit]

BTW, layout(origin_upper_left) only modifies the pixel coordinate value that is seen by the fragment shader, it doesn't actually change the window coordinate system or the rasterization rules.

The ARB_clip_control extension allows you to actually use D3D window coordinates in GL (including finally fixing GL's busted NDC depth range)... however, it only exists in GL4.

Edited by Hodgman

Share this post


Link to post
Share on other sites

 

where do I set this? vp.y = renderTarget.height - vp.y - vp.height;

In your D3D11_VIEWPORT, but... that's just to convert the placement of your viewport rect from GL coords to D3D (or vice versa), it won't actually flip it vertically.

well, I'm rendering a fullscreen quad, and the vertices are defined in ndc space, but it appeared upside down.

That's because NDC is the same in GL/D3D, but tex-coords are flipped. So the right thing(tm) to do is to flip your texcoords.

Alternatively, you can just flip your VS's position.y output variable, but that won't fix the same bug in other cases (e.g. when artists put a texture on a model, it will be vertically flipped between the two APIs...)
 
Alternatively, you can flip all of your textures on disk or when loading them (D3D expects the top row of pixels to come first in the buffer, GL expects the bottom row of pixels to come first in the buffer...), and also flip all your projection matrices upside down in one of the APIs (so that render-targets get flipped as well as texture files -- D3D's NDC->pixel coordinate mapping is flipped vertically from GL's)... In shaders like that which don't use a projection matrix, you'd just multiply the VS's position.s output by -1.
This will have the same effect -- the texture data itself (and render-target data) will now be upside down, so there's no need to flip the texcoords any more.

 

Personally, I choose to use D3D's coordinate systems as the standard, and do all this flipping nonsense in GL only -- but vice versa works too.

 

[Edit]

BTW, layout(origin_upper_left) only modifies the pixel coordinate value that is seen by the fragment shader, it doesn't actually change the window coordinate system or the rasterization rules.

The ARB_clip_control extension allows you to actually use D3D window coordinates in GL (including finally fixing GL's busted NDC depth range)... however, it only exists in GL4.

 

 

awesome, thank you for the detailed explanation! :)
I think I'll go w/ the flipping tex coords on the CPU solution, it seems to me that this is the least painful :)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!