Sign in to follow this  

OpenGL What makes OpenGL right handed?

This topic is 2839 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I often hear OpenGL referred to as using a right handed coordinate system, but I'm a little unclear as to the precise reason why this is so. Sure, the gluXXX helper functions operate in a right handed system, but what specifically about OpenGL makes it right handed?

Share this post


Link to post
Share on other sites
Negative Z points away from the camera origin( into the screen ) and Positive Y points up, Positive X points towards the right.

Share this post


Link to post
Share on other sites
It's because of the projection matrix. I believe it is because of the 3rd row, 3 column. I'm not sure since I haven't bothered with that matrix in years.

Share this post


Link to post
Share on other sites
Quote:
Original post by gtdelarosa2
Negative Z points away from the camera origin( into the screen ) and Positive Y points up, Positive X points towards the right.


Yes, that's a right handed system, but what about OpenGL requires this?

Share this post


Link to post
Share on other sites
Quote:
Original post by GaryNas
Quote:
Original post by gtdelarosa2
Negative Z points away from the camera origin( into the screen ) and Positive Y points up, Positive X points towards the right.


Yes, that's a right handed system, but what about OpenGL requires this?


Why do some countries drive on the right and some on the left side of the road ? It's a convention. They had to pick something.

Share this post


Link to post
Share on other sites
The pipeline requires the data to be as such for processing..i.e internally it makes the assumption that stuff are in a right handed coordinate system. Not directly related, but APIs have to make certain assumptions, ex. that is why by default vertices in GL have to be specified counter-clockwise...thats just the way it is.

Share this post


Link to post
Share on other sites
Quote:
Original post by Yann L
Why do some countries drive on the right and some on the left side of the road ? It's a convention. They had to pick something.


Do you mean they had to pick something for the glu library, or for OpenGL?

Let me give an example. Assume my models, and world are left handed. I use this left handed world in OpenGL and do not convert to a right handed system. Which part of OpenGL will have trouble with this left handed system? I believe the problem would be in clip space, but I'm not sure.

Share this post


Link to post
Share on other sites
Quote:
Original post by cgrant
The pipeline requires the data to be as such for processing..i.e internally it makes the assumption that stuff are in a right handed coordinate system. Not directly related, but APIs have to make certain assumptions, ex. that is why by default vertices in GL have to be specified counter-clockwise...thats just the way it is.


Ah, this is exactly what I'm getting at. Do you know which part of the pipeline requires a right handed system? I've searched all over for this info, but all I can find is "OpenGL is right handed", "DirectX is left handed".

Share this post


Link to post
Share on other sites
Quote:
Original post by GaryNas
Do you mean they had to pick something for the glu library, or for OpenGL?

glu is the OpenGL utility library, it's evident that both have to use the same conventions.

Quote:
Original post by GaryNas
Let me give an example. Assume my models, and world are left handed. I use this left handed world in OpenGL and do not convert to a right handed system. Which part of OpenGL will have trouble with this left handed system?

None at all. It will just not be displayed the way you intended it to be. Up, down, left, right, in, out - all that are human concepts. A mathematical processing framework, such as OpenGL, can't use these. You have to map these concepts of human perception to absolute mathematical terms. Just as OpenGL doesn't know what the colour red is, but it knows (1,0,0).

By supplying your own projection matrix to OpenGL, you can make it left or right handed. Or, assuming some vertex shader magic, you could even specify your points in a totally different coordinate system. Spherical polar, for example. It's all a question of conventions.

So to answer your question, there is no part of OpenGL that requires a certain handedness. Assuming we're talking about modern, non-FFP OpenGL.

Share this post


Link to post
Share on other sites
Quote:
Original post by GaryNas
I've searched all over for this info, but all I can find is "OpenGL is right handed", "DirectX is left handed".
That's all you can find, because that's the entirety of it. There's nothing more to say about it.

Share this post


Link to post
Share on other sites
The difference I see is actually in projection computation (as was already mentioned above). If you choose the view local x and y directions for spanning the view plane, and let stuff at less depth cover stuff at greater depth, then choosing whether depth increases with increasing local z or else increases with decreasing local z makes the difference. The former method would be LHS, and the latter one RHS.

Share this post


Link to post
Share on other sites
Yup, it is the projection matrix. I have done some D3D and GL code where both had to be right handed. I just manipulated the projection for D3D (actually, they have D3DXPersPectiveRH) and also for modelview rotation, there is the RH versions. Translation is universal. Scale is universal.

The other thing left to do is setup culling for D3D.

Share this post


Link to post
Share on other sites
Quote:
Original post by Daaark
Quote:
Original post by GaryNas
I've searched all over for this info, but all I can find is "OpenGL is right handed", "DirectX is left handed".
That's all you can find, because that's the entirety of it. There's nothing more to say about it.
I'm not sure it's quite that simple.

I said this same thing in another recent thread, but as far as I can tell, Direct3D/DirectX is no more left-handed than it is right-handed. It works fine with both left- and right-handed systems (as far as I can tell, at least), and the DX math library includes transform functions for each handedness.

Why Direct3D is thought of as being left-handed, I'm not sure, but I suspect it may be historical. Maybe someone else can shed some light on this.

My guess is that OpenGL can be used with a left-handed system just as easily as Direct3D can be used with a right-handed system (as was suggested previously). I haven't actually tried this myself though, so I can't say for sure.

If that's true though, then I would think that the only thing that makes OpenGL 'right-handed' is the few convenience functions that build transforms for which handedness matters (gluLookAt, gluPerspective, etc.). If you take these out of the picture (e.g. by using glLoad/MultMatrix, or by using the programmable pipeline), then I'm not sure that OpenGL can be said to have an inherent handedness.

So to get back to the original question, my guess is that the only thing that makes OpenGL 'right-handed' is a few convenience functions that you can easily do without, and that in fact are no longer even included as part of the API. (I could be overlooking something though.)

Share this post


Link to post
Share on other sites
Maybe I will tell a totally dumb thing, but I'm a bit tired.
I think that makes openGL right-handed, is that if you don't apply any transformations (identity as model-view and projection), a model made and viewed in a left handed editor would appear mirrored in onenGL in the said conditions.

Some one please clarify this.

Share this post


Link to post
Share on other sites
Quote:
Original post by szecs
Maybe I will tell a totally dumb thing, but I'm a bit tired.
I think that makes openGL right-handed, is that if you don't apply any transformations (identity as model-view and projection), a model made and viewed in a left handed editor would appear mirrored in onenGL in the said conditions.

Some one please clarify this.
I don't think that's right; if the projection matrix is identity, I don't think the visual output will be anything meaningful (in the general case, at least).

Share this post


Link to post
Share on other sites
Quote:
Original post by jyk
if the projection matrix is identity, I don't think the visual output will be anything meaningful (in the general case, at least).
Sure it will. An identity projection matrix is just an unscaled orthographic projection.

Share this post


Link to post
Share on other sites
Quote:
Original post by jyk
I said this same thing in another recent thread, but as far as I can tell, Direct3D/DirectX is no more left-handed than it is right-handed.


You may be thinking about row mayor vs column mayor notation, but anyway in order to use the opposite handedness multiply your matrices by/or use this as your identity matrix:

 
[ 1 0 0 0 ]
[ 0 1 0 0 ]
[ 0 0 -1 0 ]
[ 0 0 0 1 ]

Share this post


Link to post
Share on other sites
Quote:
Original post by jyk
Quote:
Original post by szecs
Maybe I will tell a totally dumb thing, but I'm a bit tired.
I think that makes openGL right-handed, is that if you don't apply any transformations (identity as model-view and projection), a model made and viewed in a left handed editor would appear mirrored in onenGL in the said conditions.

Some one please clarify this.
I don't think that's right; if the projection matrix is identity, I don't think the visual output will be anything meaningful (in the general case, at least).
Most newbies don't even know about the projection matrix.
So the question still sands: Am I right in my previous post?

Share this post


Link to post
Share on other sites
Quote:
Original post by szecs
So the question still sands: Am I right in my previous post?


You are, you won't see much because a pixel would take the whole screen (see swiftcoder's post), but that doesn't mean the model isn't there, in a right handed coordinate system [smile].

Share this post


Link to post
Share on other sites
Does that mean I have solved the problem?
Do I get promoted or something?

(BTW if the model is smaller than 1.0 unit, then it will shown nicely, the viewport (thus pixels) has nothing to do with the matrices.)

Share this post


Link to post
Share on other sites
Well, actually I think you can also switch handedness by providing a higher value for the left parameter than the right parameter OR higher for bottom than top for glFrustum or glOrtho, still, it would default to right handed.

Quote:
Original post by szecs
(BTW if the model is smaller than 1.0 unit, then it will shown nicely, the viewport (thus pixels) has nothing to do with the matrices.)


Yeah, you're right.

Share this post


Link to post
Share on other sites
Quote:
Original post by Kwizatz
Quote:
Original post by jyk
I said this same thing in another recent thread, but as far as I can tell, Direct3D/DirectX is no more left-handed than it is right-handed.


You may be thinking about row mayor vs column mayor notation
Nope, I'm thinking about handedness.

Also, there's no such thing as 'row-major notation', at least as far as I'm aware. Are you talking about row- vs. column-vector notation?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Similar Content

    • By _OskaR
      Hi,
      I have an OpenGL application but without possibility to wite own shaders.
      I need to perform small VS modification - is possible to do it in an alternative way? Do we have apps or driver modifictions which will catch the shader sent to GPU and override it?
    • By xhcao
      Does sync be needed to read texture content after access texture image in compute shader?
      My simple code is as below,
      glUseProgram(program.get());
      glBindImageTexture(0, texture[0], 0, GL_FALSE, 3, GL_READ_ONLY, GL_R32UI);
      glBindImageTexture(1, texture[1], 0, GL_FALSE, 4, GL_WRITE_ONLY, GL_R32UI);
      glDispatchCompute(1, 1, 1);
      // Does sync be needed here?
      glUseProgram(0);
      glBindFramebuffer(GL_READ_FRAMEBUFFER, framebuffer);
      glFramebufferTexture2D(GL_READ_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
                                     GL_TEXTURE_CUBE_MAP_POSITIVE_X + face, texture[1], 0);
      glReadPixels(0, 0, kWidth, kHeight, GL_RED_INTEGER, GL_UNSIGNED_INT, outputValues);
       
      Compute shader is very simple, imageLoad content from texture[0], and imageStore content to texture[1]. Does need to sync after dispatchCompute?
    • By Jonathan2006
      My question: is it possible to transform multiple angular velocities so that they can be reinserted as one? My research is below:
      // This works quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); quat quaternion2 = GEMultiplyQuaternions(quaternion1, GEQuaternionFromAngleRadians(angleRadiansVector2)); quat quaternion3 = GEMultiplyQuaternions(quaternion2, GEQuaternionFromAngleRadians(angleRadiansVector3)); glMultMatrixf(GEMat4FromQuaternion(quaternion3).array); // The first two work fine but not the third. Why? quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); vec3 vector1 = GETransformQuaternionAndVector(quaternion1, angularVelocity1); quat quaternion2 = GEQuaternionFromAngleRadians(angleRadiansVector2); vec3 vector2 = GETransformQuaternionAndVector(quaternion2, angularVelocity2); // This doesn't work //quat quaternion3 = GEQuaternionFromAngleRadians(angleRadiansVector3); //vec3 vector3 = GETransformQuaternionAndVector(quaternion3, angularVelocity3); vec3 angleVelocity = GEAddVectors(vector1, vector2); // Does not work: vec3 angleVelocity = GEAddVectors(vector1, GEAddVectors(vector2, vector3)); static vec3 angleRadiansVector; vec3 angularAcceleration = GESetVector(0.0, 0.0, 0.0); // Sending it through one angular velocity later in my motion engine angleVelocity = GEAddVectors(angleVelocity, GEMultiplyVectorAndScalar(angularAcceleration, timeStep)); angleRadiansVector = GEAddVectors(angleRadiansVector, GEMultiplyVectorAndScalar(angleVelocity, timeStep)); glMultMatrixf(GEMat4FromEulerAngle(angleRadiansVector).array); Also how do I combine multiple angularAcceleration variables? Is there an easier way to transform the angular values?
    • By dpadam450
      I have this code below in both my vertex and fragment shader, however when I request glGetUniformLocation("Lights[0].diffuse") or "Lights[0].attenuation", it returns -1. It will only give me a valid uniform location if I actually use the diffuse/attenuation variables in the VERTEX shader. Because I use position in the vertex shader, it always returns a valid uniform location. I've read that I can share uniforms across both vertex and fragment, but I'm confused what this is even compiling to if this is the case.
       
      #define NUM_LIGHTS 2
      struct Light
      {
          vec3 position;
          vec3 diffuse;
          float attenuation;
      };
      uniform Light Lights[NUM_LIGHTS];
       
       
    • By pr033r
      Hello,
      I have a Bachelor project on topic "Implenet 3D Boid's algorithm in OpenGL". All OpenGL issues works fine for me, all rendering etc. But when I started implement the boid's algorithm it was getting worse and worse. I read article (http://natureofcode.com/book/chapter-6-autonomous-agents/) inspirate from another code (here: https://github.com/jyanar/Boids/tree/master/src) but it still doesn't work like in tutorials and videos. For example the main problem: when I apply Cohesion (one of three main laws of boids) it makes some "cycling knot". Second, when some flock touch to another it scary change the coordination or respawn in origin (x: 0, y:0. z:0). Just some streng things. 
      I followed many tutorials, change a try everything but it isn't so smooth, without lags like in another videos. I really need your help. 
      My code (optimalizing branch): https://github.com/pr033r/BachelorProject/tree/Optimalizing
      Exe file (if you want to look) and models folder (for those who will download the sources):
      http://leteckaposta.cz/367190436
      Thanks for any help...

  • Popular Now