Getting ModelView Matrix and Projection Matrix

Started by
8 comments, last by Side Winder 14 years, 1 month ago
I've looked at some of the documentation and I should just have to call
glGetFloatv(GL_MODELVIEW_MATRIX, matrix);
to get the model-view matrix. But I'm fairly sure this isn't working correctly. Firstly, the matrix that is returned has very small values or very large values, which leads me to believe the floats are being initialised incorrectly. Secondly, when I call GetError I get an error... So what could be going wrong? I've tried glMatrixMode and that doesn't seem to do anything. And these errors are exactly the same when I try and get the projection matrix. edit: when I said I thought the floats were being initialised incorrectly, I was wrong. They're not being initialised at all, which is why they have such bizarre values. So then, why isn't glGetFloatv assigning the matrices to the matrix that I've created...?
Advertisement
call wglGetCurrentContext to make sure things are sane.

http://msdn.microsoft.com/en-us/library/dd374383%28VS.85%29.aspx
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Makes no difference... I even called wglMakeCurrent() and that doesn't change anything either.
Did you try glGetDoublev(GL_MODELVIEW_MATRIX, matrix)? Does it make any difference?
Quote:Original post by Side Winder
...
Secondly, when I call GetError I get an error... So what could be going wrong?
Which error is that?
The error is invalid enum on glGetFloatv(). And then if I try glMatrixMode() I get an invalid operation... Thing is, looking at the SDK doc at opengl.org, invalid operation for glMatrixMode() is when it's being called between glBegin and glEnd... Except I'm using OpenGL 3.2... So I have no begin or end.

And using glGetDoublev() didn't work either.
Quote:Original post by Side Winder
Except I'm using OpenGL 3.2... So I have no begin or end.
If you are using Gl 3.2, then you don't have a modelview matrix either. Matrices in 3.2 must be passed to the shader as a matrix uniform.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Right... OK... How do I do that? :/ As in, how do I calculate the ModelViewMatrix to pass through? Cause that's actually what I'm trying to do in the first place (pass the MVM to the shader), but I was under the impression that I needed to calculate the MVM to pass it through.
Quote:Original post by Side Winder
Right... OK... How do I do that? :/ As in, how do I calculate the ModelViewMatrix to pass through? Cause that's actually what I'm trying to do in the first place (pass the MVM to the shader), but I was under the impression that I needed to calculate the MVM to pass it through.
Presumably you have some sort of camera, and some sort of object(s) that you want to render.

The view matrix is the inverse of the camera transform, and the model matrix is the transform of the object you are currently rendering. The modelview matrix is the concatenation (i.e. multiplication) of the two.

Generally, I recommend grabbing an existing 3D math library (such as CML) to handle the legwork for you.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

I see, thanks.

This topic is closed to new replies.

Advertisement