Jump to content
  • Advertisement
Sign in to follow this  
Side Winder

Getting ModelView Matrix and Projection Matrix

This topic is 3035 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've looked at some of the documentation and I should just have to call
glGetFloatv(GL_MODELVIEW_MATRIX, matrix);
to get the model-view matrix. But I'm fairly sure this isn't working correctly. Firstly, the matrix that is returned has very small values or very large values, which leads me to believe the floats are being initialised incorrectly. Secondly, when I call GetError I get an error... So what could be going wrong? I've tried glMatrixMode and that doesn't seem to do anything. And these errors are exactly the same when I try and get the projection matrix. edit: when I said I thought the floats were being initialised incorrectly, I was wrong. They're not being initialised at all, which is why they have such bizarre values. So then, why isn't glGetFloatv assigning the matrices to the matrix that I've created...?

Share this post


Link to post
Share on other sites
Advertisement
call wglGetCurrentContext to make sure things are sane.

http://msdn.microsoft.com/en-us/library/dd374383%28VS.85%29.aspx

Share this post


Link to post
Share on other sites
Did you try glGetDoublev(GL_MODELVIEW_MATRIX, matrix)? Does it make any difference?

Share this post


Link to post
Share on other sites
Quote:
Original post by Side Winder
...
Secondly, when I call GetError I get an error... So what could be going wrong?
Which error is that?

Share this post


Link to post
Share on other sites
The error is invalid enum on glGetFloatv(). And then if I try glMatrixMode() I get an invalid operation... Thing is, looking at the SDK doc at opengl.org, invalid operation for glMatrixMode() is when it's being called between glBegin and glEnd... Except I'm using OpenGL 3.2... So I have no begin or end.

And using glGetDoublev() didn't work either.

Share this post


Link to post
Share on other sites
Quote:
Original post by Side Winder
Except I'm using OpenGL 3.2... So I have no begin or end.
If you are using Gl 3.2, then you don't have a modelview matrix either. Matrices in 3.2 must be passed to the shader as a matrix uniform.

Share this post


Link to post
Share on other sites
Right... OK... How do I do that? :/ As in, how do I calculate the ModelViewMatrix to pass through? Cause that's actually what I'm trying to do in the first place (pass the MVM to the shader), but I was under the impression that I needed to calculate the MVM to pass it through.

Share this post


Link to post
Share on other sites
Quote:
Original post by Side Winder
Right... OK... How do I do that? :/ As in, how do I calculate the ModelViewMatrix to pass through? Cause that's actually what I'm trying to do in the first place (pass the MVM to the shader), but I was under the impression that I needed to calculate the MVM to pass it through.
Presumably you have some sort of camera, and some sort of object(s) that you want to render.

The view matrix is the inverse of the camera transform, and the model matrix is the transform of the object you are currently rendering. The modelview matrix is the concatenation (i.e. multiplication) of the two.

Generally, I recommend grabbing an existing 3D math library (such as CML) to handle the legwork for you.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!