• Advertisement
Sign in to follow this  

problems using identity matrix

This topic is 4515 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello! I have only the following in my main: glMatrixMode (GL_MODELVIEW); glLoadIdentity(); int matrix[16]; glGetIntegerv( GL_MODELVIEW_MATRIX, matrix); cout << "Matrix " << matrix[0] <<"/"<<matrix[1]<<"/"<<matrix[2]<<"/"<<matrix[3]< endl; I estimeted something like 1/0/0/0 but i got 8932 / 666 / -1073745864 / 134516137 ! Does anybody know whats wrong?

Share this post


Link to post
Share on other sites
Advertisement
You must have a valid OpenGL rendering context before you can make OpenGL calls.

Enigma

Share this post


Link to post
Share on other sites
thank you for your reply!!! I added
glutInit(&argc, argv);
glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);

but it doesnt change anything... What else do i need for having a valid OpenGL rendering context?

Share this post


Link to post
Share on other sites
I'm not sure using glut. But I know if you are drawing vertices you have the context. Can you draw any triangles. If you can, then you have a rendering context.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement