Deprecated matrixes

Started by
6 comments, last by V-man 11 years, 12 months ago
Hello,

I was reading some posts across the internet and it seems the the modelview and projection matrix, along all they're modification functions, such as glTranslate, rotate, etc became deprecated and removed in the most recent openGL versions.

I'm still using opengl 3.0 version, so I don't get compilation errors using them, but seeing that they are no longer in use in current openGL versions I would like to follow that philosophy.

Since this is Computer Graphics the matrixes will still be needed, even if it's not in the openGL matrix stack. After some more reading seems that the general idea is to implement the functions by myself (or using someone elses library), and pass the matrix to my shaders.

The thing is, I don't plan to use shaders everytime I want to render something, so my question here is:
-Should I start using shaders everytime I want to render something, even though it's just making simple calculations as would be done without my shader implementation, or in these particular cases I should pass my own calculated matrixes to openGL?

I'm doing this thinking from a performance point of view.
Advertisement
Not using shader by itself is deprecated in OpenGL >= 3.0. So if you want to get up to date you have to move to using them.

If you want a replacement for the old gl matrices functions, you might want to look at something like glm (http://glm.g-truc.net). Its a math library focused on interacting nicely with opengl, so it might be close to the old interface. I haven't used it myself though.
I don't believe there is any reason to update to Opengl 4 if you are not going to use the new functionality it gives you. IE tessellation, if you dont' plan on using it you don't need GL 4. Look at what GL 4 gives you and then decide if you need to upgrade. I'm still using 2.0 until I really need to upgrade + there are a lot of non 4.0 cards still out there.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

Then I think I will continue using the matrix stack openGL provides, since I don't have the needs to use openGL 4.0 functionalities.

Thanks for the opinions, helped a lot!
Whether you got to 4.0 or 3.0 or 2.0, you can always perform matrices on the cpu. Just 4.0 forces you to do so.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

It's also the case that managing matrixes yourself might be something you'll need to do if you start getting into e.g. skeletal animation. The GL matrix stacks are just too limited for that, so it's not a bad idea to start getting comfortable with the idea of not using them.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Skeletal animation is in my plans indeed, I already wrote some modules for it and I conclued that I need to calculate a lot of matrices and pass them to my shader to positionate my vertexes, seeing from that point of view I think leaving the openGL matrix stack will be something that will come naturaly, but maybe it's best to solve that problem early, than later change a lot of code :P
People who have a GL 2 card are probably not interested in gaming, therefore they don't upgrade.
The year is 2012 and if you don't have at least a GL 3.3 card, then you are living in the past.
If you have a GL 3.0 driver, then it is a old driver. All GL 3 cards are really GL 3.3.
The only exception is the Mac, where Apple decided to provide a 3.2 driver.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);

This topic is closed to new replies.

Advertisement