Jump to content

  • Log In with Google      Sign In   
  • Create Account


Does it matter what OpenGL version I learn to use?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
4 replies to this topic

#1 SalsaRoja   Members   -  Reputation: 100

Like
0Likes
Like

Posted 02 May 2012 - 03:47 PM

Hi,

Question is in the title.

I've been programming in C/C++/Java for my classes in computer science for a few years now but haven't done anything substantial.

I've been skimming over OpenGL stuff and found it hard to start and realize there's MANY outdated tutorials/books available that may not necessarily apply to the latest version of OpenGL.

My question is -- should I seek out guides/tutorials/documents that talk only about the latest OpenGL or should I simply learn any version that I feel comfortable learning with?

Sponsor:

#2 japro   Members   -  Reputation: 887

Like
1Likes
Like

Posted 02 May 2012 - 03:57 PM

I would stay clear of any fixed function and immediate mode stuff. I think 3.3 is a reasonable baseline. And if you want to support older systems you can do 2.1 but in a 3.3+ style (with custom attributes, shaders, no matrix stack...).

#3 dpadam450   Members   -  Reputation: 918

Like
1Likes
Like

Posted 02 May 2012 - 04:13 PM

Doesn't matter, depends on what guides you find. I think jumping right into shaders might be hard since new OpenGL requires it. Everything translates and of course mobile phones still use older openGL at the moment, so the answer is: doesn't matter much.

#4 V-man   Members   -  Reputation: 805

Like
0Likes
Like

Posted 02 May 2012 - 08:31 PM

Doesn't matter, depends on what guides you find. I think jumping right into shaders might be hard since new OpenGL requires it. Everything translates and of course mobile phones still use older openGL at the moment, so the answer is: doesn't matter much.


On mobiles, you use OpenGL ES, not OpenGL. They are not the same, although they are similar.
On the desktop, you might have OpenGL ES available as well, but I think the OP is talking about OpenGL. In which case, version 3.3 is a good aim.

If the goal is to just learn the API, it doesn't matter much.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);

#5 samoth   Crossbones+   -  Reputation: 4717

Like
3Likes
Like

Posted 03 May 2012 - 07:57 AM

If you want a spinning triangle or you want to draw some textured sprites as in a tile-based 2D game, it doesn't matter. Otherwise, it makes a huge difference. I would recommend starting to learn OpenGL 3.3 (3.2 if you target Mac), and having an optional look into 4.x (reasons follow).

Versions 3.x and 4.x follow a much more modern model (server-side data buffers) and are usually significantly more efficient. Starting with OpenGL 1.x/2.x you will most probably sooner or later have to "unlearn" everything that is now wrong and learn the proper techniques.

Also, assuming OpenGL 3.x as a minimum requirement has advantages. OpenGL 2.x turns out to be a nightmare when you want to do something "real". The minimum requirements defined by the spec are a joke. Though most hardware supports much better than the minimum, you have no guarantee and you must query every single bit, which isn't all nice and easy. Some hardware just doesn't support more than the bare minimum -- what now?. A lot of the "useful modern" functionality is only available as extension, which further complicates your program, since you have to query for extension presence and write alternative codepaths.
And then, some vendors will just lie to you, directly or indirectly (that is, by following word by word a specification that has been explicitly worded in a deceptive manner). Using extensions properly (and you will need to use many...) can be a real challenge. Writing code (other than some colored triangles) that runs on, say, 1.5 to 2.1 included is... daunting.

Version 3.0 has minimum requirements. There is no such thing as an OpenGL 3.0 graphics card that supports vertex texture fetch with "maximum 0 fetches", or multiple render targets with "max number of targets = 1" or textures no larger than 256x256.

Almost all things that are "just normal" such as reasonably sized textures, framebuffer objects, and at least 4 render targets are mandatory. In other words, as soon as you have version 3.0, you know that you can do most things without having to worry. It will work.
You need very few, if any, extensions in addition. Plus, the ones you will want to use are mostly of the kind "nice to have, but ok if missing". Whereas in 2.x it was more a "oh shit, what now?" situation.

OpenGL 3.1 to 3.3 inclusive add features which may or may not be interesting to you (to me they are), but they run on the same class of hardware. There is however a considerable difference in how parameters are declared between 3.1 and 3.2, which again makes writing a shader that "just runs" on any version an impossible endeavour. Thus, I've decided for myself to just stick with 3.3, which works and has none of the problems. Writing code that works on 3.3 and works the same on 4.x is a breeze.

Now of course Version 4.0 adds more features and 4 is a bigger number than 3, and bigger is always better.
However, do note that Mac currently (to my knowledge) supports none higher than version 3.2 and in general you will need a different class of hardware to run OpenGL 4.x. Not all your customers will have the most recent class of graphics hardware on their system (though you can pretty much assume that OpenGL 3.x is omnipresent with anyone whom you would want as customer nowadays -- someone who wouldn't pay $25 for a graphics card 3 years ago won't pay for your game either!).
Everything in 4.x that doesn't strictly require hardware support is also present as 1:1 identical backtension in 3.x on the major IHVs (backtensions as I call them are really ARB extensions that are 100% identical to core functionality. If only the ARB guys had either been smart enough to name them differently too (BRA would have been nice!) so you have a chance of knowing that functions and constants don't adhere to the ARB's own naming scheme...

Edited by samoth, 03 May 2012 - 07:59 AM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS