Sign in to follow this  
IgnatusZul

OpenGL Universal OpenGL Version

Recommended Posts

IgnatusZul    1160

Developing a game with OpenGL, which OpenGL version would have no problem running on most computers? For both Hardware and Platform (PC/MAC/LINUX).

Some say 2.x because it's older and everyone is capable of running older versions with what ever hardware they might have.
Now wouldn't 3.x be much better in terms of performance, better tools and cool effects with programmable pipeline?
 
I'm a bit lost on this subject, heard that Mac can't even go beyond 3.2, and what about Linux?
Any feedback would be helpful, thanks smile.png

Share this post


Link to post
Share on other sites
Chris_F    3030

There is a tradeoff between features and audience size. Increasing the minimum system requirements gives you greater abilities but may potentially decreases your audience size. What is more important to you, graphics fidelity or broadest possible audience? If it's the former, go with OpenGL 1.1, if it's the latter, go with OpenGL 4.3, if it's somewhere in between... Nobody can tell you whats best for your game. Are you making a FarmVille or are you making a Crysis? What features do you feel you need to reach your artistic goals? Picking the minimum spec that gives you what you need is probably the best option.

Edited by Chris_F

Share this post


Link to post
Share on other sites
TheChubu    9454

I've seen quite a few projects start like this. "Which is the best OGL for compatibility?" And they land on OpenGL 2.1.

 

It is true, pretty much anything you grab will support OGL 2.1 (grab a can on the street, it supports 2.1). The thing is that I've seen projects like this going on for years, while OGL 2.1 was a good idea 3 or 4 years ago, compatibility wise, today? When their game hits the street? Not so much.

 

So I'd pick up something from OpenGL 3.0 and upwards.

Share this post


Link to post
Share on other sites
amorita    138

I've worked on a commercial OpenGL game for several years and most of my work was in the graphics part of the code.  Speaking from experience, most of the problems we ran into was due to people not having up-to-date OpenGL drivers installed.

 

Most people (not most hard-core games, but most casual and non-gamers) have integrated graphics solutions (integrated Intel or mobile AMD/NVidia in a laptop) and rarely or never update their drivers from when they first get their machine.  It works well enough for them to surf the web, e-mail, do their work (editing Word/Excel/PowerPoint docs) that they never have an urgent need to update their video drivers.  Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

 

In addition, the OpenGL support of integrated video chipsets is not necessarily the best to begin with.  And Intel/AMD/NVidia do not provide updates for their older integrated video chipsets which are still in use by many people.  So, some of these people were stuck with older drivers with known bugs in the OpenGL drivers.

 

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio).  So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver.  But, the quality of the OpenGL drivers in the past few years has greatly improved.

 

So, the good news is that the quality of OpenGL drivers is improving.  The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

Share this post


Link to post
Share on other sites

Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

Honestly, this has mainly to do with one of the most common recommendations to install drivers, which is to go to safe mode, uninstall the old driver and install the new one. You don't have to, but it isn't hard to see where the issue lies. Besides, people think that if it already works as-is it's probably fine, not realizing that old drivers may be leaving features unused (e.g. the drivers bundled with the GeForce 7 use OpenGL 2.0, but the newest drivers provide OpenGL 2.1).

 

 

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio).  So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver.  But, the quality of the OpenGL drivers in the past few years has greatly improved.

It's a chicken-and-egg situation, if nobody uses OpenGL, there's no incentive to improve its support, which in turn means nobody wants to use it, and... well, it's a self-feedback loop. I think id is pretty much the only reason it didn't die completely. At least OpenGL 3 seemed to have gotten all vendors back into OpenGL, just because apparently it had enough of a reputation to make lack of support look stupid (maybe the backslash when Vista was implied to lack OpenGL support was a hint, even if it turned out to be false later).

 

 

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

 

EDIT: basically, if you care about people with old systems (especially people in e.g. developing countries, where hardware can be considered quite expensive), OpenGL 2 may be a good compromise. If you expect some decent hardware, OpenGL 3 would be better. I'd say that OpenGL 4 would be better if considered optional for now unless you really need the most powerful hardware (i.e. support it if you want but don't assume it'll be very common).

 

If somebody is stuck with OpenGL 1 that's most likely the kind of people you wouldn't want to bother targetting anyway... Either their hardware is pretty weak and will slow down without much effort or they're the kind of people who'd rather stick to browser games (if they play games at all).

Edited by Sik_the_hedgehog

Share this post


Link to post
Share on other sites
mhagain    13430

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

 

You would be surprised.  Check out the Steam or Bethesda forums for Rage - there was an awful lot of so-called "hardcore gamers" who had issues because they never bothered updating their drivers, not to mention an awful lot more who had issues because they were randomly copying individual DLL files all over their systems without any clear knowledge of what they were doing.  (It's also a good example of how poor driver support can mess up a game.)

Share this post


Link to post
Share on other sites
IgnatusZul    1160

Many thanks for the feedback everyone.

Turns out this is the ugly part of game dev, hopefully pumping up the system requirements and some proper error handling, will make people aware of what they need.

I'm targeting people with decent computers, something that can render 3D graphics with post processing at a playable fps, I really REALLY want to avoid the old pipeline, it's just seems dirty, do some newer AAA games even use old pipeline these day?

 

For example I am interested to know what versions of OGL do Valve use for their games on MAC?

 

And I'll probably just end up going with 3.2. seems to be a better choice.

Share this post


Link to post
Share on other sites
wintertime    4108

Theoretically you could also program in a relatively modern style with VBO and Shaders even with 2.1, if you accept a few quirks and dont need all the new features.

If you can accept people with weak onboard chips not getting to play then 3.x should be fine.

Share this post


Link to post
Share on other sites
mhagain    13430

I'm targeting people with decent computers, something that can render 3D graphics with post processing at a playable fps....

 

In that case go for 3.x - it's all achievable with earlier versions for sure, but you'll have a much nicer time using 3.x.

 

One project I was involved in up to maybe this time last year (where initially I had thought I was being brought in just to optimize the renderer), one of the leads was absolutely insistent on the "what about older hardware?" line but yet was also pushing very heavily for lots of post-processing, lots of complex geometry, lots of real-time dynamic lighting, etc.  I ended up with an insane mixture of core GL1.4 with a software wrapper around VBOs, ARB assembly programs, glCopyTexSubImage2D, multiple codepaths for everything and an edifice so fragile that I was terrified of even bugfixing it (the fact that it was build on an originally GL1.1 codebase that was fairly crankily and inflexibly maintained to that point didn't help).  It was a nightmare - I walked out one day without saying a word and just didn't come back.

 

It's just not worth going down that route - you'll only burn yourself out.  So either dial back the ambitions and use an earlier version, or else keep the ambitions and use the most reasonable sane recent version.  But don't try to mix the two.

Edited by mhagain

Share this post


Link to post
Share on other sites
3Ddreamer    3826

Hi,

 

I have what I believe could be a relevant question here which I am actually handling in a job project to make a 2D game with jMonkey that can run through OpenGL on WinXP or higher.

 

OpenGL 2.1 which my jMonkey installation has is my heavy favorite for WinXP or higher compatibility.  I don't need any advanced OpenGL features.   Am I on the right track? 

 

Where can I get information on what version of OpenGL ships with WinXP, Vista, Win7, and Win8?  (Really I am only interested in WinXP to meet the minimum OpenGL requirements.)

 

smile.png

Edited by 3Ddreamer

Share this post


Link to post
Share on other sites
mhagain    13430

All versions of Windows ship with OpenGL 1.1 (with a small handful of extensions), but this is a software-emulated OpenGL.  The key thing here is that OpenGL is not software so it doesn't really make sense to talk about "what version of OpenGL ships with Windows".  OpenGL is implemented in your 3D card's driver, so it's shipped by the 3D hardware vendor.

Share this post


Link to post
Share on other sites

I recommend OpenGL 3.3 (or 3.2 if you want to target Mac as well) for two reasons. The first reason is a technical one, the second is an economic one.

 

OpenGL 2.x quickly becomes a real nightmare, unless you only ever do the most puny stuff. You have barely any guarantees of what is supported, and many things must be implemented using extensions. Sometimes there are different ARB and EXT extensions (and vendor extensions) all of which you must consider, because none of them is supported everywhere. Usually they have some kind of "common functionality" that you can figure out, but sometimes they behave considerably different so you must write entirely different code paths for each. Some functionality in the spec (and in extensions) is deliberately worded in a misleading way, too. For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

Most things are kind of obscure or loosely defined, for example you have no guarantee that textures larger than 256x256 are supported at all (you must query to be sure, but what do you do in the worst case?).

 

Put that in contrast to OpenGL 3.3 where you have almost everything that you will likely need (except tesselation and atomic counters, really) guaranteed as core functionality. You have guaranteed minimum specs that must be supported. For almost everyone, these guaranteed minimums are good enough so you never have to think about them. A shader language that just works. No guessing.

 

The economic reason why I would not go anywhere below GL3 is that it rules out people that you likely do not want as customers (I wouldn't want them anyway!). GL3 compatible cards have been around $20 for about 5 years. Integrated cards support GL3 in the mean time as well (of course Intel was never a role model in OpenGL support, but most stuff kind of works most of the time now). If someone cannot or does not want to spend $20 on a graphics card, it's unlikely he will pay you either. They'll probably only pirate your stuff. Why should you burden yourself running after someone who you know isn't going to pay you?

 

About outdated drivers, my stance is that typing "nvidia driver" into Google and clicking "yes, install please" is not too much of a technical challenge. My mother can do that. If someone is unable (or unwilling) to do this, they are likely also people that you do not want as customers. Dealing with people who cannot type 2 words and do 2 mouse clicks is a customer service nightmare. They cannot possibly pay you enough money to make up for that.

Share this post


Link to post
Share on other sites
EddieV223    1839

 I think the sweet spot is 3.3, but since mac only supports 3.2 it leaves them out.  3.3 is akin to 4.0 but for dx10 cards, so it modern version but for legacy cards too.  

Share this post


Link to post
Share on other sites
3Ddreamer    3826

Well, the Macs are about 1/4 of the target market according to research in my case.  So it seems that implementing upto 3.2 and a notification message for the user to update OpenGL if needed will be in order.

 

I have no idea yet how to jump from default 2.1 to 3.2 with jMonkey but I am sure the community there has the method, likely done at the tool level (having development software updated for OpenGL 3.2).

 

Thanks! smile.png

Share this post


Link to post
Share on other sites
Hodgman    51324

For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

ATI pulled that one on D3D9c as well -- to be D3D9c compliant, you have to support MRT and VTF, so ATI returned true for those caps, but then also returned 1 for the MRT max count, and returned false for every texture format when querying if it was a VTF-capable format... sad.png

Share this post


Link to post
Share on other sites
mhagain    13430

For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

ATI pulled that one on D3D9c as well -- to be D3D9c compliant, you have to support MRT and VTF, so ATI returned true for those caps, but then also returned 1 for the MRT max count, and returned false for every texture format when querying if it was a VTF-capable format... sad.png

 

GL_ARB_occlusion_query allows the query counter bits to be 0 - what's worse is this was a deliberate decision by the ARB made so as to allow vendors that don't support occlusion queries to be able to claim GL1.5 support; see http://www.opengl.org/archives/about/arb/meeting_notes/notes/meeting_note_2003-06-10.html for more info on that one.

Share this post


Link to post
Share on other sites
GeneralQuery    1263

I remember reading an nVidia employee's response on OpenGL.org to a poster's annoyance that the noise function was always returning 0. The response was (and I'm paraphrasing) "the specs state to return a number in the range [0,1], therefore returning 0 conforms to the spec".

Share this post


Link to post
Share on other sites
blueshogun96    2265

Sorry if this is somewhat offtopic, but the 3 posts above mine (especially GeneralQuery's mentioning NVIDIA), reminds me of the time when I was porting some code from Direct3D8 to Direct3D9 from an older version of the NVSDK (5.21).  I was particularly interested in the bump refraction demo submitted from Japan using a proprietary texture format "NVHS".  I never found anywhere in NVIDIA's documentation that this texture format was only supported on the GeForce 3 and 4 Ti series GPUs, so I was getting upset that I couldn't get the feature to work on my 8400 GS M.  I assumed it was just a problem with my drivers, but to be sure I asked some other people to verify that it does or doesn't work on their machines. Turns out that when checking the device caps, the driver claims the texture format was supported on all NVIDIA cards, but creating the texture using that format would always fail unless your GPU was from the NV2x series.

 

I tried to warn NVIDIA of this driver bug, but to no avail.  It's not too relavent now since no one used that format (Q8W8V8U8 was more compatible anyway), and DirectX9 is dying slowly but surely either way.

Share this post


Link to post
Share on other sites
mhagain    13430

It may be veering off-topic, but all of this does serve to highlight one key point that is relevant to the original post.  That is: vendor shenanigans are quite widespread, and no matter which API (or which version of an API) you choose, you do still have to tread a little carefully.

Edited by mhagain

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By cebugdev
      hi all,

      i am trying to build an OpenGL 2D GUI system, (yeah yeah, i know i should not be re inventing the wheel, but this is for educational and some other purpose only),
      i have built GUI system before using 2D systems such as that of HTML/JS canvas, but in 2D system, i can directly match a mouse coordinates to the actual graphic coordinates with additional computation for screen size/ratio/scale ofcourse.
      now i want to port it to OpenGL, i know that to render a 2D object in OpenGL we specify coordiantes in Clip space or use the orthographic projection, now heres what i need help about.
      1. what is the right way of rendering the GUI? is it thru drawing in clip space or switching to ortho projection?
      2. from screen coordinates (top left is 0,0 nd bottom right is width height), how can i map the mouse coordinates to OpenGL 2D so that mouse events such as button click works? In consideration ofcourse to the current screen/size dimension.
      3. when let say if the screen size/dimension is different, how to handle this? in my previous javascript 2D engine using canvas, i just have my working coordinates and then just perform the bitblk or copying my working canvas to screen canvas and scale the mouse coordinates from there, in OpenGL how to work on a multiple screen sizes (more like an OpenGL ES question).
      lastly, if you guys know any books, resources, links or tutorials that handle or discuss this, i found one with marekknows opengl game engine website but its not free,
      Just let me know. Did not have any luck finding resource in google for writing our own OpenGL GUI framework.
      IF there are no any available online, just let me know, what things do i need to look into for OpenGL and i will study them one by one to make it work.
      thank you, and looking forward to positive replies.
    • By fllwr0491
      I have a few beginner questions about tesselation that I really have no clue.
      The opengl wiki doesn't seem to talk anything about the details.
       
      What is the relationship between TCS layout out and TES layout in?
      How does the tesselator know how control points are organized?
          e.g. If TES input requests triangles, but TCS can output N vertices.
             What happens in this case?
      In this article,
      http://www.informit.com/articles/article.aspx?p=2120983
      the isoline example TCS out=4, but TES in=isoline.
      And gl_TessCoord is only a single one.
      So which ones are the control points?
      How are tesselator building primitives?
    • By Orella
      I've been developing a 2D Engine using SFML + ImGui.
      Here you can see an image
      The editor is rendered using ImGui and the scene window is a sf::RenderTexture where I draw the GameObjects and then is converted to ImGui::Image to render it in the editor.
      Now I need to create a 3D Engine during this year in my Bachelor Degree but using SDL2 + ImGui and I want to recreate what I did with the 2D Engine. 
      I've managed to render the editor like I did in the 2D Engine using this example that comes with ImGui. 
      3D Editor preview
      But I don't know how to create an equivalent of sf::RenderTexture in SDL2, so I can draw the 3D scene there and convert it to ImGui::Image to show it in the editor.
      If you can provide code will be better. And if you want me to provide any specific code tell me.
      Thanks!
    • By Picpenguin
      Hi
      I'm new to learning OpenGL and still learning C. I'm using SDL2, glew, OpenGL 3.3, linmath and stb_image.
      I started following through learnopengl.com and got through it until I had to load models. The problem is, it uses Assimp for loading models. Assimp is C++ and uses things I don't want in my program (boost for example) and C support doesn't seem that good.
      Things like glVertexAttribPointer and shaders are still confusing to me, but I have to start somewhere right?
      I can't seem to find any good loading/rendering tutorials or source code that is simple to use and easy to understand.
      I have tried this for over a week by myself, searching for solutions but so far no luck. With tinyobjloader-c and project that uses it, FantasyGolfSimulator, I was able to actually load the model with plain color (always the same color no matter what I do) on screen and move it around, but cannot figure out how to use textures or use its multiple textures with it.
      I don't ask much: I just want to load models with textures in them, maybe have lights affect them (directional spotlight etc). Also, some models have multiple parts and multiple textures in them, how can I handle those?
      Are there solutions anywhere?
      Thank you for your time. Sorry if this is a bit confusing, English isn't my native language
    • By dpadam450
      FINALLY, upgrading my engine to openGL 4. I was having some trouble so I started with a stripped down application and was wondering if VAO's are required, because I have a sample working, but if I remove the VAO then it doesn't seem to like drawing my triangle.
  • Popular Now