Sign in to follow this  
IgnatusZul

OpenGL Universal OpenGL Version

Recommended Posts

Developing a game with OpenGL, which OpenGL version would have no problem running on most computers? For both Hardware and Platform (PC/MAC/LINUX).

Some say 2.x because it's older and everyone is capable of running older versions with what ever hardware they might have.
Now wouldn't 3.x be much better in terms of performance, better tools and cool effects with programmable pipeline?
 
I'm a bit lost on this subject, heard that Mac can't even go beyond 3.2, and what about Linux?
Any feedback would be helpful, thanks smile.png

Share this post


Link to post
Share on other sites

There is a tradeoff between features and audience size. Increasing the minimum system requirements gives you greater abilities but may potentially decreases your audience size. What is more important to you, graphics fidelity or broadest possible audience? If it's the former, go with OpenGL 1.1, if it's the latter, go with OpenGL 4.3, if it's somewhere in between... Nobody can tell you whats best for your game. Are you making a FarmVille or are you making a Crysis? What features do you feel you need to reach your artistic goals? Picking the minimum spec that gives you what you need is probably the best option.

Edited by Chris_F

Share this post


Link to post
Share on other sites

I've seen quite a few projects start like this. "Which is the best OGL for compatibility?" And they land on OpenGL 2.1.

 

It is true, pretty much anything you grab will support OGL 2.1 (grab a can on the street, it supports 2.1). The thing is that I've seen projects like this going on for years, while OGL 2.1 was a good idea 3 or 4 years ago, compatibility wise, today? When their game hits the street? Not so much.

 

So I'd pick up something from OpenGL 3.0 and upwards.

Share this post


Link to post
Share on other sites

I've worked on a commercial OpenGL game for several years and most of my work was in the graphics part of the code.  Speaking from experience, most of the problems we ran into was due to people not having up-to-date OpenGL drivers installed.

 

Most people (not most hard-core games, but most casual and non-gamers) have integrated graphics solutions (integrated Intel or mobile AMD/NVidia in a laptop) and rarely or never update their drivers from when they first get their machine.  It works well enough for them to surf the web, e-mail, do their work (editing Word/Excel/PowerPoint docs) that they never have an urgent need to update their video drivers.  Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

 

In addition, the OpenGL support of integrated video chipsets is not necessarily the best to begin with.  And Intel/AMD/NVidia do not provide updates for their older integrated video chipsets which are still in use by many people.  So, some of these people were stuck with older drivers with known bugs in the OpenGL drivers.

 

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio).  So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver.  But, the quality of the OpenGL drivers in the past few years has greatly improved.

 

So, the good news is that the quality of OpenGL drivers is improving.  The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

Share this post


Link to post
Share on other sites

Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

Honestly, this has mainly to do with one of the most common recommendations to install drivers, which is to go to safe mode, uninstall the old driver and install the new one. You don't have to, but it isn't hard to see where the issue lies. Besides, people think that if it already works as-is it's probably fine, not realizing that old drivers may be leaving features unused (e.g. the drivers bundled with the GeForce 7 use OpenGL 2.0, but the newest drivers provide OpenGL 2.1).

 

 

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio).  So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver.  But, the quality of the OpenGL drivers in the past few years has greatly improved.

It's a chicken-and-egg situation, if nobody uses OpenGL, there's no incentive to improve its support, which in turn means nobody wants to use it, and... well, it's a self-feedback loop. I think id is pretty much the only reason it didn't die completely. At least OpenGL 3 seemed to have gotten all vendors back into OpenGL, just because apparently it had enough of a reputation to make lack of support look stupid (maybe the backslash when Vista was implied to lack OpenGL support was a hint, even if it turned out to be false later).

 

 

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

 

EDIT: basically, if you care about people with old systems (especially people in e.g. developing countries, where hardware can be considered quite expensive), OpenGL 2 may be a good compromise. If you expect some decent hardware, OpenGL 3 would be better. I'd say that OpenGL 4 would be better if considered optional for now unless you really need the most powerful hardware (i.e. support it if you want but don't assume it'll be very common).

 

If somebody is stuck with OpenGL 1 that's most likely the kind of people you wouldn't want to bother targetting anyway... Either their hardware is pretty weak and will slow down without much effort or they're the kind of people who'd rather stick to browser games (if they play games at all).

Edited by Sik_the_hedgehog

Share this post


Link to post
Share on other sites

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

 

You would be surprised.  Check out the Steam or Bethesda forums for Rage - there was an awful lot of so-called "hardcore gamers" who had issues because they never bothered updating their drivers, not to mention an awful lot more who had issues because they were randomly copying individual DLL files all over their systems without any clear knowledge of what they were doing.  (It's also a good example of how poor driver support can mess up a game.)

Share this post


Link to post
Share on other sites

Many thanks for the feedback everyone.

Turns out this is the ugly part of game dev, hopefully pumping up the system requirements and some proper error handling, will make people aware of what they need.

I'm targeting people with decent computers, something that can render 3D graphics with post processing at a playable fps, I really REALLY want to avoid the old pipeline, it's just seems dirty, do some newer AAA games even use old pipeline these day?

 

For example I am interested to know what versions of OGL do Valve use for their games on MAC?

 

And I'll probably just end up going with 3.2. seems to be a better choice.

Share this post


Link to post
Share on other sites

Theoretically you could also program in a relatively modern style with VBO and Shaders even with 2.1, if you accept a few quirks and dont need all the new features.

If you can accept people with weak onboard chips not getting to play then 3.x should be fine.

Share this post


Link to post
Share on other sites

I'm targeting people with decent computers, something that can render 3D graphics with post processing at a playable fps....

 

In that case go for 3.x - it's all achievable with earlier versions for sure, but you'll have a much nicer time using 3.x.

 

One project I was involved in up to maybe this time last year (where initially I had thought I was being brought in just to optimize the renderer), one of the leads was absolutely insistent on the "what about older hardware?" line but yet was also pushing very heavily for lots of post-processing, lots of complex geometry, lots of real-time dynamic lighting, etc.  I ended up with an insane mixture of core GL1.4 with a software wrapper around VBOs, ARB assembly programs, glCopyTexSubImage2D, multiple codepaths for everything and an edifice so fragile that I was terrified of even bugfixing it (the fact that it was build on an originally GL1.1 codebase that was fairly crankily and inflexibly maintained to that point didn't help).  It was a nightmare - I walked out one day without saying a word and just didn't come back.

 

It's just not worth going down that route - you'll only burn yourself out.  So either dial back the ambitions and use an earlier version, or else keep the ambitions and use the most reasonable sane recent version.  But don't try to mix the two.

Edited by mhagain

Share this post


Link to post
Share on other sites

Hi,

 

I have what I believe could be a relevant question here which I am actually handling in a job project to make a 2D game with jMonkey that can run through OpenGL on WinXP or higher.

 

OpenGL 2.1 which my jMonkey installation has is my heavy favorite for WinXP or higher compatibility.  I don't need any advanced OpenGL features.   Am I on the right track? 

 

Where can I get information on what version of OpenGL ships with WinXP, Vista, Win7, and Win8?  (Really I am only interested in WinXP to meet the minimum OpenGL requirements.)

 

smile.png

Edited by 3Ddreamer

Share this post


Link to post
Share on other sites

All versions of Windows ship with OpenGL 1.1 (with a small handful of extensions), but this is a software-emulated OpenGL.  The key thing here is that OpenGL is not software so it doesn't really make sense to talk about "what version of OpenGL ships with Windows".  OpenGL is implemented in your 3D card's driver, so it's shipped by the 3D hardware vendor.

Share this post


Link to post
Share on other sites

I recommend OpenGL 3.3 (or 3.2 if you want to target Mac as well) for two reasons. The first reason is a technical one, the second is an economic one.

 

OpenGL 2.x quickly becomes a real nightmare, unless you only ever do the most puny stuff. You have barely any guarantees of what is supported, and many things must be implemented using extensions. Sometimes there are different ARB and EXT extensions (and vendor extensions) all of which you must consider, because none of them is supported everywhere. Usually they have some kind of "common functionality" that you can figure out, but sometimes they behave considerably different so you must write entirely different code paths for each. Some functionality in the spec (and in extensions) is deliberately worded in a misleading way, too. For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

Most things are kind of obscure or loosely defined, for example you have no guarantee that textures larger than 256x256 are supported at all (you must query to be sure, but what do you do in the worst case?).

 

Put that in contrast to OpenGL 3.3 where you have almost everything that you will likely need (except tesselation and atomic counters, really) guaranteed as core functionality. You have guaranteed minimum specs that must be supported. For almost everyone, these guaranteed minimums are good enough so you never have to think about them. A shader language that just works. No guessing.

 

The economic reason why I would not go anywhere below GL3 is that it rules out people that you likely do not want as customers (I wouldn't want them anyway!). GL3 compatible cards have been around $20 for about 5 years. Integrated cards support GL3 in the mean time as well (of course Intel was never a role model in OpenGL support, but most stuff kind of works most of the time now). If someone cannot or does not want to spend $20 on a graphics card, it's unlikely he will pay you either. They'll probably only pirate your stuff. Why should you burden yourself running after someone who you know isn't going to pay you?

 

About outdated drivers, my stance is that typing "nvidia driver" into Google and clicking "yes, install please" is not too much of a technical challenge. My mother can do that. If someone is unable (or unwilling) to do this, they are likely also people that you do not want as customers. Dealing with people who cannot type 2 words and do 2 mouse clicks is a customer service nightmare. They cannot possibly pay you enough money to make up for that.

Share this post


Link to post
Share on other sites

Well, the Macs are about 1/4 of the target market according to research in my case.  So it seems that implementing upto 3.2 and a notification message for the user to update OpenGL if needed will be in order.

 

I have no idea yet how to jump from default 2.1 to 3.2 with jMonkey but I am sure the community there has the method, likely done at the tool level (having development software updated for OpenGL 3.2).

 

Thanks! smile.png

Share this post


Link to post
Share on other sites

For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

ATI pulled that one on D3D9c as well -- to be D3D9c compliant, you have to support MRT and VTF, so ATI returned true for those caps, but then also returned 1 for the MRT max count, and returned false for every texture format when querying if it was a VTF-capable format... sad.png

Share this post


Link to post
Share on other sites

For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

ATI pulled that one on D3D9c as well -- to be D3D9c compliant, you have to support MRT and VTF, so ATI returned true for those caps, but then also returned 1 for the MRT max count, and returned false for every texture format when querying if it was a VTF-capable format... sad.png

 

GL_ARB_occlusion_query allows the query counter bits to be 0 - what's worse is this was a deliberate decision by the ARB made so as to allow vendors that don't support occlusion queries to be able to claim GL1.5 support; see http://www.opengl.org/archives/about/arb/meeting_notes/notes/meeting_note_2003-06-10.html for more info on that one.

Share this post


Link to post
Share on other sites

I remember reading an nVidia employee's response on OpenGL.org to a poster's annoyance that the noise function was always returning 0. The response was (and I'm paraphrasing) "the specs state to return a number in the range [0,1], therefore returning 0 conforms to the spec".

Share this post


Link to post
Share on other sites

Sorry if this is somewhat offtopic, but the 3 posts above mine (especially GeneralQuery's mentioning NVIDIA), reminds me of the time when I was porting some code from Direct3D8 to Direct3D9 from an older version of the NVSDK (5.21).  I was particularly interested in the bump refraction demo submitted from Japan using a proprietary texture format "NVHS".  I never found anywhere in NVIDIA's documentation that this texture format was only supported on the GeForce 3 and 4 Ti series GPUs, so I was getting upset that I couldn't get the feature to work on my 8400 GS M.  I assumed it was just a problem with my drivers, but to be sure I asked some other people to verify that it does or doesn't work on their machines. Turns out that when checking the device caps, the driver claims the texture format was supported on all NVIDIA cards, but creating the texture using that format would always fail unless your GPU was from the NV2x series.

 

I tried to warn NVIDIA of this driver bug, but to no avail.  It's not too relavent now since no one used that format (Q8W8V8U8 was more compatible anyway), and DirectX9 is dying slowly but surely either way.

Share this post


Link to post
Share on other sites

It may be veering off-topic, but all of this does serve to highlight one key point that is relevant to the original post.  That is: vendor shenanigans are quite widespread, and no matter which API (or which version of an API) you choose, you do still have to tread a little carefully.

Edited by mhagain

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Announcements

  • Forum Statistics

    • Total Topics
      628372
    • Total Posts
      2982305
  • Similar Content

    • By test opty
      Hi all,
       
      I'm starting OpenGL using a tut on the Web. But at this point I would like to know the primitives needed for creating a window using OpenGL. So on Windows and using MS VS 2017, what is the simplest code required to render a window with the title of "First Rectangle", please?
       
       
    • By DejayHextrix
      Hi, New here. 
      I need some help. My fiance and I like to play this mobile game online that goes by real time. Her and I are always working but when we have free time we like to play this game. We don't always got time throughout the day to Queue Buildings, troops, Upgrades....etc.... 
      I was told to look into DLL Injection and OpenGL/DirectX Hooking. Is this true? Is this what I need to learn? 
      How do I read the Android files, or modify the files, or get the in-game tags/variables for the game I want? 
      Any assistance on this would be most appreciated. I been everywhere and seems no one knows or is to lazy to help me out. It would be nice to have assistance for once. I don't know what I need to learn. 
      So links of topics I need to learn within the comment section would be SOOOOO.....Helpful. Anything to just get me started. 
      Thanks, 
      Dejay Hextrix 
    • By mellinoe
      Hi all,
      First time poster here, although I've been reading posts here for quite a while. This place has been invaluable for learning graphics programming -- thanks for a great resource!
      Right now, I'm working on a graphics abstraction layer for .NET which supports D3D11, Vulkan, and OpenGL at the moment. I have implemented most of my planned features already, and things are working well. Some remaining features that I am planning are Compute Shaders, and some flavor of read-write shader resources. At the moment, my shaders can just get simple read-only access to a uniform (or constant) buffer, a texture, or a sampler. Unfortunately, I'm having a tough time grasping the distinctions between all of the different kinds of read-write resources that are available. In D3D alone, there seem to be 5 or 6 different kinds of resources with similar but different characteristics. On top of that, I get the impression that some of them are more or less "obsoleted" by the newer kinds, and don't have much of a place in modern code. There seem to be a few pivots:
      The data source/destination (buffer or texture) Read-write or read-only Structured or unstructured (?) Ordered vs unordered (?) These are just my observations based on a lot of MSDN and OpenGL doc reading. For my library, I'm not interested in exposing every possibility to the user -- just trying to find a good "middle-ground" that can be represented cleanly across API's which is good enough for common scenarios.
      Can anyone give a sort of "overview" of the different options, and perhaps compare/contrast the concepts between Direct3D, OpenGL, and Vulkan? I'd also be very interested in hearing how other folks have abstracted these concepts in their libraries.
    • By aejt
      I recently started getting into graphics programming (2nd try, first try was many years ago) and I'm working on a 3d rendering engine which I hope to be able to make a 3D game with sooner or later. I have plenty of C++ experience, but not a lot when it comes to graphics, and while it's definitely going much better this time, I'm having trouble figuring out how assets are usually handled by engines.
      I'm not having trouble with handling the GPU resources, but more so with how the resources should be defined and used in the system (materials, models, etc).
      This is my plan now, I've implemented most of it except for the XML parts and factories and those are the ones I'm not sure of at all:
      I have these classes:
      For GPU resources:
      Geometry: holds and manages everything needed to render a geometry: VAO, VBO, EBO. Texture: holds and manages a texture which is loaded into the GPU. Shader: holds and manages a shader which is loaded into the GPU. For assets relying on GPU resources:
      Material: holds a shader resource, multiple texture resources, as well as uniform settings. Mesh: holds a geometry and a material. Model: holds multiple meshes, possibly in a tree structure to more easily support skinning later on? For handling GPU resources:
      ResourceCache<T>: T can be any resource loaded into the GPU. It owns these resources and only hands out handles to them on request (currently string identifiers are used when requesting handles, but all resources are stored in a vector and each handle only contains resource's index in that vector) Resource<T>: The handles given out from ResourceCache. The handles are reference counted and to get the underlying resource you simply deference like with pointers (*handle).  
      And my plan is to define everything into these XML documents to abstract away files:
      Resources.xml for ref-counted GPU resources (geometry, shaders, textures) Resources are assigned names/ids and resource files, and possibly some attributes (what vertex attributes does this geometry have? what vertex attributes does this shader expect? what uniforms does this shader use? and so on) Are reference counted using ResourceCache<T> Assets.xml for assets using the GPU resources (materials, meshes, models) Assets are not reference counted, but they hold handles to ref-counted resources. References the resources defined in Resources.xml by names/ids. The XMLs are loaded into some structure in memory which is then used for loading the resources/assets using factory classes:
      Factory classes for resources:
      For example, a texture factory could contain the texture definitions from the XML containing data about textures in the game, as well as a cache containing all loaded textures. This means it has mappings from each name/id to a file and when asked to load a texture with a name/id, it can look up its path and use a "BinaryLoader" to either load the file and create the resource directly, or asynchronously load the file's data into a queue which then can be read from later to create the resources synchronously in the GL context. These factories only return handles.
      Factory classes for assets:
      Much like for resources, these classes contain the definitions for the assets they can load. For example, with the definition the MaterialFactory will know which shader, textures and possibly uniform a certain material has, and with the help of TextureFactory and ShaderFactory, it can retrieve handles to the resources it needs (Shader + Textures), setup itself from XML data (uniform values), and return a created instance of requested material. These factories return actual instances, not handles (but the instances contain handles).
       
       
      Is this a good or commonly used approach? Is this going to bite me in the ass later on? Are there other more preferable approaches? Is this outside of the scope of a 3d renderer and should be on the engine side? I'd love to receive and kind of advice or suggestions!
      Thanks!
    • By nedondev
      I 'm learning how to create game by using opengl with c/c++ coding, so here is my fist game. In video description also have game contain in Dropbox. May be I will make it better in future.
      Thanks.
  • Popular Now