Jump to content

  • Log In with Google      Sign In   
  • Create Account

Universal OpenGL Version


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
21 replies to this topic

#1 IggyZuk   Members   -  Reputation: 1071

Like
2Likes
Like

Posted 26 February 2013 - 06:42 PM

Developing a game with OpenGL, which OpenGL version would have no problem running on most computers? For both Hardware and Platform (PC/MAC/LINUX).

Some say 2.x because it's older and everyone is capable of running older versions with what ever hardware they might have.
Now wouldn't 3.x be much better in terms of performance, better tools and cool effects with programmable pipeline?
 
I'm a bit lost on this subject, heard that Mac can't even go beyond 3.2, and what about Linux?
Any feedback would be helpful, thanks smile.png

Start by doing what is necessary; then do what is possible; and suddenly you are doing the impossible.


Sponsor:

#2 Chris_F   Members   -  Reputation: 2467

Like
4Likes
Like

Posted 26 February 2013 - 07:06 PM

There is a tradeoff between features and audience size. Increasing the minimum system requirements gives you greater abilities but may potentially decreases your audience size. What is more important to you, graphics fidelity or broadest possible audience? If it's the former, go with OpenGL 1.1, if it's the latter, go with OpenGL 4.3, if it's somewhere in between... Nobody can tell you whats best for your game. Are you making a FarmVille or are you making a Crysis? What features do you feel you need to reach your artistic goals? Picking the minimum spec that gives you what you need is probably the best option.


Edited by Chris_F, 26 February 2013 - 07:08 PM.


#3 mhagain   Crossbones+   -  Reputation: 8285

Like
6Likes
Like

Posted 26 February 2013 - 07:36 PM

One possible advantage of using an older version is that older codepaths are more likely to be very well supported.  One possible disadvantage is that these older codepaths may not have recieved more recent bugfixes or performance improvements.  As always it's a balancing act.

 

A possibly useful tip may be to look at what reasonably recent OpenGL games do, and copy that.  You can use a tool like GLIntercept to peek under the covers, look at what calls they make, what versions they use, and get a reasonably decent idea of how things are structured.  The thinking behind this is that hardware vendors are more likely to optimize around (and provide more robust code paths for) popular OpenGL games, so by using the same calls you're going to be hitting driver codepaths that can be assumed to be reasonably decent.

 

Of course, the counterargument to this is that a game may be doing something dreadful that a driver needs to implement a workaround for, but at least it's a starting point, so maybe grab the latest id Software games and one or two other titles and have a look.

 

That aside I'd personaly be inclined to shoot for GL3.x and pull in as much later functionality via extensions and as alternate codepaths (where those extensions are available) as possible.  Since you're starting out here you're looking at maybe a year or so before you've something solid done, by which time things will have moved on from where they're currently at.  But this all depends on your target audience and only you can define that.

 

So the target audience?  Key questions would be what type of player you're aiming for?  Casual?  Hardcore?  Somewhere inbetween?  Do you want to hit laptops, business-class PCs and el-cheapo "multimedia" PCs?  How much support are you prepared to offer?  Because face it - you're never going to hit "most PCs" for the simple reason that "most PCs" includes a hell of a lot of - frankly - f--king awful things infested with cruddy startup apps and which haven't seen a driver update since the day they came out of the factory.  So trim that one down - if you can get a 10% to 25% coverage but if that coverage includes 95% of your target audience you're doing OK.

 

A final tip - the fact that you're asking this suggests that this stuff is relatively new to you.  If that's the case, then maybe back off on the multiplatform ambitions for now - you've enough to be getting on with learning without having to deal with the vagaries of 3 different platforms too.  You can always pick it up a little later once you get more confident.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#4 TheChubu   Crossbones+   -  Reputation: 4831

Like
2Likes
Like

Posted 26 February 2013 - 08:19 PM

I've seen quite a few projects start like this. "Which is the best OGL for compatibility?" And they land on OpenGL 2.1.

 

It is true, pretty much anything you grab will support OGL 2.1 (grab a can on the street, it supports 2.1). The thing is that I've seen projects like this going on for years, while OGL 2.1 was a good idea 3 or 4 years ago, compatibility wise, today? When their game hits the street? Not so much.

 

So I'd pick up something from OpenGL 3.0 and upwards.


"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#5 amorita   Members   -  Reputation: 138

Like
3Likes
Like

Posted 26 February 2013 - 09:17 PM

I've worked on a commercial OpenGL game for several years and most of my work was in the graphics part of the code.  Speaking from experience, most of the problems we ran into was due to people not having up-to-date OpenGL drivers installed.

 

Most people (not most hard-core games, but most casual and non-gamers) have integrated graphics solutions (integrated Intel or mobile AMD/NVidia in a laptop) and rarely or never update their drivers from when they first get their machine.  It works well enough for them to surf the web, e-mail, do their work (editing Word/Excel/PowerPoint docs) that they never have an urgent need to update their video drivers.  Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

 

In addition, the OpenGL support of integrated video chipsets is not necessarily the best to begin with.  And Intel/AMD/NVidia do not provide updates for their older integrated video chipsets which are still in use by many people.  So, some of these people were stuck with older drivers with known bugs in the OpenGL drivers.

 

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio).  So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver.  But, the quality of the OpenGL drivers in the past few years has greatly improved.

 

So, the good news is that the quality of OpenGL drivers is improving.  The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.



#6 Sik_the_hedgehog   Crossbones+   -  Reputation: 1836

Like
2Likes
Like

Posted 27 February 2013 - 01:09 AM

Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

Honestly, this has mainly to do with one of the most common recommendations to install drivers, which is to go to safe mode, uninstall the old driver and install the new one. You don't have to, but it isn't hard to see where the issue lies. Besides, people think that if it already works as-is it's probably fine, not realizing that old drivers may be leaving features unused (e.g. the drivers bundled with the GeForce 7 use OpenGL 2.0, but the newest drivers provide OpenGL 2.1).

 

 

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio).  So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver.  But, the quality of the OpenGL drivers in the past few years has greatly improved.

It's a chicken-and-egg situation, if nobody uses OpenGL, there's no incentive to improve its support, which in turn means nobody wants to use it, and... well, it's a self-feedback loop. I think id is pretty much the only reason it didn't die completely. At least OpenGL 3 seemed to have gotten all vendors back into OpenGL, just because apparently it had enough of a reputation to make lack of support look stupid (maybe the backslash when Vista was implied to lack OpenGL support was a hint, even if it turned out to be false later).

 

 

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

 

EDIT: basically, if you care about people with old systems (especially people in e.g. developing countries, where hardware can be considered quite expensive), OpenGL 2 may be a good compromise. If you expect some decent hardware, OpenGL 3 would be better. I'd say that OpenGL 4 would be better if considered optional for now unless you really need the most powerful hardware (i.e. support it if you want but don't assume it'll be very common).

 

If somebody is stuck with OpenGL 1 that's most likely the kind of people you wouldn't want to bother targetting anyway... Either their hardware is pretty weak and will slow down without much effort or they're the kind of people who'd rather stick to browser games (if they play games at all).


Edited by Sik_the_hedgehog, 27 February 2013 - 01:13 AM.

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

#7 blueshogun96   Crossbones+   -  Reputation: 1107

Like
5Likes
Like

Posted 27 February 2013 - 04:05 AM

I know this would be more work, but in my experience, I found it easier to allow the user to choose which OpenGL version the want/need.  Example, for Windows, I like to create a basic API for gfx and write seperate .dll files for the various differing APIs and load those via ::LoadLibrary(), similar to what Unreal Tournament does.  It helps when compatibility reasons occur, but at the same time you can't limit yourself for the sake of a very small percentage of users.  If id Software did that, then Doom3 wouldn't have done so well (IMO).

 

It's been asked before, but what exactly is your target audience?  What type of game(s) do you have in mind?  If you're doing a 2D game or a really basic 3D game without much gfx complexity, then it won't really matter so much.  If you're doing a 2D game with some advanced effects, then I'd recommend doing no less than 2.x.  If you're doing a hardcore 3D game, then OpenGL 3 or higher is like an absolute must.

 

I'm working on a 2D OpenGL game for MacOSX.  Since most Intel based Macs and Macbook Pros support OpenGL 3.2, using it isn't going to effect compatibility very much, is it?  Windows and Linux users, on the other hand, have a greater variety of hardware and drivers.  So on the latter two OSes, compatibility is more of an issue.  You also have to keep in mind that as hardware evolves, the API also evolves to fit the needs of the hardware, not vice versa.  Some vendors have to go through hoops to maintain compatibility with OpenGL 1.1!  2.x is not so bad, but OpenGL was in dire need of a rewrite (and still is) due to it's limitations on current hardware.

 

Overall, you need to do what's best for your game!  Not what's best for the stubborn user that refuses to get with the times.  Sorry to sound like a douchebag, but there comes a time where the line must be drawn and the cut off point has to be enforced so we can move on without looking back.  Example: Unless you're catering to a very specific group with this interest, would you make your game compatible for those 'ol DOS users?  Or would you use DirectX 3.0 for those retro PC gamers?  As much as I enjoyed the days of DOS, and even more so, how much I would absolutely LOVE to go back to 1997 and write PC games with DirectX 3.0, I can't let such things hinder my game's progression and potential.  If you use the latest version of OpenGL, I doubt you'd get many people complaining about compatibility, unless it's a casual game that lots of everyday people will be using.  Then you'd get some 30+ year old soccer mom wondering why it's not working on her Netbook PC or low end laptop.

 

Sorry for ranting, that's just my view. ^^

 

Shogun.


Follow Shogun3D on the official website: http://shogun3d.net

 

blogger.png twitter.png tumblr_32.png facebook.png

 

"Yo mama so fat, she can't be frustum culled." - yoshi_lol


#8 mhagain   Crossbones+   -  Reputation: 8285

Like
1Likes
Like

Posted 27 February 2013 - 04:10 AM

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

 

You would be surprised.  Check out the Steam or Bethesda forums for Rage - there was an awful lot of so-called "hardcore gamers" who had issues because they never bothered updating their drivers, not to mention an awful lot more who had issues because they were randomly copying individual DLL files all over their systems without any clear knowledge of what they were doing.  (It's also a good example of how poor driver support can mess up a game.)


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#9 IggyZuk   Members   -  Reputation: 1071

Like
1Likes
Like

Posted 27 February 2013 - 09:43 AM

Many thanks for the feedback everyone.

Turns out this is the ugly part of game dev, hopefully pumping up the system requirements and some proper error handling, will make people aware of what they need.

I'm targeting people with decent computers, something that can render 3D graphics with post processing at a playable fps, I really REALLY want to avoid the old pipeline, it's just seems dirty, do some newer AAA games even use old pipeline these day?

 

For example I am interested to know what versions of OGL do Valve use for their games on MAC?

 

And I'll probably just end up going with 3.2. seems to be a better choice.


Start by doing what is necessary; then do what is possible; and suddenly you are doing the impossible.


#10 wintertime   Members   -  Reputation: 1887

Like
1Likes
Like

Posted 27 February 2013 - 03:00 PM

Theoretically you could also program in a relatively modern style with VBO and Shaders even with 2.1, if you accept a few quirks and dont need all the new features.

If you can accept people with weak onboard chips not getting to play then 3.x should be fine.



#11 mhagain   Crossbones+   -  Reputation: 8285

Like
2Likes
Like

Posted 27 February 2013 - 03:32 PM

I'm targeting people with decent computers, something that can render 3D graphics with post processing at a playable fps....

 

In that case go for 3.x - it's all achievable with earlier versions for sure, but you'll have a much nicer time using 3.x.

 

One project I was involved in up to maybe this time last year (where initially I had thought I was being brought in just to optimize the renderer), one of the leads was absolutely insistent on the "what about older hardware?" line but yet was also pushing very heavily for lots of post-processing, lots of complex geometry, lots of real-time dynamic lighting, etc.  I ended up with an insane mixture of core GL1.4 with a software wrapper around VBOs, ARB assembly programs, glCopyTexSubImage2D, multiple codepaths for everything and an edifice so fragile that I was terrified of even bugfixing it (the fact that it was build on an originally GL1.1 codebase that was fairly crankily and inflexibly maintained to that point didn't help).  It was a nightmare - I walked out one day without saying a word and just didn't come back.

 

It's just not worth going down that route - you'll only burn yourself out.  So either dial back the ambitions and use an earlier version, or else keep the ambitions and use the most reasonable sane recent version.  But don't try to mix the two.


Edited by mhagain, 27 February 2013 - 03:35 PM.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#12 3Ddreamer   Crossbones+   -  Reputation: 3167

Like
0Likes
Like

Posted 18 March 2013 - 09:22 PM

Hi,

 

I have what I believe could be a relevant question here which I am actually handling in a job project to make a 2D game with jMonkey that can run through OpenGL on WinXP or higher.

 

OpenGL 2.1 which my jMonkey installation has is my heavy favorite for WinXP or higher compatibility.  I don't need any advanced OpenGL features.   Am I on the right track? 

 

Where can I get information on what version of OpenGL ships with WinXP, Vista, Win7, and Win8?  (Really I am only interested in WinXP to meet the minimum OpenGL requirements.)

 

smile.png


Edited by 3Ddreamer, 18 March 2013 - 09:24 PM.

Personal life and your private thoughts always effect your career. Research is the intellectual backbone of game development and the first order. Version Control is crucial for full management of applications and software.  The better the workflow pipeline, then the greater the potential output for a quality game.  Completing projects is the last but finest order.

 

by Clinton, 3Ddreamer


#13 mhagain   Crossbones+   -  Reputation: 8285

Like
1Likes
Like

Posted 19 March 2013 - 04:20 AM

All versions of Windows ship with OpenGL 1.1 (with a small handful of extensions), but this is a software-emulated OpenGL.  The key thing here is that OpenGL is not software so it doesn't really make sense to talk about "what version of OpenGL ships with Windows".  OpenGL is implemented in your 3D card's driver, so it's shipped by the 3D hardware vendor.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#14 samoth   Crossbones+   -  Reputation: 5038

Like
3Likes
Like

Posted 19 March 2013 - 08:13 AM

I recommend OpenGL 3.3 (or 3.2 if you want to target Mac as well) for two reasons. The first reason is a technical one, the second is an economic one.

 

OpenGL 2.x quickly becomes a real nightmare, unless you only ever do the most puny stuff. You have barely any guarantees of what is supported, and many things must be implemented using extensions. Sometimes there are different ARB and EXT extensions (and vendor extensions) all of which you must consider, because none of them is supported everywhere. Usually they have some kind of "common functionality" that you can figure out, but sometimes they behave considerably different so you must write entirely different code paths for each. Some functionality in the spec (and in extensions) is deliberately worded in a misleading way, too. For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

Most things are kind of obscure or loosely defined, for example you have no guarantee that textures larger than 256x256 are supported at all (you must query to be sure, but what do you do in the worst case?).

 

Put that in contrast to OpenGL 3.3 where you have almost everything that you will likely need (except tesselation and atomic counters, really) guaranteed as core functionality. You have guaranteed minimum specs that must be supported. For almost everyone, these guaranteed minimums are good enough so you never have to think about them. A shader language that just works. No guessing.

 

The economic reason why I would not go anywhere below GL3 is that it rules out people that you likely do not want as customers (I wouldn't want them anyway!). GL3 compatible cards have been around $20 for about 5 years. Integrated cards support GL3 in the mean time as well (of course Intel was never a role model in OpenGL support, but most stuff kind of works most of the time now). If someone cannot or does not want to spend $20 on a graphics card, it's unlikely he will pay you either. They'll probably only pirate your stuff. Why should you burden yourself running after someone who you know isn't going to pay you?

 

About outdated drivers, my stance is that typing "nvidia driver" into Google and clicking "yes, install please" is not too much of a technical challenge. My mother can do that. If someone is unable (or unwilling) to do this, they are likely also people that you do not want as customers. Dealing with people who cannot type 2 words and do 2 mouse clicks is a customer service nightmare. They cannot possibly pay you enough money to make up for that.



#15 EddieV223   Members   -  Reputation: 1407

Like
1Likes
Like

Posted 19 March 2013 - 01:17 PM

 I think the sweet spot is 3.3, but since mac only supports 3.2 it leaves them out.  3.3 is akin to 4.0 but for dx10 cards, so it modern version but for legacy cards too.  


If this post or signature was helpful and/or constructive please give rep.

 

// C++ Video tutorials

http://www.youtube.com/watch?v=Wo60USYV9Ik

 

// Easy to learn 2D Game Library c++

SFML2.1 Download http://www.sfml-dev.org/download.php

SFML2.1 Tutorials http://www.sfml-dev.org/tutorials/2.1/

 

// SFML 2 book

http://www.amazon.com/gp/product/1849696845/ref=as_li_ss_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=1849696845&linkCode=as2&tag=gamer2creator-20

 


#16 3Ddreamer   Crossbones+   -  Reputation: 3167

Like
0Likes
Like

Posted 19 March 2013 - 05:38 PM

Well, the Macs are about 1/4 of the target market according to research in my case.  So it seems that implementing upto 3.2 and a notification message for the user to update OpenGL if needed will be in order.

 

I have no idea yet how to jump from default 2.1 to 3.2 with jMonkey but I am sure the community there has the method, likely done at the tool level (having development software updated for OpenGL 3.2).

 

Thanks! smile.png


Personal life and your private thoughts always effect your career. Research is the intellectual backbone of game development and the first order. Version Control is crucial for full management of applications and software.  The better the workflow pipeline, then the greater the potential output for a quality game.  Completing projects is the last but finest order.

 

by Clinton, 3Ddreamer


#17 px   Members   -  Reputation: 523

Like
0Likes
Like

Posted 19 March 2013 - 08:58 PM

You could always take a look at the Steam Hardware Survey. They don't directly check OpenGL support (not sure why not) but you can at least get an idea of what cards are being used by gamers more by checking the video card results. That might give you a bit of insight anyways.
Humble people don't refer to themselves as humble (W.R.T. "IMHO".)

#18 Hodgman   Moderators   -  Reputation: 32008

Like
0Likes
Like

Posted 19 March 2013 - 09:19 PM

For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

ATI pulled that one on D3D9c as well -- to be D3D9c compliant, you have to support MRT and VTF, so ATI returned true for those caps, but then also returned 1 for the MRT max count, and returned false for every texture format when querying if it was a VTF-capable format... sad.png



#19 mhagain   Crossbones+   -  Reputation: 8285

Like
2Likes
Like

Posted 20 March 2013 - 04:08 AM

For example, you can have a card that supports multiple render targets, with a maximum of one. You can have a card that supports vertex texture fetch with zero fetches.

ATI pulled that one on D3D9c as well -- to be D3D9c compliant, you have to support MRT and VTF, so ATI returned true for those caps, but then also returned 1 for the MRT max count, and returned false for every texture format when querying if it was a VTF-capable format... sad.png

 

GL_ARB_occlusion_query allows the query counter bits to be 0 - what's worse is this was a deliberate decision by the ARB made so as to allow vendors that don't support occlusion queries to be able to claim GL1.5 support; see http://www.opengl.org/archives/about/arb/meeting_notes/notes/meeting_note_2003-06-10.html for more info on that one.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#20 GeneralQuery   Crossbones+   -  Reputation: 1263

Like
0Likes
Like

Posted 20 March 2013 - 04:49 AM

I remember reading an nVidia employee's response on OpenGL.org to a poster's annoyance that the noise function was always returning 0. The response was (and I'm paraphrasing) "the specs state to return a number in the range [0,1], therefore returning 0 conforms to the spec".






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS