Universal OpenGL Version

Started by
19 comments, last by 21st Century Moose 11 years, 1 month ago

Developing a game with OpenGL, which OpenGL version would have no problem running on most computers? For both Hardware and Platform (PC/MAC/LINUX).

Some say 2.x because it's older and everyone is capable of running older versions with what ever hardware they might have.
Now wouldn't 3.x be much better in terms of performance, better tools and cool effects with programmable pipeline?
I'm a bit lost on this subject, heard that Mac can't even go beyond 3.2, and what about Linux?
Any feedback would be helpful, thanks smile.png

Start by doing what is necessary; then do what is possible; and suddenly you are doing the impossible.

Advertisement

There is a tradeoff between features and audience size. Increasing the minimum system requirements gives you greater abilities but may potentially decreases your audience size. What is more important to you, graphics fidelity or broadest possible audience? If it's the former, go with OpenGL 1.1, if it's the latter, go with OpenGL 4.3, if it's somewhere in between... Nobody can tell you whats best for your game. Are you making a FarmVille or are you making a Crysis? What features do you feel you need to reach your artistic goals? Picking the minimum spec that gives you what you need is probably the best option.

One possible advantage of using an older version is that older codepaths are more likely to be very well supported. One possible disadvantage is that these older codepaths may not have recieved more recent bugfixes or performance improvements. As always it's a balancing act.

A possibly useful tip may be to look at what reasonably recent OpenGL games do, and copy that. You can use a tool like GLIntercept to peek under the covers, look at what calls they make, what versions they use, and get a reasonably decent idea of how things are structured. The thinking behind this is that hardware vendors are more likely to optimize around (and provide more robust code paths for) popular OpenGL games, so by using the same calls you're going to be hitting driver codepaths that can be assumed to be reasonably decent.

Of course, the counterargument to this is that a game may be doing something dreadful that a driver needs to implement a workaround for, but at least it's a starting point, so maybe grab the latest id Software games and one or two other titles and have a look.

That aside I'd personaly be inclined to shoot for GL3.x and pull in as much later functionality via extensions and as alternate codepaths (where those extensions are available) as possible. Since you're starting out here you're looking at maybe a year or so before you've something solid done, by which time things will have moved on from where they're currently at. But this all depends on your target audience and only you can define that.

So the target audience? Key questions would be what type of player you're aiming for? Casual? Hardcore? Somewhere inbetween? Do you want to hit laptops, business-class PCs and el-cheapo "multimedia" PCs? How much support are you prepared to offer? Because face it - you're never going to hit "most PCs" for the simple reason that "most PCs" includes a hell of a lot of - frankly - f--king awful things infested with cruddy startup apps and which haven't seen a driver update since the day they came out of the factory. So trim that one down - if you can get a 10% to 25% coverage but if that coverage includes 95% of your target audience you're doing OK.

A final tip - the fact that you're asking this suggests that this stuff is relatively new to you. If that's the case, then maybe back off on the multiplatform ambitions for now - you've enough to be getting on with learning without having to deal with the vagaries of 3 different platforms too. You can always pick it up a little later once you get more confident.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

I've seen quite a few projects start like this. "Which is the best OGL for compatibility?" And they land on OpenGL 2.1.

It is true, pretty much anything you grab will support OGL 2.1 (grab a can on the street, it supports 2.1). The thing is that I've seen projects like this going on for years, while OGL 2.1 was a good idea 3 or 4 years ago, compatibility wise, today? When their game hits the street? Not so much.

So I'd pick up something from OpenGL 3.0 and upwards.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

My journals: dustArtemis ECS framework and Making a Terrain Generator

I've worked on a commercial OpenGL game for several years and most of my work was in the graphics part of the code. Speaking from experience, most of the problems we ran into was due to people not having up-to-date OpenGL drivers installed.

Most people (not most hard-core games, but most casual and non-gamers) have integrated graphics solutions (integrated Intel or mobile AMD/NVidia in a laptop) and rarely or never update their drivers from when they first get their machine. It works well enough for them to surf the web, e-mail, do their work (editing Word/Excel/PowerPoint docs) that they never have an urgent need to update their video drivers. Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

In addition, the OpenGL support of integrated video chipsets is not necessarily the best to begin with. And Intel/AMD/NVidia do not provide updates for their older integrated video chipsets which are still in use by many people. So, some of these people were stuck with older drivers with known bugs in the OpenGL drivers.

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio). So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver. But, the quality of the OpenGL drivers in the past few years has greatly improved.

So, the good news is that the quality of OpenGL drivers is improving. The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

Also, many of them feel that updating drivers is a difficult thing to do (too technical for them) and are afraid that they will mess up their system.

Honestly, this has mainly to do with one of the most common recommendations to install drivers, which is to go to safe mode, uninstall the old driver and install the new one. You don't have to, but it isn't hard to see where the issue lies. Besides, people think that if it already works as-is it's probably fine, not realizing that old drivers may be leaving features unused (e.g. the drivers bundled with the GeForce 7 use OpenGL 2.0, but the newest drivers provide OpenGL 2.1).

In reality, there are a lot more games that use DirectX than use OpenGL (easily 10 to 1 ratio). So, Intel/AMD/NVidia have not had too much incentive to keep the quality of their OpenGL drivers on par with the quality of their DirectX driver. But, the quality of the OpenGL drivers in the past few years has greatly improved.

It's a chicken-and-egg situation, if nobody uses OpenGL, there's no incentive to improve its support, which in turn means nobody wants to use it, and... well, it's a self-feedback loop. I think id is pretty much the only reason it didn't die completely. At least OpenGL 3 seemed to have gotten all vendors back into OpenGL, just because apparently it had enough of a reputation to make lack of support look stupid (maybe the backslash when Vista was implied to lack OpenGL support was a hint, even if it turned out to be false later).

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

EDIT: basically, if you care about people with old systems (especially people in e.g. developing countries, where hardware can be considered quite expensive), OpenGL 2 may be a good compromise. If you expect some decent hardware, OpenGL 3 would be better. I'd say that OpenGL 4 would be better if considered optional for now unless you really need the most powerful hardware (i.e. support it if you want but don't assume it'll be very common).

If somebody is stuck with OpenGL 1 that's most likely the kind of people you wouldn't want to bother targetting anyway... Either their hardware is pretty weak and will slow down without much effort or they're the kind of people who'd rather stick to browser games (if they play games at all).

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

I know this would be more work, but in my experience, I found it easier to allow the user to choose which OpenGL version the want/need. Example, for Windows, I like to create a basic API for gfx and write seperate .dll files for the various differing APIs and load those via ::LoadLibrary(), similar to what Unreal Tournament does. It helps when compatibility reasons occur, but at the same time you can't limit yourself for the sake of a very small percentage of users. If id Software did that, then Doom3 wouldn't have done so well (IMO).

It's been asked before, but what exactly is your target audience? What type of game(s) do you have in mind? If you're doing a 2D game or a really basic 3D game without much gfx complexity, then it won't really matter so much. If you're doing a 2D game with some advanced effects, then I'd recommend doing no less than 2.x. If you're doing a hardcore 3D game, then OpenGL 3 or higher is like an absolute must.

I'm working on a 2D OpenGL game for MacOSX. Since most Intel based Macs and Macbook Pros support OpenGL 3.2, using it isn't going to effect compatibility very much, is it? Windows and Linux users, on the other hand, have a greater variety of hardware and drivers. So on the latter two OSes, compatibility is more of an issue. You also have to keep in mind that as hardware evolves, the API also evolves to fit the needs of the hardware, not vice versa. Some vendors have to go through hoops to maintain compatibility with OpenGL 1.1! 2.x is not so bad, but OpenGL was in dire need of a rewrite (and still is) due to it's limitations on current hardware.

Overall, you need to do what's best for your game! Not what's best for the stubborn user that refuses to get with the times. Sorry to sound like a douchebag, but there comes a time where the line must be drawn and the cut off point has to be enforced so we can move on without looking back. Example: Unless you're catering to a very specific group with this interest, would you make your game compatible for those 'ol DOS users? Or would you use DirectX 3.0 for those retro PC gamers? As much as I enjoyed the days of DOS, and even more so, how much I would absolutely LOVE to go back to 1997 and write PC games with DirectX 3.0, I can't let such things hinder my game's progression and potential. If you use the latest version of OpenGL, I doubt you'd get many people complaining about compatibility, unless it's a casual game that lots of everyday people will be using. Then you'd get some 30+ year old soccer mom wondering why it's not working on her Netbook PC or low end laptop.

Sorry for ranting, that's just my view. ^^

Shogun.

The bad news is that a lot of people are still using (or stuck with) older, buggy OpenGL drivers.

I wouldn't expect those to care about gaming anyway ^^; (or to have something that supports anything newer than 1.1 for that very reason...)

You would be surprised. Check out the Steam or Bethesda forums for Rage - there was an awful lot of so-called "hardcore gamers" who had issues because they never bothered updating their drivers, not to mention an awful lot more who had issues because they were randomly copying individual DLL files all over their systems without any clear knowledge of what they were doing. (It's also a good example of how poor driver support can mess up a game.)

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Many thanks for the feedback everyone.

Turns out this is the ugly part of game dev, hopefully pumping up the system requirements and some proper error handling, will make people aware of what they need.

I'm targeting people with decent computers, something that can render 3D graphics with post processing at a playable fps, I really REALLY want to avoid the old pipeline, it's just seems dirty, do some newer AAA games even use old pipeline these day?

For example I am interested to know what versions of OGL do Valve use for their games on MAC?

And I'll probably just end up going with 3.2. seems to be a better choice.

Start by doing what is necessary; then do what is possible; and suddenly you are doing the impossible.

Theoretically you could also program in a relatively modern style with VBO and Shaders even with 2.1, if you accept a few quirks and dont need all the new features.

If you can accept people with weak onboard chips not getting to play then 3.x should be fine.

This topic is closed to new replies.

Advertisement