Jump to content

  • Log In with Google      Sign In   
  • Create Account


realistic minimum GL version support in a year?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
11 replies to this topic

#1 irreversible   Crossbones+   -  Reputation: 1094

Like
0Likes
Like

Posted 28 January 2012 - 05:59 AM

I'd really like to go with 3.0 (2008) as it simplifies things considerably for me, but it wouldn't kill me to go as low as 1.5 (2003) for core functionality. Doom3, which I use as a low end benchmark seems to run on 1.2/1.2.1 (1998). Strangely enough there's a thin line of DX8 (2000) and slightly thicker line of DX9 (2002) users [totalling around 6%] in the Steam Hardware Survey. I'm guessing these are people with essentially unupgraded office-type computers. Are those worth targeting? I mean, if something is meant to perform with GPU transforms in mind, then performance on those computers is going to be abysmal anyway since it's logically unreasonable to expect that the CPU would outperform the GPU by a considerable margin to make up for the difference?

At the same time 2008 seems a bit too recent or can I expect some of the core 3.0 features to be managed by the driver if required? The question stems from the fact that I'm using a 2008 laptop with a GF280M to test things. It's good, because IMO it's a good approximation of a lower-to-medium end system. Strangely enough, though it seems to support 3.3 features here and there, which weren't available until 2010 (which I assume is adjusted for by the driver).

What are your thoughts?

Sponsor:

#2 SimonForsman   Crossbones+   -  Reputation: 5460

Like
1Likes
Like

Posted 28 January 2012 - 08:19 AM

I'd really like to go with 3.0 (2008) as it simplifies things considerably for me, but it wouldn't kill me to go as low as 1.5 (2003) for core functionality. Doom3, which I use as a low end benchmark seems to run on 1.2/1.2.1 (1998). Strangely enough there's a thin line of DX8 (2000) and slightly thicker line of DX9 (2002) users [totalling around 6%] in the Steam Hardware Survey. I'm guessing these are people with essentially unupgraded office-type computers. Are those worth targeting? I mean, if something is meant to perform with GPU transforms in mind, then performance on those computers is going to be abysmal anyway since it's logically unreasonable to expect that the CPU would outperform the GPU by a considerable margin to make up for the difference?

At the same time 2008 seems a bit too recent or can I expect some of the core 3.0 features to be managed by the driver if required? The question stems from the fact that I'm using a 2008 laptop with a GF280M to test things. It's good, because IMO it's a good approximation of a lower-to-medium end system. Strangely enough, though it seems to support 3.3 features here and there, which weren't available until 2010 (which I assume is adjusted for by the driver).

What are your thoughts?


You should get GPU transforms even on 1.1 , the matrix multiplication etc when you call glTranslate/rotate/etc are indeed done on the CPU, but the matrix is then uploaded to the GPU which does the per vertex transformations. (This really is no slower than the shader based approach (Where you also do the matrix multiplication on the CPU aswell and then send them to the GPU as shader uniforms), only less flexible)

OpenGL prior to 4.x (and to some extent 3.x) was an extension mess, you had pretty much all the OpenGL 3.1 features back in OpenGL 2.1 through extensions on nvidia hardware (Geometry shaders for example was available in OpenGL on nvidia GPUs before D3D10 got released), the big problem back then however was that if you were using extensions you couldn't just say "Requires OpenGL 1.5 and 64MB VRAM"

Doom3 will not run on just any 1.2 card, it uses extensions like mad and requires several features that didn't make it into core OpenGL until 1.5 or 2.0 (Most of the cards that Doom3 ran on got driver updates raising the supported OpenGL version on them).

As OpenGL has had a tendency to move slowly the hardware has petty much always been ahead of the specification which is why we have extensions, newer OpenGL versions tend to move the commonly used extensions into the core profile but rarely add anything new (usually its only a change of extension names from ext_blabla to arb_blabla which is why nvidia can add support for new OpenGL versions with a simple driver update (All the functionality was allready there and usable but with the "wrong" name in the application->driver interface))
I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!

#3 V-man   Members   -  Reputation: 797

Like
1Likes
Like

Posted 28 January 2012 - 08:40 AM

Any card that does GL 3.0 has a driver that does GL 3.3 (on Windows anyway) so why not use that?
Doom3 can run on GL 1.1 apparently http://liamm.com/tech/voodoo-2-sli-doom-3-kit
All id engines are GL 1.1 + a load and tons of extensions.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);

#4 irreversible   Crossbones+   -  Reputation: 1094

Like
0Likes
Like

Posted 28 January 2012 - 11:37 AM

By transforms I mean server-side processing that entails transform feedback, direct VBO memory mapping and general GPU utilization to speed up processes that can be offloaded from the CPU. I don't mean simple matrix operations like scaling, translation and rotation, which are implicit.

You can do most things on a basic level on the CPU, but at some point you'll have to lay down a guideline that says "you need this good of a computer to run the game". For instance displaying animated models with 8k polys on hardware that's 7-8 years old is either impossible or so slow it just doesn't make sense. Whereas the CPU cannot be used here to effectively lay down overall requirement guidelines since a CPU doesn't say anything about the presence of graphics hardware, the opposite is in my opinion a valid assumption to make: if the user has a GT400 series or newer card then he very likely has at least 2-8 cores on his CPU to keep that 580 well fed.

While the mess about extensions is indeed a big one, I think the versions at least set down some broader limitations like "you can't run this on hardware that's older than 8 years". The aim of this thread is to gauge what developers here think that this limitation should or might be in about one year for a small indie game that would do away with as much implementation headaches as possible while not giving up potential players due to hardware requirements.

PS - in all honestly I shouldn't have brought Doom 3 into the mix - it's just an example that is unnecessary for this discussion.

#5 mhagain   Crossbones+   -  Reputation: 6323

Like
1Likes
Like

Posted 28 January 2012 - 01:22 PM

There are two approaches that seem valid to me.

One is to aim for 2.1 and pull in as much from 3.x+ as possible on hardware that supports it.
Two is to aim for 3.3 and pull in as much from 4.x+ as possible on hardware that supports it.

In both cases "provided it doesn't make a mess of the code" should go without saying.

Approach one is a "low-end hardware" option and is for the case where you want to target that remaining DX9 class hardware, Intel graphics, etc. It will also get you on a baseline that's been well shaken-out in the wild and driver quality should be quite robust.

Approach two is a "mid-end but not premium" option: anyone who's upgraded their hardware in the last few years will be covered by this. Driver quality is currently slightly flaky (see: Rage) but things should hopefully settle down well over the next 12 months.

I wouldn't bother with retaining support for the fixed pipeline; it's a hell of a lot of work, can lead to incredibly messy and tangled code, time better spent elsewhere will be sucked in by it, and in the end the one person who still has a GeForce 4 MX may be kept happy but everybody else will be suffering.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#6 Martins Mozeiko   Crossbones+   -  Reputation: 1413

Like
0Likes
Like

Posted 28 January 2012 - 02:02 PM

Any card that does GL 3.0 has a driver that does GL 3.3 (on Windows anyway) so why not use that?

I don't believe this is true. I have Intel HD 3000 and driver for it provides only GL 3.1 (on Windows 7 x64). Yeah, I know its Intel... but anyway.

#7 V-man   Members   -  Reputation: 797

Like
0Likes
Like

Posted 28 January 2012 - 03:43 PM


Any card that does GL 3.0 has a driver that does GL 3.3 (on Windows anyway) so why not use that?

I don't believe this is true. I have Intel HD 3000 and driver for it provides only GL 3.1 (on Windows 7 x64). Yeah, I know its Intel... but anyway.


I guess I should have said nvidia and AMD.
Intel's GL version doesn't matter since their drivers aren't good judging from the many intel bug threads I have seen over the years. We usual suggest GL 1.1 bare minimum or just go with Direct3D in these forums.
Judging from the Direct3D forum, apparently there are bugs as well but much less.

As an indie developer you have to draw somewhere the line of support. If you are making a game for casual gamers, you can't ignore the Intel market.
Also, in the nvidia market, do you want to support GeforceFX and Geforce 6 and 7?
In the ATI/AMD market, do you want to support the Radeon 9700 and Radeon X1300?
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);

#8 zedz   Members   -  Reputation: 291

Like
0Likes
Like

Posted 29 January 2012 - 03:56 PM

Intel's GL version doesn't matter since their drivers aren't good judging from the many intel bug threads I have seen over the years.

perhaps true but I have a machine with onboard intel HD Graphics 3000, oddly its faster than my previous nvidia 9500 & I havent noticed any issue, mind Im not really focused on desktop development nowadays

#9 ill   Members   -  Reputation: 320

Like
1Likes
Like

Posted 31 January 2012 - 01:04 AM

I'm basically targetting OpenGL 3.3 and later. OpenGL 3.0 cards support 3.3 with a driver update.

So this basically means you need cards from 2007 or later such as the Geforce 8 series.

I'm doing Deferred shading so anything older than that probably won't run the graphics very well anyway due to the high memory bandwidth.

Also I think the audience for my game would either be gamers with fairly up to date decent PCs with ATI or Nvidia cards, or console gamers. My friend's laptop with a low end ATI card from the same era as the Geforce 8 cards runs my engine at about 27 FPS at the moment. His laptop is pretty damn old. Anyone with a laptop older than that is likely not the kind of gamer that would play my game anyway or can't afford a newer computer so they probably won't be looking for new games to buy anyway.

You just need to make sure you're not wasting your time supporting older hardware when the benefit isn't all that high. It feels great just using modern high end features. I plan on possibly supporting GL 4.2 as well and possibly DirectX even...

#10 irreversible   Crossbones+   -  Reputation: 1094

Like
0Likes
Like

Posted 31 January 2012 - 03:13 AM

I'm basically targetting OpenGL 3.3 and later. OpenGL 3.0 cards support 3.3 with a driver update.

So this basically means you need cards from 2007 or later such as the Geforce 8 series.

I'm doing Deferred shading so anything older than that probably won't run the graphics very well anyway due to the high memory bandwidth.

Also I think the audience for my game would either be gamers with fairly up to date decent PCs with ATI or Nvidia cards, or console gamers. My friend's laptop with a low end ATI card from the same era as the Geforce 8 cards runs my engine at about 27 FPS at the moment. His laptop is pretty damn old. Anyone with a laptop older than that is likely not the kind of gamer that would play my game anyway or can't afford a newer computer so they probably won't be looking for new games to buy anyway.

You just need to make sure you're not wasting your time supporting older hardware when the benefit isn't all that high. It feels great just using modern high end features. I plan on possibly supporting GL 4.2 as well and possibly DirectX even...


I like this reasoning. Merging it with what mhagain wrote, I think I'm going for 2.1 core and 3.3 extended as the main thing I'm not ready to give up is transform feedback. Although, I'd LOVE for Khronos to have added tessellation to v3. This is one thing I'm probably going out of my way to add forward compatibility for as I just want to do it.

PS - I'm using a deferred shader approach as well and will add a forward pass for transparency at one point, although I'll be severely limiting the use of transparent geometry (mostly to liquids).

Thanks for your thoughts everyone - this has helped me to define my approach a great deal!

#11 Tachikoma   Members   -  Reputation: 548

Like
0Likes
Like

Posted 31 January 2012 - 06:07 AM

If you started development recently, by the time you're done the landscape would have shifted somewhat anyway - i.e. more devices would be supporting newer GL versions.
Latest project: Sideways Racing on the iPad

#12 Nairou   Members   -  Reputation: 404

Like
0Likes
Like

Posted 02 February 2012 - 10:36 AM

What platforms are you targeting? If you plan to support Mac, they only support 3.2 (not 3.3), any only in the latest version (10.7 Lion).




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS