• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
irreversible

realistic minimum GL version support in a year?

11 posts in this topic

I'd really like to go with 3.0 (2008) as it simplifies things considerably for me, but it wouldn't kill me to go as low as 1.5 (2003) for core functionality. Doom3, which I use as a low end benchmark seems to run on 1.2/1.2.1 (1998). Strangely enough there's a thin line of DX8 (2000) and slightly thicker line of DX9 (2002) users [totalling around 6%] in the [url="http://store.steampowered.com/hwsurvey"]Steam Hardware Survey[/url]. I'm guessing these are people with essentially unupgraded office-type computers. Are those worth targeting? I mean, if something is meant to perform with GPU transforms in mind, then performance on those computers is going to be abysmal anyway since it's logically unreasonable to expect that the CPU would outperform the GPU by a considerable margin to make up for the difference?

At the same time 2008 seems a bit too recent or can I expect some of the core 3.0 features to be managed by the driver if required? The question stems from the fact that I'm using a 2008 laptop with a GF280M to test things. It's good, because IMO it's a good approximation of a lower-to-medium end system. Strangely enough, though it seems to support 3.3 features here and there, which weren't available until 2010 (which I assume is adjusted for by the driver).

What are your thoughts?
0

Share this post


Link to post
Share on other sites
[quote name='irreversible' timestamp='1327751946' post='4906982']
I'd really like to go with 3.0 (2008) as it simplifies things considerably for me, but it wouldn't kill me to go as low as 1.5 (2003) for core functionality. Doom3, which I use as a low end benchmark seems to run on 1.2/1.2.1 (1998). Strangely enough there's a thin line of DX8 (2000) and slightly thicker line of DX9 (2002) users [totalling around 6%] in the [url="http://store.steampowered.com/hwsurvey"]Steam Hardware Survey[/url]. I'm guessing these are people with essentially unupgraded office-type computers. Are those worth targeting? I mean, if something is meant to perform with GPU transforms in mind, then performance on those computers is going to be abysmal anyway since it's logically unreasonable to expect that the CPU would outperform the GPU by a considerable margin to make up for the difference?

At the same time 2008 seems a bit too recent or can I expect some of the core 3.0 features to be managed by the driver if required? The question stems from the fact that I'm using a 2008 laptop with a GF280M to test things. It's good, because IMO it's a good approximation of a lower-to-medium end system. Strangely enough, though it seems to support 3.3 features here and there, which weren't available until 2010 (which I assume is adjusted for by the driver).

What are your thoughts?
[/quote]

You should get GPU transforms even on 1.1 , the matrix multiplication etc when you call glTranslate/rotate/etc are indeed done on the CPU, but the matrix is then uploaded to the GPU which does the per vertex transformations. (This really is no slower than the shader based approach (Where you also do the matrix multiplication on the CPU aswell and then send them to the GPU as shader uniforms), only less flexible)

OpenGL prior to 4.x (and to some extent 3.x) was an extension mess, you had pretty much all the OpenGL 3.1 features back in OpenGL 2.1 through extensions on nvidia hardware (Geometry shaders for example was available in OpenGL on nvidia GPUs before D3D10 got released), the big problem back then however was that if you were using extensions you couldn't just say "Requires OpenGL 1.5 and 64MB VRAM"

Doom3 will not run on just any 1.2 card, it uses extensions like mad and requires several features that didn't make it into core OpenGL until 1.5 or 2.0 (Most of the cards that Doom3 ran on got driver updates raising the supported OpenGL version on them).

As OpenGL has had a tendency to move slowly the hardware has petty much always been ahead of the specification which is why we have extensions, newer OpenGL versions tend to move the commonly used extensions into the core profile but rarely add anything new (usually its only a change of extension names from ext_blabla to arb_blabla which is why nvidia can add support for new OpenGL versions with a simple driver update (All the functionality was allready there and usable but with the "wrong" name in the application->driver interface))
1

Share this post


Link to post
Share on other sites
Any card that does GL 3.0 has a driver that does GL 3.3 (on Windows anyway) so why not use that?
Doom3 can run on GL 1.1 apparently http://liamm.com/tech/voodoo-2-sli-doom-3-kit
All id engines are GL 1.1 + a load and tons of extensions.
1

Share this post


Link to post
Share on other sites
By transforms I mean server-side processing that entails transform feedback, direct VBO memory mapping and general GPU utilization to speed up processes that can be offloaded from the CPU. I don't mean simple matrix operations like scaling, translation and rotation, which are implicit.

You can do most things on a basic level on the CPU, but at some point you'll have to lay down a guideline that says "you need this good of a computer to run the game". For instance displaying animated models with 8k polys on hardware that's 7-8 years old is either impossible or so slow it just doesn't make sense. Whereas the CPU cannot be used here to effectively lay down overall requirement guidelines since a CPU doesn't say anything about the presence of graphics hardware, the opposite is in my opinion a valid assumption to make: if the user has a GT400 series or newer card then he very likely has at least 2-8 cores on his CPU to keep that 580 well fed.

While the mess about extensions is indeed a big one, I think the versions at least set down some broader limitations like "you can't run this on hardware that's older than 8 years". The aim of this thread is to gauge what developers here think that this limitation should or might be in about one year for a small indie game that would do away with as much implementation headaches as possible while not giving up potential players due to hardware requirements.

PS - in all honestly I shouldn't have brought Doom 3 into the mix - it's just an example that is unnecessary for this discussion.
0

Share this post


Link to post
Share on other sites
There are two approaches that seem valid to me.

One is to aim for 2.1 and pull in as much from 3.x+ as possible on hardware that supports it.
Two is to aim for 3.3 and pull in as much from 4.x+ as possible on hardware that supports it.

In both cases "provided it doesn't make a mess of the code" should go without saying.

Approach one is a "low-end hardware" option and is for the case where you want to target that remaining DX9 class hardware, Intel graphics, etc. It will also get you on a baseline that's been well shaken-out in the wild and driver quality should be quite robust.

Approach two is a "mid-end but not premium" option: anyone who's upgraded their hardware in the last few years will be covered by this. Driver quality is currently slightly flaky (see: Rage) but things should hopefully settle down well over the next 12 months.

I wouldn't bother with retaining support for the fixed pipeline; it's a hell of a lot of work, can lead to incredibly messy and tangled code, time better spent elsewhere will be sucked in by it, and in the end the one person who still has a GeForce 4 MX may be kept happy but everybody else will be suffering.
1

Share this post


Link to post
Share on other sites
[quote name='V-man' timestamp='1327761617' post='4907019']
Any card that does GL 3.0 has a driver that does GL 3.3 (on Windows anyway) so why not use that?[/quote]
I don't believe this is true. I have Intel HD 3000 and driver for it provides only GL 3.1 (on Windows 7 x64). Yeah, I know its Intel... but anyway.
0

Share this post


Link to post
Share on other sites
[quote name='Martins Mozeiko' timestamp='1327780945' post='4907119']
[quote name='V-man' timestamp='1327761617' post='4907019']
Any card that does GL 3.0 has a driver that does GL 3.3 (on Windows anyway) so why not use that?[/quote]
I don't believe this is true. I have Intel HD 3000 and driver for it provides only GL 3.1 (on Windows 7 x64). Yeah, I know its Intel... but anyway.
[/quote]

I guess I should have said nvidia and AMD.
Intel's GL version doesn't matter since their drivers aren't good judging from the many intel bug threads I have seen over the years. We usual suggest GL 1.1 bare minimum or just go with Direct3D in these forums.
Judging from the Direct3D forum, apparently there are bugs as well but much less.

As an indie developer you have to draw somewhere the line of support. If you are making a game for casual gamers, you can't ignore the Intel market.
Also, in the nvidia market, do you want to support GeforceFX and Geforce 6 and 7?
In the ATI/AMD market, do you want to support the Radeon 9700 and Radeon X1300?
0

Share this post


Link to post
Share on other sites
[quote]Intel's GL version doesn't matter since their drivers aren't good judging from the many intel bug threads I have seen over the years.[/quote]
perhaps true but I have a machine with onboard intel HD Graphics 3000, oddly its faster than my previous nvidia 9500 & I havent noticed any issue, mind Im not really focused on desktop development nowadays
0

Share this post


Link to post
Share on other sites
I'm basically targetting OpenGL 3.3 and later. OpenGL 3.0 cards support 3.3 with a driver update.

So this basically means you need cards from 2007 or later such as the Geforce 8 series.

I'm doing Deferred shading so anything older than that probably won't run the graphics very well anyway due to the high memory bandwidth.

Also I think the audience for my game would either be gamers with fairly up to date decent PCs with ATI or Nvidia cards, or console gamers. My friend's laptop with a low end ATI card from the same era as the Geforce 8 cards runs my engine at about 27 FPS at the moment. His laptop is pretty damn old. Anyone with a laptop older than that is likely not the kind of gamer that would play my game anyway or can't afford a newer computer so they probably won't be looking for new games to buy anyway.

You just need to make sure you're not wasting your time supporting older hardware when the benefit isn't all that high. It feels great just using modern high end features. I plan on possibly supporting GL 4.2 as well and possibly DirectX even...
1

Share this post


Link to post
Share on other sites
[quote name='ill' timestamp='1327993440' post='4907906']
I'm basically targetting OpenGL 3.3 and later. OpenGL 3.0 cards support 3.3 with a driver update.

So this basically means you need cards from 2007 or later such as the Geforce 8 series.

I'm doing Deferred shading so anything older than that probably won't run the graphics very well anyway due to the high memory bandwidth.

Also I think the audience for my game would either be gamers with fairly up to date decent PCs with ATI or Nvidia cards, or console gamers. My friend's laptop with a low end ATI card from the same era as the Geforce 8 cards runs my engine at about 27 FPS at the moment. His laptop is pretty damn old. Anyone with a laptop older than that is likely not the kind of gamer that would play my game anyway or can't afford a newer computer so they probably won't be looking for new games to buy anyway.

You just need to make sure you're not wasting your time supporting older hardware when the benefit isn't all that high. It feels great just using modern high end features. I plan on possibly supporting GL 4.2 as well and possibly DirectX even...
[/quote]

I like this reasoning. Merging it with what mhagain wrote, I think I'm going for 2.1 core and 3.3 extended as the main thing I'm not ready to give up is transform feedback. Although, I'd LOVE for Khronos to have added tessellation to v3. This is one thing I'm probably going out of my way to add forward compatibility for as I just want to do it.

PS - I'm using a deferred shader approach as well and will add a forward pass for transparency at one point, although I'll be severely limiting the use of transparent geometry (mostly to liquids).

Thanks for your thoughts everyone - this has helped me to define my approach a great deal!
0

Share this post


Link to post
Share on other sites
If you started development recently, by the time you're done the landscape would have shifted somewhat anyway - i.e. more devices would be supporting newer GL versions.
0

Share this post


Link to post
Share on other sites
What platforms are you targeting? If you plan to support Mac, they only support 3.2 (not 3.3), any only in the latest version (10.7 Lion).
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0