Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


We're also offering banner ads on our site from just $5! 1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


How do game developers know the system requirement to their games?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 warnexus   Prime Members   -  Reputation: 1450

Like
2Likes
Like

Posted 15 February 2014 - 05:48 PM

What kind of tool do they use to make this decision? Are the system requirements just an estimation? Suppose the end user has slighting below the system requirements, would that be enough to run the game?

 

Do they use a benchmark tool (not sure if these actually existed online or need to be custom made)? 

 

 

 

An example system requirements for a game made using XNA:

 

  • OS: Windows (XP/Vista/7), MacOS X, or Linux
  • Processor: 2ghz+
  • Memory: 512 MB
  • Graphics: Pixel Shader Model 2.0+ 

I only made games in Java so I know it will run on any platforms. But I would not know how good the processor needs to be and how much RAM the end user will need for running games made in Java. I would not even know how good the graphics card needs to be that is required for the game. 



Sponsor:

#2 Hodgman   Moderators   -  Reputation: 30926

Like
15Likes
Like

Posted 15 February 2014 - 06:53 PM

The graphics model requirements (e.g. SM2.0+) depends on the API/features that you use.

e.g. if you're using the D3D9 API, but not using an SM3 features, or you're using the D3D11 API with "feature level 9", then your game will run on SM 2+ video cards.

Similarly when you use OpenGL, different versions of OpenGL have different features. If you only use functions from OpenGL 2.1, then you'd say your game requires an OpenGL2.1+ compliant GPU.

 

For RAM requirements, you'd take measurements. Go through the worst-case part of your game and see how much RAM it consumes. Most game engines have built-in tools to help with these measurements, because it's extremely important for console games. e.g. PS3 only has 256MiB of main RAM -- so actually running out of RAM (and crashing) is a real possibility.

Usually during development you come up with some "budgets" for RAM usage -- e.g. the AI system gets xxMB, the animation systems get yyMB, the graphics systems get zzMB (and xx + yy + zz < 256), etc...

 

IIRC, when you launch a java program, there's a command line parameter that specifies a maximum RAM limit that the program can use. You could experimentally lower this limit until you find the point where your game crashes wink.png

 

For CPU requirements, you've just got to test. Usually you'll develop on your 'recommended' hardware, and keep an eye on frame-rates (or: milliseconds per frame) at all times. Again, you'd usually create "budgets" for CPU usage -- e.g. AI gets xx ms per frame, animation gets yy ms per frame, graphics gets zz ms per frame, etc, etc... (where xx + yy + zz < 16.6ms for a 60Hz game, or < 33.3ms for a 30Hz game).

Again, most game engines will have built-in tools to help you measure the number of milliseconds per system per frame.

To discover the minimum requirements, the only real option is empirical testing. Get a bunch of test machines and make observations. If you observe it running at your minimum acceptable framerate in the worst-case part of the game, then that hardware is ok. Repeat until you find hardware that isn't ok...


Edited by Hodgman, 16 February 2014 - 03:28 AM.


#3 frob   Moderators   -  Reputation: 22218

Like
2Likes
Like

Posted 16 February 2014 - 08:40 AM

(where xx + yy + zz < 16.6ms for a 60Hz game, or < 33.3ms for a 30Hz game).
Again, most game engines will have built-in tools to help you measure the number of milliseconds per system per frame.
To discover the minimum requirements, the only real option is empirical testing. Get a bunch of test machines and make observations. If you observe it running at your minimum acceptable framerate in the worst-case part of the game, then that hardware is ok. Repeat until you find hardware that isn't ok...

Everything above is quite true, just note that the common 60Hz and 30Hz come from game console backgrounds and TV screen refresh rates.

Although single-DVI interface cables run at 60Hz on the highest resolutions, 1280x1024@85Hz is a very common spec in competitive gaming.

There are quite a few twitch gameplay styles, especially FPS horror games and competitive arenas, where players expect that a high-end computer can hit 120Hz or even faster. There are many tournaments where people spend fortunes on the newest graphics cards with dual-link DVI monitors just so they can edge out their opponent on twitch response.

In this type of situation a budget of just 8ms or 5ms can seem generously high.

Check out my book, Game Development with Unity, aimed at beginners who want to build fun games fast.

Also check out my personal website at bryanwagstaff.com, where I write about assorted stuff.


#4 warnexus   Prime Members   -  Reputation: 1450

Like
0Likes
Like

Posted 17 February 2014 - 09:17 PM

Thanks for the responses! This is really good info!






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS