What gets a game to pass certification by a publisher?

Started by
13 comments, last by Stainless 10 years, 1 month ago
It seems weird or more appropriately baffling how games with bugs, low frame rate and poor gameplay pass certification in which then the game distributed to the consumers? Am I missing something about certification by the publisher?
Advertisement

Certification is a process controlled by the platform holders (e.g. Nintendo, Microsoft, Sony), not the publishers. It mostly ensures that your game isn't going to crash, and that the user-experience is going to be consistent with all other games on that platform (time spent in loading screens, names/icons of buttons, sign-in screens, animating "game is saving" icons, online profanity filtering, etc). Actual bad gameplay problems are only a minor concern here.

Also to be quite frank, the console makers are often willing to look the other way when a big company asks for some ... flexibility in hitting the requirements.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

Reiterating: They don't care about gameplay.

Certification is not about how fun the game is. It can be crazy fun, or it can be the stupidest game ever.

Certification is about how you handle a specific list of bad situations. A Google search for the words "TRC TCR Lotcheck" gives some descriptive results. Does the game crash? (Hint: It better not crash.) Does the game follow all the system's requirements? (Hint: a PlayStation controller button showing up in an XBox certification is a really bad thing.) Is it using debug libraries, or old libraries that have been replaced due to security concerns? Can the testers continuously play the game for 100 hours, swapping out testers on the machines over the course of a few days? Can they leave the game sitting in a start screen or other steady position running for a few days without it crashing? Does the game handle behavior like disconnected controllers, ejected disks, unplugged network cables, and bad internet connections? Do all the features basically work in a manner that the testers can figure out?

If all of those pass, it ships.

Often the certification teams will find something to complain about. Sometimes they are big complaints. E.g. "We found a crash..." Other times they are minor complaints that you can easily fight back against. Them: "The saving screen was up for about 5 seconds". Us: "On our retail kits it takes about 1.2 seconds, what did you use to time it?" Them: "Oh, looks like you are right. Sorry about that."

I've worked on a few games that passed cleanly on the first pass through all of the big three, so I know it happens occasionally. When we had our first 1-submit Nintendo title the studio execs took the whole team out to lunch and gave the rest of the week off. Having good internal QA makes life so much better for everyone. :-)

Also to be quite frank, the console makers are often willing to look the other way when a big company asks for some ... flexibility in hitting the requirements.

flexibility as in being lenient with the rules? Is it necessarily so that the game can hit the deadline? I'm guessing since the publisher is publishing the game, they obviously want the game to be shipped so they have return on investment?

May you give me one example of a worst case that happened to a game but the game still got shipped? If specific information cannot be disclosed, I can understand.

flexibility as in being lenient with the rules? Is it necessarily so that the game can hit the deadline? I'm guessing since the publisher is publishing the game, they obviously want the game to be shipped so they have return on investment?


I believe that flexibility is earned in a similar way to how the Mafia earns flexibility from the police.

flexibility as in being lenient with the rules? Is it necessarily so that the game can hit the deadline? I'm guessing since the publisher is publishing the game, they obviously want the game to be shipped so they have return on investment?

May you give me one example of a worst case that happened to a game but the game still got shipped? If specific information cannot be disclosed, I can understand.

They are generally called "waivers" from the requirements. The company checklists (TRC, TCR, and Lotcheck requirements) specify that you must do certain things in response to certain behavior, or must not do certain things in response to certain events.

For a specific example, let's say the requirement is that the game must not have degraded network play under specific artificial lab requirements. Two that I have seen were labs come back saying that when they artificially simulate a long-term 40% packet loss (the first case) or artificially simulate frequent multi-second latency (the second case) the game does not perform adequately or has some problems.

If you are a little studio with no clout you will have a difficult time fighting back. You adjust your network code to handle an obscene case of testers being bored and testing weird conditions.

If you are with a major publisher they can fight back with this kind of implied message. It doesn't go exactly like this, but it might be interpreted this way: "We know it has that minor issue in that very rare case, and we know our audience does not have that bad of connection. From one multi-billion dollar corporation to another multi-billion dollar corporation, we're asking for a waiver. If customers complain, you can pull out this email saying we knew it was a bug and you told us to fix it, we'll take the blame."

Most of the time the requirements are reasonable. As professional game developers we want to create amazing games that work well for everybody. When certification comes back with concerns, teams don't like the defects but generally are willing to fix them. It is only the really hard ones that require massive change after the game is essentially complete that studios want to fight back.

Some things are easier to get waivers on than others. An unanimated loading screen that approaches the limits on one specific level would be pretty easy to push back against. Submitting with an old library one or two days past the replacement cutoff date might be a little harder. It is common to challenge a cert requirement by reviewing the change, discovering the risks or costs of a change, and asking them explaining that fixing that issue would create a different issue, asking which of the two issues they would prefer. ... but be prepared for them to require the change and accept the different issue. Very rarely it reaches the point of studio leadership making passionate pleas that the fix would require massive rewrites causing the dates to slip or possibly require cancellation of the project. The more costly it is to the developer and risky the fix, the more likely they are to grant it. But the more visible and the more potentially damaging the error, the less likely a waiver will be allowed. People talk about it, negotiate, establish a paper trail, and make decisions.

Even a minor change risks destabilizing the game, so changes after final submission are heavily reviewed and require significant QA work for both halo-testing and yet another pass through the entire storyline. One typical cert requirement is that someone has played the game from beginning to end without cheats. When you make that last-minute change you ask the magical testers who can race through everything in 11 hours to do their magic and pay them overtime, meals, and a gift. The QA effort itself can even be useful when pushing back and asking for waivers.

In the network case you mentioned, Call Of Duty should fail. Yet it is published.

Can you imagine Microsoft refusing to certify COD ? smile.png Don't get me wrong, as far as I am concerned they should.

As far as I am concerned, certification is a good thing. It can be annoying, but no more annoying than bug reports you get from internal QA.

I had one many years ago...

BUG : Game crashes

Actions : Press these 5 keys with your left hand, these 5 keys with your right hand, and press the spacebar with your nose

Repeatability : 100%

The bug fix was "Don't fecking do it"

There are times when certification has to be "massaged". In my experience it is always when the test case that fails was badly designed.

For example we had massive problems getting a JVM certified by Sun. The test case was the garbage collector. The Java garbage collector is crap, it has a known bug in it that means it will eventually fail. The test case exercised the garbage collector and had to run for 10 hours.

This was fine for a normal JVM, but ours ran the test 147 times faster than the original Sun JVM. This meant we had to run for the equivalent of 1470 hours or 2 months. After between 9 hours 47 minutes and 9 hours 49 minutes, our JVM crashed.

We eventually managed to get Sun to accept that it was the test case that was at fault and we got our certification.


The bug fix was "Don't fecking do it"

Oh boy...

Still it is important to fix the bugs, very important to the customers. I can understand the game needing to meet the deadline and fixing the bugs might comprise a lot of time in other things that still need to be worked on. It still should be fixed.

Still it is important to fix the bugs, very important to the customers. I can understand the game needing to meet the deadline and fixing the bugs might comprise a lot of time in other things that still need to be worked on. It still should be fixed.


I know the feeling. I've worked on some input bugs that were most easily reproduced by slamming a bunch of buttons at a certain time, and it's easy to say the users deserve to break the game if that's what they're trying to do, but those have been tip-of-the-iceberg style bugs that have revealed issues such as race conditions in the underlying input system. It's easy to complain about QA doing stupid things, but they're only bugs because programmers did stupid things.

I know too many engineers who get something up to about 95% working properly when they're supposed to be finishing tasks and then get tons of praise from management (a) for "completing" work and then (b) fixing piles of bugs later. Of course, not all of them get fixed, and we ship games with these sorts of crashes, while they laugh at stupid QA.

This topic is closed to new replies.

Advertisement