How much time companies spend debugging
Hi,
I just bought John Robbin's book, "Debugging Applications for .NET and Windows". While reading the introduction, I noticed he said "Without realizing it, most teams spend an average of 50 percent of their development cycle debugging."
I was wondering if this is true in the game industry... In other kinds of software, I can imagine that people would be anxious to get rid of bugs- for example the code that controls weapons or medical technology. For games though, I would imagine it's at most 20 percent, though I wouldn't know since I've only done a total of 3 months in the industry, and as an intern at that.
Anyone have any ideas on this?
Thanks,
roos
I'm going to go out on a limb here, and suggest that "most software"--and in particular, most .NET and Windows software--is not for controlling weapons or medical technology. So Robbins probably is talking about run-of-the-mill software requiring 50 percent debugging time.
Why would you think that games have a lower amount of debug time than other applications? There's surprisingly little difference, when you come down to it, between game programming and other areas.
Why would you think that games have a lower amount of debug time than other applications? There's surprisingly little difference, when you come down to it, between game programming and other areas.
Yeah, that's a good point, thanks! Actually, I'm glad that you think the situation IS the same in the game industry, because then this book will really be useful :)
Why I thought maybe it's less strict in the game industry? Well, I should clarify that I'm thinking of the PC game market in particular, not consoles or PDAs, etc. (much stricter QA there).
First, just by observation, it seems like a lot of companies put out games which have bugs and then patch them later (perhaps because people are anxious to get games out by Xmas). Though I guess I can't blame them because it's impossible to test all possible hardware configurations until you actually release a game.
The other reason is something I read in an article about debugging, actually the author was explaining his theory about why games are or may be more buggy. Unfortunately I can't recall the specifics, but basically his point was that game programmers have different priorities than other programmers. You never see someone writing a word processor and worrying if it's going to be "fun". They'd be more concerned about usability and features.
Finally, 50% is just so much that it's hard to believe. I don't recall spending that much time debugging, or my coworkers spending that much time... Maybe I'm just perceiving it that way.
roos
Why I thought maybe it's less strict in the game industry? Well, I should clarify that I'm thinking of the PC game market in particular, not consoles or PDAs, etc. (much stricter QA there).
First, just by observation, it seems like a lot of companies put out games which have bugs and then patch them later (perhaps because people are anxious to get games out by Xmas). Though I guess I can't blame them because it's impossible to test all possible hardware configurations until you actually release a game.
The other reason is something I read in an article about debugging, actually the author was explaining his theory about why games are or may be more buggy. Unfortunately I can't recall the specifics, but basically his point was that game programmers have different priorities than other programmers. You never see someone writing a word processor and worrying if it's going to be "fun". They'd be more concerned about usability and features.
Finally, 50% is just so much that it's hard to believe. I don't recall spending that much time debugging, or my coworkers spending that much time... Maybe I'm just perceiving it that way.
roos
The numbers seem small because, in general, development teams don't set aside "Debugging time" until the very end of the project, after the game has reached Code Complete status. This is the time where no new features are added, the only code changes are bug fixes. In truth, however, alot of debugging goes on during development, I would argue that this type of debugging accounts for 60% or more of all debugging that occurs.
If any project spends only 20% of their time debugging, the only safe assumption is that major bugs still exist. :)
If any project spends only 20% of their time debugging, the only safe assumption is that major bugs still exist. :)
roos, the amount of time that a given project will spend debugging in the industry depends not only on the obvious factors, like how many different people work on the code and how big the codebase is, but also on other business factors like the way the company markets its product and what their business model is in handling customers..
take for instance consulting firms. They have a software product that they sell, so there is a development group. However, the customers have a need to have the software maintained and possibly extended in special ways that could vary by customer. The consulting firm is going to spend more money on the maintenance/debugging team for those clients which are more important to the company, and also this forms a feedback loop of the product development folks with the customer. It works out a bit better for everybody. Also, this way the consultants can charge the customers directly for the time the debuggers are spending fixing the code!!
If you don't believe that, consider I work on a piece of financial software that is roughly 7 million lines of code. They charge the client some amount of millions to put it in place, then they bill my time to the client to spend 8 hours a day finding bugs and then fixing the base product. For this situation, we have a group of people who deal exclusively with debugging and extending the software which is much, much larger than the product development team. That I guess means that roughly 50% of all the development time is debugging time.
how does this apply to the game industry? well to me it's mostly a matter of scale. debugging isn't just about finding bugs, it is also about design decisions that the folks who originally wrote the code were not aware needed to be put into the framework, and also debugging and maintenance brings the software cycle to a complete circle so that it can continue to evolve. The same thing is true about forming marketing and other business relationships with the people who are going to be trying to sell your software, as well as the end users.
fwiw I would also say that a vast majority of medical software is in fact controlled by microsoft-based software. I learned this when I worked for a medical printer company called Codonics (I believe they are in the top 5 of those who produce medical printers). theirs was based off of red hat linux, however my manager there explained to me that a lot of software cannot be adopted into many hospitals because they require a certain level of support as well as "dependability" which somehow it has come to pass that microsoft products are considered more dependable for medical software and imaging. Another important factor there is to consider, when something goes wrong and people die, they want someone to blame -- not a group of open source programmers or other such that have expressly said they are not accountable.
take for instance consulting firms. They have a software product that they sell, so there is a development group. However, the customers have a need to have the software maintained and possibly extended in special ways that could vary by customer. The consulting firm is going to spend more money on the maintenance/debugging team for those clients which are more important to the company, and also this forms a feedback loop of the product development folks with the customer. It works out a bit better for everybody. Also, this way the consultants can charge the customers directly for the time the debuggers are spending fixing the code!!
If you don't believe that, consider I work on a piece of financial software that is roughly 7 million lines of code. They charge the client some amount of millions to put it in place, then they bill my time to the client to spend 8 hours a day finding bugs and then fixing the base product. For this situation, we have a group of people who deal exclusively with debugging and extending the software which is much, much larger than the product development team. That I guess means that roughly 50% of all the development time is debugging time.
how does this apply to the game industry? well to me it's mostly a matter of scale. debugging isn't just about finding bugs, it is also about design decisions that the folks who originally wrote the code were not aware needed to be put into the framework, and also debugging and maintenance brings the software cycle to a complete circle so that it can continue to evolve. The same thing is true about forming marketing and other business relationships with the people who are going to be trying to sell your software, as well as the end users.
fwiw I would also say that a vast majority of medical software is in fact controlled by microsoft-based software. I learned this when I worked for a medical printer company called Codonics (I believe they are in the top 5 of those who produce medical printers). theirs was based off of red hat linux, however my manager there explained to me that a lot of software cannot be adopted into many hospitals because they require a certain level of support as well as "dependability" which somehow it has come to pass that microsoft products are considered more dependable for medical software and imaging. Another important factor there is to consider, when something goes wrong and people die, they want someone to blame -- not a group of open source programmers or other such that have expressly said they are not accountable.
Quote:Original post by roos
First, just by observation, it seems like a lot of companies put out games which have bugs and then patch them later (perhaps because people are anxious to get games out by Xmas). Though I guess I can't blame them because it's impossible to test all possible hardware configurations until you actually release a game.
You're right to identify hardware as a major factor in shipped bugs in games. Another problem is that games tend to have much more stuff going on "under the hood" (memory management, streaming content, multithreading, etc.) than, say, your average spreadsheet program. So there are a lot of corner cases that are difficult to identify.
Quote:The other reason is something I read in an article about debugging, actually the author was explaining his theory about why games are or may be more buggy. Unfortunately I can't recall the specifics, but basically his point was that game programmers have different priorities than other programmers. You never see someone writing a word processor and worrying if it's going to be "fun". They'd be more concerned about usability and features.
I'd disagree with that. A game programmer will spend a little bit of time making sure that a game is fun, but that's mostly the designer's task. In any significant game project, there will be several programmers, and for the sake of coordination it's important for the individual programmer to concentrate on implementing features as designed. I can certainly tell you that any game programmer worth his salt will, given a choice between shipping two buggy features and one bugless feature, will choose the latter. "fun" == "no bugs".
Quote:Finally, 50% is just so much that it's hard to believe. I don't recall spending that much time debugging, or my coworkers spending that much time... Maybe I'm just perceiving it that way.
It all depends on how you delineate the two tasks; and any delineation will be little more than semantics, IMHO. Programming and debugging are two sides of the same coin, and they're often happening at the same time.
Debugging is a pretty broad term to use since I think there are many activities which fall under the debugging umbrella:
1. Syntax errors (easier to fix)
2. Application Design errors
3. Code Design errors
4. System Errors (hardware, OSes & technologies)
5. Unknown quirky errors
As a general rule of thumb, you can reduce the amount of time you spend debugging by squashing bugs as soon as they appear and NOT proceed with further project development until the bug has been resolved. The justification for that is the bug becomes exponentially more expensive to resolve the longer it resides in a system (expensive in time, money, energy and motivation).
1. Syntax errors (easier to fix)
2. Application Design errors
3. Code Design errors
4. System Errors (hardware, OSes & technologies)
5. Unknown quirky errors
As a general rule of thumb, you can reduce the amount of time you spend debugging by squashing bugs as soon as they appear and NOT proceed with further project development until the bug has been resolved. The justification for that is the bug becomes exponentially more expensive to resolve the longer it resides in a system (expensive in time, money, energy and motivation).
I would say that there is a world of difference between the way a game is developed and tested compared to how an application is developed and tested.
Notice that I said 'developed and tested', and not 'debugged'.
Here, we have a team of QA/Testers that will begin to define test cases and test scripts as soon as the requirements are written... even before any code is written by the developers. Their job is to think of ways to test both the nominal and corner case scenarios as per the defined requirements.
We write ICDs (Interface Control Documents) that specify the API/protocols between each module and system. These are formally published, as are the requirements. Subsequent changes to ICD's or requirements need to be sent though a CCB (Change Control Board) that assesses impact to schedule, manpower and cost.
The QA/Tester teams usually require tools and frameworks need to be developed in order to test the product.
The developers are forced to implement unit tests to thoroughly test their modules before delivery to QA.
We have a Problem Reporting/Bug Tracking system that all the developers and testers use to report bugs and provide status on said bugs. We have weekly meetings to discuss outstanding bugs and issues.
We have a master schedule that is updated constantly. We know immediately what impact any component will have to the schedule if it is delivered late.
The product is tested at various phases during development in order to shake out bugs.
That's how we develop software here.
Notice that I said 'developed and tested', and not 'debugged'.
Here, we have a team of QA/Testers that will begin to define test cases and test scripts as soon as the requirements are written... even before any code is written by the developers. Their job is to think of ways to test both the nominal and corner case scenarios as per the defined requirements.
We write ICDs (Interface Control Documents) that specify the API/protocols between each module and system. These are formally published, as are the requirements. Subsequent changes to ICD's or requirements need to be sent though a CCB (Change Control Board) that assesses impact to schedule, manpower and cost.
The QA/Tester teams usually require tools and frameworks need to be developed in order to test the product.
The developers are forced to implement unit tests to thoroughly test their modules before delivery to QA.
We have a Problem Reporting/Bug Tracking system that all the developers and testers use to report bugs and provide status on said bugs. We have weekly meetings to discuss outstanding bugs and issues.
We have a master schedule that is updated constantly. We know immediately what impact any component will have to the schedule if it is delivered late.
The product is tested at various phases during development in order to shake out bugs.
That's how we develop software here.
Quote:Original post by Anonymous Poster
I would say that there is a world of difference between the way a game is developed and tested compared to how an application is developed and tested.
Stupid login system. I'm the AP for the above post.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement