Jump to content

  • Log In with Google      Sign In   
  • Create Account

Unit Testing ftw?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
36 replies to this topic

#21 SimonForsman   Crossbones+   -  Reputation: 6305

Like
0Likes
Like

Posted 11 October 2013 - 06:41 AM

 

code written by an expert today may be modified to incorporate new features 10 or even 20+ years from now by an entierly different group of developers

 

Please name me a single traditional game engine that has 20+ years of life in it.

 

World of Warcraft is one of the longest running major games out there today, and it hasn't had it's 9th birthday yet. 

 

Other long-running engines include the annual sports games like Madden that is 25 years old and FIFA that is 20 years old, but those have been rewritten many times over their lifetime. 

 

It is true that certain long-running libraries and components are deserving of unit tests. I haven't seen anybody write that they shouldn't have unit tests.  But other parts of the game are short-lived and the code is effectively dead when the disc image is released to manufacturing, and these would be foolish to write unit tests for.

 

 

I never mentioned games, the post i replied to was talking about business software.


I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!

Sponsor:

#22 Hodgman   Moderators   -  Reputation: 31822

Like
2Likes
Like

Posted 11 October 2013 - 07:10 AM

My philosophy with code is that if I haven't seen every single feature actually work, then the code isn't finished yet.

Under this definition, if writing the code+tests takes twice as long as just the code, that doesn't matter. When you've just written the code, the task isn't finished. After you've passed the tests, the task is finished (hopefully). So you're comparing the time taken to half-finish a task with the time taken to finish a task wink.png

 

If I'm not writing tests for something, I'll usually at least put a breakpoint in it once and step through it line by line, which is basically a semi-automated "desk check", which used to be a standard requirement for programming...

 

This is especially true when writing games in C++ -- all too often it's possible to write code that looks correct, and seems to behave correct from the outside, but a simple desk-check reveals an obvious off-by-one or overflow or edge-case bug.

 

If the code is too complex to easily step through all the possible use-cases, then I'll write a unit test instead.

 

The linked video about writing testable code contains good advice, btw. Not just for testing's sake, but also for writing flexible, reusable code in general.

 

 

 


Please name me a single traditional game engine that has 20+ years of life in it.

Yes this is not common, engines from the 80's/90's/00's are each very different from the engines of today... but 10 years of code maintenance might happen, a small bit wink.png

I bet there's still some small smidgen of code from Quake 1 hanging around in the Source engine.

When I was at Krome, the coding convention required every function to be commented with the original author and date. I saw a few nuggets deep within the 3rd rewrite of their engine that were dated year 2000, and this was in 2010.


Edited by Hodgman, 11 October 2013 - 07:11 AM.


#23 LorenzoGatti   Crossbones+   -  Reputation: 2763

Like
0Likes
Like

Posted 11 October 2013 - 07:52 AM

 

However, for most game code that is written once and then discarded, unit tests generally do not make sense. The cost to hire an room full of QA testers to verify everything at the end of the project is often less than the cost to create the automated tests.

 

But the way I understood it, was that unit testing was preferably something that the programmer himself does concurrently with the coding. Sort of a way to reduce the need for systems testing later on. Or maybe I misunderstood the context, here.

Usually, the point of unit testing is that if the units are well tested and correct all errors that come up in the integration between those units are due to the integration logic itself (e.g. module X crashes with the actual game engine because, unlike in X's unit tests, it isn't initialized properly) or to misunderstandings between modules (e.g. both the texture loader and the texture resource manager flip textures vertically, resulting in upside down graphics). It's much better than testing everything with a chance of bugs of any kind.


Produci, consuma, crepa

#24 rpiller   Members   -  Reputation: 706

Like
0Likes
Like

Posted 11 October 2013 - 08:58 AM

Would you want a programmer spending time writing units tests (especially if it's 1:1 or even if it's 1/2 the amount of time) at $150k/yr+, or a QA guy test the code for $30k/yr? I think it's a valid question on where you want your time/money spent. There is no doubt that TDD results in better code, but that's not always the requirement. Sometimes 'good enough' and getting it done fast and cheap is the requirement.

 

 


If I'm not writing tests for something, I'll usually at least put a breakpoint in it once and step through it line by line, which is basically a semi-automated "desk check", which used to be a standard requirement for programming...

 

This is especially true when writing games in C++ -- all too often it's possible to write code that looks correct, and seems to behave correct from the outside, but a simple desk-check reveals an obvious off-by-one or overflow or edge-case bug.

 

In my view stepping through code is a must. Every game/app I've ever written, which haven't been massive amount of code, I've always stepped through every line at one point or another. Every time I implement a class or couple methods I step through them to make sure they do what I expect.

 

Your comment makes it sound like that's not a common thing to do? I have to assume it's very common and probably done more then unit testing between all the different levels of programmers around.



#25 rozz666   Members   -  Reputation: 636

Like
0Likes
Like

Posted 11 October 2013 - 09:30 AM


In my view stepping through code is a must. Every game/app I've ever written, which haven't been massive amount of code, I've always stepped through every line at one point or another. Every time I implement a class or couple methods I step through them to make sure they do what I expect.

In my experience when I started using Unit TDD and Accepance TDD, I've never had to step through every line. That's what my tests are doing.

 

I'm also surprised that no one mentioned that writing unit tests improves the design of the application. It enforces separation of concerns, single responsibility principle, dependency inversion, etc., because it becomes difficult to unit test classes if they violate these principles. This, in turn, improves maintainability.



#26 Buster2000   Members   -  Reputation: 1775

Like
1Likes
Like

Posted 11 October 2013 - 09:42 AM

I work at a company where QA is considered an expensive resource (Not because of their salary but because they are a bottleneck).  Products do not ship unless they are varified by QA in several environments.  

If the software has bugs or fails a regression cycle it requires extra QA time.   If the developer follows Acceptance Criteria and writes unit tests for these criteria then the QA cycle only takes a couple of days.  It has also got to the point that as well as writing Unit tests it is also better for us to write automated UI tests and server integration tests too.

 

 

However this is non games software.  The stuff I worked on in the games industry was like frob said throw away code.  I knew for a fact that once I'd written a piece of code that I would never see it again. 

The current mobile app I am working on has bits of code I wrote 3 years ago and I know for a fact that this code will be around until somebody invents a new technology to replace smartphones.



#27 wintertime   Members   -  Reputation: 1877

Like
3Likes
Like

Posted 11 October 2013 - 05:39 PM


Would you want a programmer spending time writing units tests (especially if it's 1:1 or even if it's 1/2 the amount of time) at $150k/yr+, or a QA guy test the code for $30k/yr? I think it's a valid question on where you want your time/money spent. There is no doubt that TDD results in better code, but that's not always the requirement. Sometimes 'good enough' and getting it done fast and cheap is the requirement.

That sounds like a false equivalency to me. Wouldn't the QA guy mostly just playtest the whole game or application as a black box and then have a tiny chance of noticing some hidden glitch months later, try to reproduce it, then add a bug report to the tracker, then the programmer would have to read the report and reproduce the glitch himself, then spend a big amount of time debugging to find the place inside the whole codebase where the bug actually is, at a time he already forgot the details of how that function works and then to save time half the bugs would just be marked "won't fix"?

If the programmer writes unit-tests the name implies its for small pieces of code, which is a different thing from trying out the whole thing at once. When writing an actual piece of code the programmer would probably automatically know what needs to be tested and writing those tests at that moment would be easier/faster to type in than the other code? Then he may already save on time when he just runs the tests in a second compared to singlestepping everything a few times in the debugger and hoping to not overlook something and could maybe be a little more confident to commit the code?


Edited by wintertime, 11 October 2013 - 05:48 PM.


#28 rpiller   Members   -  Reputation: 706

Like
-1Likes
Like

Posted 12 October 2013 - 07:55 AM

I think it depends on how you do your QA, but yeah that situation you described could play out.

 

The thing with unit testing is if you use it you almost have to do TDD then, otherwise it's sort of hypocritical. And to do unit testing/TDD correctly you'll have at a min about 4 tests per unit. If you have hundreds or thousands of units you'll have at least x4 tests. As per TDD you write tests first and make them fail, then go back to the code and do the min work to make it work. This means you are bouncing back and forth from your test project to your production project about every 30 seconds to a min. Some tests you might have to stop and think about what you want to write. If you skip ahead then you aren't taking the benefits of TDD so you are stepping your way to your final unit code even though in a large % of the cases you could have skipped ahead and been fine.

 

I relate this to having to take something valuable from point A to point B. Let's say you have a crystal vase that you have to bring to your friends house down the street. TDD (which we agree perhaps you should be doing if you are doing unit testing to get the max benefit of it) is like looking around you before you take a step to see all the dangers. Then taking 1 step. Then looking around you again. Then taking 1 step. Rinse & repeat. Will you get to your goal? Most likely. Will your vase be in 1 piece? Most likely. What about the person who runs to their neighbors with the vase? Will they get to their goal? Mostly likely. Will it be faster to get there? Yes. Will the vase be in 1 piece? Most likely but you MIGHT drop the vase. Then when you get to your friends house they'll make you spend time to glue it back together, which could take a good amount of time.

 

So what does this mean? Pending on your project and the cost benefit of it, or the importance that it has zero bugs (maybe something in the space program that has lives in the balance), doing TDD/unit testing isn't always needed. Sometimes you can look while you are running your vase to it's destination and everything works out just fine.



#29 Pink Horror   Members   -  Reputation: 1229

Like
0Likes
Like

Posted 12 October 2013 - 05:06 PM


Please name me a single traditional game engine that has 20+ years of life in it.
 
World of Warcraft is one of the longest running major games out there today, and it hasn't had it's 9th birthday yet. 
 
Other long-running engines include the annual sports games like Madden that is 25 years old and FIFA that is 20 years old, but those have been rewritten many times over their lifetime. 

 

I think there's some code buried in Madden that's 20+ years old. With one team operating on a yearly release cycle, I would guess that a 100% rewrite of any major part of the game is pretty rare.



#30 Hodgman   Moderators   -  Reputation: 31822

Like
2Likes
Like

Posted 12 October 2013 - 06:54 PM

The thing with unit testing is if you use it you almost have to do TDD then, otherwise it's sort of hypocritical. And to do unit testing/TDD correctly you'll have at a min about 4 tests per unit.

I dont get that -- unit testing was around for decades before the TDD methodoligy appeared. What's hypocritical about writing tests after you write code?

As for the number of tests, 4 seems pretty arbitrary..? In my engine, the internal assertion macro is hooked up to the testing framework, so I just write one test per 'unit' that calls all methods / demonstrates each usage, and asserts he results are as expected. If these test assertions or the units internal assertions fail, the test fails.
Like in TDD, I hook my tests up to the compile button, so a test failure is reported the same as a compilation failure (with file/line numbers), but I usually write them after the code, or concurrently, instead of before like in TDD.

This means you are bouncing back and forth from your test project to your production project about every 30 seconds to a min. Some tests you might have to stop and think about what you want to write.

There's no reason you can't write your tests in the same source file as the unit if you feel like it ;P
As above, a IMHO a large part of your tests should be taken care of by assertions that are internal to the unit anyway, and not part of the test.
The point of TDD is that it makes you stop and think about the use case (what to write for the unit) before writing anything, but often this is true anyway. E.g. You're writing some code that requires a hash-map lookup, but you don't have a hash-map class (for some odd reason) - at that point, you've already got a production use-case for what the hash-map class is required to do, much like what your TDD code would give you.

#31 Washu   Senior Moderators   -  Reputation: 5423

Like
2Likes
Like

Posted 13 October 2013 - 12:20 AM

The thing with unit testing is if you use it you almost have to do TDD then, otherwise it's sort of hypocritical. And to do unit testing/TDD correctly you'll have at a min about 4 tests per unit.

I dont get that -- unit testing was around for decades before the TDD methodoligy appeared. What's hypocritical about writing tests after you write code?

No, unit testing doesn't imply TDD. Although TDD does imply unit testing.
 

As for the number of tests, 4 seems pretty arbitrary..? In my engine, the internal assertion macro is hooked up to the testing framework, so I just write one test per 'unit' that calls all methods / demonstrates each usage, and asserts he results are as expected. If these test assertions or the units internal assertions fail, the test fails.

The number of tests, generally, should be more dependent on the number of branch conditions you wish to test than some random number picked out of a hat. Code coverage is the main thing you want to ensure with your unit tests.

Like in TDD, I hook my tests up to the compile button, so a test failure is reported the same as a compilation failure (with file/line numbers), but I usually write them after the code, or concurrently, instead of before like in TDD.

That's fine by me. I've been doing unit testing for quite a long time now (last decade or more?) I've tried TDD as well, and for some projects it has worked out great. It DOESN'T tend to work out so great though when you're dealing with a pre-existing code base.

This means you are bouncing back and forth from your test project to your production project about every 30 seconds to a min. Some tests you might have to stop and think about what you want to write.

There's no reason you can't write your tests in the same source file as the unit if you feel like it ;P
As above, a IMHO a large part of your tests should be taken care of by assertions that are internal to the unit anyway, and not part of the test.

Error checking, and similar behaviors SHOULD be part of your code, not part of the unit tests. But unit tests should be able to be extracted from the code, mainly for use with integration testing suites, and for use with the rest of your documentation.

The point of TDD is that it makes you stop and think about the use case (what to write for the unit) before writing anything, but often this is true anyway. E.g. You're writing some code that requires a hash-map lookup, but you don't have a hash-map class (for some odd reason) - at that point, you've already got a production use-case for what the hash-map class is required to do, much like what your TDD code would give you.

Exactly.

Edited by Washu, 13 October 2013 - 12:22 AM.

In time the project grows, the ignorance of its devs it shows, with many a convoluted function, it plunges into deep compunction, the price of failure is high, Washu's mirth is nigh.
ScapeCode - Blog | SlimDX


#32 wodinoneeye   Members   -  Reputation: 877

Like
0Likes
Like

Posted 13 October 2013 - 01:14 AM

"QA hours are cheap."

 

Tell that to the people I used to contract for.

 

Depends on what is being tested, etc...

 

I usually had more programming expertise than most of the people doing the programming (I also used to fix code produced by others, dealt with things they forgot/ignored/never knew to do)

 

Sloppy companies dont have the programmers test their own code first and foremost (a practical problem also because many programmers have never learned how to do it properly).     Even things like code review get skpped.

 

Ive also worked for companies who WANTED automated testing, but for much of the products development the code kept changing faster than the automation team could keep up (and the product programmers and their managers often were not cooperative in putting features into the code itself to facilitate testing).

 

Often it was so bad that I could find vast quantities of bugs faster than whole teams of programmers could fix them (assuming they werent already busy writing MORE buggy code/features)


Edited by wodinoneeye, 13 October 2013 - 01:18 AM.

--------------------------------------------Ratings are Opinion, not Fact

#33 Nypyren   Crossbones+   -  Reputation: 4797

Like
1Likes
Like

Posted 13 October 2013 - 03:57 AM

Unit tests would prevent so much grief where I work.  I get occasionally asked to investigate why some of my code isn't working right.  After the initial panic of "oh shit, what edge case did I overlook", I investigate and find out that someone else has tried to tweak something and ended up breaking the ENTIRE system.  I partially want to blame myself for creating such complexity in the first place, but that's not something you can avoid in game development.  Code is going to get horrifyingly complex whether you want to or not, and nobody else is going to be able to understand what the code does in its entirety.

 

The first reason we don't have unit tests is that they aren't prioritized.  We're barely given enough time to implement what the product managers want in the first place.  The higher-ups never consider the fact that we're going to get overwhelmed by interruptions to "go figure out why X doesn't work anymore" as time passes and people accidentally break existing code.

 

The second reason we don't have unit tests is because, since we've never written them, we're inexperienced when it comes to architecting our code in a test-friendly manner.  So, instead of seeing code and saying to ourselves "I could write tests for that", we look at the code and say "There is no freaking way to write a meaningful unit test for this."  And somehow, this is the accepted reality of development where I work.  It makes me so incredibly frustrated knowing that we could be saving ourselves future trouble if we weren't so inexperienced with unit testing.



#34 Malabyte   Members   -  Reputation: 589

Like
0Likes
Like

Posted 13 October 2013 - 09:05 AM

I guess, at the end of the day, it's all about how sloppy the individual programmer (or the company that he/she is working for) is. Different tests for different situations, but the important thing is that you don't let code slip by untested? Sounds about right.


- Awl you're base are belong me! -

- I don't know, I'm just a noob -


#35 wodinoneeye   Members   -  Reputation: 877

Like
0Likes
Like

Posted 15 October 2013 - 12:28 AM

I guess, at the end of the day, it's all about how sloppy the individual programmer (or the company that he/she is working for) is. Different tests for different situations, but the important thing is that you don't let code slip by untested? Sounds about right.

 

Lack of testing is often just one problem.    Getting the managers to prioritize a bug (or even except it is a bug)  so that it actually gets fixed is another fight.    More than a few times a user interface bug I found  and reported (and demonstrated) was then subsequently ignored (sometimes I was told  "the customer wont do that") and quite soon after the customers started using it they were reporting the problem.

 

Long ago a QA manager told our QA  group that there is a psychological impairment to QAs effectiveness (first and foremost is funding/prioritization and then cooperation) -- that there is a big negative aspect because the result of our work is to tell the programmers (and often designers)  that their code/design is crap.      So many times, Ive had to show them their code was buggy - actually demonstrate it before their eyes before they HAD to believe that THEIR code had a problem.


--------------------------------------------Ratings are Opinion, not Fact

#36 frob   Moderators   -  Reputation: 22731

Like
4Likes
Like

Posted 15 October 2013 - 01:41 AM

Keep in mind that QA and automated unit tests are very different things.

Automated tests can tell you when a function that used to work is suddenly giving different results. It can tell you when a component is giving unexpected values that will propagate elsewhere in the system. Automated acceptance tests can do some fairly elaborate stress tests if they are well written, and they can help you tune and adjust things during development.

QA can tell you that a key feature is missing, but unit testing will not. QA can tell you that a feature is confusing, where unit testing just verifies the status quo. QA can tell you that features are unnecessary or redundant. QA can run creative negative tests that you will never normally consider writing unit tests for. QA can give feedback about how a human uses and abuses the system in ways automated tests never will.

We've all heard stories about the crazy things users do. There are anecdotes about people who are less technically savvy who do things like take screen captures, paste them in word to crop them and scale them, and then email the whole word document because that's they only process they understand for sending a screen capture. It is things like that where programmers rarely think to write automated tests because that isn't what the user is supposed to do. A good QA team will understand that users follow extremely strange paths through software, and find amazingly creative ways to break the system.

Sure they'll find very weird bugs that you won't fix. I've seen some "press down on the game console and listen as the disc grinds to a halt, repeat until read errors occur", and once even saw "pause the game and alt+tab out, run the uninstall program skipping the warnings, then alt+tab back to game, game will crash." But better for QA to test the crazy and blatantly invalid scenarios than for your customers to discover obscure flaws when the system is in use. There are so many unexpected real life situations that no amount of automated unit tests will ever discover.

Automated tests are great where they make sense. They can augment testing and make development safer (with a tradeoff of additional initial development costs). But automated tests will never be able to replace a high-quality QA team.

Check out my book, Game Development with Unity, aimed at beginners who want to build fun games fast.

Also check out my personal website at bryanwagstaff.com, where I write about assorted stuff.


#37 Malabyte   Members   -  Reputation: 589

Like
0Likes
Like

Posted 18 October 2013 - 08:32 PM

In addition, some of the most iconic features found in games are made from unintended bugs. Would be a shame if an automated system just fixed the bug that later became the Creeper in Minecraft, without considering its potential use in that game. wink.png


Edited by Malabyte, 18 October 2013 - 08:39 PM.

- Awl you're base are belong me! -

- I don't know, I'm just a noob -





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS