Is using a debugger lazy?

Started by
83 comments, last by SuperVGA 11 years, 6 months ago
There is another thing about the debugger that is important (and not lazy at all).

I use the debugger to verify if my algorithms are flowing the way I expected -- like desk checks.
To simply trust your code because a few test cases output the right answer is the most lazy thing one can do.

So debuggers are meant to be used to support your coding, just like armors are important to warriors. No matter how good you are with the sword/code, one day your foe/bug will hit you and every second in the world will be most valuable.
;)
Programming is an art. Game programming is a masterpiece!
Advertisement
kuramayoko10 has a good point. At the company where I used to work, at times we were strongly encouraged to step through all new code we checked in with the debugger. It seems excessive, but I think it was a good idea as an extra quality gate for times when you really can't afford adding any bugs to the product. (and you'd be surprised how often your code will generate the correct result in a particular scenario even though it's getting at it the wrong way :-))
Assuming the professor has some knowledge and assuming he met many unskilled people, those who try to code "randomly"... well he might have a point.

I mean if you write something and then by default you debug because there's surely something wrong you missed in our own code then better if you don't have a debugger and just learn how a while or for loop works.

But in the end debugging functionality from a broader point of view is also a printf. Debugging is also mentally parse your code trying to figure out what's wrong with your algorithm.

But development of every non-trivial software in 2012 needs good debugging tools.

Those who fail to realize it either live 40 years in the past or have never written (and shipped) any non trivial product.

smile.png

But in the end debugging functionality from a broader point of view is also a printf.

Last time I checked, printf doesn't let me change the line of execution, or modify the value of variables, or dump/print register values, or step through a program I don't have access to the code to, or slowly step through an algorithm iteration by iteration so I don't get a massive amount of printf dumps of information and instead focus on only the iterations I care about, etc. If all you ever use your debugger for is to look at the value of a variable, it's kinda like only using the stereo in a car and never actually driving it somewhere.

But yeah, I'll agree with people that a debugger is no excuse to be lazy and slide into the don't-plan-just-code-and-debug-later mode of action.
[size=2][ I was ninja'd 71 times before I stopped counting a long time ago ] [ f.k.a. MikeTacular ] [ My Blog ] [ SWFer: Gaplessly looped MP3s in your Flash games ]
Does the end user give a shit on whether or not you used a debugger as part of your development process? Hell no. They only care about whether or not the application works, does what its supposed to do, and does it correctly and as expected. A debugger may actually help a lot with that.

Your professor has been sitting in the ivory tower so long that he's lost sight of what matters and replaced it with stuff that doesn't matter.

They only care about whether or not the application works, does what its supposed to do, and does it correctly and as expected. A debugger may actually help a lot with that.

Correct and bug-free software is not created by endlessly running through the debugger. It is created by painstakingly designing and re-designing every aspect of the software, and exhaustively testing each component in isolation and as part of the whole.

Debuggers are a useful tool to track down the point of failure. But if you are using a debugger to replace any of the following, then you are doing it wrong:

  • Up-front design
  • Design reviews
  • Code reviews
  • Unit testing
  • Functional testing
  • Stress testing
  • User testing

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

He probably just meant that debuggers should be a tool of last resort. If you write some code that exhibits some bug the first step to getting rid of the bug should be examinining the code you just wrote. Firing up the debugger instantaneously is indeed lazy ... but we all do it.

Today I was talking with a few professors at my university about doing some work on a couple new servers they've got set up. They asked me about my background in coding, and I mentioned that I'd written a couple games over the summer with C++/Allegro, and that I used Visual Studio 2010 as my IDE. One of the professors told me that he preferred Dev-C++, which I found quite odd considering Dev-C++ at least seems very bare-bones (I wouldn't know, I've only written a couple very simple projects in it).

Anyway, I told him I thought he'd find VS2010 to be nice because the debugger is great for catching unhandled exceptions/null pointers, and he told me that debuggers are for lazy people. Now something tells me he's out of his mind, but I'm no expert, so I can't be sure. Do you think using a debugger is lazy? I don't mean overusing it, or relying on it too much. Just the simple act of using it at all. I think a debugger is a great tool and it's a good idea to use one. What do you think?


Tell him to use VI then...
Im not sure the flow of the conversation but im sure he was getting at the fact that relying on the debugger to catch every mishap in your app is lazy. You should always hand le everything you can think of as possi ly going wrong by outputing a message or throwing and exception. The debugger is for those things that go wrong but you swear shouldnt be possible. Checking if a file is loaded or not is a great example. I dont event want to mention the amount of times a huge bug turned out to just ne a mistyped filename. Now that i check every file i get a quick message in the console telling me it wasnt loaded and what the name was im trying to load.
[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.
Debuggers are invaluable, but I'd say that relying on a debugger is indeed bad if you never take stock of why you end up there in the first place.

If I have a bug that's non-obvious enough to require the use of a debugger, I always try to think about what I could have done to avoid having to use it at all. Is there an assertion missing? Is there a test missing? Could I have used a different coding technique, perhaps one that's safe by construction, rather than relying on convention? Does this interface, as written, promote misuse? Is the code I've written less obvious than it might be?

In doing this, I now only need a debugger very rarely.

In my humble opinion and experience, it's quite often the case (though not always the case) that code written with debugger-driven-development is harder to follow because it hasn't been thought through as thoroughly.

Debuggers have their use as exploratory tools too, of course.

Disclosure: I currently work on the debuggers for one of the top games console manufacturers.

This topic is closed to new replies.

Advertisement