Is using a debugger lazy?

Started by
83 comments, last by SuperVGA 11 years, 6 months ago

I'd even argue that debuggers make for better, more readable, more straightforward code in the end. I'll give an example:

I do a lot of work on a system for which using a debugger is impossible. If you look at my code there as compared to my code on my normal projects, the number of lines per functionality is roughly 40%'ish less for my normal projects [where I have a debugger]. This extra 40% of 'waste' comes from the fact that tracking bugs down on a project with roughly 250K lines of code is very difficult, so I spend a fair bit of extra time up-front putting assertions for everything and anything that I can think of that *might* happen, on every layer of the program, for every function.

<snip>

All of this should be automated. In fact, it IS automated... by this thing called a debugger.


It should be automated yes, but not with a debugger, but unit tests instead.
Advertisement

[quote name='Oolala' timestamp='1348427436' post='4982999']
I'd even argue that debuggers make for better, more readable, more straightforward code in the end. I'll give an example:

I do a lot of work on a system for which using a debugger is impossible. If you look at my code there as compared to my code on my normal projects, the number of lines per functionality is roughly 40%'ish less for my normal projects [where I have a debugger]. This extra 40% of 'waste' comes from the fact that tracking bugs down on a project with roughly 250K lines of code is very difficult, so I spend a fair bit of extra time up-front putting assertions for everything and anything that I can think of that *might* happen, on every layer of the program, for every function.

<snip>

All of this should be automated. In fact, it IS automated... by this thing called a debugger.


It should be automated yes, but not with a debugger, but unit tests instead.[/quote]You missed the point a bit of what Oolala was getting at.
On a sensible platform, yes, when your unit test fails, you get a nice dump telling you on which exact line/file things went awry. If you run the test from your IDE, then it's debugger will stop on that line and show you what's wrong.

However, on some embedded devices without debugging support, if you crash it, all you can determine is that the machine has stopped, which is incredibly frustrating!
So on such a platform: you write up a unit test to ensure that you get the expected outputs... and say your code is bad and crashes, so the device simply stops with no further information... which tells you that your test failed, but not any hints as to why.
This forces you to go and add a lot of diagnostic code to your program, so that before it crashes, you can write out information about the erroneous conditions.

Ergo, developing on platforms that have debug support tools is much nicer than developing on platforms with no such support.

e.g. on my Win32 engine, I don't have to assert that pointers aren't NULL before dereferencing them -- if such an event happens, it throws a SEH exception, which triggers a mini-dump that I can use to diagnose the problem immediately.
On this other hypothetical platform though, such an event simply cuts the power off, meaning I've got to add a lot more assertions than usual to ensure I get to respond to invalid operations before killing the machine.
Lazy is the wrong word. Of course, if you do not know where the failure is, then start debug. If you try to find out how a complex program works, debug. However, if you iteratively try to code using a debugger, then it is the wrong way. Say, do not trial and error using a debugger until a code works. Think about it, end to find the mistakes use the debugger. However often the people write a line, start the debugger and look where the next null pointer exception is, write another line, etc..

So either he really is not up to date, or you misunderstood, or he could not express what he meant.
The problem of Object Oriented Programming:Everybody tells you how to use his classes, but nobody how not to do it !

A debugger is the code equivalent of a multimeter/oscilloscope/magnifying glass etc for electronic circuits.
What your professor is suggesting is to find the cause of failure in a complex circuit by just looking at the wires or maybe by putting a lightbulb here and there.

Clearly, he is out of his mind.

Either that or there was just a misunderstanding.


QFT

One of the professors told me that he preferred Dev-C++

Is he a professor of paleontology or maybe ancient history?
-That would explain his interest in Dev-C++.

Usually, that wouldn't necessarily be a bad thing, but there's barely any support,
no patches and the debugger integration is really bad IMO.

This topic is closed to new replies.

Advertisement