Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Is using a debugger lazy?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
84 replies to this topic

#61 swiftcoder   Senior Moderators   -  Reputation: 13602

Like
1Likes
Like

Posted 22 September 2012 - 08:31 AM

What? I really have no idea what you are talking about now... how could I apply that to either?

"If <something> wasn't useful, it wouldn't exist" <- is a truth-free assertion. Lot's of things exist that aren't useful. But that's beside the point.

I have no problem with people stepping through their code after every checkin - if it makes you feel better, by all means, go ahead. But I continue to maintain that it's really not adding much to anything but your false sense of security.

Each time you step through your code, you are validating one particular path through your code. If you are very diligent (and have good test coverage), you might step through each test path. But these are still all only testing your "happy paths" (if you have written a test, then that path has an expected outcome => happy path). And your happy paths are already tested.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


Sponsor:

#62 Olof Hedman   Crossbones+   -  Reputation: 3814

Like
1Likes
Like

Posted 22 September 2012 - 12:23 PM


What? I really have no idea what you are talking about now... how could I apply that to either?

"If <something> wasn't useful, it wouldn't exist" <- is a truth-free assertion. Lot's of things exist that aren't useful. But that's beside the point.


In general, yes. But in this context of tradesmen's tools, this is rarely true.
Tools that aren't useful gets forgotten or replaced.
Lots of statements makes no sense when taken out of their context.

I don't use my debugger as quality control in the way you describe, I use it to quickly track down problems in my own and other peoples code.
I find it's extremely useful to check my assumptions about the implementation with real life data, without having to pollute the code with asserts and printouts.
Things like watchpoints and memory dump views makes it easy to quickly see that each step of the process is behaving as expected.
I can quickly set up tests and tweaks on my code and see if the behavior changes as the current theory assumes.

I don't understand why people make such a big deal about it really, its just a quick way to find problems, and to quickly check if your assumptions about what the problem really is, is true...

Its not replacing anything, its just another tool you can use, making you an even more productive coder if used right. (like any other tool in software engineering really)
Sure, in a perfect world, of perfect software engineering, where all code is written by (bugfree) AIs, debuggers will be useless, but they are still very useful now Posted Image

Talking about Quality Control, you shouldn't under estimate code reviews. (which this stepping through each line is a variant of). At one place I worked, we were forced to go through each line of code with a colleague, explaining what it did. Everyone was groaning, but you'd be surprised how many small potential issues where found that way. (I don't recommend that in general though, but on some critical software, it might be motivated)

Edited by Olof Hedman, 22 September 2012 - 12:48 PM.


#63 Pointer2APointer   Members   -  Reputation: 283

Like
0Likes
Like

Posted 22 September 2012 - 02:18 PM

I don't know if the term "lazy" fits here well.

It is a tool; a program to help you. A compiler is a program that helps you.

If we started this debate, I could tell you that using a compiler/interpreter is lazy because you're having a program(s) take care of many things for you to make the goal much easier.

I mean you don't have to use a debugger, but many people do.

I could probably write a flat binary program, but I would benefit so much more from writing code in a programming language.

I could also write code without a debugger program, but it would probably help me with error-catching(though some just use exceptions for this).

If it works for you, do it.

I guess it's just not everyone's thing.
Yes, this is red text.

#64 e‍dd   Members   -  Reputation: 2109

Like
0Likes
Like

Posted 22 September 2012 - 02:52 PM

I don't know if the term "lazy" fits here well.

Agreed, but for slightly different reasons.

I'm not convinced that using a debugger rather than your brain is ultimately the best form of laziness to aim for. Instead, not having to debug code at all is a better kind of laziness (which I guess is what I was trying to express in my previous post).

I could also write code without a debugger program, but it would probably help me with error-catching(though some just use exceptions for this).

If you mean hardware exceptions generated by e.g. C++'s assert(), or the like, agreed to an extent. But that's not what your common-or-garden programming language's exceptions are for; how would you know you aren't missing any error cases without enshrining all code paths in tests, at which point a debugger isn't needed (perhaps until/unless the tests start failing).

Edited by e‍dd, 22 September 2012 - 02:54 PM.


#65 Pointer2APointer   Members   -  Reputation: 283

Like
0Likes
Like

Posted 22 September 2012 - 03:27 PM

I meant expections in C++ with
try
,
catch
,
throw
.

Although the latter may not be the absolute best way to "debug" program code, or to help decrease compiler-errors and such, a nicer way could be to comment the code so you remind yourself of what it does and why it goes there.

Single-stepping through code with a debugger can help give programmers a better picture of their analyzed code, subroutines, code structuring, etc.

Another thing you can do to avoid compilation errors and/or runtime errors is to use separate outlined classes/methods for each type of data, and pursue a way to highlight and fix each specific flagged line of code one at a time.

Most real-advanced errors in a language like C++ are very simple: pointers.

As simple as the subject may be to some experts, the creepy things pop out when you least expect them.
Yes, this is red text.

#66 Shaquil   Members   -  Reputation: 815

Like
0Likes
Like

Posted 22 September 2012 - 04:27 PM

Each time you step through your code, you are validating one particular path through your code. If you are very diligent (and have good test coverage), you might step through each test path. But these are still all only testing your "happy paths" (if you have written a test, then that path has an expected outcome => happy path). And your happy paths are already tested.


I think that even falling for a trap like that is its own kind of dependence on the debugger.

Also, I'm surprised people are still posting here. I completely forgot. I guess when you question something that so many people like, you're bound to get everyone talking. At least it's been flame-free thus far.

#67 iMalc   Crossbones+   -  Reputation: 2403

Like
2Likes
Like

Posted 23 September 2012 - 01:28 AM

Merely using a debugger isn't lazy at all, but writing code carelessly and expecting the debugger to find all the problems for you is.

He probably got the impression you were doing the later, and that would be where he was coming from.
"In order to understand recursion, you must first understand recursion."
My website dedicated to sorting algorithms

#68 NineYearCycle   Members   -  Reputation: 1163

Like
5Likes
Like

Posted 23 September 2012 - 03:14 AM

A professor who uses Dev-C++ and thinks using a debugger to debug code is "lazy" is, in my honest opinion, an arrogant idiot.

"Ars longa, vita brevis, occasio praeceps, experimentum periculosum, iudicium difficile"

"Life is short, [the] craft long, opportunity fleeting, experiment treacherous, judgement difficult."


#69 RobTheBloke   Crossbones+   -  Reputation: 2349

Like
1Likes
Like

Posted 23 September 2012 - 05:07 AM

To be completely fair, I told him that using a debugger helped me with a lot of null references when I was trying to handle data from files that somehow never loaded up during runtime (something I've always thought was a fairly common, small mistake people make), and his response was that those situations rarely arise for him.


Ok, now we get to the crux of it. Using a debugger is NOT lazy, however using a debugger as a replacement for propper runtime checks & logging is lazy (and bad practice). If you delete all of your assets, and then run your app, does it A) crash, or B) gracefully handle the runtime error, and print a log entry + error dialog? If the answer is A, then you have more work to do. I think I actually agree with your professor here......

To me that seems like a questionable claim, but again, I'm just a beginner.

It is entirely reasonable, and entirely likely. I've worked in mddleware before, and I've had to unit test asset loadng code to the point where it would never fail. It's not actually difficult to achieve, you've just got to apply due diligence.

Furthermore, he said that he just uses error messages to let him know whenever something goes wrong. But isn't that in itself a flawed idea to begin with? Since you only put error messages where you expect errors to occur, when a problem pops up in an unexpected place your error messages won't mean anything.

An expected error of attempting to load an asset, is that the asset cannot be found on disk. I'm absolutely certain this is what the professor was talking about. Using a debugger to catch these errors is a terrible approach, and a habit you should unlearn as soon as possible.

A debugger is a good tool to help track down difficult to diagnose problems in your code. It is not a tool that allows you to avoid writing error handling mechanisms. (As an aside, I probably use a debugger once a month maybe. Unit tests have largely eradicated my need to use one on a daily basis....)

#70 Shaquil   Members   -  Reputation: 815

Like
0Likes
Like

Posted 23 September 2012 - 05:28 AM

[deleted because I'd like to just withdraw from this conversation while the withdrawin's good]

Edited by Shaquil, 23 September 2012 - 06:57 AM.


#71 swiftcoder   Senior Moderators   -  Reputation: 13602

Like
0Likes
Like

Posted 23 September 2012 - 06:43 AM

He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways.

It's also 60 years to learn all the pitfalls kids get into, by depending on their fancy tools, instead of real software engineering.

Don't discount someone merely due to age, especially in a field as young as computer science. Some of the best programmers I know started with punch cards.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#72 Shaquil   Members   -  Reputation: 815

Like
0Likes
Like

Posted 23 September 2012 - 06:57 AM


He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways.

It's also 60 years to learn all the pitfalls kids get into, by depending on their fancy tools, instead of real software engineering.

Don't discount someone merely due to age, especially in a field as young as computer science. Some of the best programmers I know started with punch cards.


You're right.

#73 Bregma   Crossbones+   -  Reputation: 6063

Like
0Likes
Like

Posted 23 September 2012 - 07:06 AM


He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways.

It's also 60 years to learn all the pitfalls kids get into, by depending on their fancy tools, instead of real software engineering.

Don't discount someone merely due to age, especially in a field as young as computer science. Some of the best programmers I know started with punch cards.

Some of the programmers in this very thread started with punch cards.

Believe me, I have been seeing the same arguments from the inexperienced for decades. All I have to do is wait for time to prove me right and the next wave in the eternal September to start the cycle all over again.

Debuggers do not solve design problems.
Stephen M. Webb
Professional Free Software Developer

#74 RobTheBloke   Crossbones+   -  Reputation: 2349

Like
0Likes
Like

Posted 23 September 2012 - 07:13 AM

Furthermore, he said that he just uses error messages to let him know whenever something goes wrong. But isn't that in itself a flawed idea to begin with? Since you only put error messages where you expect errors to occur, when a problem pops up in an unexpected place your error messages won't mean anything.

An expected error of attempting to load an asset, is that the asset cannot be found on disk. I'm absolutely certain this is what the professor was talking about. Using a debugger to catch these errors is a terrible approach, and a habit you should unlearn as soon as possible.


You're likely giving him far too much credit. Before we got to this disagreement, and I was just describing the act of making a game, I was talking about using free art because I'm not an artist, and learning to create and spritesheet and load it up and work with it. As I talked about this, he did not react much, and had a look on his face as if he had no idea what I was getting at. At the time I could not for the life of me figure it out. Why did he act like I was talking in Japanese? Now, I think he honestly has just never done anything like that, so it was news to him. He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways.


I had a lecturer at uni who I considered at the time to be set in his ways. He was all kind of "C++ operators, they suck!", "C++ templates are more hassle than they're worth", "Multiple inheritance in C++ is a terrible idea". At the time, I wrote him off as a miserable old fart who didn't have a clue what he was talking about. Fast forward 15 years, and these days I agree with (almost) everything he said. Now I could wax-lyrical for weeks on end, about the experiences I've gained over the years that have made me change my mind, but that would take weeks to explain! You simply can't condense many years worth of experience into an hour long lecture, but you can at least explain the conclusions that you've reached along the way. So rather than assuming the guy knows nothing, you should probably try to think of the reasons why he has the opinions he does.

Don't underestimate how valuable real world programming experience actually is - even if that experience was from back in the 70's. Personally, I'd value the advice of someone who's been programming since the 50's, over someone with no commerial programming experience at all. He may not know the intimate details of the latest language or technology, but experience of developing and maintaining commercial software will more than make up for that.

I'd be surprised if he does any programming outside of his courses, teaching kids. Which is sad, I think.

Why? Teaching programming is probably one of the hardest things I've ever done in my professional career! If you have a lecturer with a bucket tonne of real world experience, whose prepared to devote all his time to passing on that information to *you*, I'd be extremely grateful to have a teacher like that. Some students are lumbered with lecturers who know only as much as they've just read from the course textbook. Count yourself very lucky!

#75 mhagain   Crossbones+   -  Reputation: 9604

Like
0Likes
Like

Posted 23 September 2012 - 07:59 AM


I need to take issue with the "debugger as tool of last resort" thinking some have expounded upthread. I also use a debugger as a "prevention is better than cure" tool, so all new code is always developed and initially tested using debug builds.

Well, fine, but what is good for 'prevention is better than cure' is formal unit tests. I think all new code should be tested but don't see how that equates to all new code should be stepped through with a debugger.


This pretty much nails a problem with this thread. There are a lot of really good points being made, but there is also a certain amount of binary thinking going on; it seems to be the case that one either uses a debugger as a replacement for writing sane and sensible code that at least reads as though it's setting out to achieve it's objectives, or that one doesn't use a debugger at all (except, as I indicated, as the tool of last resort). Reality isn't like that at all (and there's more to using a debugger than just single-stepping code and inspecting call-stacks on a crash).

All non-trivial software has bugs. It's nice to be able to say that one should aim to write good correct code from the outset, but that doesn't mean that the code will be bug free. Using a debugger is not a replacement for writing good correct code; using a debugger is not a replacement for writing unit tests with complete coverage; using a debugger is a supplement to these. You write good correct code, you write good unit tests, and you use a debugger, because no matter how good everything else is, you still have bugs and you'd like to be able to find and fix the real doozies before you ship.

This isn't an either/or scenario.

<snip>

You're nuts. Remind me to never buy any software you're involved in writing.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#76 Hodgman   Moderators   -  Reputation: 40050

Like
4Likes
Like

Posted 23 September 2012 - 08:06 AM

Debuggers do not solve design problems

Who's claiming this?

#77 phil_t   Crossbones+   -  Reputation: 5698

Like
2Likes
Like

Posted 23 September 2012 - 10:28 AM

There are a lot of strange comments on this thread, and people making arguments against points that no one has made. Do people really aim to write sloppy code and rely on debuggers to fix it? Maybe they do, but it seems hard for me to imagine!


Effective use of a debugger is one of the most important skills for a developer to have. I would look for this if I were hiring someone, as I know it would help them make efficient use of their development time.

#78 Shaquil   Members   -  Reputation: 815

Like
0Likes
Like

Posted 23 September 2012 - 10:47 AM

There are a lot of strange comments on this thread, and people making arguments against points that no one has made. Do people really aim to write sloppy code and rely on debuggers to fix it? Maybe they do, but it seems hard for me to imagine!


I've been thinking that this whole time, but I didn't want to spend the time to figure out how to put it as simply as you did. Thanks for posting that.

Edited by Shaquil, 23 September 2012 - 10:47 AM.


#79 mhagain   Crossbones+   -  Reputation: 9604

Like
0Likes
Like

Posted 23 September 2012 - 10:57 AM

Do people really aim to write sloppy code and rely on debuggers to fix it? Maybe they do, but it seems hard for me to imagine!


I don't think that's what anyone who's speaking in favour of using a debugger is claiming, but it does come across as though some of those who are speaking against it are claiming that use of a debugger can lead you to that. Which is a decidedly odd viewpoint.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#80 Oolala   Members   -  Reputation: 996

Like
1Likes
Like

Posted 23 September 2012 - 01:10 PM

I'd even argue that debuggers make for better, more readable, more straightforward code in the end. I'll give an example:

I do a lot of work on a system for which using a debugger is impossible. If you look at my code there as compared to my code on my normal projects, the number of lines per functionality is roughly 40%'ish less for my normal projects [where I have a debugger]. This extra 40% of 'waste' comes from the fact that tracking bugs down on a project with roughly 250K lines of code is very difficult, so I spend a fair bit of extra time up-front putting assertions for everything and anything that I can think of that *might* happen, on every layer of the program, for every function.

I assert every pointer before dereferencing. I assert every map & set before I reference, remove, or add an object to make sure that it either does/doesn't exist. I assert every array bound & vector bound. I assign default values to all stack variables/class members/ etc. I've timed that as much as 60% of my cycles end up going to assertions. I check things at every layer of abstraction, and I re-check things that were passed by reference to functions. I wrote custom memory allocators to catch dereferences to dangling pointers. The moment something goes wrong, I know it.

Reading code like this though? It makes such a mess of everything. All of this is unnecessary if my IDE breaks on accessing an uninitialized variable, dereferencing an already de-allocated pointer, indexing an array by -1, or whatever. All of this is waste. It easily doubles or even triples the time it takes me to write code, and it is radically more difficult to find out what is even going on by reading it. So much harder in fact that I've written a program that goes through my source files and picks out all my error checking for when I want to show stuff to someone else or read through what I've done.

All of this should be automated. In fact, it IS automated... by this thing called a debugger.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS