Is using a debugger lazy?

Started by
83 comments, last by SuperVGA 11 years, 7 months ago

He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways.

It's also 60 years to learn all the pitfalls kids get into, by depending on their fancy tools, instead of real software engineering.

Don't discount someone merely due to age, especially in a field as young as computer science. Some of the best programmers I know started with punch cards.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Advertisement

[quote name='Shaquil' timestamp='1348399720' post='4982881']
He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways.

It's also 60 years to learn all the pitfalls kids get into, by depending on their fancy tools, instead of real software engineering.

Don't discount someone merely due to age, especially in a field as young as computer science. Some of the best programmers I know started with punch cards.
[/quote]

You're right.

[quote name='Shaquil' timestamp='1348399720' post='4982881']
He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways.

It's also 60 years to learn all the pitfalls kids get into, by depending on their fancy tools, instead of real software engineering.

Don't discount someone merely due to age, especially in a field as young as computer science. Some of the best programmers I know started with punch cards.
[/quote]
Some of the programmers in this very thread started with punch cards.

Believe me, I have been seeing the same arguments from the inexperienced for decades. All I have to do is wait for time to prove me right and the next wave in the eternal September to start the cycle all over again.

Debuggers do not solve design problems.

Stephen M. Webb
Professional Free Software Developer


[quote name='RobTheBloke' timestamp='1348398422' post='4982876'][quote name='Shaquil' timestamp='1347992316' post='4981329']Furthermore, he said that he just uses error messages to let him know whenever something goes wrong. But isn't that in itself a flawed idea to begin with? Since you only put error messages where you expect errors to occur, when a problem pops up in an unexpected place your error messages won't mean anything.

An expected error of attempting to load an asset, is that the asset cannot be found on disk. I'm absolutely certain this is what the professor was talking about. Using a debugger to catch these errors is a terrible approach, and a habit you should unlearn as soon as possible.
[/quote]

You're likely giving him far too much credit. Before we got to this disagreement, and I was just describing the act of making a game, I was talking about using free art because I'm not an artist, and learning to create and spritesheet and load it up and work with it. As I talked about this, he did not react much, and had a look on his face as if he had no idea what I was getting at. At the time I could not for the life of me figure it out. Why did he act like I was talking in Japanese? Now, I think he honestly has just never done anything like that, so it was news to him. He is really quite old. I was told he's been programming since the 50's. That's plenty of time to get set in your ways. [/quote]

I had a lecturer at uni who I considered at the time to be set in his ways. He was all kind of "C++ operators, they suck!", "C++ templates are more hassle than they're worth", "Multiple inheritance in C++ is a terrible idea". At the time, I wrote him off as a miserable old fart who didn't have a clue what he was talking about. Fast forward 15 years, and these days I agree with (almost) everything he said. Now I could wax-lyrical for weeks on end, about the experiences I've gained over the years that have made me change my mind, but that would take weeks to explain! You simply can't condense many years worth of experience into an hour long lecture, but you can at least explain the conclusions that you've reached along the way. So rather than assuming the guy knows nothing, you should probably try to think of the reasons why he has the opinions he does.

Don't underestimate how valuable real world programming experience actually is - even if that experience was from back in the 70's. Personally, I'd value the advice of someone who's been programming since the 50's, over someone with no commerial programming experience at all. He may not know the intimate details of the latest language or technology, but experience of developing and maintaining commercial software will more than make up for that.

I'd be surprised if he does any programming outside of his courses, teaching kids. Which is sad, I think.[/quote]
Why? Teaching programming is probably one of the hardest things I've ever done in my professional career! If you have a lecturer with a bucket tonne of real world experience, whose prepared to devote all his time to passing on that information to *you*, I'd be extremely grateful to have a teacher like that. Some students are lumbered with lecturers who know only as much as they've just read from the course textbook. Count yourself very lucky!

[quote name='mhagain' timestamp='1348230912' post='4982347']
I need to take issue with the "debugger as tool of last resort" thinking some have expounded upthread. I also use a debugger as a "prevention is better than cure" tool, so all new code is always developed and initially tested using debug builds.

Well, fine, but what is good for 'prevention is better than cure' is formal unit tests. I think all new code should be tested but don't see how that equates to all new code should be stepped through with a debugger.
[/quote]

This pretty much nails a problem with this thread. There are a lot of really good points being made, but there is also a certain amount of binary thinking going on; it seems to be the case that one either uses a debugger as a replacement for writing sane and sensible code that at least reads as though it's setting out to achieve it's objectives, or that one doesn't use a debugger at all (except, as I indicated, as the tool of last resort). Reality isn't like that at all (and there's more to using a debugger than just single-stepping code and inspecting call-stacks on a crash).

All non-trivial software has bugs. It's nice to be able to say that one should aim to write good correct code from the outset, but that doesn't mean that the code will be bug free. Using a debugger is not a replacement for writing good correct code; using a debugger is not a replacement for writing unit tests with complete coverage; using a debugger is a supplement to these. You write good correct code, you write good unit tests, and you use a debugger, because no matter how good everything else is, you still have bugs and you'd like to be able to find and fix the real doozies before you ship.

This isn't an either/or scenario.


<snip>

You're nuts. Remind me to never buy any software you're involved in writing.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.


Debuggers do not solve design problems
Who's claiming this?
There are a lot of strange comments on this thread, and people making arguments against points that no one has made. Do people really aim to write sloppy code and rely on debuggers to fix it? Maybe they do, but it seems hard for me to imagine!


Effective use of a debugger is one of the most important skills for a developer to have. I would look for this if I were hiring someone, as I know it would help them make efficient use of their development time.

There are a lot of strange comments on this thread, and people making arguments against points that no one has made. Do people really aim to write sloppy code and rely on debuggers to fix it? Maybe they do, but it seems hard for me to imagine!


I've been thinking that this whole time, but I didn't want to spend the time to figure out how to put it as simply as you did. Thanks for posting that.

Do people really aim to write sloppy code and rely on debuggers to fix it? Maybe they do, but it seems hard for me to imagine!


I don't think that's what anyone who's speaking in favour of using a debugger is claiming, but it does come across as though some of those who are speaking against it are claiming that use of a debugger can lead you to that. Which is a decidedly odd viewpoint.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

I'd even argue that debuggers make for better, more readable, more straightforward code in the end. I'll give an example:

I do a lot of work on a system for which using a debugger is impossible. If you look at my code there as compared to my code on my normal projects, the number of lines per functionality is roughly 40%'ish less for my normal projects [where I have a debugger]. This extra 40% of 'waste' comes from the fact that tracking bugs down on a project with roughly 250K lines of code is very difficult, so I spend a fair bit of extra time up-front putting assertions for everything and anything that I can think of that *might* happen, on every layer of the program, for every function.

I assert every pointer before dereferencing. I assert every map & set before I reference, remove, or add an object to make sure that it either does/doesn't exist. I assert every array bound & vector bound. I assign default values to all stack variables/class members/ etc. I've timed that as much as 60% of my cycles end up going to assertions. I check things at every layer of abstraction, and I re-check things that were passed by reference to functions. I wrote custom memory allocators to catch dereferences to dangling pointers. The moment something goes wrong, I know it.

Reading code like this though? It makes such a mess of everything. All of this is unnecessary if my IDE breaks on accessing an uninitialized variable, dereferencing an already de-allocated pointer, indexing an array by -1, or whatever. All of this is waste. It easily doubles or even triples the time it takes me to write code, and it is radically more difficult to find out what is even going on by reading it. So much harder in fact that I've written a program that goes through my source files and picks out all my error checking for when I want to show stuff to someone else or read through what I've done.

All of this should be automated. In fact, it IS automated... by this thing called a debugger.

This topic is closed to new replies.

Advertisement