Your Worst "Gotchas" ever in programming

Started by
68 comments, last by Icebone1000 11 years, 2 months ago

Debuggers not being trustworthy is a big problem. This is especially true for people working on new game consoles. Whenever a fancy new devkit comes along there are always times when the debugger will outright lie to you about the state of the hardware. Also on new systems you will run across occasional compiler and optimizer bugs. This can require extensive debugging at the assembly level, taking great care to ensure values were not optimized out, or moved to another area, or otherwise behaving in a rational but unexpected way.

It can take a lot of work, and when you finally do get the 'a-ha' moment discovering that it is a debugger or compiler error, you then get the follow-up event of how to work around it.

Hardware bugs. They can happen with bad firmwares, or inconsistent documentation where it really isn't a bug, just an undocumented feature. They happen more often when the processors get very hot. That's exactly what happens during games with compute-intensive processing and gamers with overclocked chips. Several games, like GuildWars, take steps to detect and notify users when it occurs.

These also happen on early firmwares of game consoles. Since they are covered under NDA you don't often hear the horror stories. When you have documentation that says a function does something, and it doesn't do it, you must spend hours trying to figure out if it is your code doing the wrong thing, or if it is the black box doing the wrong thing. Then you wait for the next firmware update and hope your problem is fixed.

As for the general problems, I've seen all kinds of things:

Metaprogramming is just plain painful, don't submit it to the codebase. Not only are there 'gotchas' all over the place, they fail spectacularly on data-driven development. If you need to calculate Fibonacci numbers that are known at runtime, go ahead. Otherwise either create a data table of pre-computed values or compute it at runtime.

Badly formed macros (#define) that include sequence points or don't properly place things in parenthesis, or even worse include complete statements, can be a source of frustration. They are not expanded in the debugger, are not type safe, and just cause extra work. I've spent many hours going through macro bugs.

Nested macros, especially nested macros that build complex structures should be avoided. While it may be nice to have a few macros that convert a long list of labels into an enum, a corresponding string table listing the enum names, and a corresponding automatic registration at game startup, and so on... those macros really are a bad idea. When they are fully debugged and have been used for several years they can be moderately useful, but there is a high cost to making them work correctly.

Advertisement

Earlier today, while copying code from our old texture processing tool to the shiny new C# & C++/CLI hybrid I'd been making I was surprised when previously working code to parse a DDS file was no longer working correctly.

The code was a direct copy & paste job and the DDS was a known working one.

Debugger gave garbage for the DDS entries, yet looking at the source file in the hex editor everything checked out.

Old tool was 32bit.

New tool was 64bit.

DDS structure the code used had a void* hidden in it.

Data alignment hilarity ensured.

*sigh* That's a few hours of my life I won't get back...

(Aside note: If you copy data into a System.IO.MemoryStream object make sure you seek back to 0 again or you can't copy that data back out as it is EOF : that one only took me a minute to figure out...)

Debuggers not being trustworthy is a big problem. This is especially true for people working on new game consoles.

Annoyingly not just new consoles; back in 2008 I was working on a PS2 game (using good old Codewarrior!) and my team leader called me over as he was having problems with a function which seemed to be working yet in a debug session a variable was clearly not showing anywhere near the correct value.

It was near the end of the day so I only quickly looked it over before he went home but I could see nothing wrong in the code.

I get in the next morning around 10am to find him, our lead and another senior programmer all staring at his monitor trying to figure out what is going on. I join them and for a while, when another programmer joined us, it had 5 of us looking at it all puzzled.

After a while they started looking at the assembly level at which point I took myself away to read over the PS2 assembly docs, as I didn't know it at that point, before returning to the collection of programmers and looking over the code.

After about 10mins and some heavy thinking I declared I had figured it out - the compiler, despite being in debug mode, had decided to optimise away the assignment to the variable, instead keeping it in a register to pass directly to the next function call. The debugger, on the other hand, was blissfully unaware of this fact and carried on regardless.

Net result; panic disappeared and my estimation went up in the eyes of our lead significantly (at this point I'd only been at the company about a month) so much so that he a) trusted me with some large refinements to our scripting system and b) often requested me on projects after that as he knew I knew my stuff :D

In fact that's not a bad tip for new people; do something to get yourself noticed when you join somewhere. Raising your stock early on is a very good move :)

Worst for me would mean most obvious one:

Missing: }

Just a quick addition, since I have little time.

Symptom code:

int i = 0;

printf( "i = %d", i );

Output:

"i = 1"

It took four senior coders to work out what was going wrong here. The issue was overflowing buffers -> memory corruption. Causing data to not be assigned the value it was assigned. That was a fun monday morning.

Saving the world, one semi-colon at a time.

semi-colon at the end of an if statement.

Why are you ignoring the state of the flag?? Why do you keep going into that code block? WHY??

Maybe the code isn't synced with the debugger! I'll rebuild. Nope, same problem. Stupid compiler/debugger/toolchain!

if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight

pretty much anything i've done involving c++ has been riddled with gotchyas, I hope I never have to use that language again

This took me back about 4 years, when I was working on my capstone uni project. Me and a friend were getting linker errors - something along the lines of " somevariable already defined in somefile.obj". No compilation errors - all our headers were #ifndef guarded, so we were scratching our heads for near four days.

The obvious thing, in retrospect, is that #ifndef guards only stop code declaration from happening twice. If you (as we did) put actual data in a header file (say, something like int someCount; that is not static), it doesn't matter if the .h file is guarded, when its included multiple times the linker will try allocate the variable again, and will fail, giving an error like above.

Bottom line is don't put anything except declarations in your header.

This happened to me as well, I was banging my head on the desk after looking over header guards for the n'th time, and then finally discovered what was going on with google's help (I was touching C++ for the first time). It was especially puzzling since I didn't have variables, but utility functions inside the header - doing so obviously worked for class methods, as they were inside the class declaration, but not for free functions. Took me something like three-four days to figure it out as well.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

In PHP, when I realized 0 == "pizza".

In PHP, when I realized 0 == "pizza".

Is this the sort of thing that makes people say "TRWTF is PHP?"

This topic is closed to new replies.

Advertisement