Quote:Original post by someboddy
(I just ignore them, like any other programmer)
I usually disable them, too. They are just annoying, and for many bugs, the program will anyways only crash every 6th or 7th time, which is totally acceptable as I can tell my customers that it's the operating systems fault. I hear that disabling warnings is standard in many industries to improve productivity, for example in the car manufacturing industry,
yay.
No, I was just kidding. No offense ;)
Real Programmers have their warnings
always enabled, except they have to work around compiler bugs or participate to a obfuscation contest, then they disable a warning locally. It's just wise to do enable warnings; actually, producing warning messages is often harder for compiler writers than just kicking off an error message, because sometimes they warn about code that is correct according to standards, but which actually points out something the programmer did not intend; appreciate their support! And not reading warnings and solving the problem can mean, e.g., that 6 months later you end up in weeks of debugging sessions, because the rare circumstance under which your code runs fine, has vanished. There are 1000s of reasons not to ignore them.