Help me strengthen this argument?

This topic is 4833 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Not sure if this goes here or in the Lounge, or just what.
Quote:
 From a recent online conversation with some other programmer who's new to C++ why would i want to use a const int instead of a #define? because it has a specific meaning, and will be interpreted by the compiler (which understands your code) rather than a preprocessor (which is a dumb text-substitution engine). performance impact? none either way. ok, i'll stick with defines less typing methinks >_< and no pesky =
Any ideas how I can be more convincing about proper style? :s

Share on other sites
The only problem is that practical differences aren't really all that large, at least not until you start using namespaces. Maybe you should just let him continue using defines until he discovers the disadvantages for himself.

Personally I tend to use enums for integers by the way.

Share on other sites
Well, const obeys scoping and has less confusing errors. Like if you have the following code, the error may be a bit confusing at first:
#define test  3// ...int test = 10;

Share on other sites
Quote:
 Original post by doynaxThe only problem is that practical differences aren't really all that large, at least not until you start using namespaces.

Scoping doesn't just apply to namespaces. Consider the following code, which will work with a const int, but not a macro:
#define test 1// ...void fn() {    int test = 23;    // ...}

EDIT: On a completely unrelated note, this is my 800th post and I've been registered here for exactly one year.

Share on other sites
Because one day his code will get tromped by the preprocessor and he will spend hours trying to figure out what the error is. Ask him what he would suspect as the error if his compiler gave him this error:
Error E2108 preprocessor.cpp 9: Improper use of typedef 'PROCESSOR_INFO' in function main()
Error E2379 preprocessor.cpp 9: Statement missing ; in function main()
*** 2 errors in Compile ***

Or
preprocessor.cpp: In function int main()':
preprocessor.cpp:9: error: syntax error before numeric constant

Or
preprocessor.cpp(9) : error C2143: syntax error : missing ';' before 'constant'

for this line:
PROCESS_INFORMATION pi;
Enigma

Share on other sites
Quote:
Original post by Roboguy
Quote:
 Original post by doynaxThe only problem is that practical differences aren't really all that large, at least not until you start using namespaces.

Scoping doesn't just apply to namespaces.

I'm aware of that but it simply isn't a problem he's likely to encounter until he starts using namespaces (or start hiding constants in classes).

Share on other sites
To expand on your arguement, const is affected by type-safety and standard implicit conversion rules. #defines aren't necesarily bad, in some cases they have been the best choice for the problem I was working on, but in most cases const is 'better.' Also as noted, #defines tend to introduce bugs that are un-appearant at first glance, the kind that takes hours or even days to find, and when you finally do find them, you kick yourself over how simple they are -- Then you get to explain to your boss why you wasted all that time simply because you forgot that 'test' was #define'd as 3 as in Roboguy's example.

Share on other sites
#defines are probably the most benign case. If your friend starts thinking about inlining things with preprocessor macros, however, that's a terrible idea. It's best to nip this kind of thinking in the bud ASAP. For instance, consider this example of evilness:

// Return absolute value of k with macro (or does it?).#define badAbsoluteValue(k)   ( (k) >= 0 ? (k) : -(k) )// Return absolute value of k with inline function.inline int safeAbsoluteValue(int k){  return k >= 0 ? k : -k;}int func();void codeBlock(int x){  int result;  result = badAbsoluteValue(x++);   // Spurious results; x incremented more than once  result = badAbsoluteValue(f());   // Spurious results; func called more than once   result = safeAbsoluteValue(x++);  // Correct results; x is incremented exactly once  result = safeAbsoluteValue(f());  // Correct results; func is called exactly once}`

Share on other sites
Well, you can always try throwing in a few non-technical reasons (roughly in order of believability).

1a: Peer Pressure) All the cool C++ programmers use const int.
1b: Celebrity Appeal) Bjarne Stroustrup uses const int.
1c: Not Acting Like a Putz) Do you want to be similar in any way to these guys?

2: Employability) Technical interviewers will look out for that kind of thing and you'll lose points for using #define for constants.

3: Annoyance Factor) Using const ints won't piss you off when you accidently put in an extra semi-colon when declaring your #define

4a: Fear of Bodily Harm 1) You can tell him that I'd break his kneecaps if I caught him using #defines for constants in one of my projects.
4b: Fear of Bodily Harm 2) When he goes and adds a #define that someone else is using as an enum value name and it breaks their code, they may just decide that hanging is the only appropriate response.

5: Blood Pressure) const ints are lower in saturated fats than deep friend pork rinds

6: Taint of Evil) Animals react poorly to people who use #define for constants. Especially cats. They can tell.

I guess a few those actually sound like valid technical reasons, so you should probably just start off with number six.

1. 1
Rutin
27
2. 2
3. 3
4. 4
5. 5

• 11
• 9
• 9
• 9
• 14
• Forum Statistics

• Total Topics
633312
• Total Posts
3011312
• Who's Online (See full list)

There are no registered users currently online

×