# Apparently #define is not c++

This topic is 4815 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I've had two lectures only this semester at uni and ive already learned alot of c++ stuff despite having done it for a year and a half. One of the things being: In C u would write: #define NUMBER 100 apparently, if your strict about standards, in C++ u would write: const unsigned int NUMBER = 100; Any one do the latter? ace

##### Share on other sites
You could also write:

enum { NUMBER = 100 };

yep :)

##### Share on other sites
I generally try and avoid defines and stick to const as you get type-checking and they're easier to debug.

##### Share on other sites
Yup. Thing with #define is that it happens before compilation. For numbers it is allways better to have strong typeing but it can come in useful. For example "assert" just doesn't appear in release since it is defined in release to "nothing" (which is faster than an if statment...).

##### Share on other sites
The compiler can also [potentially] perform better optimizations with "const" than it can with "#define" because #define is handled by the pre-processor and so usually converted into a literal before the actual true compilation.

Usually, the clearer and more descriptive you are to the compiler, the better the optimizations it can make.

##### Share on other sites
Yes. The first thing you must do before writing a line of C++ is to write many lines of Java. Then come back and pretend you're using Java, instead of C with OOP tacked on. Or something equivalent - the point is to avoid using constructs that were appropriate to the C language but have been superceded by more safe or robust contructs in C++. For example, avoid using arrays - prefer Vectors unless optimisation or hardware access is necessary. Avoid pointers unless to existing, stack allocated objects. Pointers should mostly be used for polymorphism purposes, such as in containers. If you really need to use a pointer locally, consider using an auto_ptr, thus avoiding worrying about destroying the object manually along all return paths.

Now, #define is still useful, but not in its previous forms - use inlines instead of macros, and consts instead of #define constants. However, for creating meta-language or taking shortcuts on class description, templates are often inadequate, so #defines are still appropriate. You probably don't need to worry about that until you're writing templates and can see the weaknesses of the template system.

##### Share on other sites

#define is a part of C++. C++ style dictates that you should prefer consts/enums to #define. It doesn't say that #define is wrong or not standard.

...

##### Share on other sites
Quote:
 Original post by antareusMisleading subject.#define is a part of C++. C++ style dictates that you should prefer consts/enums to #define. It doesn't say that #define is wrong or not standard.

Correct. There is nothing wrong, per se, with using #defines for constants. But it is considered a bad, unmodern C++ style of programming (and adding this is what Simon mentions about optimizations).

##### Share on other sites
I believe major compilers (at least MSOC and GCC) should be able to do compile time replacement of const int with a literal and also work out various arithmetic ops at compile time. Not sure though...

##### Share on other sites
Quote:
 Original post by PromitI believe major compilers (at least MSOC and GCC) should be able to do compile time replacement of const int with a literal and also work out various arithmetic ops at compile time. Not sure though...

They may, but why not just take the safe path and use const? Explicitly delcaring what you really mean can only help.

Another benefit of const vs #define is that you can embed your const variables in namespaces while #defines are "global". Name collisions are a pain.

##### Share on other sites
I've checked. Const and define generates exactly same output(same assembly). Simple expression with both is evaluated at compile time. So i'm using const.

##### Share on other sites
Quote:
 Original post by DmytryI've checked. Const and define generates exactly same output(same assembly). Simple expression with both is evaluated at compile time. So i'm using const.

[nit] Unless the programmer takes the address of that constant, in which case the compiler will be forced to set aside memory for that constant, rather than behaving like a strongly typed #define. [/nit]

##### Share on other sites
They're kind of two different things. The preprocessor is for substituting text, and variables are, well, variables. Sure, you can use the preprocessor to change the value of something, but that isn't the only thing you can do with it, and it doesn't make it evil.

Anyway, if you are going to play around with the preprocessor, it's a good idea to be polite about it. In general, give macros uppercase names so you know they're not variables, and also if all macros are written like that, then there's no chance they'll shadow a variable with the same name. And undefine them when you're done with them so they can't screw up something they're not supposed to.

##### Share on other sites
#define NUMBER 100
pro: quick.
con: dosn't respect namespace scoping.

const unsigned int NUMBER = 100;
pro: respects namespaces.
con: need seperate definitions, one in header file, one in source file, to prevent the symbol from conflicting with itself.

enum { NUMBER = 100 };
pro: quick, respects namespaces.
con: stuck with ints.

all this said, I hate allcaps, and only consider it good naming convention for #defines for the exact purpouse that I hate them - it will draw attention to the fact that it is a brute force text replacement tool, and can cause serious problems (especially with macros, --/++ operators, and all that other goodyness).

Our brains usually do not read every letter of a word. They read the first few, the last few, and the varying heights of the letters inbetween. ALLCAPS removes the height difference, which is more of a pain - and slower - for our brains to parse.

EVER WONDER WHY LEGAL TEXT IS OFTEN WRITTEN IN ALLCAPS? My theory is that it's a tool used by lawyers to try and prevent people from reading it - or at least slow you down enough for the salesman to have some time to distract you from it.

##### Share on other sites
I use #define in Java. In fact, you can use it any language you want.

The preprocessor is defined in the C/C++ ANSI standard (just search google for ANSI C++ to find the standard).

Skizz

##### Share on other sites
Quote:
 Original post by ace_lovegroveIn C u would write:#define NUMBER 100apparently, if your strict about standards, in C++ u would write:const unsigned int NUMBER = 100;Any one do the latter?

Just note, the first is preferable in C. "const" means different things in C and C++. In C++ it means "constant"; in C it means "read only".
const unsigned int n = 10;int main(void) {  char array[n];  return 0;}

array is (I believe) a VLA in C99, an array in C++, and an error in C89. To my knowledge, const is preferred to #define in C++ for two reasons:

1) #define ignores namespaces (But you should be prefixing your #define's with your namespace anyway, so this just means you can't shorten it up with a using directive)

2) With const you're more likely to get a name instead of a magic number when working with the debugger (This is a QoI issue in both C and C++)

So far as I can tell, #define constants are typesafe. I'm willing to be proven wrong, but nobody's ever offered an example.

##### Share on other sites

This topic is 4815 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Create an account

Register a new account

• ### Forum Statistics

• Total Topics
628710
• Total Posts
2984319

• 23
• 11
• 9
• 13
• 14