Apparently #define is not c++

Started by
16 comments, last by Way Walker 19 years, 6 months ago
I believe major compilers (at least MSOC and GCC) should be able to do compile time replacement of const int with a literal and also work out various arithmetic ops at compile time. Not sure though...
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Advertisement
Quote:Original post by Promit
I believe major compilers (at least MSOC and GCC) should be able to do compile time replacement of const int with a literal and also work out various arithmetic ops at compile time. Not sure though...


They may, but why not just take the safe path and use const? Explicitly delcaring what you really mean can only help.

Another benefit of const vs #define is that you can embed your const variables in namespaces while #defines are "global". Name collisions are a pain.
Stay Casual,KenDrunken Hyena
I've checked. Const and define generates exactly same output(same assembly). Simple expression with both is evaluated at compile time. So i'm using const.
Quote:Original post by Dmytry
I've checked. Const and define generates exactly same output(same assembly). Simple expression with both is evaluated at compile time. So i'm using const.


[nit] Unless the programmer takes the address of that constant, in which case the compiler will be forced to set aside memory for that constant, rather than behaving like a strongly typed #define. [/nit]
--Michael Fawcett
They're kind of two different things. The preprocessor is for substituting text, and variables are, well, variables. Sure, you can use the preprocessor to change the value of something, but that isn't the only thing you can do with it, and it doesn't make it evil.

Anyway, if you are going to play around with the preprocessor, it's a good idea to be polite about it. In general, give macros uppercase names so you know they're not variables, and also if all macros are written like that, then there's no chance they'll shadow a variable with the same name. And undefine them when you're done with them so they can't screw up something they're not supposed to.
Chess is played by three people. Two people play the game; the third provides moral support for the pawns. The object of the game is to kill your opponent by flinging captured pieces at his head. Since the only piece that can be killed is a pawn, the two armies agree to meet in a pawn-infested area (or even a pawn shop) and kill as many pawns as possible in the crossfire. If the game goes on for an hour, one player may legally attempt to gouge out the other player's eyes with his King.
#define NUMBER 100
pro: quick.
con: dosn't respect namespace scoping.

const unsigned int NUMBER = 100;
pro: respects namespaces.
con: need seperate definitions, one in header file, one in source file, to prevent the symbol from conflicting with itself.

enum { NUMBER = 100 };
pro: quick, respects namespaces.
con: stuck with ints.

all this said, I hate allcaps, and only consider it good naming convention for #defines for the exact purpouse that I hate them - it will draw attention to the fact that it is a brute force text replacement tool, and can cause serious problems (especially with macros, --/++ operators, and all that other goodyness).

Our brains usually do not read every letter of a word. They read the first few, the last few, and the varying heights of the letters inbetween. ALLCAPS removes the height difference, which is more of a pain - and slower - for our brains to parse.

EVER WONDER WHY LEGAL TEXT IS OFTEN WRITTEN IN ALLCAPS? My theory is that it's a tool used by lawyers to try and prevent people from reading it - or at least slow you down enough for the salesman to have some time to distract you from it.
I use #define in Java. In fact, you can use it any language you want.

The preprocessor is defined in the C/C++ ANSI standard (just search google for ANSI C++ to find the standard).

Skizz
Quote:Original post by ace_lovegrove
In C u would write:

#define NUMBER 100

apparently, if your strict about standards, in C++ u would write:

const unsigned int NUMBER = 100;

Any one do the latter?


Just note, the first is preferable in C. "const" means different things in C and C++. In C++ it means "constant"; in C it means "read only".
const unsigned int n = 10;int main(void) {  char array[n];  return 0;}

array is (I believe) a VLA in C99, an array in C++, and an error in C89. To my knowledge, const is preferred to #define in C++ for two reasons:

1) #define ignores namespaces (But you should be prefixing your #define's with your namespace anyway, so this just means you can't shorten it up with a using directive)

2) With const you're more likely to get a name instead of a magic number when working with the debugger (This is a QoI issue in both C and C++)

So far as I can tell, #define constants are typesafe. I'm willing to be proven wrong, but nobody's ever offered an example.

This topic is closed to new replies.

Advertisement