# Enumeration vs #define

This topic is 3116 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

So as I was reading C++ Primer as I always do these days... I ran across the enumerated type, and I've also known of #define for a while. I want to ask, what is the difference between the two? You can't change either as they are both constant as I understand, each one assigns a certain name to a specific number, but do they both have the same functionality? As usual I am sure the only difference is when you get into some really deep stuff and take advantage of the most advanced functionality, but I am still very curious. Thanks, Dbproguy

##### Share on other sites
#define is a text replacement, and enum is a type. Or, in other words, #define is dumb, and enum is smart.

Since a #define is only a text replacement, there is no checking for correctness. Enums add some error checking. For example, if you tell the compiler that a function takes a particular enum (even though the enum is only just an number internally, too), it will not accept any random number, but only a constant that is really an enum of that type. C++0x also has a new "enum class" type which is even more strict.

##### Share on other sites
Maybe an example helps better...

If you #define SOME_VALUE 2<<24-1 and use SOME_VALUE a few times later in your source, then the preprocessor will just obtusely copy "2<<24-1" over every occurrance of SOME_VALUE. If there is a mistake in your macro definition or if there are some operators in the expression that have precedence (this happens often when people forget parentheses around macros), it is very possible that this expands to total bullshit. It might give compiler errors that you don't understand, or it might compile fine, but the program might not work properly.

Imagine for example:
if(a > SOME_VALUE && a < SOME_VALUE+1) {...}
What does this evaluate to? Not immediately obvious... though it incidentially works.

On the other hand, if you write enum blah{ some_value = 2<<24-1 }; then the compiler knows that some_value is a compile time constant with a numeric value that it will determine during the compilation (or, if there is an error, it will complain). It will also assign a type to it. Why is assigning a type useful?

Imagine this:
enum color { red, yellow, green };
...
void TrafficLight::Set(color to) { m_color = to; }
...
TrafficLight t;
t.Set(green); // works fine
t.Set(53); // does not compile, what's 53 supposed to be anyway?
#define blue 4
t.Set(blue); // does not compile either
t.Set(green+yellow); // does not compile either

The compiler knows that it may only accept certain valid names for this function

##### Share on other sites
this is safe:
enum { constant = 10 + 3 };

this is not
#define constant 10 + 3

Redefining an enum will cause an error, redefining a macro, a simple warning.

In visual studio at least, during debugging you can get the numeric value of an enum, to that of a define.

##### Share on other sites
Constants made via #define are simple text substitutions, and have no type per se.
Enums, on the other hand, are typesafe (every typedef enum defines a new type), and the compiler can proofread switch statements for unchecked values. AFAIK, they also optimize better than const ints. So...
typedef enum {    value1,    value2,    value3} some_enum;typedef enum {    valueA,    valueB,    valueC} some_other_enum;typedef enum {    valueArnie,    valueBertha,    valueC // Compiler generates an error here: we already used valueC in another enum    // #defines would have created a subtle bug here} some_conflicting_enum;void some_func(some_enum foo) {   switch(foo) {       case value1: /*blah*/ break;       case value2: /*more bla*/break;       // Compiler generates a warning here: value3 not used in switch statement       // use default statement to silence it   }}void some_bad_func(some_other_enum bar) {    some_func(bar);  // Compiler generates a bonafide error}

Also, one can have anonymous enums like this (I'll sneak my own question in here if you don't mind):
class MyClass {    private:    enum {state_uninit, state_ok, state_error } state;    /*....*/    public:    MyClass() {} // state holds which value now?};/*compare and contrast*/#define STATE_UNINIT 0#define STATE_OK 1#define STATE_ERROR 2class MyOtherClass {    private:    int state;    /*....*/    public:    MyOtherClass() {state=4;} // Compiler accepts this without complaining. Bad Compiler!};

Unless you brutally abuse casts, enum variables are constrained in the values they can contain

##### Share on other sites
Quote:
 Original post by RandomBystanderAFAIK, they also optimize better than const ints.

What makes you think that?

Quote:
 Original post by RandomBystanderAlso, one can have anonymous enums like this (I'll sneak my own question in here if you don't mind):*** Source Snippet Removed ***

If your sneaky question is, which value an uninitialized enum has, it's simple to answer: Any, very likely an invalid one.

An enum is a primitive type and needs to be initialized explicitly.

##### Share on other sites
The other difference, which I think is good design, is that they can be scope/belong to classes, giving more meaning to the data you are abstracting.

##### Share on other sites
You were getting to the point, but it goes deeper than that.

A #define, used as a constant, is roughly equivilent to a C or C++ 'const' variable except, of course, that it is not typed, that it is inserted through text replacement, and that it's address cannot be taken via the & operator (because a #define never actually exists in memory at runtime).

An enumeration defines a category of related constants, which together constitute a new, distinct type. This means that type safety applies in full, and consequently that enumeration members from other enums and literal values cannot be assigned to the enum type, even when the constant values they represent may be identical to those in the target enumeration.

As an example, imagine that you wanted to represent the state of matter -- of which there are 4 generally accepted states: Solid, Liquid, Gas and Plasma.

With #define, you would do "#define SOLID 0", "#define LIQUID 1" and so on. Any function taking a "matter state" would be taking in some type which can be readily converted from an integer, and nothing prevents the programmer from passing SOLID to any such function, even when it has nothing to do with these "matter states" -- in other words you could very well multiply a Matrix by PLASMA, and the compiler is happy to make that happen, even though it makes no sense at all. Also just as bad, nothing prevents your functions that deal with the state of matter from taking any random value: What happens when one of those functions tries do deal with -1, or 17 (both of which coorespond to states of matter as yet undiscovered by science [grin])? And the final kicker is that you can't use the string "PLASMA" anywhere in your code, or it'll be replaced by "3" -- which is why such Macros are traditionally in all-caps to begin with; its simply a convention that helps avoid this shorcoming of the dumb text-replacement feature that is #define.

With enum, you would do (in C++):

enum eMatterState {  SOLID,  LIQUID,  GAS,  PLASMA};

In C, you'd wrap that up in an appropriate typedef, IIRC. Anyhow, now your functions dealing with the state of matter can take a parameter of type "eMatterState", so it can only take one of SOLID, LIQUID, GAS or PLASMA -- There is no way to pass -1 or 17, or even 0, 1, 2, or 3 without going through the eMatterState type. Also, no more multiplying that Matrix by PLASMA, unless you specifically define such an operation taking eMatterState as its parameter.

In short, enums are a way to define a new type which represents a set of acceptable values (which may be contiguous or sparse) that the compiler uses to enforce type safety in much the same way as classes do in C++ -- enumerations can be though of as a "class" of related integer constants. This helps keep you, the programmer, from making silly mistakes by allowing the compiler to enforce not just type correctness, but logical correctness as well.

As someone else mentioned, since enums are primitive types, they must be initialized, and so they should always be. Uninitialized enums are the only way to represent a value in an enum which is outside the set of acceptable values. Well, aside from abusing typecasting.

#define should almost always be avoided, and should only be used in cases where it's "feature" of not being type-safe is actually useful. These cases are few and far between, and you should stick to const for stand-alone constant values, and enum for related constant values, whenever possible. Until you can understand clearly when such a use of #define is necessary, they typically pose more risk than they are worth in non-trivial use cases.

[Edited by - Ravyne on November 5, 2009 5:25:21 PM]

##### Share on other sites
Quote:
Original post by Rattenhirn
Quote:
 Original post by RandomBystanderAFAIK, they also optimize better than const ints.

What makes you think that?

It depends on the situation.

In general, the compiler will, most of the time, simply inline a const int value as if it were a literal int, however, it is free not to do so. This is because a const int must reside in memory because its address can be taken via the '&' operator (though I wonder if the compiler is free to do away with that if its address is never taken).

An member of an enum, on the other hand, is not a variable, so the compiler can assume that its address can not be taken, and therefore need not reside in memory.

Now, it gets more interesting here because what the compiler can do becomes hardware dependant. Take the ARM arcitecture for instance, which supports, IIRC, literals of up to only 5 bits. If your enum member is within the range of 0 to 31, or -16 to 15, then its may be inserted inline, but if it falls outside this range (possibly even if any single member of the whole enum falls outside this range) then it must be loaded from memory because ARM doesn't as far as I know have any other means to support literals of a larger range.

I'm inclined to agree that an enum allows the compiler more room to make optimization choices (to the extent that the underlying platform allows) versus, say, const ints. In general, the more contextual information the construct supplies, the better the compiler *may* be able to do -- enum supplies more context than const int, if ever so little. That said, the difference this makes in practice, even under the most favorable conditions, is probably negligable. I certainly wouldn't go about changing every 'cont int' in your program to single-member enums [smile]

##### Share on other sites
Quote:
 Original post by RavyneIn general, the compiler will, most of the time, simply inline a const int value as if it were a literal int, however, it is free not to do so. This is because a const int must reside in memory because its address can be taken via the '&' operator (though I wonder if the compiler is free to do away with that if its address is never taken).

Yes it is free to do that and most do since it's fairly trivial, although I've seen some that don't, even though the architecture didn't need literal pools.

• 23
• 10
• 19
• 15
• 14