#### Archived

This topic is now archived and is closed to further replies.

# enum questions

This topic is 5081 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

First, is it okay to use enum''s in place of plain #defines? At one time, I thought that was their only use, but lately I''ve seen them being used other ways. For example, I never label my enum: enum { SomethingONE, SomethingELSE SomethingOTHER, SomethingCOUNT }; I know it works, but is this how most programmers create constant defines? It''s a pain in the ass doing #define SomethingONE 0 #define SomethingELSE 1 #define SomethingOTHER 2 etc,etc My enum is used to accomplish the same deal, but with less typing, counting, and maintanence. Is this how everyone else does it? I have some enums ranging in the 50''s of labels. It''s all crunched and removed at compile time, right? I also wanted to know if it''s possible to create negative enums. An enum that counts backwards: enum { SomethingONE, // would be 0 SomethingELSE, // would be -1 SomethingOTHER, // would be -2 SomethingCOUNT // would be -3 }; I know I can simply type "-enumlabel", but it''s less readable. Thanks for any help or advice

##### Share on other sites
im not sure, but im pretty sure that the pre procceser is evil. it doesnt have type safety, for one thing, its also less read able, plus, when you make an enum, your making a new data type, so you can effectively say

enum Blah
{
blah1,
blAh2
}

Blah something = blah1;

this way you dont have to write int or whatever, you actually know what your putting into it. also, if you try to put anything else into it, the compiler will let you know. go with enums

##### Share on other sites
I''m wanting more of a hard-coded value. Places where I would insert a manually typed number, but give it a name. For example, my character animation-types are defined with enums. The enum labels hold the animation-type index within an array.

enum
{
ANIM_STAND,
ANIM_WALK,
ANIM_COUNT
};
..
CAnim AnimArray[ANIM_COUNT];
..
CurrentAnim = AnimArray[ANIM_WALK];

##### Share on other sites
You can use enum''s like that, but since you are not needing a new type which enum is, I would go with const values.

const int ANIM_STAND = 0;
const int ANIN_WALK = 1;
const int ANIM_COUNT = 2;

One of the uses for const in C++ is to replace #defines for creating named constant values.

"Life is very short, and there''s no time for fussing and fighting my friend..."

##### Share on other sites
Your use of enums is consistent with good practices. If you want to have a set of identifiers to describe state, enums are useful and, in my opinion, a very good replacement for #defines.

##### Share on other sites
Actually, yes, constants and enumarations are good practice. The only real reason to use pre-processor macros for constants is conditional compilation. Namely, lets say I have a debugging method.
#define DEBUGMODE 1#if DEBUGMODE#define DebugLog( x ) fprintf(stderr, x )#else#define DebugLog( x )#endif
You can''t use enumerations to do this. Actually, some would tell me I should be using inline functions. Its all a matter of obsoletion and succession.

##### Share on other sites
Enums are good practice especially when you don''t really need to know what the actual numeric values are (even though you *do* know because of the C/++ standard).

##### Share on other sites
Hmm. const int ANIN_WALK = 1; seems as troublesome as #defines. The reason I switched to enums was to avoid having to manually update the numbers. When numbers get into the 100's, it can become a real big pain in the ass to delete the 3rd entry. I usually just replace it with a "reserved", but sometimes that's not acceptable.

enums are not pre-processor? How exactly do they work? Do they require global memory? Or are they replaced with real-hard-coded values once the program is fully compiled? I don't want my thousands of global enum labels to be using up global memory.

Hmm. When a number is manually typed into code (i = j + 99), the 99 does use a register or some type of temporary memory to compute the calculation, correct? I want my enums to be no different. Is this not how they work? If I define a list of enums in a global header, is that no different than declaring integers? If so, I need to switch back to #defines!

Thanks for all the info

[edited by - Jiia on March 25, 2004 5:20:37 PM]

##### Share on other sites
quote:
Original post by Jiia
enums are not pre-processor? How exactly do they work? Do they require global memory? Or are they replaced with real-hard-coded values once the program is fully compiled? I don''t want my thousands of global enum labels to be using up global memory.

An enum is basically just a disguised int.

##### Share on other sites
So with an example such as

enum
{
NumberFive=5,
NumberSix=6
};

#define NumberEleven (NumberFive + NumberSix)

NumberEleven actually gets processed at runtime..? I see. That sucks. Well, I''m glad I know that now. I did a profile on some methods a while back where two functions performed the same task, but one operated using globals. It wasn''t pretty. Globals must have to run through some hard-core chains to get processed. I''ve hated them ever since.

Are there any tricks to defining macros for such things? Can I avoid typing out hundreds of number values for my #defines? I would give an example of how terrible it can become, but the post would advance 3 pages.

I know there are tons of games out there that have "hard-coded" states and index values. How do they manage this? With #defines,
enums, or something else? And if #defines are usually used, do they really do the #define NAMEA 0, #define NAMEB 1 deal?

##### Share on other sites
#defines are always interpreted at "preprocessor time" - *before* compilation begins.

##### Share on other sites
actually jiia it isn''t like that, and doesn''t suck.

a preprocessor item is ALWAYS replaces at preprocess time, BUT contant math (ie ints, floats, enums, etc ...) may be (and usually is) replaced at compile time ... so there is no penalty ...

these following:

#define num1 1
const int num2 = 2;
enum
{
num3 = 3;
};

int x = num1 * num2 * num3 * 4;

generated the EXACT SAME runtime code as:

int x = 24;

on any decent compiler. because each argument used was known at compile time, and therefore solved by the compiler.

##### Share on other sites
Well, that's good news. But it still sucks that I'm declaring global variables for no reason. They don't need to be variables at all; They never change.

Oh well. It's still better than updating hundreds of numbers manually.

quote:
Original post by Zahlman
#defines are always interpreted at "preprocessor time" - *before* compilation begins.
Well, that's fine. But the only phases that concern me are *before my program runs* and after.

[edited by - Jiia on March 26, 2004 4:37:03 AM]

##### Share on other sites
quote:
Original post by Xai ..on any decent compiler. because each argument used was known at compile time, and therefore solved by the compiler.
I''m curious about this. If the compiler knows the enum isn''t going to change, why exactly is it left in after compile time at all? What use is there in keeping that number stored in some hidden memory through my entire program''s running course?

##### Share on other sites
it isn''t in many cases ... there is no rule that the C++ compiler has to leave the constant anywhere in memory (outside the instuctions using it) - UNLESS you put the extern keyword on it - or export it.