enum questions

Started by
13 comments, last by Jiia 20 years ago
First, is it okay to use enum''s in place of plain #defines? At one time, I thought that was their only use, but lately I''ve seen them being used other ways. For example, I never label my enum: enum { SomethingONE, SomethingELSE SomethingOTHER, SomethingCOUNT }; I know it works, but is this how most programmers create constant defines? It''s a pain in the ass doing #define SomethingONE 0 #define SomethingELSE 1 #define SomethingOTHER 2 etc,etc My enum is used to accomplish the same deal, but with less typing, counting, and maintanence. Is this how everyone else does it? I have some enums ranging in the 50''s of labels. It''s all crunched and removed at compile time, right? I also wanted to know if it''s possible to create negative enums. An enum that counts backwards: enum { SomethingONE, // would be 0 SomethingELSE, // would be -1 SomethingOTHER, // would be -2 SomethingCOUNT // would be -3 }; I know I can simply type "-enumlabel", but it''s less readable. Thanks for any help or advice
Advertisement
im not sure, but im pretty sure that the pre procceser is evil. it doesnt have type safety, for one thing, its also less read able, plus, when you make an enum, your making a new data type, so you can effectively say

enum Blah
{
blah1,
blAh2
}

Blah something = blah1;


this way you dont have to write int or whatever, you actually know what your putting into it. also, if you try to put anything else into it, the compiler will let you know. go with enums
FTA, my 2D futuristic action MMORPG
I''m wanting more of a hard-coded value. Places where I would insert a manually typed number, but give it a name. For example, my character animation-types are defined with enums. The enum labels hold the animation-type index within an array.

enum
{
ANIM_STAND,
ANIM_WALK,
ANIM_COUNT
};
..
CAnim AnimArray[ANIM_COUNT];
..
CurrentAnim = AnimArray[ANIM_WALK];
You can use enum''s like that, but since you are not needing a new type which enum is, I would go with const values.

const int ANIM_STAND = 0;
const int ANIN_WALK = 1;
const int ANIM_COUNT = 2;

One of the uses for const in C++ is to replace #defines for creating named constant values.

"Life is very short, and there''s no time for fussing and fighting my friend..."
"Life is very short, and there's no time for fussing and fighting my friend..."
Your use of enums is consistent with good practices. If you want to have a set of identifiers to describe state, enums are useful and, in my opinion, a very good replacement for #defines.
Actually, yes, constants and enumarations are good practice. The only real reason to use pre-processor macros for constants is conditional compilation. Namely, lets say I have a debugging method.
#define DEBUGMODE 1#if DEBUGMODE#define DebugLog( x ) fprintf(stderr, x )#else#define DebugLog( x )#endif 
You can''t use enumerations to do this. Actually, some would tell me I should be using inline functions. Its all a matter of obsoletion and succession.
william bubel
Enums are good practice especially when you don''t really need to know what the actual numeric values are (even though you *do* know because of the C/++ standard).
Hmm. const int ANIN_WALK = 1; seems as troublesome as #defines. The reason I switched to enums was to avoid having to manually update the numbers. When numbers get into the 100's, it can become a real big pain in the ass to delete the 3rd entry. I usually just replace it with a "reserved", but sometimes that's not acceptable.

enums are not pre-processor? How exactly do they work? Do they require global memory? Or are they replaced with real-hard-coded values once the program is fully compiled? I don't want my thousands of global enum labels to be using up global memory.

Hmm. When a number is manually typed into code (i = j + 99), the 99 does use a register or some type of temporary memory to compute the calculation, correct? I want my enums to be no different. Is this not how they work? If I define a list of enums in a global header, is that no different than declaring integers? If so, I need to switch back to #defines!

Thanks for all the info

[edited by - Jiia on March 25, 2004 5:20:37 PM]
quote:Original post by Jiia
enums are not pre-processor? How exactly do they work? Do they require global memory? Or are they replaced with real-hard-coded values once the program is fully compiled? I don''t want my thousands of global enum labels to be using up global memory.


An enum is basically just a disguised int.
You are not the one beautiful and unique snowflake who, unlike the rest of us, doesn't have to go through the tedious and difficult process of science in order to establish the truth. You're as foolable as anyone else. And since you have taken no precautions to avoid fooling yourself, the self-evident fact that countless millions of humans before you have also fooled themselves leads me to the parsimonious belief that you have too.--Daniel Rutter
So with an example such as

enum
{
NumberFive=5,
NumberSix=6
};

#define NumberEleven (NumberFive + NumberSix)

NumberEleven actually gets processed at runtime..? I see. That sucks. Well, I''m glad I know that now. I did a profile on some methods a while back where two functions performed the same task, but one operated using globals. It wasn''t pretty. Globals must have to run through some hard-core chains to get processed. I''ve hated them ever since.

Are there any tricks to defining macros for such things? Can I avoid typing out hundreds of number values for my #defines? I would give an example of how terrible it can become, but the post would advance 3 pages.

I know there are tons of games out there that have "hard-coded" states and index values. How do they manage this? With #defines,
enums, or something else? And if #defines are usually used, do they really do the #define NAMEA 0, #define NAMEB 1 deal?

This topic is closed to new replies.

Advertisement