Archived

This topic is now archived and is closed to further replies.

Spartacus

Enum vs #define

Recommended Posts

Hey! Can someone tell me when it is appropriate to use an enumeration and when to use #define? Lets say I have a function called DrawText() which draws text on the screen, but to do that it needs some flags to determine if the text is right justified or left justified or bold or italic...etc. How would it then be appropriate to define those flags? Would it be best to use an enumeration or just #define the flags? e.g.:
enum TEXTFLAGS
{
TF_BOLD = 0x0001,
TF_ITALIC = 0x0002,
TF_LEFTJUSTIFY = 0x0004,
...etc
};

or

#define TF_BOLD 0x0001
#define TF_ITALIC 0x0002
#define TF_LEFTJUSTIFY 0x0004
 
The Direct3D header uses enumerations for almost everything (render states, texture stage states, texture operations...) except for the d3d caps. All of them are #defined. Why is that? Why aren''t they put in an enumeration like everything else? Thanks! Spartacus

Real programmers don''t document, if it was hard to write it should be hard to understand

Share this post


Link to post
Share on other sites
With those enums you wont be able to use them as bitflags. You can''t do that because when it comes time to pass those to a function the function will expect a single TEXTFLAGS enum. If you want to get around it and your compiler supports nameless enums then do

enum
{
TF_BOLD = 0x0001,
TF_ITALIC = 0x0002,
TF_LEFTJUSTIFY = 0x0004,
...etc
}

then your function can take any type that works with bitflags, char, int, whatever.

------------
- outRider -

Share this post


Link to post
Share on other sites
#defines in header files should generally be avoided. That is because every file that includes a header with a certain #defined constant is not allowed to redefine that constant. And it is not even allowed to use the defined identifier in another context because it will be replaced by the preprocessor. #defines are always global and they are independent from namespaces and scope.

So if you need to define a range of related constants, use an enumeration. If the enumeration is only used by a certain class, put it into the definition of that class and make it a local enumeration. This way, you prevent cluttering up the global namespace. If you need a single constant, unrelated to other constants, define it as a const variable, preferably inside a class definition.

If a range of constants should be used as flags as outRider pointed out, using an enumeration would be wrong. You could use #defines here, as the DirectX SDK headers do, but better yet use const declarations. And again, preferably inside a class definition or at least, a unique namespace. This way, nobody will ever have to worry when including your headers.

Source files (implementation files) are an entirely different topic. They will never be included anywhere, so feel free to clutter up the global namespace and go crazy with #defines - nobody will ever find out. As long as you can still maintain the code, it''s all right. But remember that consts and enums aren''t slower than #defines are - and they will be handled by the compiler, not by the preprocessor.

In general, constants should always be defined using the means provided by the language, not by the preprocessor. Macros are a different topic. They can really ease some tasks in implementation files and the are definitely not "evil" or anything.

Last but not least, if you really have to use #defined constants in a header file (I can''t see an urgent reason, though), make them as unique as possible. Think about using a common prefix as the DirectX SDK headers do.

That''s it. Happy coding.

Share this post


Link to post
Share on other sites
Hey thanks for the help to both of u For bitflags I guess I will just go on and #define the flags with a unique prefix, but I''ll try to use enumerations wherever it''s possible. Again, thanks alot for the help!

Spartacus



Real programmers don''t document, if it was hard to write it should be hard to understand


Share this post


Link to post
Share on other sites
quote:
Original post by outRider
With those enums you wont be able to use them as bitflags. You can''t do that because when it comes time to pass those to a function the function will expect a single TEXTFLAGS enum. If you want to get around it and your compiler supports nameless enums then do



That''s funny... I use enum''s as bitflags no problem:


  
class Entity {

enum Behaviours {
doesCollide = 1 << 0,
doesUpdate = 1 << 1,
doesMakeCoffee = 1 << 2,
doesWhatever = 1 << 3,

defaultBehaviour = doesCollide | doesMakeCoffee
};

Behaviours mBehaviour;

public:
Entity()
: mBehaviour(defaultBehaviour)
{ }
};



---- --- -- -
Blue programmer needs food badly. Blue programmer is about to die!

Share this post


Link to post
Share on other sites
You're combining your enums within the definition, not passing them to the function straight. Doing it that way you'd have to pre-combine every combination a user might want... I sure as hell wouldn't want to do that for an enum of more than a couple of entries.


    
enum Bitflags
{
bit1 = 1,
bit2 = 2,
bit3 = 4
};

void func(Bitflags flags)
{
}

int main()
{
func(bit1 | bit2);

return 0;
}


main.cpp(17) : error C2664: 'func' : cannot convert parameter 1 from 'const int' to 'enum Bitflags'
Conversion to enumeration type requires an explicit cast (static_cast, C-style cast or function-style cast)

You can do func((Bitflags)(bit1 | bit2)); but it's easier to just use an unnamed union.

------------
- outRider -

[edited by - outRider on August 13, 2002 1:16:08 AM]

Share this post


Link to post
Share on other sites
Yeah, you can only use them as bitflags if you convert them to a built-in type like int.


    
enum BitFlags
{
flag1 = 0x01,
flag2 = 0x02,
flag3 = 0x04
};


void TakeBitFlagsInt( int flags )
{
}


void TakeBitFlags( BitFlags flags )
{
}


TakeBitFlags( flag1 | flag3 ); // Doesn't work


TakeBitFlagsInt( flag1 | flag3 ); // Works



[edited by - Origin on August 13, 2002 5:42:44 AM]

Share this post


Link to post
Share on other sites