#### Archived

This topic is now archived and is closed to further replies.

# Thoughts on defines and macros in C++, should they be used?

This topic is 5287 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I was just wondering what people thought about using macros and defines in Object oriented programming and in C++. I am asking because someone I know at work is using macros to define macros and making entire files that are just macros. Shouldn't good OO design really remove the need for macros? I am not sure, just asking. [edited by - hellfire on September 1, 2003 2:05:29 AM]

##### Share on other sites
I use defines in my code though I do try to keep their use to a minimum. There are just some things that I cannot see doing any better way.

I find that defines make the code a LOT more clear.
For instance, my current protect. Networking code that has to recognize many different packet types. You tell me which is clearer?

if (Packet.Type == GENERIC_PACKET)
{
}
else if (Packet.Type == MOVE_PACKET)
{
}

OR

if (Packet.Type == 0)
{
}
else if (Packet.Type == 1)
{
}

Now the use of macros to mace macros thing is another story all together... Still don''t think they should be done away with all together though.

Webby

##### Share on other sites
There are some things macros can do that other things in C++ can't do. But making such heavy use of them is pretty suspicious. They're not THAT useful. Maybe he doesn't know of inline functions or templates -- Those together remove many of the uses for macros. (edit: or 'const', as the post above mine shows)

As for OO.. It certainly doesn't solve every problem in programming, no matter how good OO design you have! There are many things that can be made a hell of a lot more elegantly with alternative approaches. OO and macros are so different things that you can't really say OO would solve what macros do or vice versa.

[edited by - civguy on September 1, 2003 2:49:53 AM]

##### Share on other sites
quote:
Original post by WebsiteWill
if (Packet.Type == GENERIC_PACKET)
{
}
else if (Packet.Type == MOVE_PACKET)
{
}

You don''t need macros. You need const.

const int GENERIC_PACKET = 0;
const int MOVE_PACKET = 1;

Or, even better, an enum:

enum PacketIds { GENERIC_PACKET, MOVE_PACKET };

Now you don''t even have to type the indices, since enum automatically makes increasing indices.

##### Share on other sites
I have three macros defined

#ifdef DEBUG
#define SETDEBUG(X) X
#else
#define SETDEBUG(X) /*X*/
#endif
#define nSETDEBUG(X) /*X*/
#define DEBUGOUT(X) {DebugSystem::Stream << X; DebugSystem::Stream.flush( );}

Then, at the beginning of my files I just define all the output macros I need, such as

#define VIDEO_TEST_LOG(X) SETDEBUG(DEBUGOUT(X))

In my code, I can call anywhere :

VIDEO_TEST_LOG( "\n [video error] " << Object.name << " Failed at test #" << TestNumber )

When I don''t need debug output anymore, I undefine DEBUG.
When I don''t need a specific output anymore, I define it to nSETDEBUG insetead, like this:

#define UNUSED_LOG(X) nSETDEBUG(DEBUGOUT(X))

This way, I can change a single letter to get it working again. And besides, all this takes is one line per output. Finally, since in the release version all these are commented out, there''s no need to worry about which are getting compiled into the final binary.

ToohrVyk

##### Share on other sites
I can see using macros, in some the cases you guys showed. I can see where they could be replaced with other coding practices.

Okay, here is my real question, this guy as far as I can tell, is using macros instead of using objects and inhertance. ( I wish I could post the code, but I can''t.) Does anyone think that is okay?

##### Share on other sites
quote:
Original post by ToohrVyk
I have three macros defined

#ifdef DEBUG
#define SETDEBUG(X) X
#else
#define SETDEBUG(X) /*X*/
#endif
#define nSETDEBUG(X) /*X*/
#define DEBUGOUT(X) {DebugSystem::Stream << X; DebugSystem::Stream.flush( );}

Then, at the beginning of my files I just define all the output macros I need, such as

#define VIDEO_TEST_LOG(X) SETDEBUG(DEBUGOUT(X))

In my code, I can call anywhere :

VIDEO_TEST_LOG( "\n [video error] " << Object.name << " Failed at test #" << TestNumber )

When I don't need debug output anymore, I undefine DEBUG.
When I don't need a specific output anymore, I define it to nSETDEBUG insetead, like this:

#define UNUSED_LOG(X) nSETDEBUG(DEBUGOUT(X))

This way, I can change a single letter to get it working again. And besides, all this takes is one line per output. Finally, since in the release version all these are commented out, there's no need to worry about which are getting compiled into the final binary.

ToohrVyk

Why not use have a debug class, have the debug functions in there, and make instance of it all the files that would use it? Or make it the base class? That way you just delete the instance of it, but you still have the class for other projects. I see how macros can be useful, but I can see how you wouldn't need one either.

[edited by - Hellfire on September 1, 2003 3:44:36 AM]

##### Share on other sites
quote:
Original post by hellfire

Why not use have a debug class, have the debug functions in there, and make instance of it all the files that would use it? Or make it the base class? That way you just delete the instance of it, but you still have the class for other projects. I see how macros can be useful, but I can see how you wouldn''t need one either.

If you did that you''d have calls to a debugging class spread throughout your application. When it came time to ship a production version of your program, it''d still have that debugging stuff in it, potentially slowing things down. Even if you later made it so all the debugging methods didn''t actually do anything, you''d still have the overhead of all these needless method calls throughout your production release. You can''t just delete the class, if you did that you''d have to go through and remove all references to it in all your other code.

Macros entirely occur at the preprocessor level, before the compiler even comes into play. If you turn them off, then they''re not included in your binary at all. assert() is a good example of this.

C++ has reduced the need for macros dramatically, but when it comes to debug code versus production code, macros are still quite useful.

##### Share on other sites
quote:
Original post by hellfire
Okay, here is my real question, this guy as far as I can tell, is using macros instead of using objects and inhertance. ( I wish I could post the code, but I can't.) Does anyone think that is okay?
If he's doing code like this:
struct X {  #define X_MEMBERS int blaa; int blee;  X_MEMBERS};struct Y { //derived from X  #define Y_MEMBERS int foo; int bar;  X_MEMBERS  Y_MEMBERS};

struct X {  int blaa;  int blee;};struct Y : X {  int foo;  int bar;};

Then he's doing something very wrong . But I can't really say since I haven't seen his code.

[edited by - civguy on September 1, 2003 3:58:13 AM]

##### Share on other sites
Yes, he is doing something like that, but much worse and much uglier, and much more confusing. How did you see his code.

quote:
Original post by civguy
quote:
Original post by hellfire
Okay, here is my real question, this guy as far as I can tell, is using macros instead of using objects and inhertance. ( I wish I could post the code, but I can''t.) Does anyone think that is okay?
If he''s doing code like this:
struct X {  #define X_MEMBERS int blaa; int blee;  X_MEMBERS};struct Y { //derived from X  #define Y_MEMBERS int foo; int bar;  X_MEMBERS  Y_MEMBERS};

struct X {  int blaa;  int blee;};struct Y : X {  int foo;  int bar;};

Then he''s doing something very wrong . But I can''t really say since I haven''t seen his code.

[edited by - civguy on September 1, 2003 3:58:13 AM]

##### Share on other sites
You could just remove the class from the project. I remove classes from projects. If you only use the class as a debug class and only use it to where you used macros, than you can just remove the class like you remove the macros.
But to each their own.
quote:
Original post by tortoise
quote:
Original post by hellfire

Why not use have a debug class, have the debug functions in there, and make instance of it all the files that would use it? Or make it the base class? That way you just delete the instance of it, but you still have the class for other projects. I see how macros can be useful, but I can see how you wouldn''t need one either.

If you did that you''d have calls to a debugging class spread throughout your application. When it came time to ship a production version of your program, it''d still have that debugging stuff in it, potentially slowing things down. Even if you later made it so all the debugging methods didn''t actually do anything, you''d still have the overhead of all these needless method calls throughout your production release. You can''t just delete the class, if you did that you''d have to go through and remove all references to it in all your other code.

Macros entirely occur at the preprocessor level, before the compiler even comes into play. If you turn them off, then they''re not included in your binary at all. assert() is a good example of this.

C++ has reduced the need for macros dramatically, but when it comes to debug code versus production code, macros are still quite useful.

##### Share on other sites
quote:
Original post by tortoise
quote:
Original post by hellfire

Why not use have a debug class, have the debug functions in there, and make instance of it all the files that would use it? Or make it the base class? That way you just delete the instance of it, but you still have the class for other projects. I see how macros can be useful, but I can see how you wouldn''t need one either.

If you did that you''d have calls to a debugging class spread throughout your application. When it came time to ship a production version of your program, it''d still have that debugging stuff in it, potentially slowing things down. Even if you later made it so all the debugging methods didn''t actually do anything, you''d still have the overhead of all these needless method calls throughout your production release.

Slightly OT: .NET has a pretty nifty solution to this problem. If you mark a method with the ConditionalAttribute attribute, all calls to the method are removed by the compiler unless a specific preprocessor symbol is defined:

[Conditional("DEBUG")]
public void Debug( string message )
{
someFile.WriteLine( message );
}

logger.Debug( "In some method" );

In the above example, the Debug method and all calls to it will be stripped out of a build unless the DEBUG preprocessor symbol is defined.

--
AnkhSVN - A Visual Studio .NET Addin for the Subversion version control system.
[Project site] [Blog] [RSS] [Browse the source] [IRC channel]

##### Share on other sites
quote:
Original post by hellfire
You could just remove the class from the project. I remove classes from projects. If you only use the class as a debug class and only use it to where you used macros, than you can just remove the class like you remove the macros.
But to each their own.

What if your users find some bugs, what if you want to update the gameplay, what if.. what if..
Would you add all the removed function-calls back to the source? Or would you only debug the new code? What if they new code creates a bug in the old code?

[...]
There's nothing wrong with defines, as long as you keep them clean, easy to read, easy to use.

.lick

[edited by - Pipo DeClown on September 1, 2003 3:43:51 PM]

##### Share on other sites
quote:
Original post by tortoise
If you did that you''d have calls to a debugging class spread throughout your application. When it came time to ship a production version of your program, it''d still have that debugging stuff in it, potentially slowing things down.

What if you inline the functions, so that Debug( "error" ) resolves to nothing? Does it still have overhead?

##### Share on other sites
what bout this?
const axpReal64   axpMicro = 0.000001;const axpReal64   axpKilo  = 1000.0;#define MICRO      axpMicro*#define KILO       axpKilo*axpReal64 gValue = KILO lValue;

one thing is bothering me though...

for "whole" numbers over 0 (like 1, 10, 100, 1000,etc..) would it be better to use real or int? ie: we do not know which type will be used more often...

##### Share on other sites
quote:
Original post by psykr
What if you inline the functions, so that Debug( "error" ) resolves to nothing? Does it still have overhead?

inlined code still has a cost, it's still gotta works its way through the processor. You also have no guarantee anything is inlined, larger functions are likely to not get inlined despite your use of 'inline'. And on top of that, debugging calls often include writing to a log file and/or writing to stdout, two expensive tasks that inlining won't speed up much at all.

quote:

You could just remove the class from the project. I remove classes from projects. If you only use the class as a debug class and only use it to where you used macros, than you can just remove the class like you remove the macros.
But to each their own.

Well yeah. But that isn't what you said originally, at least not what I got. You seemed to suggest replacing the macros with a debug class. If you have a debug class that only gets called via macros, then that's pretty much the same thing.

[edited by - tortoise on September 1, 2003 8:32:33 PM]

##### Share on other sites
The one thing that is really nice about defines that there is still no good alternative is nested comments:

#IF 0

just comment out the whole damn class

#IF 0

crappy function

#ENDIF

#ENDIF

I still don''t know of an alternative to this.

-timiscool999
"I like waffles. Especially with syrup." -me

XBox controller v2.0

click for bigger picture

##### Share on other sites
He removes the macros too. Add the instance back in if you want to debug. Adding and removing objects is very easy, once the class is created. I am not saying that defines are bad, I just wanted to know some cases that defines could do that OO code couldn''t, so far no one has shown me a case. I am not talking about which is more effient or better, by a few nana secs.
How is it different taking out macros or a class that do the same thing?

quote:
Original post by Pipo DeClown
quote:
Original post by hellfire
You could just remove the class from the project. I remove classes from projects. If you only use the class as a debug class and only use it to where you used macros, than you can just remove the class like you remove the macros.
But to each their own.

What if your users find some bugs, what if you want to update the gameplay, what if.. what if..
Would you add all the removed function-calls back to the source? Or would you only debug the new code? What if they new code creates a bug in the old code?

[...]
There''s nothing wrong with defines, as long as you keep them clean, easy to read, easy to use.

.lick

[edited by - Pipo DeClown on September 1, 2003 3:43:51 PM]

##### Share on other sites
I only used #defines to control compilation.

##### Share on other sites
I seemed to be getting side tracked here. He is doing this and move, he is defining structs in structs, using token pasting to define and redefine structs and fucntions. Is this only bad from a readablity standpoint? BTW he wrote a 10 page doc on how to use "his" system, plus a simple tutorial app and still people cannot follow what he is doing. Thats bad right?

quote:
Original post by civguy
quote:
Original post by hellfire
Okay, here is my real question, this guy as far as I can tell, is using macros instead of using objects and inhertance. ( I wish I could post the code, but I can''t.) Does anyone think that is okay?
If he''s doing code like this:
struct X {  #define X_MEMBERS int blaa; int blee;  X_MEMBERS};struct Y { //derived from X  #define Y_MEMBERS int foo; int bar;  X_MEMBERS  Y_MEMBERS};

struct X {  int blaa;  int blee;};struct Y : X {  int foo;  int bar;};

Then he''s doing something very wrong . But I can''t really say since I haven''t seen his code.

[edited by - civguy on September 1, 2003 3:58:13 AM]

##### Share on other sites
anything that requires a tutorial to _understand_ is absolute shit, IMO.

That thing your friend has dreamed up sounds like a candidate.

COM is on the list. hence my... aversion to DX...

##### Share on other sites
quote:
Original post by tortoise
inlined code still has a cost, it''s still gotta works its way through the processor. You also have no guarantee anything is inlined, larger functions are likely to not get inlined despite your use of ''inline''. And on top of that, debugging calls often include writing to a log file and/or writing to stdout, two expensive tasks that inlining won''t speed up much at all.

If it gets inlined, then no, there is no runtime overhead. Of course there''s still a preprocessor overhead, which macros also have.

One thing I use macros for is passing things like __FUNCTION__ to a logger.

#define LOG(A) printf( "%s %s(%d)\n", A, __FUNCTION__, __LINE__ );

##### Share on other sites
In the case of debug macros, there''s really no reason to manually remove them anyway. Assuming that in each and every #ifdef DEBUG / #endif block you have, there exists only code that should be compiled in a debug build, then that code is excluded entirely by compiling in release mode (or manually #undef''ing DEBUG, whichever the case may be).

My point here is the advantage that you NEVER have to remove the debug code and disabling it takes extremely little effort. With a debug class, you''d either have to remove all reference to it, which is tedious; wrap it in #ifdef DEBUG / #endif blocks, which defeats the purpose of this discussion; or change all the methods to no-ops, which, albeit easy, seems a little tacky compared to just not including the code.

Though the debug class thing could be made a tad easier (albeit less efficient and a bit more memory-wasting) by having an ABC with the entire interface necessary for debugging. Then derive two classes from it: one that properly implements all of the virtual functions (and no others) and the other that defines all the virtual functions as no-ops. Have the app hold a pointer to the base class, then (possibly in a #ifdef DEBUG / #endif block) construct it to the real class for debugging and the no-op class for release.

I came up with that solution off the top of my head. It seems like a viable way to switch between debug and release versions of code with extremely little effort and minimal use of macros.

-Auron

##### Share on other sites
What about using macros to define and create classes and objects? What are you guys thoughts on that little practice?

##### Share on other sites
quote:
Original post by hellfire
What about using macros to define and create classes and objects? What are you guys thoughts on that little practice?

Without seeing the code, my guess is that the person either hasn''t learned about templates yet or is afraid to use them.

As others have said, macros have limited value in C++. There are a few useful tricks, but anything more than sporadic usage in serious code indicates ignorance or naiveté