Sign in to follow this  
MBrand2

Why use #define?

Recommended Posts

While studying tutorials and books I have come across things like #define Num 100 I don't know what #define is, but it is fairly easy to figure out that it does something similar to int Num = 100;. So what is the advantage to using #define over declaring an int?

Share this post


Link to post
Share on other sites
it's a pre-compiler tag. so rather than creating a variable named num that takes up space in the EXE, you replace all instances of the symbol Num with the #define'd value _before_ you compile.

-me

Share this post


Link to post
Share on other sites
Anything with a "#" is taken care of by the preprocesser. This is done before your compilier starts to compile. #define is just a replacement tag. #define num 100 just replaces 100 every place it sees num. #define PI 3.14159, will set up PI for you, since we all understand what PI is then it just makes sense to use it. The other reason you use it is for value that might change sometime latter in your coding. Say you want to send a network packet every 53 MS. So you set up a #define SEND_PACKET_TIME 53, later you realize you only need to send it every 200 MS, you just have to change your #define SEND_PACKET_TIME in one place instead of going through all your code to make the changes.

theTroll

Share this post


Link to post
Share on other sites
Quote:
Original post by MBrand2
While studying tutorials and books I have come across things like #define Num 100 I don't know what #define is, but it is fairly easy to figure out that it does something similar to int Num = 100;. So what is the advantage to using #define over declaring an int?


It is usually used in c, where in the standard that most compilers have implemented (AFAIK), you cannot use a variable as an array size. The #define gives what would be an anonymus magic number some context. I gather that the latest version of C allows for variables to be used as array sizes.

In c++, you would prefer to use a const int, or const float/whatever, where you might be have tempted to use a #define in c. Because #defines are dealth with during preprocessing, they cannot obey scope, and defines inside namespaces, etc won't work quite as intended.

Share this post


Link to post
Share on other sites
I should mention the "evil" macro is also a #define construct
something of the form:

#define min(x,y) (((x)<(y))?(x):(y))

will return the minimum of any two constructs that support the < operator. (this can also be made with a template based macro too)
Since the code on the right side replaces the code on the left side, your code:
min(1,2) gets replaced with (((1)<(2))?(1):(2)) wich may be reduced later by the compiler, but maybe not.

#define is also useful with other preprocessor constucts.
This includes the standard header gaurds:
#ifndef MY_HEADER_H
#define MY_HEADER_H
....
#endif

but can help you at compile time choose code in your project aswell.

#define USE_LOGGER
...
Logger()
{
#ifdef USE_LOGGER
....code....
#endif
}


Since the #ifdef and #endif is evaluated at compile time, your code is only compiled into the final exe if you #define USE_LOGGER.
If you comment out that #define, then all of a sudden the logging code is not even compiled.
where

static bool USE_LOGGER = true;
...
Logger()
{
if ( USE_LOGGER )
{
....code....
}
}


means that the if is evaluated at every call to logger. So the #defines can save you computation time, since
the compiler will probably optomize out the empty function, where it will not optomise out the function with the regular if statement.
So it completely removes the overhead of the logger function.

Share this post


Link to post
Share on other sites
Quote:
Original post by MBrand2
... So what is the advantage to using #define over declaring an int?


There is none. People still use it mostly out of habit -- there was no good alternative until a few years ago.

Except for a few situations, #define should generally be avoided. It works by substituting text, so it is a very good source of bugs, and it makes debugging more difficult.

One situation where it is useful is enabling and disabling code (as demonstrated by KulSeran), or when doing text tricks such as concatenating symbols.

Share this post


Link to post
Share on other sites
It is occasionally useful for other things, such as turning debugging on and off :

#define DEBUG

(...)

#ifdef DEBUG
cout << "Debugging stuff" << endl
#endif

For production, you can comment out the define, and your exe won't be cluttered with a bunch of if statements, as it would if you had used a bool.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Quote:
Original post by JohnBolton
Quote:
Original post by MBrand2
... So what is the advantage to using #define over declaring an int?


There is none. People still use it mostly out of habit -- there was no good alternative until a few years ago.

Except for a few situations, #define should generally be avoided. It works by substituting text, so it is a very good source of bugs, and it makes debugging more difficult.

One situation where it is useful is enabling and disabling code (as demonstrated by KulSeran), or when doing text tricks such as concatenating symbols.


N.B. if you're going to use "int num = 100;" instead of "#define num 100", make sure you use "const int num = 100;"!!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this