# Global variables in comparison to #define

This topic is 2020 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

As far as I can understand, if you use the pre processor command #define, for example, #define money 100, it'll replace all the instances of "money" and replace them with 100. As far as I know it is global and can be used all over your code?

What is the point of using #define instead of a global variable? Don't they pretty much do the same thing?
What's the difference between the two?

##### Share on other sites
I'm assuming you mean C++.

A define statement replaces -any- instance of "money" in your code, no matter if it's a variable or function in any namespace or scope. You get no namespace and the possibility of name clashes is pretty much guaranteed unless you give it a very long and unique name like "MYPROJECT_MONEY". A const global can be overridden in a scope defining another instance of "money" and you can even put it in a specific namespace avoiding other files declaring the same name.

Defines are uncontrollable and will find a way of leaking into places where you don't want it unless you give them very specific and ugly names. The windows.h header is a great example of this.. you better define LEAN_AND_MEAN before using it and hope all the defines really get undefined.

They're only "global" in the sense that if you include a file with it, you'll include the define it contains as well. But the same goes for globally defined const values, so there's no difference there.

##### Share on other sites
So basically you're all saying that if I can use a global constant rather than a #define, I should?

##### Share on other sites

So basically you're all saying that if I can use a global constant rather than a #define, I should?

You can only benefit by doing so. Const values can be defined in headers as well. If you need a global variable (god forbid) you'll have to use the extern keyword in the header and define it in an implementation file.

##### Share on other sites
define is just a text replace, and does not care about the language. that can lead to some interesting abuses, and some interesting uses (the header include once thing). other than that, use language features, as they don't want to bite you in the back.

like #define max did all the time for me...

if you don't plan to ctrl-r replace-all-text, don't use #define.

##### Share on other sites
Preprocessor macros can be useful for many things, but I would not use them for constant values because you lose type safety and conflicts are possible that could create really hard to find bugs.

##### Share on other sites
The above advice is all perfectly valid and I don’t want my post to be misunderstood as a way to “get around” these faulty macro points—there is no real substitution for inlined functions etc.

I just want to add some safety tips for those few times when you really do need a macro.

1. Naming macros such as “MONEY” is too generic. Due to the consequences of text replacement, you could end up with some very abstract and hard-to-trace errors if you use too-general names for your macros. The best way to combat this is to add a fake namespace to your macro. For example, in my engine there are 16 projects each with one namespace. lse, lss, lsm, lsg, etc. Within those projects, I replicate the namespaces within the macros. LSE_ELEMENTS( VAR ), LSG_OPENGL, LSG_DIRECTX11, etc.
2. The above not only reduces conflicts but also lets know you 2 things: #1: Is this macro from my own library?, and #2: Which library? LSG_ = L. Spiro Graphics library. Easy.
3. #undefine macros as soon as they are no longer needed. Header guards etc. should never be undefined, but within translation units (.cpp files) you might have some macros inside functions to make some tasks easier. An example in my engine is “#define LSG_HANDLE2CBUF( HANDLE )”, which, in DirectX 11 and DirectX 10, translates my custom handle into a custom cbuffer pointer, and is used only inside the CDirectX11CompiledShader and CDirectX10CompiledShader .CPP files. It is considered tidy to clean up after yourself, so #undef at the bottom of the .CPP files is a good idea. I have heard rumors of the possibility of macros “leaking” from one translation unit into another under some compilers so this is a good idea in general to avoid bugs.
4. __ (2 underscores) is a prefix reserved for the system/compiler. If you want to make absolutely sure your macros will never conflict with anything, you could add some underscores in front, but make sure it is not just 2 underscores. At work we use 3.

L. Spiro Edited by L. Spiro

##### Share on other sites

__ (2 underscores) is a prefix reserved for the system/compiler. If you want to make absolutely sure your macros will never conflict with anything, you could add some underscores in front, but make sure it is not just 2 underscores. At work we use 3.

Anything starting with two underscores or one underscore and a capital letter is reserved for the compiler. So, anything starting with three underscores is reserved (since it also starts with two underscores) and any capitalized macro that starts with any underscores is reserved. Also, there can't be any sequence of two underscores in the identifier, even if it's not at the start. The compiler is not likely to define a macro that starts with three underscores but it is still allowed to do so.

##### Share on other sites

so #undef at the bottom of the .CPP files is a good idea.

I agree with most of #3 except for doing this, as I think it's going too far. I think if I saw it being done I'd say "WTF are they doing this for???" (and I think 99.99% of other programmers would say the same (what I'm trying to say is you'll just confuse other programmers for the most part with it)). I've never heard of a compiler needing this, and I think following it just on "rumor" is going waaay too far, asking for unnecessary mental overhead in developing. Additionally, if a compiler leaks macros/identifiers it shouldn't from one translation unit to another, it's worth reporting that bug to the compiler vendor, and expecting them to fix it.

[quote name='L. Spiro' timestamp='1341672034' post='4956646']
__ (2 underscores) is a prefix reserved for the system/compiler. If you want to make absolutely sure your macros will never conflict with anything, you could add some underscores in front, but make sure it is not just 2 underscores. At work we use 3.

Anything starting with two underscores or one underscore and a capital letter is reserved for the compiler. So, anything starting with three underscores is reserved (since it also starts with two underscores) and any capitalized macro that starts with any underscores is reserved. Also, there can't be any sequence of two underscores in the identifier, even if it's not at the start. The compiler is not likely to define a macro that starts with three underscores but it is still allowed to do so.
[/quote]
+1. In addition: "Each name that begins with an underscore is reserved to the implementation for use as a name in the
global namespace." So macros simply should never start with an underscore, and no variable in the global namespace should either, even if it's followed by a lower case letter. Edited by Cornstalks

##### Share on other sites

[quote name='L. Spiro' timestamp='1341672034' post='4956646']
so #undef at the bottom of the .CPP files is a good idea.

I agree with most of #3 except for doing this, as I think it's going too far. I think if I saw it being done I'd say "WTF are they doing this for???" (and I think 99.99% of other programmers would say the same (what I'm trying to say is you'll just confuse other programmers for the most part with it)). I've never heard of a compiler needing this, and I think following it just on "rumor" is going waaay too far, asking for unnecessary mental overhead in developing. Additionally, if a compiler leaks macros/identifiers it shouldn't from one translation unit to another, it's worth reporting that bug to the compiler vendor, and expecting them to fix it.
[/quote]

Actually, #undefing your macros is still a good idea, in case someone gets antsy about build times and tries to deploy a unity build structure to your C/C++ project. Leaving macros defined all over the place can get incredibly painful in unity builds.

##### Share on other sites

[quote name='Cornstalks' timestamp='1341707356' post='4956770']
[quote name='L. Spiro' timestamp='1341672034' post='4956646']
so #undef at the bottom of the .CPP files is a good idea.

I agree with most of #3 except for doing this, as I think it's going too far. I think if I saw it being done I'd say "WTF are they doing this for???" (and I think 99.99% of other programmers would say the same (what I'm trying to say is you'll just confuse other programmers for the most part with it)). I've never heard of a compiler needing this, and I think following it just on "rumor" is going waaay too far, asking for unnecessary mental overhead in developing. Additionally, if a compiler leaks macros/identifiers it shouldn't from one translation unit to another, it's worth reporting that bug to the compiler vendor, and expecting them to fix it.
[/quote]

Actually, #undefing your macros is still a good idea, in case someone gets antsy about build times and tries to deploy a unity build structure to your C/C++ project. Leaving macros defined all over the place can get incredibly painful in unity builds.
[/quote]
Are you and I talking about the same thing (#undef at the bottom of the .CPP files)? Like I said, I agree with most of #3 (cleaning up your macros is a good thing). But I think cleaning them up at the end of a source file is a waste of time and space, and I can't see how that would decrease compile times at all.

##### Share on other sites
The whole idea of a "unity build" is to #include several of your .cpp files into one "master" translation unit, which does in fact help with compile times in some cases. (My personal feeling is that unity builds are a bandage over terrible header and dependency management issues, but that's another debate.)

Consider the following code:

// Foo.cpp #define bool int bool FooFunction() { return 1; } // Bar.cpp bool BarFunction() { return true; } // Unity.cpp #include "foo.cpp" #include "bar.cpp"

This is typical of how unity builds are implemented. Clearly, in this example, you can expect the #define to cause havoc.

If you use unity builds, it's generally a very good idea to keep macros tightly scoped and #undef them as soon as possible. If that happens to be at the end of a .cpp file, so be it.

##### Share on other sites

*snip*

Ah, I see. I wasn't familiar with the term "unity build" (though I'm familiar with the concept; I'm more familiar with the term "amalgamation," thanks to SQLite) and had Unity (as in Unity3D) come to mind. Yes, I must agree then iff a unity build is being done. But L. Spiro was talking about macros spilling over from one translation unit to another, and in this normal workflow with multiple translation units I think it's pointless. Edited by Cornstalks

##### Share on other sites
I said to #undef them at the bottom but that was not to be taken too literally. I personally #undef them at the earliest possible moment, almost always being inside the same function in which they are created (even if I have a family of related functions in a row that end up redefining the same macro the same way), but I only wanted to mention the most common case where people “leak” macros, where you might #define some macro at the top of the .CPP, and then just let it go.

Definitely, if you are defining something inside a function, #undef at the end of the function, not the end of the file.

Also, having macros leak into other translation units (in a normal environment, not unity builds) is of course a special rare case, and is not the motivation for the #undef.
That is only a secondary point, since it is unlikely you will ever even encounter that.

L. Spiro

##### Share on other sites
#define: constant
global variable: variable

##### Share on other sites

#define: constant
global variable: variable

*Sigh*

const int i = 123; // constant
int i = 123; // global variable
#define i 123 // technically creates a literal, not a constant, everywhere it is replaced, with no meaningful name for the compiler to use Edited by Aardvajk

##### Share on other sites
In addition, it's worth noting that "const" has certain limitations; compare:
[source lang="cpp"]struct C
{
inline static int getval() {return 4;}
};
const int MAX=1024;
const int MIN=C::getval();
[/source]
"MAX" is a constant integral expression (can be used as an array-size in array declarations, as a case label in switch statements, etc.), while "MIN" is not.
See: http://www.devx.com/...tion/33327/1954

In C++11 there's a new declaration specifier, "constexpr", which allows you to solve this problem and, for example, do this:
[source lang="cpp"]constexpr int getDefaultArraySize (int multiplier)
{
return 10 * multiplier;
}
int my_array[ getDefaultArraySize( 3 ) ];
// perfectly legal, "getDefaultArraySize( 3 )" is a constant integral expression equal to 30 at compile-time
[/source]

See: http://www.cprogramm...-constexpr.html

More:
http://en.cppreferen...guage/constexpr
http://cpptruths.blo...texpr-meta.html
http://thenewcpp.wor...1/14/constexpr/
http://kaizer.se/wik...onstexpr_foldr/ Edited by Matt-D

##### Share on other sites

[quote name='Acotoz' timestamp='1341721110' post='4956821']
#define: constant
global variable: variable

*Sigh*

const int i = 123; // constant
int i = 123; // global variable
#define i 123 // technically creates a literal, not a constant, everywhere it is replaced, with no meaningful name for the compiler to use
[/quote]

Alright, let's have another case here.

what happens if I do this?

#define CYCLE for (int i = 0; i < 25; i++)

CYCLE will be defined by that little instruction, so that is not a literal, not a constant, not a variable.

Good luck

##### Share on other sites
one important difference is that global variables are really variables, the compiler has to assume that you could change them, even if they are const (e.g. modifying the binary or unlocking that segment via system calls, changing a value and locking that segment again). That means that you have a load, possibly a cache miss and in cases where that value is just 0 or 1, the compiler could optimize code, but with variables it's not fully legal.

one way around this is to use enums
 enum{DEBUGOUTPUT_ENABLED=0}; ... void foo() { if(DEBUGOUTPUT_ENABLED) .. } 

doesn't work for floats, obviously, or any non primitive type, but for cases where you want to replace defines, it can make quite sense to use enums.

##### Share on other sites
To the best of my knowledge modern compilers will try to inline const variables into the code whenever possible. There are few cases where you actually need a physical variable, for example in a case like this:
const int myConst = 42;
...
myFunction(&myConst);
where the signature is void myFunction(const int* value).

Edit: I just tried something like this for kicks:
[source lang="cpp"]const int myConst = 10;
[...]
*const_cast<int*>(&myConst) = 5;

for (int i = 0; i < myConst; ++i)
std::cout &lt;&lt; i &lt;&lt; " ";
[/source]
It just crashes in MSVC Release builds. And in http://stackoverflow...alue-of-a-const Andrew Khosravian claims to quote the standard which would reinforce my opinion.

##### Share on other sites
as I said, you need to unlock that segment via system function, I did that like 10years ago on Win95 if I recall correctly, to change const strings during startup time.

note also, that if you take the address, you actually need to make it static const, just naming it const would result in a different address for every translation unit, unless you have just one uber/unity file, it might cause you a lot of trouble. (e.g. if you push the address of a 'stop' flag into a queue and check for it in another translation unit).

afaik, enums are rvalues, you cannot have those kind of misstakes.

I also suggest to not overgeneralize one compiler and platform as a proof of something, if you compile for e.g. ARM for gameboy advance, and you have a const int which has more than 24bit set (e.g. 0xffffffff), then there is no instruction to create that 0xffffffff in one cycle, the compiler starts to be 'smart' and loads that data from a global address space, which usually cost you more than those instructions that are needed to create the 0xffffffff by computation, an enum can help (even then compilers sometimes try to cluster those).

You're right that the const will be embeded, on an x86, tho

##### Share on other sites
So basically we can agree on saying "using a const is fine using a modern compiler on a desktop plattform and probably quite a few others". Because if we move into the weirder corners you need extremely domain-specific knowledge anyway and cannot make any generalizations because the intersection of what you should be using on consoles, desktop and some of the embedded devices you encounter is basically a very primitive C no one would like to work with if they had any choice?

##### Share on other sites
or we can agree enums are always fine

whatever you prefer

• 11
• 11
• 9
• 12
• 10