Archived

This topic is now archived and is closed to further replies.

Defines or Constants

This topic is 5237 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have been reading alot of other peoples source and I have seen to things done and I am not sure which is better. Sometimes I see sopmething like this:
#define WINDOW_WIDTH 600
 
and sometimes I see this:
const int WINDOW_WIDTH = 600;
 
I was wondering which is best used for defining constants. ---- Jemts "A Hooloovoo is a super-intelligent shade of the color blue."- Douglas Adams

Share this post


Link to post
Share on other sites
I am not 100% sure of this but the way I think of it is:

#define replaces with wherever it is used, which imo feels pretty unsafe (kinda like replacing all instances of in a file with , it might get wrong somewhere).

With const int = you are instead dealing with an "approved" variable, which I think might help the compiler (type checking, casting etc).

As Ready4Dis said, both should work and imo you should use the one that you think works best for you.

Share this post


Link to post
Share on other sites
I use defines pretty liberally in most of my code, I can't ever remember using, or indeed the need for using a const.

A defined macro, like SCRNW would be replaced by the actual value at compile time, so its kind of like hard coding numbers into the file, but giving them a name. I develop in VC++ which shows you the defines in the source code as normal.

A const requires memory to store the value, whereas macros add to the size of the executable for each instance of the macro name you have put in the source file. So I suppose consts would be more memory efficient if you are using the constant multiple times throughout a source file, because there is only one part of memory which if looked up, whereas defines are hard coded always taking up the amount of memory required to store the value as binary each time.

I feel as if i'm going round in circles, so I can't guarantee my explanation is correct, anyone else care to add/ammend?

[edited by - sckoobs on August 10, 2003 6:38:52 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
void SomeFunction()
{
float SomeVar = 5.5;
}


if you use

const int SomeVar = 555;

you wont have any problems compiling / using code, while

#define SomeVar 555

essentially tells c++

float 555 = 5.5; which results in compile time errors that can be very cryptic

Share this post


Link to post
Share on other sites