Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Llamasoft.net

Another newbie (gah) question :-)

This topic is 6922 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey everyone again, Just a quickie this time... Exactly *WHY* are ''#define''s so good? I see alot of snippets of people''s code using them for constants like BALL_WIDTH X, but why not make an const int (or wot have you) named BALL_WIDTH = X ? hmm ? Are defines faster ? or do they take less memory, or what ? cheers all, Nick - Head Designer, Llamasoft.net -- Visit our website... Llamasoft.net Games, goodies, and ingenuity

Share this post


Link to post
Share on other sites
Advertisement
#defines are the old c way to define constants. The newer (and better) c++ way to define constants is with "const". Most people just use define as a matter of habit

<(o)>

Share this post


Link to post
Share on other sites
You could use const back in normal C. The difference between using const and #define, is when you use const, you are stll using a variable. With #define, wherever the define is seen, the identifier is replaced with what ever was defined. Also, with #define you can do more than just say #define BALL_WIDTH 5, you can define an entire block of code

#define dummy_macro DrawSomething(); \
x = 5 + y; \
DoSomethingnow(x);

Domini

Share this post


Link to post
Share on other sites
And the difference between const in c and const in c++ is that the const in c means "this cannot be changed" whereas the const in c++ defines a new type, eg "const int" instead of a constant "int".
And use inline instead of macros.

Share this post


Link to post
Share on other sites
Its NOT a matter of preference! #define and const are totally different! #define simply replaces a literal string with another literal string at compile time, while const doesn''t!!! For example:

#define PRINT_SOME_STUFF printf("Hello!");
#define PRINT_SOME_MORE printf("Hi Again!");

void main()
{
PRINT_SOME_STUFF
PRINT_SOME_MORE
}


This would work, since it replaces all PRINT_SOME_STUFF''s with printf("Hello!");, and all the other defines with their accociated literal. const cant do that, and those changes are made at compile tile, so the code would work. This:

const char[] stuff = "printf("Hello!");"

void main()
{
stuff
}


HA! Like hell this would work! You''d get errors to hell!

See, they are totally different.

PCMCIA - People Can't Memorize Computer Industry Acronyms
ISDN - It Still Does Nothing
APPLE - Arrogance Produces Profit-Losing Entity
SCSI - System Can't See It
DOS - Defunct Operating System
BASIC - Bill's Attempt to Seize Industry Control
IBM - I Blame Microsoft
DEC - Do Expect Cuts
CD-ROM - Consumer Device, Rendered Obsolete in Months
OS/2 - Obsolete Soon, Too.
WWW - World Wide Wait
MACINTOSH - Most Applications Crash; If Not, The Operating System Hangs

Share this post


Link to post
Share on other sites
Everything beginning with a # is a preprocessor directive, whereas everything else are compiler commands.

Visit our homepage: www.rarebyte.de.st

GA

Share this post


Link to post
Share on other sites
actually, the ''proper'' replacement for #define constants in C++ is enum. to answer the original question though - #define''s are inlined into the code whereas variables have to be loaded. the latter is slower.

-goltrpoat


--
Float like a butterfly, bite like a crocodile.

Share this post


Link to post
Share on other sites
It kind of makes me ill seeing people put printf everywhere, does anyone else feel a bit sick when they see printf, maybe thats because i never learned c only C++. Cout seems so more friendly

Well well this is my sig
` Treize Khushrenada

Share this post


Link to post
Share on other sites
Another big difference between const and #define is that unless you specifically state in the define what type it is, it will be up to the compiler.

const short bob = 0; //This is a short
#define bob 0; //This is up to the compiler. Probably int.

If you use this defined value, type casting could occur without it being obvious. (Slowing things down!)

Share this post


Link to post
Share on other sites
quote:
Original post by Treize

It kind of makes me ill seeing people put printf everywhere, does anyone else feel a bit sick when they see printf, maybe thats because i never learned c only C++. Cout seems so more friendly


Well I started out in C++ and then learned C, and now I can´t stop using sprintf (vsprintf in Win32), because this function is so great!

And yes I think Cout is more friendly in the begining, but after some time when you have learned sprintf, you´ll find it just as easy.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!