Jump to content

  • Log In with Google      Sign In   
  • Create Account


#define gridSz 16 versus Const Int gridSz=16; ????


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
10 replies to this topic

#1 simulator   Members   -  Reputation: 122

Like
Likes
Like

Posted 02 August 2001 - 11:23 AM

Hi- Having recently discovered #defines, I'm wondering why anyone would use constant variables instead???? What do you use?? Am I missing something???? Is there any difference- speed wise etc? I would imagine #defines are better because they become part of the compiled program memory versus constant variables that have to be accessed separately?? Thanx Edited by - simulator on August 2, 2001 6:26:01 PM Edited by - simulator on August 2, 2001 6:30:07 PM

Sponsor:

#2 Buster   Members   -  Reputation: 100

Like
Likes
Like

Posted 02 August 2001 - 11:27 AM

defines get put in before your program compiles.

You can even things like this:

#define printg printf


Although I don''t know why you''d want to!

#3 simulator   Members   -  Reputation: 122

Like
Likes
Like

Posted 02 August 2001 - 11:35 AM

Hi & thanx,

Yes I know about how defines work...
Perhaps constant variables would be those defined at run time eg:


Const long lPitch;

(ie: where lPitch must be determined only once the program is running) (not sure this is even permitted.... must try it)


thanx


--- yes it works- so that must be why???

But still, even for plain, known-ahead-of-time constants- what do most people use: #define?????

Edited by - simulator on August 2, 2001 6:42:59 PM

Edited by - simulator on August 2, 2001 6:49:10 PM

#4 Beer Hunter   Members   -  Reputation: 712

Like
Likes
Like

Posted 02 August 2001 - 11:41 PM

Constant values are placed inline in the same way that #defines are, so there''s no speed difference. I use const for constants, and #define for other things, but both are quite acceptable. Do what looks nice to you.

#5 deadlinegrunt   Members   -  Reputation: 123

Like
Likes
Like

Posted 06 August 2001 - 08:55 PM

Not to forget about type safety either...

YAP-YFIO
-deadlinegrunt

#6 Normie   Members   -  Reputation: 122

Like
Likes
Like

Posted 07 August 2001 - 07:46 AM

Here''s the difference:
Constants are just like normal variables, with the exception that they cannot be changed.

Defines are simply "substitutions" in code. Take an example:

#define NUMBER 23
...
array[NUMBER] = 113;

is literally exactly the same as

array[23] = 113;

However, defines are not type-checked. Therefore, a number of goofy problems may get right past the compiler and linker (not positive about that, but pretty damned sure). Take an example:

#define NUMBER "AKLSJFD"

array[NUMBER] = 113;

Obviously, "AKLSJFD" isn''t a valid array indice, but since it''s defined in a macro, it might get by the compiler. (Again, not dead sure about this).

To summarize: either one is fine, but just remember that #define-s are less safe than consts.

-Normie

"But time flows like a river... and history repeats."
...
So...what about beaver dams?

#7 simulator   Members   -  Reputation: 122

Like
Likes
Like

Posted 08 August 2001 - 07:14 PM

Hmm- OK good points, thank you

#8 Beer Hunter   Members   -  Reputation: 712

Like
Likes
Like

Posted 08 August 2001 - 08:59 PM

Normie - you''re actually mistaken there - #defines are, as you said, substitutions. So:

#define NUMBER "AKLSJFD"
array[NUMBER] = 113;

would become:

array["AKLSJFD"] = 113;

...which raises a compiler error. However:

#define INCREMENT(a, b) a += b

If you pass a char* and an float to that function, you''ll get an error which doesn''t seem to make sense (Illegal use of floating point). And that''s the only problem I''ve ever seen coming from lack of type safety. It''s not actually as bad as some people make out.

#9 brad_beveridge   Members   -  Reputation: 122

Like
Likes
Like

Posted 12 August 2001 - 02:48 PM

#define''s are also nasty in that if you do something like

#define IncCheck(a,b) (a++ < b++) ? a++ : b++

and then
IncCheck(i,j)
both i & j will be incremented at least once,
because the direct substitution expands to
(i++ < j++) ? i++ : j++

The const keyword was introduced to allow the compiler to type check arguments (can''t do that with macros) and the inline keyword was used to fix the expansion problem that macros suffer. The only time I''d recommend using macros is conditional code compilation, ie
#if defined _DEBUG
printf("This is a debug version");
#endif

Brad

#10 Beer Hunter   Members   -  Reputation: 712

Like
Likes
Like

Posted 12 August 2001 - 09:50 PM

brad_beveridge: If you wrote that as an inline function, using the same code, you''d get the exact same logic error. I think what you meant was more like this:

#define Check(a, b) ((a > b) ? a : b)
Check(a++, b++);

Which would of course cause an error. But show me someone who would write something like that and not see the problem within a few minutes, and I''ll show you someone who needs to learn the basics of c.

There are still advantages that macros have over inline functions. In some cases, it''s just that it requires less code than the inline function, and in some cases, nothing can go wrong:

#define CREATEWINDOW(Title) CreateWindowEx(0, "randomclassname", Title, WS_VISIBLE, 0, 0, 640, 480, NULL, NULL, GetModuleHandle(NULL), NULL)

And although the following isn''t the best example, there are some things that macros can do that inline functions cannot:

// Declare and clear a DirectX structure:
#define DECLARE_STRUCT(type, var) type var; ZeroMemory(&var, sizeof(var)); var.dwSize = sizeof(var)


In summary, use macros where appropriate, and use inline functions where they''re appropriate. A macro should use each parameter exactly once (to avoid the Check(a++, b++) problem), and should be commented to describe the parameters if the parameters are not obvious.

#11 brad_beveridge   Members   -  Reputation: 122

Like
Likes
Like

Posted 13 August 2001 - 01:54 PM

Opps - my bad, Beer Hunter - you wrote what I ment to write I agree that macros can be good, but they have to be used right - we use them heavily where I work to mask certain things (we use ANSI C) sometimes they''re great, at other times a nightmare. Also VC++ doesn''t cope all that well with them, especially when debugging.

Brad




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS