Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

deadlydog

#define???

This topic is 5649 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I''m just learning direct x programming from my book I bought, "Tricks of the Windows Game Programming Gurus" right now, and for a lot of things he uses C code. I however have never taken C, I''ve only taken C++. He declares many things using #define. Now, I know what this does, but when I try changing his C code to C++, I get errors. For example, where he would declare: // defines for windows #define WINDOW_CLASS_NAME "WINCLASS1" // default screen size #define SCREEN_WIDTH 640 // size of screen #define SCREEN_HEIGHT 480 #define SCREEN_BPP 8 // bits per pixel I tried using things like: string WINDOW_CLASS_NAME = "WINCLASS1"; and, const int SCREEN_WIDTH = 640; and so on, except whenever I changed any of them, I would get errors. His code works fine the way it is, so I''m not sure what the problem is when I try and use C++ code instead. So my question is, is it possible to code this stuff using C++, and if so how, or what am I doing wrong?? Thnx - Dan He who laughs, lasts

Share this post


Link to post
Share on other sites
Advertisement
Guest Anonymous Poster
#define is a c++ command also, it is used to create a unchangeable value(program wise) that you can easily edit in your code if needed. They are not ment to be variables like you tried to make them.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
First, here''s an important engineering principle that will save you untold grief throughout your life:

"If it ain''t broke, don''t fix it."

Now to your problem. The STL string type is not equivalent to an array of chars, so you will get errors if a C function is expecting "char *" and you pass it a string. The "const int" substitution should work though -- what exactly is the error?

Share this post


Link to post
Share on other sites
quote:
Original post by Anonymous Poster
#define is a c++ command also, it is used to create a unchangeable value(program wise) that you can easily edit in your code if needed. They are not ment to be variables like you tried to make them.

#define is a relic from the days of C. It has its uses, but const variables in C++ are better, because they are type-safe. #define macros rely on dumb and blind text replacement.

This is not to say that they aren''t useful - to C programmers. In C++, you''re usually better off using const variables and inline functions.

As for the error, the second AP is right.

Share this post


Link to post
Share on other sites
quote:
Original post by Miserable
[quote]Original post by Anonymous Poster
#define is a c++ command also, it is used to create a unchangeable value(program wise) that you can easily edit in your code if needed. They are not ment to be variables like you tried to make them.

#define is a relic from the days of C. It has its uses, but const variables in C++ are better, because they are type-safe. #define macros rely on dumb and blind text replacement.

This is not to say that they aren''t useful - to C programmers. In C++, you''re usually better off using const variables and inline functions.

As for the error, the second AP is right.

I have a question, just as a newb to all of this. If #define does a text replacement, and that is a preprocessor action, won''t the compiler still catch it?

For example:


  
#define MyCharacter x

char x = ''a'';
int y = MyCharacter;


Won''t the preprocessor replace "MyCharacter" with "x", and then won''t the compiler still catch the error?

Share this post


Link to post
Share on other sites
In this case, the compiler will see,


char x = 'a';
int y = x;


This is all handled by the pre-processor, so the compiler won't issue a warning.

It really is dumb text replacement.

[edited by - rotos on December 6, 2002 11:46:16 AM]

Share this post


Link to post
Share on other sites
But if the compiler sees:

int y = x;

Won''t it say "Hmmm...well x is a char" and then shoot an error back? I don''t mean to be difficult. I honestly don''t know.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
#define is very useful when used correctly. Eg: i fyou have a value that never changes throughout your code, it is more readable to use a #define and also gives you the ability to change every instance of that value by simply changing the #define itself.

As for the quote "#define is a relic from the days of C" I can verify that C is alive and well in the commercial games industry! Too many people get caught up with "C++ is better than C" etc...when people should be looking at the application of the language to their problem.

But anyway! You may very well be better off using const var''s instead, I just wanted to point out the positive use of #defines for a newbie perspective.

Share this post


Link to post
Share on other sites
quote:
Original post by Utwo
But if the compiler sees:

int y = x;

Won''t it say "Hmmm...well x is a char" and then shoot an error back? I don''t mean to be difficult. I honestly don''t know.


It probably won''t. char is often used as small integer, so conversion to another integer type should be silent.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
....this should''ve been in my ast post!...

the compiler will probably see the ASCII value of ''x'' assign that to y, which would be reasonable. Only in y it would be a int value rather than a char. Hope that makes sense :op

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!