Archived

This topic is now archived and is closed to further replies.

Orpheum

Macro's suck!

Recommended Posts

I''m having some trouble with a macro and I can''t figure it out. My macro is defined like this: #define RGB16bit(r, g, b) (USHORT)((b / 8) + ((g / 4) << 5) + ((r / 8) << 11)) then it is called like so: BYTE blue = //blah blah, green = //yadda yadda, red = //foo foo; RGB16bit(red, green, blue); When I try to compile, I get a syntax error '')''. I know it is not a problem of mismatched parenthesees, but when I take out the offending '')'', I get a macro invocation error because it is expecting a closing paren!! I don''t regularly use macro''s, so it''s probably something stupid, but it has evaded me so far... any thoughts??

Share this post


Link to post
Share on other sites
You say that you do this:

BYTE red = 10,
green = 20
blue = 30;

BUT I think that you''re doing this:

BYTE red = 10,
green = 20
blue = 30

Note: I think you''ve missed the semicolon after 30. Take a look at your code carefully to see if you''ve missed the semicolon or not. That''s most likely the problem, even though it might be something else. If you have your semicolon there, try to do a Rebuild All, and see if it still does the same thing.

----------------------------------------------
That's just my 200 bucks' worth!

..-=gLaDiAtOr=-..

Share this post


Link to post
Share on other sites
Yes, macros do suck.

For instance, one problem with your macro is that you haven''t enclosed all of the arguments with parenthesis. This can cause severe problem due to order of operations. For instance, if you tried:
RGB16bit (1+1, 2+2, 3+3)
you would get ((1+1/8) + ((2+2/4)<<5) + ((3+3/8)<<11))

Since they''re all integers, you would really be getting:
RGB16bit (1, 2, 3) instead of RGB16bit (2, 4, 6).

Macros suck. I would use #inline functions instead, but it''s your pain.

I expect something else is going on, because I tried your code and it worked fine on my machine. Here''s my test program, with the parenthesis around the variables so you don''t get the problem I described above:
    
typedef unsigned short USHORT;
#define RGB16bit(r, g, b) (USHORT)( ((b) / 8) + \
(((g) / 4) << 5) + \
(((r) / 8) << 11))
void main ()
{
USHORT a = RGB16bit (8, 16, 32);
}

In the debugger, a evaluates to 0x0884, which looks right to me.

Maybe USHORT''s not defined? Problems on the line above or below the #define? Forget a semicolon on the end of a class or struct? Look for those kinds of errors.

Better yet, use an inline function:
inline USHORT RGB16bit (USHORT r, USHORT g, USHORT b)
{
return (b / 8) + ((g / 4) << 5) + ((r / 8) << 11);
}

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Solution: don''t use macros. They are evil old style C and should be cast into hell where they belong.

Share this post


Link to post
Share on other sites