why do they use caps for the return values?

Started by
9 comments, last by gumpy 18 years, 10 months ago
Hey, this might be a rather stupid question but I am now wondering for a while... I just read the d3d tutorial on the msdn page which is pretty easy to understand BUT: I noticed that all the return values are VOID, INT etc... When I have a look at them it says "#define VOID void" or "typedef int INT". wtf? Is there a good reason to do this? I see no point or no advantage why someone would define something to the same thing except if one enjoys to write caps (which I don't!). I can understand people who define FALSE as 0 and TRUE as 1 but the int INT thing isn't very clear to me. It helps not to understand something better and pressing the shift key while typing doesn't make anything easier... Can anyone enlighten me? Thanks in advance :) BiGF00T
Now get down on your hands and knees and start repeating "Open Source Good, M$ Evil", smacking your head against the pavement after each repetition. Once you have completed your training you may change your first name to GNU/, to show that you are free from the slavery of the closed source world. -Michalson
Advertisement
As far as I know, it's for future combatibility, if they one day decide to define INT as, say, a 64 bit int, instead of the standard 32bit one. It just gives them the option to change the underlying data types in the future
They're supposed to be redefiniable for compability with future systems and various other hacks..
INT and VOID may be pretty extreme examples (especially VOID, why on earth would you ever want to redefined that?) but I suppose Microsoft prefers to be consistent and wrap up everything in their own custom abstraction layer.

Quite a few types have been changed already between Win16, Win32 and Win64. Microsofts decision to keep ints as 32-bit on 64-bit platforms has probably clashed with other compilers already, so keeping INT as an explicit 32-bit type in public headers should help ensure binary compability.
Typically, #defines macros and "constants" are put in all caps. This helps identify them as being defined using #define. Almost every type in the Microsoft functions are variable types created using #define. This basically means there is one place they need to modify something when they modify one of these. This also allows them to hide some of the Unicode/Ascii stuff.

"I can't believe I'm defending logic to a turing machine." - Kent Woolworth [Other Space]

ok, at least the int thing makes sense now and as you say, they want to have it consistent. later people would complain "why do i have to write >INT< but >void<? that makes no sense!!"
thanks for your help ;) now i can sleep peacefully again :P

so if I'll just use the uppercase thingies in the future, I'll certainly have no problems and if I ever find a machine where the macros are not defined then i can redefine them...
Now get down on your hands and knees and start repeating "Open Source Good, M$ Evil", smacking your head against the pavement after each repetition. Once you have completed your training you may change your first name to GNU/, to show that you are free from the slavery of the closed source world. -Michalson
Quote:Original post by BiGF00T
ok, at least the int thing makes sense now and as you say, they want to have it consistent. later people would complain "why do i have to write >INT< but >void<? that makes no sense!!"
Well, someone probably reasoned that "Why should we define every single built-in type except void? That's just so inconsistent!!".
yes, rather make them all look the same.
what would your advice be? use the stuff I'm used to or switch to the fancy new caps redefines? or might there be situations when i should prefer one before the other?
Now get down on your hands and knees and start repeating "Open Source Good, M$ Evil", smacking your head against the pavement after each repetition. Once you have completed your training you may change your first name to GNU/, to show that you are free from the slavery of the closed source world. -Michalson
Quote:Original post by BiGF00T
what would your advice be? use the stuff I'm used to or switch to the fancy new caps redefines? or might there be times when i should prefer one before the other?
I'd say stay away from Window's types unless you're interfacing directly with one of it's APIs. There's really no guarantee about what will happen to window's types in the future, and their are so generic that their virtually guaranteed to clash with other libraries/systems.

If you *really* need compability on that level (which you almost certainly don't, at least not for the majority of your code) then I advice you to create your own set of types with some fairly unique prefix.

For most of my own projects i define a set of types with exact sizes (you can base them on C99's stdint.h if you're lucky enough to have it) for various IO and when I need 64-bit integers. For the rest I simply assume at least 32-bit bit ints (you won't encounter anything less modern systems) and work with the built-in types. Oh, and size_t/ptrdiff_t/intptr_t can also do wonders for compability.

Binary compability for libraries is a whole topic on it's own. And using fixed-size types is just one of many messy things involved..
For your own stuff, I would recommend using typedef instead of #define. It's a lot cleaner, and is actually handled by the compiler (I believe), instead of the preprocessor.

Also, I would only use the Microsoft named stuff when you absolutely need too. Otherwise if you know what you have is the same as their's, just cast to it.

"I can't believe I'm defending logic to a turing machine." - Kent Woolworth [Other Space]

thanks to all of you. i never liked typing caps so it'll be no problem to return to my usual habit of using the "original" lowercase types. if i'll ever encounter problems then i'll know what might cause the problem.
Now get down on your hands and knees and start repeating "Open Source Good, M$ Evil", smacking your head against the pavement after each repetition. Once you have completed your training you may change your first name to GNU/, to show that you are free from the slavery of the closed source world. -Michalson

This topic is closed to new replies.

Advertisement