• Advertisement

Archived

This topic is now archived and is closed to further replies.

So what is the point of HDC and HWND?

This topic is 6550 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
As far as I know, it''s because they might just change that later. That it''s an int now tells nothing of the future. Kind of like WPARAM and LPARAM - used to be word/long, but now it''s UINT/LONG. I think I''m right, anyway

A polar bear is a rectangular bear after a coordinate transform.

Share this post


Link to post
Share on other sites
As in many other cases, types are created so that the system can identify the operations that can be done with that piece of data. What you''re asking is akin to asking "why are there chars? why not just use shorts?"

As for what HWND and HDC do, HWND is a handle to a window, and HDC is a handle to a device context. Device contexts are contexts in which the information held in the window can be drawn, for example, if you are printing on a monochrome printer you would naturally need a different device context than the one used to display it on a SVGA monitor.

HTH
-fel

Share this post


Link to post
Share on other sites
you should get the book "Code Complete". it has a great chapter on the power of naming conventions and typedef-ing.


_________________Gecko___
Gecko Design

Share this post


Link to post
Share on other sites
It makes for more readable, memorable, and meaningful code. If you see a function prototype that takes an HWND, you would have a better idea of what to pass it than if it was an int. The same goes for HDC or any of the other numerous typedef-ed types used.

It also allows for quick changing of underlying data types without having to do a lot of error-prone search-and-replacing. Imagine you had a 16-bit graphics engine, and you wanted to upgrade to 32-bit... you''d have to go through replacing ''short''s with ''int''s, making sure you didn''t change anything that you actually needed to stay as a short. If you instead wrote ''typedef short PIXEL'' and used PIXEL wherever you wanted your 16-bit value to go on the screen, you can then change all instances to use ints just by changing that single typedef line.

Share this post


Link to post
Share on other sites
Why use typedef instead of #define? (typedef short PIXEL instead of #define short PIXEL). What''s the difference? Which one is better?

/. Muzzafarath

Share this post


Link to post
Share on other sites
And it would be
#define PIXEL short
not
#define short PIXEL
incase you wanted to use #defines (maybe you like buggy code), hehe. Typedefs are better than macros (defines)in the above context and const is a better way to create a constant (value) as you can specify the type. Just my opinion, so write how you feel comfortable unless you are forced to do otherwise.

Hargle

Share this post


Link to post
Share on other sites

  • Advertisement