So what is the point of HDC and HWND?

Started by
6 comments, last by King 24 years, 1 month ago
I really don''t get the point of HWND why do you need it, and along with HDC, why can''t you just use int? - king171@hotmail.com
- http://www.cfxweb.net/mxf/
Advertisement
As far as I know, it''s because they might just change that later. That it''s an int now tells nothing of the future. Kind of like WPARAM and LPARAM - used to be word/long, but now it''s UINT/LONG. I think I''m right, anyway

A polar bear is a rectangular bear after a coordinate transform.
A polar bear is a rectangular bear after a coordinate transform.
As in many other cases, types are created so that the system can identify the operations that can be done with that piece of data. What you''re asking is akin to asking "why are there chars? why not just use shorts?"

As for what HWND and HDC do, HWND is a handle to a window, and HDC is a handle to a device context. Device contexts are contexts in which the information held in the window can be drawn, for example, if you are printing on a monochrome printer you would naturally need a different device context than the one used to display it on a SVGA monitor.

HTH
-fel
~ The opinions stated by this individual are the opinions of this individual and not the opinions of her company, any organization she might be part of, her parrot, or anyone else. ~
you should get the book "Code Complete". it has a great chapter on the power of naming conventions and typedef-ing.


_________________Gecko___
Gecko Design

_________________Gecko___Gecko Design
It makes for more readable, memorable, and meaningful code. If you see a function prototype that takes an HWND, you would have a better idea of what to pass it than if it was an int. The same goes for HDC or any of the other numerous typedef-ed types used.

It also allows for quick changing of underlying data types without having to do a lot of error-prone search-and-replacing. Imagine you had a 16-bit graphics engine, and you wanted to upgrade to 32-bit... you''d have to go through replacing ''short''s with ''int''s, making sure you didn''t change anything that you actually needed to stay as a short. If you instead wrote ''typedef short PIXEL'' and used PIXEL wherever you wanted your 16-bit value to go on the screen, you can then change all instances to use ints just by changing that single typedef line.
Why use typedef instead of #define? (typedef short PIXEL instead of #define short PIXEL). What''s the difference? Which one is better?

/. Muzzafarath
I'm reminded of the day my daughter came in, looked over my shoulder at some Perl 4 code, and said, "What is that, swearing?" - Larry Wall
Using a typedef forces the compiler to use more aggressive type-checking.
Creativity is a bloody nuisance and an evil curse that will see to it that you die from stress and alcohol abuse at a very early age, that you piss off all your friends, break appointments, show up late, and have this strange bohemian urge (you know that decadent laid-back pimp-style way of life). The truly creative people I know all live lousy lives, never have time to see you, don't take care of themselves properly, have weird tastes in women and behave badly. They don't wash and they eat disgusting stuff, they are mentally unstable and are absolutely brilliant. (k10k)
And it would be
#define PIXEL short
not
#define short PIXEL
incase you wanted to use #defines (maybe you like buggy code), hehe. Typedefs are better than macros (defines)in the above context and const is a better way to create a constant (value) as you can specify the type. Just my opinion, so write how you feel comfortable unless you are forced to do otherwise.

Hargle
Hargle

This topic is closed to new replies.

Advertisement