How do I ensure, that WORD is always 2 Bytes and DWORD always 4 Bytes across diffrent hosts?
I've been taught that diffrent CPU-architectures define different sizes for the data types.
So, how can I ensure having the right size of the variable?
Just to be clear, they're not talking about different cpu architectures like going from a windows machine with an Intel processor to running the same program on an AMD processor. That is talking about going from an x86 family processor (like your PC) over to an ARM family processor like your cell phone. Or potentially they're talking about moving from a 32-bit compiler to a 64-bit compiler. You have to actually change architectures, which means a whole lot of things change.
If you are running the same executable on all the different machines the sizes will be the same.
The other thing to consider is padding inside the structure. Just because a field is 32 bits or 16 bits does not mean they are packed the same. Padding inside structures is an implementation detail in this language, but must be a specific spacing for file formats. They could all be aligned on byte boundaries, or 2-byte boundaries, or 4-byte boundaries, or 16-byte boundaries, or whatever else the compiler feels like doing. There are compiler-specific commands to adjust that.
Endian-ness will be another concern if you are going cross platform. While the values are the same as far as your code is concerned (the value is 12345678) the actual encoding of the byte pattern can be different. On one platform a 16 bit value may be encoded as AB, on another it is BA. For a 32 bit value it may be ABCD or DCBA or on some middle endian hardware potentially BADC.
These concerns don't really apply if you are talking about staying on a single architecture, such as building a program that only runs on 32-bit windows, or only runs on 64-bit windows.