Question about Potential Assumption

Started by
3 comments, last by Catafriggm 18 years, 10 months ago
Would it be reasonable to expect size_t and ptrdiff_t to always reflect the effective word size (that is, what the program has access to) of the processor the program is being compiled for? By definition ptrdiff_t is signed, and the size of a pointer. By definition size_t is unsigned, and logically it's the size of a pointer. So an alternate form of my question could be: is there ever any reasonable case where the size of a pointer would not be the word size of the target processor (and I need something nonhypothetical)?
Advertisement
Quote:Original post by Catafriggm
Would it be reasonable to expect size_t and ptrdiff_t to always reflect the effective word size (that is, what the program has access to) of the processor the program is being compiled for?

By definition ptrdiff_t is signed, and the size of a pointer. By definition size_t is unsigned, and logically it's the size of a pointer. So an alternate form of my question could be: is there ever any reasonable case where the size of a pointer would not be the word size of the target processor (and I need something nonhypothetical)?


Well, the intention was that int would be the "natural" size of an integer on a given processor, so that should be your word size. void * is the most generic pointer type, and therefore the largest (on most systems you'll encounter, pointers are the same size, but you can probably find systems where sizeof(void *) > sizeof(int *). I think the standard would allow sizeof(int *) > sizeof(void *), but you'd have a lot of wasted space for the int pointer, so that'd only be found in a joke implementation.)

I think that size_t and ptrdiff_t are unlikely to reflect the effective word size of the processor. size_t is an unsigned type that can represent the largest chunk of memory you can request. ptrdiff_t is a signed type that results from taking the difference between two pointers that point to memory in the same object (if they're not into the same object, then the difference need have no meaning). I would expect these to be at least unsigned long and long, respectively, if not larger types. In other words, not representative of the word size.
Well, I've spent some time looking at this (among many other unrelated things) since posting this topic. VC++ entirely uses WIN64 to determine what types ptrdiff_t and size_t are: they're 64-bit when WIN64 is defined, otherwise 32-bit (and since modern VC++ doesn't compile 16-bit code...). I spent way too much time looking at glibc, and found that size_t and ptrdiff_t are based on constants built into GCC (__PTRDIFF_TYPE__ and __SIZE_TYPE__). I still need to find what those values are, and in what cases. But those are just a few specific cases.
Why do you need to know?
GCC is really annoying because of the horrible file layout, but I THINK that __SIZE_TYPE__ and __PTRDIFF_TYPE__ are the word size.

This topic is closed to new replies.

Advertisement