# Why sizeof(int) == 4 on a x64 system?

This topic is 3045 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

#include <iostream>
#include <tchar.h>

int _tmain(int argc, TCHAR** argv)
{
std::cout << "sizeof(char)       = " << sizeof(char) << std::endl;
std::cout << "sizeof(TCHAR)      = " << sizeof(TCHAR) << std::endl;
std::cout << "sizeof(short)      = " << sizeof(short) << std::endl;
std::cout << "sizeof(int)        = " << sizeof(int) << std::endl;
std::cout << "sizeof(long)       = " << sizeof(long) << std::endl;
std::cout << "sizeof(long long)  = " << sizeof(long long) << std::endl;
std::cout << "sizeof(_int64)     = " << sizeof(_int64) << std::endl;
return 0;
}

Output:
sizeof(char)       = 1
sizeof(TCHAR)      = 2
sizeof(short)      = 2
sizeof(int)        = 4
sizeof(long)       = 4
sizeof(long long)  = 8
sizeof(_int64)     = 8
The output does not change even if I compile with x86 configuration. Why does int type always have 4-byte size? Shouldn't it allocate 8-bytes on a x64 system?

##### Share on other sites
Because too many programs assumed int would always be 32-bits, so it was kept that way for compatibility.

##### Share on other sites
No, nothing forces int to be the same width as the architecture. Keeping int to 32 bits maintains the de facto standard width for the type and thereby improves cross-architecture compatibility.

##### Share on other sites
Ok, I understand why int stayed 32-bit. It is for compatibility issues.

I wonder one thing, does it take the same CPU time to take of summation of any two _int64 numbers on x86 and x64 CPUs?

For example:

_int64 num1, num2, num3;num1 = 5;num2 = 10;num3 = num1 + num2;

Does this code take same number of instruction cycles on x86 and x64 CPUs?
I mean, on a x64 system, does the type _int64 still being processed as a pair of "int" numbers, or is it treated as a whole number?

##### Share on other sites
Given that it doesn't take the same number of cycles on all x86 processors, and doesn't take the same number of cycles on all x64 processors? No.

##### Share on other sites
That's not up to the language, that's up to the compiler. Compile it and look at the generated assembly.

##### Share on other sites
Perhaps youre confusing int with size_t, which does change. I wouldnt say int stayed 4 bytes for compatibility reasons, its because the size of int is implementation defined. There was no reason TO change it, therefore it wasnt changed.

##### Share on other sites
Quote:
 Original post by cache_hitI wouldnt say int stayed 4 bytes for compatibility reasons, its because the size of int is implementation defined. There was no reason TO change it, therefore it wasnt changed.
Of course there was a reason to change it. Just like there was a reason to change int from 16 bits to 32 bits once 32 bit processors rolled around. The C89 standard specified that "a 'plain' int object has the natural size suggested by the architecture of the execution environment". To the extent that an architecture with 64-bit registers suggests any natural size, it suggests a 64-bit size.

##### Share on other sites
Quote:
 Original post by BattousaiDoes this code take same number of instruction cycles on x86 and x64 CPUs?I mean, on a x64 system, does the type _int64 still being processed as a pair of "int" numbers, or is it treated as a whole number?

Quote:
Original post by Sneftel
Quote:
 Original post by cache_hitI wouldnt say int stayed 4 bytes for compatibility reasons, its because the size of int is implementation defined. There was no reason TO change it, therefore it wasnt changed.
Of course there was a reason to change it. Just like there was a reason to change int from 16 bits to 32 bits once 32 bit processors rolled around. The C89 standard specified that "a 'plain' int object has the natural size suggested by the architecture of the execution environment". To the extent that an architecture with 64-bit registers suggests any natural size, it suggests a 64-bit size.

I agree at all. Why shouldn't the size of int extend to 64-bits? After all, programmers were supposed to write their code platform independent (of course there are exceptions). If the problem was compatibility, they could put a note on the software product something like "Works only on 32-bit systems" and compile a 64-bit alternative. Actually, from what I see on the web, people are already doing this.

##### Share on other sites
Be aware that if you were to use GCC on an x64 linux/mac, then int is 8 bytes. The sizes of basic types are implementation dependent.

• 17
• 11
• 12
• 9
• 49
• ### Forum Statistics

• Total Topics
631396
• Total Posts
2999759
×