• Advertisement
Sign in to follow this  

32 vs 64 bit

This topic is 2262 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Is there any reason not to switch all my code over to 64 bit?


* I hear that 64 bit can be slower for some reason, is this true?

* If I were to switch, would I just convert all my ints to int64s etc? Or keep them as small as makes sense?

* Would not using the 'native' type for the processor be faster? Similar to how bools are often stored as 4 bytes? (wouldn't this be 8 bytes for 64 bit processors?)


Thanks as usual guys!

Share this post


Link to post
Share on other sites
Advertisement
As far as I know you don't have to explicitly "convert to 64 bit". The differentiation is mostly relevant to the compiler and what instruction set it uses. The only issues that you might need to fix in your code is when you made assumptions about the sizes of say size_t and pointers. Just compile your code as 64bit...

Share this post


Link to post
Share on other sites

Is there any reason not to switch all my code over to 64 bit?


64-bit programs won't run on a 32-bit Operating System. However if your program requires more than about 2GB of RAM then it won't work on 32-bit anyway.

Many programs with a 64-bit version also have a 32-bit version.


* I hear that 64 bit can be slower for some reason, is this true?


Yes. For example, as pointers double in size you can get more cache misses with some data structures. Also the compiler may not be as well optimized for 64-bit instructions as 32-bit ones.

However they can also be quicker, as in 64-bit mode you get more registers too. The only way to be sure is to test it.


* If I were to switch, would I just convert all my ints to int64s etc? Or keep them as small as makes sense?


Only convert where there's a clear benefit from doing so, or you'll make the cache miss problem mentioned above even worse.

Obviously you will need to convert to 64-bit types where you need to support quantities of data that would overflow a 32-bit value.


* Would not using the 'native' type for the processor be faster? Similar to how bools are often stored as 4 bytes? (wouldn't this be 8 bytes for 64 bit processors?)


No, they can handle 32-bit data types just fine. Also as I noted earlier bigger data types tend to cause extra cache misses, so using smaller data types can make a program run faster even if the CPU can't process them quite as quickly as bigger types.

If you have a choice between using two 32-bit values or one 64-bit one though, then the single 64-bit value will normally be better. For example, chess programmers like 64-bit because you can fit one bit per square on the board into a single 64-bit register.

Share this post


Link to post
Share on other sites
Indeed, unless you have made any assumptions about variable sizes then in many cases simply switching to x64 mode will Just Work.

About the only thing which will definitately change, going to x64 mode, is pointer sizes. Everything else depends on the platform you are using, which includes the compiler.[s] For example, iirc, with MSVC 'int' stays 32bit and 'long' becomes 64bit. [/s] edit: apprently I did not recal correctly, see the post futher down :) (However I still wouldn't assume this to be the case and if I needed a precise/forced size then I'd either typedef my own types or use boost's sized types.)

Share this post


Link to post
Share on other sites

[quote name='James Leighe' timestamp='1323949139' post='4894143']
* I hear that 64 bit can be slower for some reason, is this true?


Yes. For example, as pointers double in size you can get more cache misses with some data structures. Also the compiler may not be as well optimized for 64-bit instructions as 32-bit ones.

However they can also be quicker, as in 64-bit mode you get more registers too. The only way to be sure is to test it.
[/quote]

Compilers are also free to make more assumptions about the hardware; MSVC, for example, will use SSE2 instructions where it can beause all x64 processors support the instructions.

Share this post


Link to post
Share on other sites
I totally agree with adam on the alignment-issues of structures containing pointers. I once had a structure that was 128 byte on 32 bit and 136 byte on 64 bit just because of a single pointer that used up 8 byte instead of 4 byte in 64 bit. Calculating the n-th element was significantly faster under 32bit until I rearranged the variables in the struct.


Additionally, in the beginning when I switched from 32 bit to 64 bit I had some bugs with the printf function. Depending on whether you use linux or windows you will need some new constants to refer to 64bit variables in the generated string.

Share this post


Link to post
Share on other sites
If writing C++, native types should be used only for OS-specific functionality.

For iteration, use size_t or std::allocator<T>::size_type, which is defined in all containers as well.

For container manipulation, prefer std::algorithm, iterators or ::size_type in that order.

For pointer manipulation, ptrdiff_t is defined in similar manner.

Demotion of integer types is rarely used (use of short instead of int for <32k has fallen out of favor). Choice here should be based on numeric range, not target platform, unless performing some really tricky optimizations for specific hardware.

Float vs. double is a mixed bag. Doubles on their own are likely to be faster and more accurate, but choice here will be dictated by memory use and choice of SIMD, so for many uses floats are natural choice and to avoid excessive conversions, propagate through entire project. Otherwise, doubles are preferred.

Bithacks are a no-go. Whether cramming pointers into ints, checking signs, math optimizations... They would be one of last types of optimization performed and it's highly unlikely they result in any meaningful improvement.

Another gotcha is serialization. Different padding, complications around unions, different sizes and lengths of types.


Obviously, all of the above applies to standalone C++ code only. Whatever OS requires will be quite a mess.

Share this post


Link to post
Share on other sites

For example, iirc, with MSVC 'int' stays 32bit and 'long' becomes 64bit.

Microsoft Visual Studio adheres to LLP64, so int and long are always 32 bits. Only the size of pointers changes.


L. Spiro

Share this post


Link to post
Share on other sites

[quote name='phantom' timestamp='1323951311' post='4894151']
For example, iirc, with MSVC 'int' stays 32bit and 'long' becomes 64bit.

Microsoft Visual Studio adheres to LLP64, so int and long are always 32 bits. Only the size of pointers changes.


L. Spiro
[/quote]

I stand corrected :)

Share this post


Link to post
Share on other sites
[font="Arial"]Microsoft once replied to an inquiry as to why they were reluctant to convert Visual Studio to 64 bits (paraphrased):
"Uh, you know, this is a horrible lot of work, and having done it with Excel, we're scared to do it again soon, as long as it works well just like this. 32 bit apps do run fine under 64bits too, so what is the issue! It's not like any program except maybe Excel where some people have thousands of sheets with tens of thousands of rows and columns needs more than 2GB anyway, either".

There is a grain of truth in that.

Although if you have always been pedantic to[/font][font="Arial"] use correct types [/font][font="Arial"](which can be harder than one anticipates), pedantic on constants (yes, [font="Courier New"](size_t) -1[/font] is not the same in 32 and 64 bits, -- surprise surprise, nor are many other dangerous "common idioms", nor do most bitmasks work as expected...), and never made any assumptions on structure sizes etc, chances are indeed good it will just work.
Though you never know what any of your co-workers or third party library implementors did. And what's worse, code can actually "work fine", i.e. not crash despite being broken.

Despite that, I'm inclined to recommend moving to 64 bits, because the advantages outweight the disadvantages. The much larger and uniform register set is a big advantage (some algorithms such as curve25519 run orders of magnitude (not just 20% or so) faster in 64 bits, only because the data fits into registers), and practically unlimited address space can be a big relief. Also, a lot less guesswork and writing extra branches (CPU features) is involved.

If you read The Old New Thing every now and then and see entries like "question about really stupid thing" and the answer reads "yeah that is historical, for compatibility with 16 bit Windows", you may get an idea of another good reason to move swiftly forward and do a clean cut, if you can afford it. Carrying your old luggage around forever is not always a good thing.
[/font]

Share this post


Link to post
Share on other sites

[font="Arial"]
"Uh, you know, this is a horrible lot of work, and having done it with Excel, we're scared to do it again soon, as long as it works well just like this. 32 bit apps do run fine under 64bits too, so what is the issue! It's not like any program except maybe Excel where some people have thousands of sheets with tens of thousands of rows and columns needs more than 2GB anyway, either".


They painted themselves in corner, simple as that.

Developers would love to use more than 2000-era hardware, they just can't.

Shortcomings are also painfully visible when dealing with more than just source editor, where things either take a long time or crash. Roslyn is effectively written from scratch by duplicating what VS does.[/font]

Mozilla recently finally broke VS to the point where they need to remove code to even have it build. Chrome went through this long time ago. And this is some of best engineered software. Imagine how often VS breaks for typical enterprise bloatware.

Share this post


Link to post
Share on other sites

[quote name='samoth' timestamp='1324030555' post='4894438']
[font="Arial"]
"Uh, you know, this is a horrible lot of work, and having done it with Excel, we're scared to do it again soon, as long as it works well just like this. 32 bit apps do run fine under 64bits too, so what is the issue! It's not like any program except maybe Excel where some people have thousands of sheets with tens of thousands of rows and columns needs more than 2GB anyway, either".


They painted themselves in corner, simple as that.

Developers would love to use more than 2000-era hardware, they just can't.

Shortcomings are also painfully visible when dealing with more than just source editor, where things either take a long time or crash. Roslyn is effectively written from scratch by duplicating what VS does.[/font]

Mozilla recently finally broke VS to the point where they need to remove code to even have it build. Chrome went through this long time ago. And this is some of best engineered software. Imagine how often VS breaks for typical enterprise bloatware.
[/quote]


Two things, there has been a 64 bit compiler for a long time, its one the IDE that is 32 bit at this point, and really, is there a need for a 64bit version? ... Thats a lot of text files open!

Second, Mozilla broke the 32 bit compile ( actually link ) with PGO enabled! Big difference. PGO effectively holds each OBJ file in memory, so obviously it is going to be a bit of a memory whore. The trade of is, better over all optimization than GCC, which works differently. If PGO wasn't enabled, there never would have been an issue.

Share this post


Link to post
Share on other sites

Two things, there has been a 64 bit compiler for a long time, its one the IDE that is 32 bit at this point, and really, is there a need for a 64bit version? ... Thats a lot of text files open!

Second, Mozilla broke the 32 bit compile ( actually link ) with PGO enabled! Big difference. PGO effectively holds each OBJ file in memory, so obviously it is going to be a bit of a memory whore. The trade of is, better over all optimization than GCC, which works differently. If PGO wasn't enabled, there never would have been an issue.


I never claimed dark sorcery is the cause of mentioned issues. Nor that witches should be burned as a solution. Nor that there is any kind of mistery behind it.


Why do I need to know about it? Why is 32-bit memory limit even mentioned in era of petabyte datasets?

Phones and some netbooks have 2GB memory and quad cores. A developer these days is likely to consider 16GB a mid-range machine.

Sapir-Worf: tools determine the type of problems you can tackle, since they are incapable of expressing bigger concepts.


Why are build times still an issue? Why doesn't IDE simply keep all versions of all changes in memory. It would only be 30GB, that's ~$120, allowing for instant recompile of only changed fragments and avoiding disk bottleneck completely.

Programming tools are the only tools that have not changed in last 30 years. Compared to what happened to CAD, we still have our equivalent of punch cards, only rendered in high resolution and anti-aliasing.

Modern CAD system, for example, keeps live track of all aspects, from technical to paperwork in a transacted, versioned ecosystem across entire project (say, a small suburb with 5000 houses), from high-level concepts down to every nut and bolt on every hinge of every cabinet, cross-referenced with vendor specs, QA ISO certifications on per-government basis, shared in real time between tens of thousands of people involved in project on any device, from high-end planners to workers on site with iPad, with costs and timelines tracked across all involved parties.

We have git. Yay!

Share this post


Link to post
Share on other sites
Ok, someone here has a substatiantally better PC than I do! :D

I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe. I actually haven't personally seen a > 16GB machine yet, outside of dedicated engineering boxes. Even then, at my prior workspace, the CATIA guys were still using machines WAYYYYY below that spec level.

I think calling a 16GB machine average might be more than a wee bit optomistic. Don't get me wrong, I would absolutely love it, but it just doesn't seem to be the case.

Share this post


Link to post
Share on other sites

I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe.


The funny thing is, when you read about "building a PC" on the net, you will always hear "dont get more than 8Gb since xyz can't utilize it anyway" and on the other end apparently developers write software with the mindset "I can't use more than that since no one has such a machine.". :D. Its not like getting 16 or even 32 Gb will significantly increase the cost of the machine (in my case going rom 8 to 16 was a 4% increase in cost).

Share this post


Link to post
Share on other sites

I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe. I actually haven't personally seen a > 16GB machine yet, outside of dedicated engineering boxes. Even then, at my prior workspace, the CATIA guys were still using machines WAYYYYY below that spec level.

It depends on the industry. I know a couple of music and film guys, and in that line of work, 16GB is considered woefully inadequate for a work machine - as a recent college grad you 'make do' with an 8-core 20GB machine until you can afford to upgrade to a 16-core 32GB machine.

Share this post


Link to post
Share on other sites

[quote name='Serapth' timestamp='1324054043' post='4894528']
I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe.


The funny thing is, when you read about "building a PC" on the net, you will always hear "dont get more than 8Gb since xyz can't utilize it anyway" and on the other end apparently developers write software with the mindset "I can't use more than that since no one has such a machine.". :D. Its not like getting 16 or even 32 Gb will significantly increase the cost of the machine (in my case going rom 8 to 16 was a 4% increase in cost).
[/quote]



OK, where are you people shopping???

I just checked out DELL, just for a frame of reference and the prices are attached. Its by no means "cheap".

I personally only buy Laptops anymore, which does make it a bit trickier, but still, even buying a 16gb machine seems to be a wee bit on the pricy side.

Share this post


Link to post
Share on other sites

OK, where are you people shopping???

I just checked out DELL, just for a frame of reference and the prices are attached. Its by no means "cheap".

Never, ever, *ever* buy RAM from OEMs. 16 GB of DDR3 RAM costs about $75.

Edit: Same goes for hard drives, graphics/RAID cards, you name it. OEM pricing is disgusting.

Share this post


Link to post
Share on other sites

[quote name='Serapth' timestamp='1324054881' post='4894535']
OK, where are you people shopping???

I just checked out DELL, just for a frame of reference and the prices are attached. Its by no means "cheap".

Never, ever, *ever* buy RAM from OEMs. 16 GB of DDR3 RAM costs about $75.

Edit: Same goes for hard drives, graphics/RAID cards, you name it. OEM pricing is disgusting.
[/quote]

Wow, that discrepency is almost criminal. I haven't purchased a desktop in so long I hadn't realized that the difference had gotten that bad. The last time I considered building my own machine, it was simply cheaper to buy a whitebox instead.

Share this post


Link to post
Share on other sites

Wow, that discrepency is almost criminal. I haven't purchased a desktop in so long I hadn't realized that the difference had gotten that bad. The last time I considered building my own machine, it was simply cheaper to buy a whitebox instead.

My university has a policy that all IT purchases have to be from OEMs, so that they are serviced by OEM warranties. My back-of-the-envelope estimate suggest we could cut purchasing costs by 75% if we were to bring it in-house and build our own computers on site (even accounting for the salaries of the additional technical and support staff needed).

Share this post


Link to post
Share on other sites

I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe. I actually haven't personally seen a > 16GB machine yet, outside of dedicated engineering boxes.


Why? Because most software is still 32 bit. Because tools and programming paradigms we have, from ints to VMs are still 32 bit, stuck in 386 architecture times.

Read blogs from Google's android developers. Dual xeons and such, making it possible to develop the OS which takes 5 hours to build on lower end machines.

I think calling a 16GB machine average might be more than a wee bit optomistic. Don't get me wrong, I would absolutely love it, but it just doesn't seem to be the case.[/quote]It really wouldn't matter for most, since there is essentially no software that can make use of it. Even one that does exist is scaled up version of 32-bit memory models.

What else could one do with 32/64/128 GB of RAM? How about making a 64GB RAM disk? Still a factor 10 faster than SSDs and just perfect for, say, CI and testing.

Or run 8 VMs, with ability to test live under multiple OSes. Very nice for web development (having several 30" screens obviously, or 24" if you're on a budget and can only spend $150 per).

"Think big, like the Americans" (Indy and Atlantis reference :)

OK, where are you people shopping???[/quote]

It's a random number I made up and threw out. My point was it's no longer necessary to mortgage a house to get that.

Part of the problem is also the perceived convenience of laptops.

----

Regardin RAM prices - they are at rock bottom right now, partly due to Apple (iPad, iPhone, uses custom chips, sells more units than DELL does PCs). Current RAM manufacturing capacity has vast overproduction which pushed prices that low.

Disks on the other hand are going up fast due to lost manufacturing capacity.

----
My university has a policy that all IT purchases have to be from OEMs, so that they are serviced by OEM warranties.[/quote]

Once you try to run any non-trivial organization you realize such arrangements are almost necessary and that they save costs by a wide margin. Especially for something long-term, anything that spans the life of a single "model".

It requires bureaucracy and does limit certain choices, but it's only managable solution. Some of it also comes down to implementation, it's possible to make it a nightmare or a well-oiled process.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement