32 vs 64 bit

Started by
18 comments, last by Antheus 12 years, 4 months ago

[font="Arial"]
"Uh, you know, this is a horrible lot of work, and having done it with Excel, we're scared to do it again soon, as long as it works well just like this. 32 bit apps do run fine under 64bits too, so what is the issue! It's not like any program except maybe Excel where some people have thousands of sheets with tens of thousands of rows and columns needs more than 2GB anyway, either".


They painted themselves in corner, simple as that.

Developers would love to use more than 2000-era hardware, they just can't.

Shortcomings are also painfully visible when dealing with more than just source editor, where things either take a long time or crash. Roslyn is effectively written from scratch by duplicating what VS does.[/font]

Mozilla recently finally broke VS to the point where they need to remove code to even have it build. Chrome went through this long time ago. And this is some of best engineered software. Imagine how often VS breaks for typical enterprise bloatware.
Advertisement

[quote name='samoth' timestamp='1324030555' post='4894438']
[font="Arial"]
"Uh, you know, this is a horrible lot of work, and having done it with Excel, we're scared to do it again soon, as long as it works well just like this. 32 bit apps do run fine under 64bits too, so what is the issue! It's not like any program except maybe Excel where some people have thousands of sheets with tens of thousands of rows and columns needs more than 2GB anyway, either".


They painted themselves in corner, simple as that.

Developers would love to use more than 2000-era hardware, they just can't.

Shortcomings are also painfully visible when dealing with more than just source editor, where things either take a long time or crash. Roslyn is effectively written from scratch by duplicating what VS does.[/font]

Mozilla recently finally broke VS to the point where they need to remove code to even have it build. Chrome went through this long time ago. And this is some of best engineered software. Imagine how often VS breaks for typical enterprise bloatware.
[/quote]


Two things, there has been a 64 bit compiler for a long time, its one the IDE that is 32 bit at this point, and really, is there a need for a 64bit version? ... Thats a lot of text files open!

Second, Mozilla broke the 32 bit compile ( actually link ) with PGO enabled! Big difference. PGO effectively holds each OBJ file in memory, so obviously it is going to be a bit of a memory whore. The trade of is, better over all optimization than GCC, which works differently. If PGO wasn't enabled, there never would have been an issue.

Two things, there has been a 64 bit compiler for a long time, its one the IDE that is 32 bit at this point, and really, is there a need for a 64bit version? ... Thats a lot of text files open!

Second, Mozilla broke the 32 bit compile ( actually link ) with PGO enabled! Big difference. PGO effectively holds each OBJ file in memory, so obviously it is going to be a bit of a memory whore. The trade of is, better over all optimization than GCC, which works differently. If PGO wasn't enabled, there never would have been an issue.


I never claimed dark sorcery is the cause of mentioned issues. Nor that witches should be burned as a solution. Nor that there is any kind of mistery behind it.


Why do I need to know about it? Why is 32-bit memory limit even mentioned in era of petabyte datasets?

Phones and some netbooks have 2GB memory and quad cores. A developer these days is likely to consider 16GB a mid-range machine.

Sapir-Worf: tools determine the type of problems you can tackle, since they are incapable of expressing bigger concepts.


Why are build times still an issue? Why doesn't IDE simply keep all versions of all changes in memory. It would only be 30GB, that's ~$120, allowing for instant recompile of only changed fragments and avoiding disk bottleneck completely.

Programming tools are the only tools that have not changed in last 30 years. Compared to what happened to CAD, we still have our equivalent of punch cards, only rendered in high resolution and anti-aliasing.

Modern CAD system, for example, keeps live track of all aspects, from technical to paperwork in a transacted, versioned ecosystem across entire project (say, a small suburb with 5000 houses), from high-level concepts down to every nut and bolt on every hinge of every cabinet, cross-referenced with vendor specs, QA ISO certifications on per-government basis, shared in real time between tens of thousands of people involved in project on any device, from high-end planners to workers on site with iPad, with costs and timelines tracked across all involved parties.

We have git. Yay!
Ok, someone here has a substatiantally better PC than I do! :D

I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe. I actually haven't personally seen a > 16GB machine yet, outside of dedicated engineering boxes. Even then, at my prior workspace, the CATIA guys were still using machines WAYYYYY below that spec level.

I think calling a 16GB machine average might be more than a wee bit optomistic. Don't get me wrong, I would absolutely love it, but it just doesn't seem to be the case.

I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe.


The funny thing is, when you read about "building a PC" on the net, you will always hear "dont get more than 8Gb since xyz can't utilize it anyway" and on the other end apparently developers write software with the mindset "I can't use more than that since no one has such a machine.". :D. Its not like getting 16 or even 32 Gb will significantly increase the cost of the machine (in my case going rom 8 to 16 was a 4% increase in cost).

I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe. I actually haven't personally seen a > 16GB machine yet, outside of dedicated engineering boxes. Even then, at my prior workspace, the CATIA guys were still using machines WAYYYYY below that spec level.

It depends on the industry. I know a couple of music and film guys, and in that line of work, 16GB is considered woefully inadequate for a work machine - as a recent college grad you 'make do' with an 8-core 20GB machine until you can afford to upgrade to a 16-core 32GB machine.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]


[quote name='Serapth' timestamp='1324054043' post='4894528']
I mean, that sounds nice and all, but 30GB desktops aren't exactly the norm, or hell, even the fringe.


The funny thing is, when you read about "building a PC" on the net, you will always hear "dont get more than 8Gb since xyz can't utilize it anyway" and on the other end apparently developers write software with the mindset "I can't use more than that since no one has such a machine.". :D. Its not like getting 16 or even 32 Gb will significantly increase the cost of the machine (in my case going rom 8 to 16 was a 4% increase in cost).
[/quote]



OK, where are you people shopping???

I just checked out DELL, just for a frame of reference and the prices are attached. Its by no means "cheap".

I personally only buy Laptops anymore, which does make it a bit trickier, but still, even buying a 16gb machine seems to be a wee bit on the pricy side.

OK, where are you people shopping???

I just checked out DELL, just for a frame of reference and the prices are attached. Its by no means "cheap".

Never, ever, *ever* buy RAM from OEMs. 16 GB of DDR3 RAM costs about $75.

Edit: Same goes for hard drives, graphics/RAID cards, you name it. OEM pricing is disgusting.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]


[quote name='Serapth' timestamp='1324054881' post='4894535']
OK, where are you people shopping???

I just checked out DELL, just for a frame of reference and the prices are attached. Its by no means "cheap".

Never, ever, *ever* buy RAM from OEMs. 16 GB of DDR3 RAM costs about $75.

Edit: Same goes for hard drives, graphics/RAID cards, you name it. OEM pricing is disgusting.
[/quote]

Wow, that discrepency is almost criminal. I haven't purchased a desktop in so long I hadn't realized that the difference had gotten that bad. The last time I considered building my own machine, it was simply cheaper to buy a whitebox instead.

Wow, that discrepency is almost criminal. I haven't purchased a desktop in so long I hadn't realized that the difference had gotten that bad. The last time I considered building my own machine, it was simply cheaper to buy a whitebox instead.

My university has a policy that all IT purchases have to be from OEMs, so that they are serviced by OEM warranties. My back-of-the-envelope estimate suggest we could cut purchasing costs by 75% if we were to bring it in-house and build our own computers on site (even accounting for the salaries of the additional technical and support staff needed).

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

This topic is closed to new replies.

Advertisement