Sign in to follow this  
3TATUK2

32bit/64bit Woes?

Recommended Posts

3TATUK2    714

I just upgraded from 32bit Windows to x64... Now a program I run uses realloc() and once the integer value of the address reaches greater than ~2000000000, it crashes - I suspect there's some gcc argument which will prevent this... any ideas?

Share this post


Link to post
Share on other sites
frob    44916
Could be various issues.

One of the more likely causes is pointer truncation. It is a relatively common problem from header mismatch where something is referencing the old 32-bit headers rather than the 64-bit header, or a link issue or code duplication issue where something is using the 32-bit version of the function.

A little bit of time with the debugger and the disassembly window can show you if the pointer is getting truncated. Hopefully it would be as simple as using a watch window where a pointer suddenly changes from an extended 64-bit address to a 32 bit address. Stepping through the code with a mixed disassembly enabled can also help reveal the mismatch, such as using the 32-bit ecx instead of the 64-bit rcx register.

Share this post


Link to post
Share on other sites
satanir    1452


but the 32bit exe should run on 64bit through WoW64

Yes, but that has nothing to do with what frob said.

 


What exactly should i do

Debug your app. Easily said that done, I know, but you already know when and where the crash happens. You can break right before it crashes, switch to disassembley view, and check what happens.

 

BTW, what exception are you getting?

Share this post


Link to post
Share on other sites
3TATUK2    714

the problem is realloc is returning NULL after the integer pointer address size goes above ~2billion. Not sure why it goes so high. it either grows "faster than it should" (program only uses about 400MB) - or it "jumps" from some low point to some high point

Share this post


Link to post
Share on other sites
satanir    1452


it either grows "faster than it should" (program only uses about 400MB) - or it "jumps" from some low point to some high point

realloc works fine. The reason it uses more memory than you allocated is because of fragmentation. When you allocate a buffer, the allocation has to be contigous, so depending on your allocation pattern, you'll see some overhead ('some' can become 'extreme' sometimes).

Also, you can't really deduce how much memory your application consumed based on the address, as allocation doesn't start from 0. The task-manager will tell you exactly how much memory your app actually consumes.

 

Anyway, based on your description, I suspect you have a memory leak.

Share this post


Link to post
Share on other sites
Ohforf sake    2052
No it does not.

Using realloc is the problem. If you constantly increase an array by a small amount (say 1), there is a good chance that the array can not be increased because s.th. else was allocated behind it. This results in realloc allocating a completely new chunk of memory somewhere else and copying all the contents over. The prior location is then free for other use. However, since you keep increasing your array size, it will never again fit into that gap. Even worse, in some (not uncommon) situations, this happens with every resize/realloc of the array. You end up with a crap ton of gaps that are basically "free" but can't be used. This is the fragmentation that satanir meant.
Using realloc in this fashion can easily result in your application requesting multiple GBs of memory from the OS although you only need a few MBs or KBs. And at some point, your virtual address space simply overflows. That is what you are seeing.

Share this post


Link to post
Share on other sites
Vortez    2714

Well if is program is a 32 bit executable, here's your problem. Even if your running on a 64 bits OS, you just can't allocate more than 2 gb of memory. Even with the LARGE ADDRESS AWARE thing, i think you'll be limited to 3gb instead. If you need that much memory, why not compile a 64 bits exec??

Share this post


Link to post
Share on other sites
cr88192    1570

Well if is program is a 32 bit executable, here's your problem. Even if your running on a 64 bits OS, you just can't allocate more than 2 gb of memory. Even with the LARGE ADDRESS AWARE thing, i think you'll be limited to 3gb instead. If you need that much memory, why not compile a 64 bits exec??

 

yes.

 

personally though, I am sticking with 32 bits for now, mostly until 32-bit OS's pretty much die off, since a 64-bit program will not work on a 32-bit computer.

I had at one point been building mostly for 64-bits, but this issue of "being friendly to 32-bit WinXP" brought things back.

 

 

memory use is still a challenge though, as 2-3GB doesn't go as far sometimes as one might think, and fragmentation can be an issue, especially if one tries to do lots of exact-size arrays.

 

one trick I have often used is that most variable-size structures (arrays/etc), are not sized exactly, but rather follow a curve like 4*1.5n, or 16*1.5n, such that while there is often a little waste, fragmentation is reduced (when an array is reallocated, it is padded up to the next size in the curve).

 

this is sometimes combined with a bit of trickery, like some amount of keeping stuff compressed in RAM and decompressing things on demand, ...

interestingly, this turns out to not really be a new thing (it was fairly common back in the 16-bit era).

Share this post


Link to post
Share on other sites
cr88192    1570

 

2-3GB doesn't go as far sometimes as one might think

 
say-what-now.jpg

 

 

 

basically, in my engine, voxel terrain and audio data were eating lots of RAM (previously, the thing kept running out of address space and crashing, but since has been reduced to a more stable 500-700MB, most of the time).

 

decided to leave out a longer description, but:

voxel terrain can eat lots of RAM (lots of largish 3D arrays of voxels);

to a lesser extent, so can things like PCM audio (at 44.1kHz 16 bit) and compressed video / textures / ...

 

so, while 2-3GB seems like it should be enough for almost anything "reasonable", it isn't too hard to run into the limit and then sit around trying to shave off memory use to keep everything fitting (or within a more reasonable limit).

 

basically, keeping voxel chunks, audio, ... compressed until needed, etc.

 

likewise, a person might need to do things like leave larger video files on disk, and read-in frames individually, rather than just reading the whole video into a memory buffer or put it in an asset pack and bulk-loading it (like one might do for most other asset loading).

 

for example, while a person might bulk-load their textures and animated textures and sound-effects and similar, they would probably not  do so with their cutscenes (which would be left on disk until needed).

 

also, there may be issues with trying to load or work with very-high-res images (ex: 8192 x 8192 or 16384 x 16384), which may have a hard time fitting in RAM (8192 x 8192 is hit or miss, 16384 x 16384 = no).

Edited by BGB

Share this post


Link to post
Share on other sites
Khatharr    8812

basically, in my engine, voxel terrain and audio data were eating lots of RAM (previously, the thing kept running out of address space and crashing, but since has been reduced to a more stable 500-700MB, most of the time).

 

decided to leave out a longer description, but:

voxel terrain can eat lots of RAM (lots of largish 3D arrays of voxels);

to a lesser extent, so can things like PCM audio (at 44.1kHz 16 bit) and compressed video / textures / ...

 

so, while 2-3GB seems like it should be enough for almost anything "reasonable", it isn't too hard to run into the limit and then sit around trying to shave off memory use to keep everything fitting (or within a more reasonable limit).

 

basically, keeping voxel chunks, audio, ... compressed until needed, etc.

 

likewise, a person might need to do things like leave larger video files on disk, and read-in frames individually, rather than just reading the whole video into a memory buffer or put it in an asset pack and bulk-loading it (like one might do for most other asset loading).

 

for example, while a person might bulk-load their textures and animated textures and sound-effects and similar, they would probably not  do so with their cutscenes (which would be left on disk until needed).

 

also, there may be issues with trying to load or work with very-high-res images (ex: 8192 x 8192 or 16384 x 16384), which may have a hard time fitting in RAM (8192 x 8192 is hit or miss, 16384 x 16384 = no).

 

 

That's like venting the oceans into space and then saying, "We ran out of water because 1,400,000,000,000,000,000,000kg isn't as much as you may think."

Share this post


Link to post
Share on other sites
cr88192    1570

 


basically, in my engine, voxel terrain and audio data were eating lots of RAM (previously, the thing kept running out of address space and crashing, but since has been reduced to a more stable 500-700MB, most of the time).

 

...

 

That's like venting the oceans into space and then saying, "We ran out of water because 1,400,000,000,000,000,000,000kg isn't as much as you may think."

 

 

granted.

 

though, for many developers, there is a mindset that RAM is a nearly boundless free resource.

 

except... when it is not.

 

like, while big arrays are fairly obvious memory wasters, often lots of memory can still be eaten up by "little things" as well (a buffer here, a buffer there, a few kB here and there, ...), and this may be less obvious.

Share this post


Link to post
Share on other sites
Khatharr    8812
though, for many developers, there is a mindset that RAM is a nearly boundless free resource.
 

 

That's why we should have a government run program that abducts budding programmers and forces them to spend a year developing for old console systems.

Share this post


Link to post
Share on other sites
King Mir    2490

though, for many developers, there is a mindset that RAM is a nearly boundless free resource.

 
That's why we should have a government run program that abducts budding programmers and forces them to spend a year developing for old console systems.

Then they'll start using old console tricks to save memory at the cost of code readability.

Hardware has changed, and with it acceptable use of memory. Budding programmers should learn on today's hardware, with modern and relevant resource considerations.

But it's true enough that no resource is limitless.

Share this post


Link to post
Share on other sites
Khatharr    8812

 

 

though, for many developers, there is a mindset that RAM is a nearly boundless free resource.

 
That's why we should have a government run program that abducts budding programmers and forces them to spend a year developing for old console systems.

 

Then they'll start using old console tricks to save memory at the cost of code readability.

Hardware has changed, and with it acceptable use of memory. Budding programmers should learn on today's hardware, with modern and relevant resource considerations.

But it's true enough that no resource is limitless.

 

 

Or possibly they'll start using discretion, planning, and responsibility (and maybe learn about portability) instead of using 800MB of memory to run a calculator. My point is not just that resources are limited. My point is that consuming resources simply because they're available is disrespectful to the end user who had to pay money to buy those resources so that they would have more available. There's no reason (apart from laziness) for a program to consume more than it's using, and it tends to lead to bad design habits, such as the ubiquitous 'load everything now' screen.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this