Why does Visual C++ has a limited stack and heap allocation dimension (1MiB as default)?

Started by
7 comments, last by DarkRadeon 12 years, 1 month ago
Yes, I only learn it today, after an entire afternoon spent with several algorithms and data structures C++ samples getting crash on Windows for unknown reason -.-
One answer could be: "for optimization kinda stuffs..."

...

I mean, are you kidding me Microsoft?
Why don't you give me a simple message like "Hey you dumb-ass, here you cannot allocate more then 1MiB onto stack or you must change the linker option"?
On several linux machines all worked fine, even with very large stack allocations (more than 100MB), so what is the real reason?
I know that generally having a really big stack is not so common (an a good idea), but why Microsoft set a lot of limitation without warning the user? ç___ç

edit: wow, same limitation for the heap :|
Advertisement

getting crash on Windows for unknown reason


Were you not debugging? If you were, you would have received a "Stack Overflow" exception...

[quote name='Alessio89' timestamp='1330539443' post='4917874']
getting crash on Windows for unknown reason


Were you not debugging? If you were, you would have received a "Stack Overflow" exception...
[/quote]
because I used different IDEs and compilers I was "testing" on many machines.. and only on Windows PCs I had that problem and I knew that the code was right, so I didn't think about debugging..
so what is the real reason?[/quote]

Stack is limited to avoid resource exhaustion by OS. Stack is non-pageable memory and applications could exhaust RAM too quickly.

wow, same limitation for the heap :|[/quote]

Huh? Heap isn't limited beyond what OS can allocate. It's definitely not limited to 1MB. Large allocations (1GB+) however may fail, but that's not unique to Windows.

only on Windows PCs I had that problem[/quote]

Unless you manually specific stack size on other platforms, the limitations are the same. it's also perfectly possible to simply smash the stack without crashing, so not having problems elsewhere might be just usual undefined behavior.

I'm guessing it has to do with something else, similar to this:
int bar[1024*1024*1024];
int main() {
int foo[1024*1024*1024];
}

bar may get allocated inside one of exe sections, so size constraints again become a problem. Not sure if and what kind of limits there are.
foo is obviously too large for stack.

Application like that would require 1GB of physical RAM and 2GB of address space at minimum which is problematic in general case, but not necessarily impossible to achieve, just takes a few linker switches.

Other OSes have same constraints.

Stack is limited to avoid resource exhaustion by OS. Stack is non-pageable memory and applications could exhaust RAM too quickly.


I know that but the applications I tested never had big allocation memory (topically 10-100MB)


Huh? Heap isn't limited beyond what OS can allocate. It's definitely not limited to 1MB. Large allocations (1GB+) however may fail, but that's not unique to Windows.


In visual studio, on the linker option there are bot Stack and Heap reserve Size, both are set to 1MB as default.


Unless you manually specific stack size on other platforms, the limitations are the same. it's also perfectly possible to simply smash the stack without crashing, so not having problems elsewhere might be just usual undefined behavior.


I also worked with some linux server, where I couldn't touch nothing except submitting source code, the server had the task to compile it and run it sending to me result like time execution time compilation etc..


I'm guessing it has to do with something else, similar to this:
int bar[1024*1024*1024];
int main() {
int foo[1024*1024*1024];
}

bar may get allocated inside one of exe sections, so size constraints again become a problem. Not sure if and what kind of limits there are.
foo is obviously too large for stack.


The biggest single "object" I had in the various programs were arrays of on million integers, like int a[ 1000000 ] and on both windows and linux an int should be 4 bytes, so 4 millions bytes, ~3.8MB, not a big problem for machines equipped with at least 4GB RAM..


Application like that would require 1GB of physical RAM and 2GB of address space at minimum which is problematic in general case, but not necessarily impossible to achieve, just takes a few linker switches.

Other OSes have same constraints.

I'm actually abstain from C++, but I notice that making global or static those array solve that kind of problem...
Maybe I should see again the C++ memory allocation model xD
mmm OK, now I understand that the stack reserve size is expressed in byte and not int MB... now with the correct value it works perfectly..
Don't allocate large arrays on the stack. It's not designed for that. Use the heap. new/delete. std::vector<T>, std::array<T>, or similar.

In time the project grows, the ignorance of its devs it shows, with many a convoluted function, it plunges into deep compunction, the price of failure is high, Washu's mirth is nigh.

Regarding the heap option, it's important to actually read the documentation here otherwise you have a strong risk of getting some serious misunderstandings about Windows. This is particularly the case if coming from a background of another OS - there is sometimes no direct one-to-one mapping and if you rely on what looks or seems similar you can often go down an entirely wrong path.

In this case the relevant MSDN page is here: http://msdn.microsoft.com/en-us/library/ms810603.aspx

Note that for the heap, the size specifies initial reserve and commit sizes, not absolute limits:
... values specifying the amount of reserved and committed space initially needed by the application.[/quote]

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.


Don't allocate large arrays on the stack. It's not designed for that. Use the heap. new/delete. std::vector<T>, std::array<T>, or similar.

Yes, I know that, but the source files were not written by me, I tested only the "cost" of the algorithm the had...


Regarding the heap option, it's important to actually read the documentation here otherwise you have a strong risk of getting some serious misunderstandings about Windows. This is particularly the case if coming from a background of another OS - there is sometimes no direct one-to-one mapping and if you rely on what looks or seems similar you can often go down an entirely wrong path.

In this case the relevant MSDN page is here: http://msdn.microsof...y/ms810603.aspx

Note that for the heap, the size specifies initial reserve and commit sizes, not absolute limits:
... values specifying the amount of reserved and committed space initially needed by the application.

[/quote]
Thank you for the link : D.
The code was written for different linux machines (servers and a distribute net), so probably the author did not care about windows memory model... Now it's all clear : )

Thank you guys.

This topic is closed to new replies.

Advertisement