Sign in to follow this  
smitty1276

Win32 Error 8

Recommended Posts

smitty1276    560
Has anyone here had any experience resolving the win32 error 8: "Not enough storage is available to process this command"?? Some background... some people may recall past threads where I discuss some incredibly bizarre scenarios that I am trying to deal with due to the obnoxious nature of the project I'm working on. Basically, we (me and a couple coworkers) are trying to take some old C simulator type code--written for Linux--and get it working within a component-based simulation framework on Win32... this typically involves writing an "interface layer" which serves as the middle man. If the framework wants to do something, it calls the interface, which calls down into the guts of the simulation code and makes it do what it has to do by whatever means necessary. A major problem that is beginning to come up is that the old C sim code is VERY heavily reliant upon static data. A single huge static data structure--which contains member structures for all of the subsystem data--maintains almost all of the data for the entire aircraft and all of its subsystems... this really sucks. Furthermore, there are tons of local static variables (things like "int tf_firstpass" to check for the need to initialize things). One of the things we want to do eventually is allow for multiple instances of the aircraft to fly at the same time, which is not possible with the static data. So I rewrote the interface to be very component based and loaded from DLLs. Now, at runtime, for each instance of the aircraft I am actually making a copy of the DLLs that is named for the aircraft instance and loading the DLL into a new address space per aircraft. Surprisingly, this works. (If anyone has better ideas, I'm all ears... I know that this solution is hideous). But after loading, unloading, loading, unloading, etc., a few times, the LoadLibrary call will fail for one particular DLL and GetLastError() gives me the error... Error 8 "Not enough storage is available to process this command". There isn't a ton of info online for this error, it seems, and I was wondering if anyone knew how to go about resolving it?

Share this post


Link to post
Share on other sites
LessBread    1415
That you're getting that error after "loading, unloading, loading, unloading" and that the error follows a call to LoadLibrary, suggests that it could be that you're running out of handles. Try recrafting your code so that each dll is only loaded and unloaded one time. Once a dll is loaded, you can use GetModuleHandle to obtain the handle to that dll, unless you've taken steps to maintain access to each handle after the dll is loaded.

Here are a few links on the subject of handle limits

Maximum NT User Handles Per Process Is 10,000 in Windows 2000

Windows are not cheap objects

Does Windows have a limit of 2000 threads per process?

All though those links may not be specific to your situation, they do offer insights into how to think about the problem.


Share this post


Link to post
Share on other sites
smitty1276    560
Thanks for the links. I'll check them out.

I should note that the reason that I am actually having to reload them occasionally is that I need for the static data to be reset to its initial state. Unfortunately, the underlying sim code doesn't expose any clean interface for doing that sort of thing... in fact, most of the subsystems are actually initialized the first time you run them. The initialization and update logic are coupled together (using one of those static boolean flags to indicate first_pass). The horrible design of the sim code is causing all sorts of awkward situations.

Also, I currently have 15 or 20 components running, and it usually happens after the 3rd or 4th reload. So, in total I am only doing about 80 to 100 LoadLibrary calls. But I am calling FreeLibrary() to unload each component before reloading them, so they should be properly released... er, shouldn't they?

I'll go check out your links now. r++

Share this post


Link to post
Share on other sites
LessBread    1415
Have you thought about making a copy of the static data and then working with the copy and simply recopying the original static data whenever a reset is needed?

Share this post


Link to post
Share on other sites
smitty1276    560
I've thought of a lot of things. The problem with the data copying is... prepare yourself for this, it's almost embarrasing, :-)... that the core data--the stuff that holds the sim state--is in one giganormous 20+MB structure, containing lord-knows-how-many smaller structures. Most subsystems have a structure for their actual "current state", as well as a "buffer in" and a "buffer out" structure. The problem is that this design is not entirely consistent, and--worst of all--most of the structure isn't used... but we can't really say with any certainty which parts will never be used, which makes selectively copying pieces a bit more difficult.

We've written some code that analyzes changes, and the typical frame results in maybe 3k of actual modified data in the whole mess. There are some enormous sections that appear to NEVER change, but there may be some unique cases where this isn't true. Furthermore, the codebase we use comes to us from a team that may have reserved some of that space for particular "vehicular features" (I've gotta be somewhat vague so I don't get in trouble) that may not yet be incorporated into the simulation. Since different versions of the vehicle--which we integrate every few months--may radically alter the composition of the underlying static data structure, we can't build to many assumptions about that into our approach.

In other words, is an absolute nightmare. The sim code works fine for what it was intended to do, which is to run a standalone testbed simulation in a *nix environment, so I don't want to be too hard on those guys. But we are coercing it to do something it wasn't meant to do. The guts of it are so thorough and so accurate that it is very much worth making it work.

Another problem with the copying of static data is the fact that we would like to have multiple instances running in the same environment and process at some point. So if we can let them each have their own version of the data it would be ideal. Making an adhoc copy of the dll and renaming it seems to be working great for the average use case. It's just when we're doing particular things that require a restart more than a few times that I'm having problems.

Share this post


Link to post
Share on other sites
LessBread    1415
Wow. Sounds like a real mess.

Have you looked into memory mapping the data? (eg. CreateFileMapping, MapViewOfFile) A dll is best used for code. LoadLibrary basically maps a dll into the address space of the process and performs any necessary fixups on the function pointers in the pe header. If the data doesn't need to be initialized as part of the load process, file mapping might be the way to go.

Here are a couple of articles on how LoadLibrary does it's thing.

A tour of part of the Windows NT® loader code
What Goes On Inside Windows 2000: Solving the Mysteries of the Loader

Here's a tut on memory mapping large files. Memory Management for Large-File Editors

Share this post


Link to post
Share on other sites
Extrarius    1412
Quote:
Original post by smitty1276
[...]A single huge static data structure[...]
If all the code uses a single static structure, it shouldnt be too difficult to remove the static and make every function take a parameter with the name the static used to have - that way, you can have as many instances as you want without having to go through the mess you talk about of loading new DLLs. Since you need to properly initialize stuff, you could keep the static struct but move it a psuedoconstructor for use when you make a new instance of the data.

Share this post


Link to post
Share on other sites
smitty1276    560
It's almost a million lines of code with hundreds or thousands of functions--each of which accesses multiple datastructures--in a codebase which we do not maintain and which is updated every few months by those who do.

By the time we finished refactoring one version, several more would have already come out.

Share this post


Link to post
Share on other sites
ApochPiQ    23062
Have you considered using a multi-process approach? That is, spawn a separate process for each instance of the underlying code, and then use a thin IPC layer to consolidate the data into your final application.

This may not be performant enough depending on your needs, but it'll magically solve the static-data issue without needing to do any dirty hackery with the DLLs.


(Which, by the way, I agree with LessBread's guess - this sounds like a handle or other system-level resource problem. It may also be a stack/heap space issue at some level. Either way, it's not likely to be easy to solve, unless you can mitigate the resource usage of each loaded instance in some way.)

Share this post


Link to post
Share on other sites
smitty1276    560
ApochPiQ... the IPC layer idea is actually worth looking into. I've never actually directly dealt with that sort of thing, so I'm not sure how it would work, but I'll definitely look around. Can you suggest any good starting point for reading (of course I'll google)?

Regarding the "resource usage of each loaded instance"... what is that likely to be? The size of the loaded DLLs is, in total, about 10MB per instance. Then, of course, there is another ~22MB of memory allocated for the static data. What other resource usage--particularly memory--is likely to occur?

Share this post


Link to post
Share on other sites
ApochPiQ    23062
Depends; if the code in the DLLs does anything at all at the OS level (creating windows, loading graphics or other resources, etc.) then it will incur overhead in the OS for tracking those resources. For instance any icon loaded creates a tracking entry in the OS level (which you access via an HICON handle). Obviously any heap allocations made by the DLL will be relevant, as well as the space needed for the DLL code itself.

It's possible there is some form of fragmentation occuring from all the reloads. Look into DLL base address relocation for why this might be. It's possible that Windows runs out of consolidated open memory to locate new DLL copies in. Approximately how many reloads can you perform before you encounter the error?



IPC is short for interprocess communication. Basically you would have multiple EXE instances running, each of which registers with a "central" EXE of some kind. The central code is responsible for the actual UI and so forth. The simulation is then run in the spinoff EXEs, and state from the simulation is sent to the UI/master layer via IPC. The advantage of this of course is that you don't have to write any special handling to add a new simulation instance; Windows will already take care of running the EXE as its own separate memory space and so on.

There are many IPC mechanisms available. A few options you might look into are shared memory (probably the best bet for raw performance), sockets, and named pipes. The main concern you will run into will be locking, particularly for shared memory; it's basically the same situation as multithreading. Sockets and pipes will eliminate the locking problem but introduce a bit of overhead. Depending on how much state you need to move around each frame, that might be prohibitive, although 3KB * number of instances isn't that much and should be feasible, since it's all local and not across the network (Windows has a much more efficient socket/pipe implementation for the local case where the data doesn't cross any network interfaces).

Share this post


Link to post
Share on other sites
smitty1276    560
Quote:
It's possible that Windows runs out of consolidated open memory to locate new DLL copies in. Approximately how many reloads can you perform before you encounter the error?

It seems to typically fail after 3-5 reloads. So, when I call FreeLibrary it doesn't just mark that whole space as free, contiguous memory?

Share this post


Link to post
Share on other sites
LessBread    1415
Are these dlls ~20 MB in size? Typically dlls are loaded into the process address space at a preferred address. The size of the dll could be making that impossible because there isn't sufficient space available at that location so the dll is getting loaded elsewhere. On top of that, it could be that reloads aren't occurring the same address but at different addresses taking up 20 mb chunks every time. At any rate, depends is a great help when it comes to dlls. It will tell you were dlls want to be loaded and it comes with a profiler that will track LoadLibrary calls and log them to a file.

Regarding IPC, with 20 mb chunks, you'll want to use memory mapping to share the data across processes. Using pipes etc. will be much too slow, probably slower than simply copying the data and using the copy per my earlier suggestion.

With the memory mapping you can leave the dlls as dlls and map them, you'll just need to figure out the offset into the file where the data you're interested resides. The thing to remember with memory mapping is that it can let you write over the data, so you need to pay attention to the permission flags and to how you use the mapped view pointer.

Another helpful (and free) utility is ProcessExplorer. It can show you how many handles a process has open and much more.


[Edited by - LessBread on March 21, 2007 3:02:27 PM]

Share this post


Link to post
Share on other sites
smitty1276    560
Sweet, LessBread. That utility looks nice. Sorry to get back so late on this.

Also, to answer your question, the DLLs are less than about 10MB all together, for 15 or 20 components... I'd say they average 250k apiece.

Share this post


Link to post
Share on other sites
Antheus    2409
While I don't have experience with this, a google search points to some examples of how this happens.

One of them deals with exhausting the TLS pool. Since dealing with lots of static data, there might be some form of automated TLS going on, and with this much data, it seems like a likely cause.

But again, I do not know the internals of DLL mechanisms, and whether this does happen, it just seems reasonable explanation.


But if the size of API is reasonably small, then some for of IPC would be the ideal solution to this. Just pass everything through a broker that spawns the executable as needed.

If regular usage of API involves thousands of functions, this might not be an option.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this