Object pooling and managed languages

Started by
1 comment, last by IkarusDowned 11 years, 5 months ago
I am considering implementing Object Pooling into my game. The game is written in a managed language (Javascript). The goal of implementing Object Pooling is to reduce garbage collection by reusing objects.

However, my worry is that Object Pooling would make things fragile. The danger I am particularly concerned about is that it is possible to ignorantly keep a reference to an object and manipulate it after it has been returned to the pool. This could cause bugs that would be very difficult to track down.

For example, imagine I have a pool of vectors and the following happens:

- ClassA gets a vector from ClassB.
- ClassB returns the vector sent to ClassA back to the pool.
- ClassC gets a vector from ClassB (the same object as stored by ClassA) and stores it.

The issue here is that one can never be sure that an object received from a pool will not be returned and recycled without their knowledge.

I'm having difficulty deciding whether the risk is worth the reward, or if there is a way to mitigate it.

Any advice would be greatly appreciated.
Advertisement
For those interested in the topic, I asked the same question on the Javascript subreddit and had some interesting responses.

I am posting the link below for the sake of closure:

http://www.reddit.com/r/javascript/comments/12z99j/object_pooling_best_practices/
Howie,

I ran into a similar question using Java / Python and did some tests to figure out what the "best" thing I could do was. Here' smy results, tho they are highly dependant on my particular situation and I'm sure there are other opinions out there:

1) First, if you must use a language with GC (Garbage Collection) built in, check the target version of that language for its current memory-related performance issues. For example, Java 1.4 and below had some realy buggy memory management, which meant that it was better to pool objects than to allow the GC to handle it. On the other hand, for later versions of Java, this problem went away; it was much better to let the GC do its work

2) A common misunderstanding with most GC languages is that it will instantly destory your data, or that the underlying memory itself will be "removed" from the process runtime when it gets collected; this is HIGHLY machine and language dependent, but with the way that MOST OS work, and on top of that the way that language-internal memory management works, the raw memory itself gets cached (read up on how a Buddy Allocator or how C's malloc() is often implemented, if curious).

3) Are you using multiple threads? If so, you have to make sure the pool is thread-safe. This introduces some serious overhead if you aren't careful.

4) The problem you mentioned is a MAJOR headache for debugging. To the point where I had to ask "is this worth it?" and do the current analysis :P

My general opinion: for most GC languages, don't pool. I haven't tried it with JS, but that's my general feeling. There are a few exceptions:

1) Can you guarantee the scope of the objects? i.e., if you can guarantee that the pool will only be used inside a certain block of code, then you can pre-generate a pool at the beginning of the block, and then destroy them at the end. It becomes the programmer's responsibility to make sure that no references are held.

2) large buffers of data that can be pre-calculated and stored should be.

3) I have a tendency to pre-allocate raw bytes of data. if you KNOW you'll always use 100bytes of data in a certain function, you might as well have it pre-allocated (my olde C / C++ mindset)

This topic is closed to new replies.

Advertisement