why unreal and cryengine overload global operator new / delete

Started by
2 comments, last by fabiozhang 10 years, 11 months ago

most programmers are advised not to overload global operator new/delete, especially when you are writing a library for users. But I find these game engines do it. Any reasons?

Advertisement
The most frequent (and excellent) reason to use a custom allocator is to assist in hunting bugs. If you are on a debug build, most good allocators will record where the allocation took place, record an optional comment to help in debugging, include a dead zone around the allocation to help track down overruns, and occasionally even stop execution when accessed after being released.

Another good reason to use a custom allocator is to do things the standard allocators do not provide. Many standard allocators have poor performance for frequent small allocations, or they may not provide ways to get specific alignment, or they may not provide methods to simply discard the contents rather than following a traditional destruction and teardown, or whatever else you may have a need for.

Finally, in cross-platform development it is sometimes necessary to provide a memory manager that works the same on all systems.



Keeping that in mind, it is best for professional library creators to build libraries that can have their allocators replaced. That does not mean that you as the library provider replace them, instead it means that the consumer of the library have the ability to replace them with their own allocators to meet their needs.


On the other hand, a poorly written memory manager can cause no end of trouble. Bugs in the system can be insanely difficult to hunt down, poor implementations can cause performance problems and other issues. Usually if you have to ask questions about it then you probably aren't qualified to do it.

Consequently, overriding the allocators is something normally recommended AGAINST unless you have a specific need. For hobby work and small stuff the built-in debug libraries generally have pretty good debug support. If you do have a specific need, it is generally recommended to use an existing, well-written and debugged library.


Engines like you mentioned benefit from all three of the benefits listed above, and they use a solid implementation to avoid the potential problems listed above. That means their use of a custom allocator within their engine is a reasonable decision for them, where it may not be a reasonable decision for a student, or an inexperienced, or a hobby developer.

especially when you are writing a library for users

When writing a core game engine like Unreal/CryEngine, you're basically writing the very core of your application. In this sense, you're not making a library, but the immovable foundations of your app, so you can do what you want. In this situation, you're making an application and other people are writing libraries for you, e.g. these engines will use Bink, FMod, Scaleform, SpeedTree, etc, etc, and it would be bad for these libraries to override a global function, because then they couldn't be properly integrated into applications.

As Frob mentions above, engines will override global memory functions primarily for tracking, profiling and debugging purposes.

The most frequent (and excellent) reason to use a custom allocator is to assist in hunting bugs. If you are on a debug build, most good allocators will record where the allocation took place, record an optional comment to help in debugging, include a dead zone around the allocation to help track down overruns, and occasionally even stop execution when accessed after being released.

Another good reason to use a custom allocator is to do things the standard allocators do not provide. Many standard allocators have poor performance for frequent small allocations, or they may not provide ways to get specific alignment, or they may not provide methods to simply discard the contents rather than following a traditional destruction and teardown, or whatever else you may have a need for.

Finally, in cross-platform development it is sometimes necessary to provide a memory manager that works the same on all systems.



Keeping that in mind, it is best for professional library creators to build libraries that can have their allocators replaced. That does not mean that you as the library provider replace them, instead it means that the consumer of the library have the ability to replace them with their own allocators to meet their needs.


On the other hand, a poorly written memory manager can cause no end of trouble. Bugs in the system can be insanely difficult to hunt down, poor implementations can cause performance problems and other issues. Usually if you have to ask questions about it then you probably aren't qualified to do it.

Consequently, overriding the allocators is something normally recommended AGAINST unless you have a specific need. For hobby work and small stuff the built-in debug libraries generally have pretty good debug support. If you do have a specific need, it is generally recommended to use an existing, well-written and debugged library.


Engines like you mentioned benefit from all three of the benefits listed above, and they use a solid implementation to avoid the potential problems listed above. That means their use of a custom allocator within their engine is a reasonable decision for them, where it may not be a reasonable decision for a student, or an inexperienced, or a hobby developer.


Thanks for your reply. I don't doubt the benefit of writing a custom allocator, but just wonder why these game engines overload the global new/delete. Because i thought that if overloading them for a specific base class could be better for a library.

However, I think i got my answer from Hodgman's reply, thank both of you.

This topic is closed to new replies.

Advertisement