Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


why unreal and cryengine overload global operator new / delete


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 fabiozhang   Members   -  Reputation: 103

Like
0Likes
Like

Posted 21 May 2013 - 11:52 PM

most programmers are advised not to overload global operator new/delete, especially when you are writing a library for users. But I find these game engines do it. Any reasons? 

 



Sponsor:

#2 frob   Moderators   -  Reputation: 22246

Like
8Likes
Like

Posted 22 May 2013 - 12:20 AM

The most frequent (and excellent) reason to use a custom allocator is to assist in hunting bugs. If you are on a debug build, most good allocators will record where the allocation took place, record an optional comment to help in debugging, include a dead zone around the allocation to help track down overruns, and occasionally even stop execution when accessed after being released.

Another good reason to use a custom allocator is to do things the standard allocators do not provide. Many standard allocators have poor performance for frequent small allocations, or they may not provide ways to get specific alignment, or they may not provide methods to simply discard the contents rather than following a traditional destruction and teardown, or whatever else you may have a need for.

Finally, in cross-platform development it is sometimes necessary to provide a memory manager that works the same on all systems.



Keeping that in mind, it is best for professional library creators to build libraries that can have their allocators replaced. That does not mean that you as the library provider replace them, instead it means that the consumer of the library have the ability to replace them with their own allocators to meet their needs.


On the other hand, a poorly written memory manager can cause no end of trouble. Bugs in the system can be insanely difficult to hunt down, poor implementations can cause performance problems and other issues. Usually if you have to ask questions about it then you probably aren't qualified to do it.

Consequently, overriding the allocators is something normally recommended AGAINST unless you have a specific need. For hobby work and small stuff the built-in debug libraries generally have pretty good debug support. If you do have a specific need, it is generally recommended to use an existing, well-written and debugged library.


Engines like you mentioned benefit from all three of the benefits listed above, and they use a solid implementation to avoid the potential problems listed above. That means their use of a custom allocator within their engine is a reasonable decision for them, where it may not be a reasonable decision for a student, or an inexperienced, or a hobby developer.

Check out my book, Game Development with Unity, aimed at beginners who want to build fun games fast.

Also check out my personal website at bryanwagstaff.com, where I write about assorted stuff.


#3 Hodgman   Moderators   -  Reputation: 31100

Like
7Likes
Like

Posted 22 May 2013 - 12:40 AM

especially when you are writing a library for users

When writing a core game engine like Unreal/CryEngine, you're basically writing the very core of your application. In this sense, you're not making a library, but the immovable foundations of your app, so you can do what you want. In this situation, you're making an application and other people are writing libraries for you, e.g. these engines will use Bink, FMod, Scaleform, SpeedTree, etc, etc, and it would be bad for these libraries to override a global function, because then they couldn't be properly integrated into applications.

 

As Frob mentions above, engines will override global memory functions primarily for tracking, profiling and debugging purposes.



#4 fabiozhang   Members   -  Reputation: 103

Like
0Likes
Like

Posted 22 May 2013 - 06:12 AM

The most frequent (and excellent) reason to use a custom allocator is to assist in hunting bugs. If you are on a debug build, most good allocators will record where the allocation took place, record an optional comment to help in debugging, include a dead zone around the allocation to help track down overruns, and occasionally even stop execution when accessed after being released.

Another good reason to use a custom allocator is to do things the standard allocators do not provide. Many standard allocators have poor performance for frequent small allocations, or they may not provide ways to get specific alignment, or they may not provide methods to simply discard the contents rather than following a traditional destruction and teardown, or whatever else you may have a need for.

Finally, in cross-platform development it is sometimes necessary to provide a memory manager that works the same on all systems.



Keeping that in mind, it is best for professional library creators to build libraries that can have their allocators replaced. That does not mean that you as the library provider replace them, instead it means that the consumer of the library have the ability to replace them with their own allocators to meet their needs.


On the other hand, a poorly written memory manager can cause no end of trouble. Bugs in the system can be insanely difficult to hunt down, poor implementations can cause performance problems and other issues. Usually if you have to ask questions about it then you probably aren't qualified to do it.

Consequently, overriding the allocators is something normally recommended AGAINST unless you have a specific need. For hobby work and small stuff the built-in debug libraries generally have pretty good debug support. If you do have a specific need, it is generally recommended to use an existing, well-written and debugged library.


Engines like you mentioned benefit from all three of the benefits listed above, and they use a solid implementation to avoid the potential problems listed above. That means their use of a custom allocator within their engine is a reasonable decision for them, where it may not be a reasonable decision for a student, or an inexperienced, or a hobby developer.


Thanks for your reply. I don't doubt the benefit of writing a custom allocator, but just wonder why these game engines overload the global new/delete. Because i thought that if overloading them for a specific base class could be better for a library.

However, I think i got my answer from Hodgman's reply, thank both of you.

Edited by fabiozhang, 22 May 2013 - 07:02 AM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS