There is no particular reason why developers should have to explicitly manage the lifetime of resources, just as they no longer explicitly manage the lifetime of memory.
I agree that in a managed language, you need to use a different mindset than you do in C/C++/etc, however, I've had a bunch of bugs in my C# tool chain from being lazy with resource lifetimes.
e.g. when running my asset build tool, at one point, a file is opened for writing and asset data is written into it. Later on that same file is opened for reading, as another asset is dependent on that data.
If I just open up some streams and trust the GC to clean up my resources, the second open-for-read operation quite often fails, because the open-for-write resource handle hasn't been cleaned up yet. This forces me to either use C#'s using
blocks, or to write C-style cleanup code (with the modern twist of doing it in finally blocks, etc
). When there are actually requirements on the lifetime of resources, relying on the GC's arbitrary lifetime management doesn't work :/
Lets say there's a basic game entity class, and among it's members is a sprite object. Now there is also a graphics manager class that simply cycles through all existing sprites and draws them to the screen. When an entity is created it hands the graphics manager a reference to it's sprite so it can draw it. Now what happens when that game entity dies. Well lets amuse it's kept track of by a world manager or something like that that does game logic. The the world manager checks if the entity has hit 0 hp and the manager removes it from it's list of living entity and it eventually gets eaten by GC.
It's a pretty small change to fix that, from:
sprite.Dispose(); // you're dead now, clean up please
I personally wouldn't use a design like this in C# or C++ w/ RAII though. You're making the assumption that by removing the sprite from this list, that it will stop being drawn... I'd much prefer this chain-reaction to be stated explicitly, rather than the unwritten assumption that it will just happen by magic.
Now remember that the graphics manager still has a reference to it's sprite, so GC will never touch it.
If you were writing this in C++ with RAII smart pointers, you'd have the exact same problem. You'd solve that problem in both C++ and C# by having your graphics manager use a weak reference.
Edited by Hodgman, 17 August 2013 - 12:19 AM.