Overusing smart pointers?

Started by
31 comments, last by sundersoft 11 years, 9 months ago
I have a feeling that I might be overusing smart pointers.. Does anyone know of a good article on when to use them and when not to?
Advertisement
Probably not.

I always use smart pointers, unless there's a good reason not to. A good reason can be performance in tight loops or something like that, but those cases are rare and should never leak out of your interfaces. Note that using smart pointers doesn't mean 'don't think about ownership', which often results in overusing shared pointers. Quite the opposite in fact: different types of smart pointers are a great way to explicitly state ownership of an object.
Actually Stoustrup himself suggests to use smart pointers as "last resort", expecially shared_ptr . The danger of introducing smart pointers as standard in C++ is, IMO, the urge that people have to use them everywhere, and I believe that is wrong for a series of reasons but the top one is: they really make C++ uglier than already is, having this const shared_ptr<Node>& all over the place is really ugly.

C++ object lifetime should be managed with the following priority:

1) Stack based. This is the way C++ was designed for. If you can make it stack based, do it. Cleanest and fastest code is obtained this way.
2) Clean ownership heap based with new and delete. One guy creates the object, same guy is responsible to delete it.
3) unique_ptr .. which is like 2 but if you are willing to write unique_ptr<blabla>, .get() and all that nonsense over and over again to AVOID 1 SINGLE CALL TO "delete" in your destructor.
4) shared_ptr . That is, when the ownership is _really_ SHARED, so you can't possibly predict the lifetime and ownership of an object and this is used and kept alive by different objects.

So while I can't possibly think about using option 3 ever.. I know there are scenarios where 4 is useful. I use shared_ptr for materials for example, and it works ok.
Also, shared_ptr can create leaks that are REALLY hard to detect if you manage to create a reference loop.

All the above is IMO IMO IMO .. smart pointes just made it into the C++11 standard and there is still no book about best practices out yet... so the jury is still out on this one.

Stefano Casillo
TWITTER: [twitter]KunosStefano[/twitter]
AssettoCorsa - netKar PRO - Kunos Simulazioni


Actually Stoustrup himself suggests to use smart pointers as "last resort", expecially shared_ptr . The danger of introducing smart pointers as standard in C++ is, IMO, the urge that people have to use them everywhere, and I believe that is wrong for a series of reasons but the top one is: they really make C++ uglier than already is, having this const shared_ptr<Node>& all over the place is really ugly.

IMHO, correctness should come before aesthetics every time. typedefs can be a big help, as can auto if available.


C++ object lifetime should be managed with the following priority:

1) Stack based. This is the way C++ was designed for. If you can make it stack based, do it. Cleanest and fastest code is obtained this way.
[/quote]
I definitely agree with this.


2) Clean ownership heap based with new and delete. One guy creates the object, same guy is responsible to delete it.
3) unique_ptr .. which is like 2 but if you are willing to write unique_ptr<blabla>, .get() and all that nonsense over and over again to AVOID 1 SINGLE CALL TO "delete" in your destructor.
[/quote]
If you've got unique_ptr, you've quite possibly got auto too. You get brevity with additional safety for free. And the more you standardise on smart pointers, the less you need .get().

Besides, without the use of something like unique_ptr, one would typically be required to make more than one call to delete where error handling occurs, early returns, inside surrounding catch clauses, etc.
I see a flamewar coming, let's keep it calm.

The problem with auto is that it can also be overused. You get safety and smaller lines of code. But it's also harder to grok for another human being who didn't write the code. You have no idea what the variable type holds unless you make more use of advanced IDE tools. Furthermore I'm starting to see programmers using auto for almost every single variable they declare (being completely unnecessary btw).
Using auto inside a template is incredibly useful because by definition, templates don't know the type of objects that are being passed into them until compiled.
But IMHO, outside of a template, the usage of auto should only be used as a last resort; typically where using auto has a real benefit over explicitly declaring the type of the variable. This isn't an old problem, interpreted languages (Python, Lua) already have this issue. It's not a problem about safety, style, or aesthetics. It's a problem of deeply understanding what your (or someone else's) code is doing, which is very important in close-to-the-metal languages like Asm, C & C++.

In Python & Lua, this problem is alleviated because one can write "print( variable )" and they also have an interactive interpreter. In C++ we have none of that.
I tend to rarely use smart pointers. When I first was introduced to them, I overused shared_ptr, and I would often end up with some interesting crashes in destructors on program exit. Then I started thinking more about why I actually needed them for my situation, and it turns out I didn't. I often have manager classes that are the sole place for creation and destruction of whatever resource they're managing. When I need a pointer to a resource, I ask the manager for it.

Not that there aren't any reasons to use smart pointers, but I tend to structure code in a way where I don't need them. Sometimes this can't be done, and sometimes it'd just be too much work to refactor the code. I know of a multi-million line program that is littered with shared pointers, and occasionally those developers will run into a problem with them.
shared_ptr can be overused unique_ptr on the other hand has no overhead anyway (except maybe for the syntactical one). But the benefits you get are a way lower chance to produce memory leaks, exception safety etc... Also the "overhead" of writing unique_ptr is often compensated by the fact that you don't even need an explicit constructor on lots of cases.

Anyone who dismisses smart pointers with "all it saves you is a delete in the destructor" has apparently never heard of exception safety.
So far I have mostly used unique_ptr (or the old auto_ptr), because it's a simple way of getting exception safety when allocating something locally.

Can't think of any time when I've had to resort to shared_ptr, there always seems to be a better option.
Aah great.. lots of good stuff here.. I think actualy my greatest problem is that I'm using the heap more than I should..
In my own code I tend to use local variables and standard containers the vast majority of the time, which makes explicit use of pointers rather infrequent. I think this style of programming has saved me from many headaches, and I strongly advocate it.

When I do have to use a pointer (e.g., for the return value of a factory), I often end up using a shared_ptr because I don't trust the caller of that code (usually me) with not messing up the ownership. But I don't feel very strongly about this choice and perhaps I'll change it in the future.

This topic is closed to new replies.

Advertisement