Private virtual functions, public virtual destructors, and protected non-virtual destructors

Started by
12 comments, last by Hodgman 10 years ago

I have 3 questions regarding the virtual keyword. The code speaks for itself, I have inserted the 3 questions as comments:


#include <iostream>

class Base
{
public:
    Base() {}
    void doThing() { this->doThingImpl(); }
protected:
    ~Base() {}
private:
    virtual void doThingImpl() {}
};

class Derived : public Base
{
public:
    Derived() {}
    ~Derived() {}  // #1 Why does this have to be virtual if it is not to be inherited from?
private:
    virtual void doThingImpl() {} // #2 Why do people make derivded member functions virtual?
};

class Derived2 : public Base
{
public:
    Derived2() {}
    virtual ~Derived2() {}
private:
    virtual void doThingImpl() final {} // #3 Being a final member function, should this now be virtual or not?
};

int main()
{
    Derived* test = new Derived();
    delete test; // warning: deleting object of polymorphic class type 'Derived' which has
                 // non-virtual destructor might cause undefined behaviour [-Wdelete-non-virtual-dtor]

    Derived2* test2 = new Derived2();
    delete test2;
}

1: It doesn't, but there's nothing in the code saying that it can't be inherrited from. The warning protects you from mistakes, but it can have false positives.

2: It is a virtual function, whether marked so or not, and in C++03 you're making that explicit. In C++11 you can mark the function as "overrides" for better documentation of intent and error protection. See also bellow:

3: Final implies virtual, so it is redundant information. However, having virtual there makes it easy to use search functions to find virtual methods. So it comes down to style.

Advertisement

Also...

#1) The function signature became virtual in the base class. Anything derived from Base has the entry in the vtable, so the base class will always look the function up even if you mark it as final. Writing the keyword 'virtual' in front of it at this point is optional, it retains the virtual-ness from the base class. Explicitly marking it as virtual can be useful to other programmers so they don't need to look up the details elsewhere.

You can now remove virtual-ness since C++11 introduced the "final" keyword to stop making things virtual below that level, but you probably don't want to do that to a destructor unless you want partially-destroyed objects. The benefit of losing virtual-ness is that you can drop the extra vtable lookup and branch directly to the function, instead of derefencing a pointer and branching there. If you want to stop the inheritance chain mark the class as final instead of the destructor.

#2) You make them virtual so you can modify the behavior in a child class.

One very frequent example of this is given in the OnPreUpdate, OnUpdate, and OnPostUpdate example above. Imagine you have a bunch of game objects, one of them is a turret. You probably want an update function to turn to face the target, so you can override OnUpdate() and put your behavior in there. If you look through common frameworks you will see a lot of adjustable behavior with OnXxx style virtual functions.

If a derived class is also intended to be derived from, for example, you have a base "SelfAimingTurret" class and it is designed as the base for "LaserTurret", "MissileTurret", "SlimeTurret", you can add new virtual functions that are used to specialize the derived classes.

#3) Final is new in C++11. When you use it the function continues to be in the vtable but no longer needs to be looked up in some cases.

The function is still virtual in some meanings. The class and every derived class will still have a vtable entry for it because the base functionality requires it. It does not remove the vtable entry. The vtable entry for that class and derived classes continue to exist and will point to that function. This is required so that inside Base when they call doThingImpl() it can still look up the function.

The function stops being virtual in other meanings. That class and derived classes can directly use the function instead of looking up the address in the vtable. So if inside Derived2 you call doThingImpl() the function does not need to be looked up, it can branch to that function directly.

B) Interfaces shouldn't be virtual, only implementations.

B is backwards. Base classes should be at least partially virtual, some people argue they should be pure virtual (aka abstract).

That article you linked to was advocating never having the public interface virtual (except the destructor), only the protected/private interface. The private interface can still be pure-virtual. "Guideline #1: Prefer to make interfaces nonvirtual, using Template Method."

The reason for the protected non-virtual and public virtual destructor is normally pretty clear. Base *p = new Derived(); delete p; If the base destructor is non-virtual the wrong destructor will be called. Hence the guideline that if a class is designed for inheritance then the destructor should be virtual.

Yep, I already do that part.
It's this part:

On the flip side, by making the destructor non-virtual and protected you make it so others cannot derive from the class.

...that was confusing me, because protected doesn't stop inheritance.

Making the destructor private makes it so others can't derive from the class. Making the destructor protected is giving them explicit permission to do so.


But what good is a class with a private destructor? If it's private, you can't call delete on it - so it's no good when using it directly. If it's private, you can't derive from it - so it's no good as a base class. A class with a private destructor seems unusable. What am I missing?

Both Frob and Herb Sutter's article are saying, "Guideline #4: A base class destructor should be either public and virtual, or protected and nonvirtual.".
The need for virtual destructors I understand fine. What benefit does protected access give to non-virtual destructors?
@Erik Rufelt
I think the "two jobs" issue only applies to concrete base classes with that can be extended by overriding virtual functions. I.e. Implementation inheritance.

Abstract base classes (I.e. Interface inheritance) doesn't have this problem.

The equivalent of the interface/implements keyword in C++ is to make an abstract base class, and inherit from it using public-virtual inheritance (or just public inheritance if you trust you're not going to get into any multiple-inheritance induced traps). Any other kind of inheritance is equivalent to the 'extends' keyword, and should be pretty rare.

@Servant - private destructors are great for all sorts of alternative interface/implementation schemes. E.g. With COM, you delete an object by calling a virtual Release function. Internally, it would be capable of calling delete this, while at the same time you can be sure that clients are not allowed to.

I also see a common implementation-hiding pattern where there is only ever one derived type (such as in cross-platform libs)
struct Foor : NoCreate { int Get(); }
struct Bar : NoCreate { Foo* Stuff(); }
//cpp
Struct FooImp { int i; }
struct BarImp { FooImp f; }
int Foo::Get() { return ((FooImp*)this)->i; }
Foo* Bar::Stuff() { return (Foo*)&((BarImp*)this)->f; }
Where "NoCreate" had private constructors/destructors/assignment.
This makes the implementation ugly, but keeps the interface very clean.


@TheComet
Your base class destructor would normally be virtual in that case, so it's trying to be helpful and warn you. Seeing you're using new/delete on the same type, it's safe to ignore/disable that warning.

This topic is closed to new replies.

Advertisement