Followers 0

# Private virtual functions, public virtual destructors, and protected non-virtual destructors

## 13 posts in this topic

Most experienced c++ programmers should be familiar with all four of these detailed guidelines.

Thank you for the link, that helped explain it alot.

Item 4 on the list covers this specific case. Seeing all the protected functions means the class is meant for inheritance, therefore it seems public and virtual is the rule.

Ah, I glanced at the glance and didn't see any virtual functions (and Ctrl+F'd for 'abstract', 'inheritance', 'derived', etc... and didn't find any), so I thought you were saying that concrete classes not intended to be inherited should have virtual destructors anyway.

What you and the article seem to be saying is: (correct me if I'm wrong)
A) Concrete classes should almost never be derived from.
B) Interfaces shouldn't be virtual, only implementations.
C) Abstract classes should always either have a protected destructor, or a public virtual destructor (which is what you initially said, and what the article says).

I'm still a bit confused about ([tt]C[/tt]) though - I get why destructors need to be virtual if they are to be derived, but I don't get why (and when) they should be protected if non-virtual.

Part of my confusion was that I never realized you could delete objects with private/protected destructors even from outside the class, and that private virtual functions can still be overridden (just not called) by derived classes.
Two interesting twists on class member accessibility that I have to learn better!

So is it a good practice for concrete classes enforcing their concreteness by declaring the class 'final'?

And how does making the destructor non-virtual protected actually help avoid the crashes here:
Base* b = new Derived;
delete b; // Base::~Base() had better be virtual!
...if you can still call delete on classes (such as base classes) with protected/private destructors?

What benefit does a protected destructor actually give (other than the inability to manually call Base::~Base())?

Overuse of virtual functions can quickly ruin game performance. An obvious symptom (if you are actually concerned about it) are a large number of zero-byte or four-byte functions in the symbol table. Usually these are virtual functions that do nothing; in many cases it is possible to completely eliminate the function call overhead.

It seems that's more of an issue of, "Let's make everything virtual, just incase we later want to override it", which is definitely a poor way of doing things!

I haven't learned the other lessons yet: No deriving from concretes, and non-virtual public interfaces with virtual (only when needed) implementations. I'll have to bear that in mind - I dislike changing coding styles mid-project, but it makes sense to me (or at least is starting to), so I'll try to apply this in my own code. Edited by Servant of the Lord
0

##### Share on other sites

Public and virtual, or protected and non-virtual, those are the options for c++ destructors.

Guideline #4: A base class destructor should be either public and virtual, or protected and nonvirtual

The bold bit is very important there.

If the class isn't meant to ever be inherited from, then public and nonvirtual destructors are the norm.
Polymorphism / inheritance is rare, therefore, public-virtual and private-nonvirtual destructors should also be rare...

As for the complications, you can call "delete pFoo" when the object has a private destructor. What you cannot do is call the destructor directly

Not on my compiler:

class Foo { ~Foo() {} };
void test()
{
Foo* f = new Foo;
delete f; // generates error: Foo::~Foo' : cannot access private member declared in class 'Foo'
}

And how does making the destructor non-virtual protected actually help avoid the crashes here:

Base* b = new Derived;
delete b; // Base::~Base() had better be virtual!
...if you can still call delete on classes (such as base classes) with protected/private destructors?

You can't. It avoids them at compile time, forcing you not to use delete on a base pointer.

class Base { protected: ~Base() {} };
class Derived : public Base { public: ~Derived() {} };
void test()
{
Base* b = new Derived;
delete b; // generates error: Base::~Base' : cannot access protected member declared in class 'Base'
delete (Derived*)b;//works
}
Edited by Hodgman
2

##### Share on other sites

What you and the article seem to be saying is: (correct me if I'm wrong)
A) Concrete classes should almost never be derived from.
B) Interfaces shouldn't be virtual, only implementations.
C) Abstract classes should always either have a protected destructor, or a public virtual destructor (which is what you initially said, and what the article says).

B is backwards. Base classes should be at least partially virtual, some people argue they should be pure virtual (aka abstract).

The reason for the protected non-virtual and public virtual destructor is normally pretty clear.  Base *p = new Derived(); delete p;  If the base destructor is non-virtual the wrong destructor will be called. Hence the guideline that if a class is designed for inheritance then the destructor should be virtual. On the flip side, by making the destructor non-virtual and protected you make it so others cannot derive from the class.

Overuse of virtual functions can quickly ruin game performance. An obvious symptom (if you are actually concerned about it) are a large number of zero-byte or four-byte functions in the symbol table. Usually these are virtual functions that do nothing; in many cases it is possible to completely eliminate the function call overhead.

It seems that's more of an issue of, "Let's make everything virtual, just incase we later want to override it", which is definitely a poor way of doing things!

No.  We really were using polymorphic behavior.  In that case it was a pattern of PreUpdate(), Update() and PostUpdate(). Nearly every class left the Pre and Post functions empty, and about half provided Update functions. This meant every frame we had about 20,000 empty function calls that could be eliminated.

I've seen the same mistake made on many games and some smaller engines.

Instead, make a private virtual function in the base class, and a flag that indicates it should be used. If you want to force them to use it, make it abstract.

public:

bool PreUpdate() { /* lots of other code in base class */ ... if(mNeedsPreUpdate) { OnPreUpdate() }; ... }

bool Update () { /* lots of other code in base class */ ... if(mNeedsUpdate) { OnUpdate() }; ... }

bool PostUpdate() {/* lots of other code in base class */ ... if(mNeedsPostUpdate) { OnPostUpdate() }; ... }

protected:

bool mNeedsPreUpdate;

bool mNeedsUpdate;

bool mNeedsPostUpdate;

private:

virtual void OnPreUpdate() { mNeedsPreUpdate = false; }

virtual void OnUpdate() { mNeedsUpdate = false; };

virtual void OnPostUpdate() { mNeedsPostUpdate = false; }

virtual void AllClassesMustWriteThis() = 0;

I'm guessing with your previous comments, that pattern may seem odd.  Consider how much work it saves when you have 10,000 game objects in the world.  The neat trick is that derived classes cannot call the private virtual functions, but they can override them. Similarly, private pure virtual functions mean the derived classes must override the function, but they cannot call it.

0

##### Share on other sites

On the flip side, by making the destructor non-virtual and protected you make it so others cannot derive from the class.

Making the destructor private makes it so others can't derive from the class. Making the destructor protected is giving them explicit permission to do so.

0

##### Share on other sites

Also following up on why public virtual is a problematic thing, imagine this:

class Base {

...

virtual void DoStuff() { /* Do assorted things */ }

...

};

class Derived : public Base {

...

virtual void DoStuff() { /* Do different assorted things */ }

...

}

Three months later you write this:

Base *p = SomeAccessorMethod();

assert (p != NULL);

if(p != null) {

p->DoStuff();

}

And to your horror, DoStuff does NOT do what you told it to do.  Perhaps it was meant to clear the screen to green, and some other programmer made it clear the screen to blue.  So you dig through the code and find that DoStuff has been overriden by 97 different child classes, and half of them have bastardized the meaning so much that the call is nearly unusable.  Half change the screen color to various choices, four will open a dialog box, and one of them even ends the game. .... Okay, that has been taken to absurdity, but hopefully the point is clear.

Provide concrete functions to your public base classes. You can provide virtual behaviors for child classes to override when needed, and doing so prevents the class from being misused too egregiously, allowing you to still modify the base behavior without rewriting all the derived classes.

0

##### Share on other sites

On the flip side, by making the destructor non-virtual and protected you make it so others cannot derive from the class.

Making the destructor private makes it so others can't derive from the class. Making the destructor protected is giving them explicit permission to do so.

That's what I get for replying after midnight. With that kind of mistake, time for bed.

1

##### Share on other sites

Also, sorry about not including more of the previous conversation in the split.

It is about a base class and derived class pair, where the original poster (not Servant) had built some base classes with non-virtual public destructors, and the question was the meaning for the rules about moving to public virtual or non-public non-virtual, and some other small goodies for that base class.

0

##### Share on other sites

This post pretty much became a reasoning with myself where I sortof end up agreeing with the article linked.. but would be nice to hear if I went wrong somewhere. :)

And to your horror, DoStuff does NOT do what you told it to do. Perhaps it was meant to clear the screen to green, and some other programmer made it clear the screen to blue. So you dig through the code and find that DoStuff has been overriden by 97 different child classes, and half of them have bastardized the meaning so much that the call is nearly unusable. Half change the screen color to various choices, four will open a dialog box, and one of them even ends the game. .... Okay, that has been taken to absurdity, but hopefully the point is clear.

This isn't specific to a public virtual functions. If that function delegates the work to an internal class it could have 97 different internal classes instead.

I'm not arguing against the 4 points, just that this problem isn't solved by them. (Though it can tend to lessen expected severity).

Also, the private virtual idiom is exactly the same as PIMPL but written in a different (a bit nicer) way.

This idiom also removes pure virtual "interfaces" from C++, which may be good or bad depending on who you ask. When using such constructs I have seen performance problems like frob mentioned, but I still use them for a lot of things as it's so easy to understand and use. If nothing else I like that Intellisense doesn't list a bunch of private members on the auto-complete menu.

The difference if of course small, but say for example:

class Reader {
public:
virtual size_t readData(void *destination, size_t maxBytes) = 0;
};

public:
size_t readData(void *destination, size_t maxBytes) {
}

private:
virtual size_t _readData(void *destination, size_t maxBytes) = 0;
};

private:
virtual size_t _readData(void *destination, size_t maxBytes) = 0;
};
public:
size_t readData(void *destination, size_t maxBytes) {
}
private:
};


The second way is a faster way to write the third way.

The first way... I understand the points of the article.. but it's somewhat appealing to not have to implement any methods when writing an "interface".

In a sense I feel the "two jobs" from the article (http://www.gotw.ca/publications/mill18.htm) is somewhat arbitrarily chosen:

The problem is that "simultaneously" part, because each virtual function is doing two jobs: It's specifying interface because it's public and therefore directly part of the interface Widget presents to the rest of the world; and it's specifying implementation detail, namely the internally customizable behavior, because it's virtual and therefore provides a hook for derived classes to replace the base implementation of that function (if any).

The second version already has two jobs. It specifies the interface and how the internal implementation is referenced.

You could write it like this instead:

class ReaderInterface {
public:
};

public:
}

private:
};


I would agree with setting the arbitrary choice at the previous point rather than following the example I just gave though... but I don't see how that is necessarily always "better".

As C++ doesn't have an 'interface' keyword I guess this might simply be a case where the language-designers have chosen for us, C++ doesn't need and shouldn't have 'interfaces' in that sense. Therefore example 2 from above (matching the article) is the sweet spot and should always(?) be chosen.

I personally have a tendency to abuse virtual interfaces sometimes... which might be time to change..

0

##### Share on other sites

I have 3 questions regarding the virtual keyword. The code speaks for itself, I have inserted the 3 questions as comments:

#include <iostream>

class Base
{
public:
Base() {}
void doThing() { this->doThingImpl(); }
protected:
~Base() {}
private:
virtual void doThingImpl() {}
};

class Derived : public Base
{
public:
Derived() {}
~Derived() {}  // #1 Why does this have to be virtual if it is not to be inherited from?
private:
virtual void doThingImpl() {} // #2 Why do people make derivded member functions virtual?
};

class Derived2 : public Base
{
public:
Derived2() {}
virtual ~Derived2() {}
private:
virtual void doThingImpl() final {} // #3 Being a final member function, should this now be virtual or not?
};

int main()
{
Derived* test = new Derived();
delete test; // warning: deleting object of polymorphic class type 'Derived' which has
// non-virtual destructor might cause undefined behaviour [-Wdelete-non-virtual-dtor]

Derived2* test2 = new Derived2();
delete test2;
}
0

##### Share on other sites

I have 3 questions regarding the virtual keyword. The code speaks for itself, I have inserted the 3 questions as comments:

#include <iostream>

class Base
{
public:
Base() {}
void doThing() { this->doThingImpl(); }
protected:
~Base() {}
private:
virtual void doThingImpl() {}
};

class Derived : public Base
{
public:
Derived() {}
~Derived() {}  // #1 Why does this have to be virtual if it is not to be inherited from?
private:
virtual void doThingImpl() {} // #2 Why do people make derivded member functions virtual?
};

class Derived2 : public Base
{
public:
Derived2() {}
virtual ~Derived2() {}
private:
virtual void doThingImpl() final {} // #3 Being a final member function, should this now be virtual or not?
};

int main()
{
Derived* test = new Derived();
delete test; // warning: deleting object of polymorphic class type 'Derived' which has
// non-virtual destructor might cause undefined behaviour [-Wdelete-non-virtual-dtor]

Derived2* test2 = new Derived2();
delete test2;
}

1: It doesn't, but there's nothing in the code saying that it can't be inherrited from. The warning protects you from mistakes, but it can have false positives.

2: It is a virtual function, whether marked so or not, and in C++03 you're making that explicit. In C++11 you can mark the function as "overrides" for better documentation of intent and error protection. See also bellow:

3: Final implies virtual, so it is redundant information. However, having virtual there makes it easy to use search functions to find virtual methods. So it comes down to style.

0

##### Share on other sites

Also...

#1) The function signature became virtual in the base class. Anything derived from Base has the entry in the vtable, so the base class will always look the function up even if you mark it as final. Writing the keyword 'virtual' in front of it at this point is optional, it retains the virtual-ness from the base class. Explicitly marking it as virtual can be useful to other programmers so they don't need to look up the details elsewhere.

You can now remove virtual-ness since C++11 introduced the "final" keyword to stop making things virtual below that level, but you probably don't want to do that to a destructor unless you want partially-destroyed objects. The benefit of losing virtual-ness is that you can drop the extra vtable lookup and branch directly to the function, instead of derefencing a pointer and branching there. If you want to stop the inheritance chain mark the class as final instead of the destructor.

#2) You make them virtual so you can modify the behavior in a child class.

One very frequent example of this is given in the OnPreUpdate, OnUpdate, and OnPostUpdate example above. Imagine you have a bunch of game objects, one of them is a turret. You probably want an update function to turn to face the target, so you can override OnUpdate() and put your behavior in there. If you look through common frameworks you will see a lot of adjustable behavior with OnXxx style virtual functions.

If a derived class is also intended to be derived from, for example, you have a base "SelfAimingTurret" class and it is designed as the base for "LaserTurret", "MissileTurret", "SlimeTurret", you can add new virtual functions that are used to specialize the derived classes.

#3) Final is new in C++11. When you use it the function continues to be in the vtable but no longer needs to be looked up in some cases.

The function is still virtual in some meanings. The class and every derived class will still have a vtable entry for it because the base functionality requires it. It does not remove the vtable entry. The vtable entry for that class and derived classes continue to exist and will point to that function. This is required so that inside Base when they call doThingImpl() it can still look up the function.

The function stops being virtual in other meanings. That class and derived classes can directly use the function instead of looking up the address in the vtable. So if inside Derived2 you call doThingImpl() the function does not need to be looked up, it can branch to that function directly.

Edited by frob
0

##### Share on other sites

B) Interfaces shouldn't be virtual, only implementations.

B is backwards. Base classes should be at least partially virtual, some people argue they should be pure virtual (aka abstract).

That article you linked to was advocating never having the public interface virtual (except the destructor), only the protected/private interface. The private interface can still be pure-virtual. "Guideline #1: Prefer to make interfaces nonvirtual, using Template Method."

The reason for the protected non-virtual and public virtual destructor is normally pretty clear.  Base *p = new Derived(); delete p;  If the base destructor is non-virtual the wrong destructor will be called. Hence the guideline that if a class is designed for inheritance then the destructor should be virtual.

Yep, I already do that part.
It's this part:

On the flip side, by making the destructor non-virtual and protected you make it so others cannot derive from the class.

...that was confusing me, because [tt]protected[/tt] doesn't stop inheritance.

Making the destructor private makes it so others can't derive from the class. Making the destructor protected is giving them explicit permission to do so.

But what good is a class with a private destructor? If it's private, you can't call delete on it - so it's no good when using it directly. If it's private, you can't derive from it - so it's no good as a base class. A class with a private destructor seems unusable. What am I missing?

Both Frob and Herb Sutter's article are saying, "Guideline #4: A base class destructor should be either public and virtual, or protected and nonvirtual.".
The need for virtual destructors I understand fine. What benefit does protected access give to non-virtual destructors?
0

##### Share on other sites
@Erik Rufelt
I think the "two jobs" issue only applies to concrete base classes with that can be extended by overriding virtual functions. I.e. Implementation inheritance.

Abstract base classes (I.e. Interface inheritance) doesn't have this problem.

The equivalent of the interface/implements keyword in C++ is to make an abstract base class, and inherit from it using public-virtual inheritance (or just public inheritance if you trust you're not going to get into any multiple-inheritance induced traps). Any other kind of inheritance is equivalent to the 'extends' keyword, and should be pretty rare.

@Servant - private destructors are great for all sorts of alternative interface/implementation schemes. E.g. With COM, you delete an object by calling a virtual Release function. Internally, it would be capable of calling delete this, while at the same time you can be sure that clients are not allowed to.

I also see a common implementation-hiding pattern where there is only ever one derived type (such as in cross-platform libs)
struct Foor : NoCreate { int Get(); }
struct Bar : NoCreate { Foo* Stuff(); }
//cpp
Struct FooImp { int i; }
struct BarImp { FooImp f; }
int Foo::Get() { return ((FooImp*)this)->i; }
Foo* Bar::Stuff() { return (Foo*)&((BarImp*)this)->f; }
This makes the implementation ugly, but keeps the interface very clean.

@TheComet
Your base class destructor would normally be virtual in that case, so it's trying to be helpful and warn you. Seeing you're using new/delete on the same type, it's safe to ignore/disable that warning.
1

## Create an account

Register a new account