I definitely agree with "fail fast", but I disagree with your advice in this particular case. The problem is, passing an invalid pointer to delete doesn't always cause a crash right away (unless you're always running with app verifier, which isn't realistic). It can simply corrupt the heap causing problems later, or in fact it could cause no problems at all if something has already been reallocated at that memory location (not unlikely when space is requested shortly after the delete for a same size object, since there will be a nice hole in the heap there).
The other thing is that it is not uncommon to have code that, say, resets the state of an object, which would include deleting any members that were new'd. In that case, there is no logic error, and you would need to store additional state in order to know if a pointer is valid or not. So why not just set it to null.
Ok, thats fair. I didn't think of the particulary case when you are like resetting a state in a class, in which case you certainly should null out the pointer. I'm not really sure what case I had in mind where you would not null out something after deletion (which is one of the things that make unique_ptr so handsome), but I think I was mostly thinking in terms of destructors or temporary objects, since Renthalkx mentioned merely protection against double delete. I can't speak for all cases and compilers, but newer MSVC seems quite reliable with catching deletion of invalid memory - got a few heap corruptions, but mostly just a crash in that very line, but might very well be different in other cases.
It also makes it harder to diagnose issues, because you have no idea if the pointer is "valid" when inspecting it in the debugger.
Thats not really an issue in MSVC at least, since once you inspect the pointer it will show you just garbage data, which is enough to identify an invalid pointer - but null values are still easier to catch, I give you that.
I'm probably going to get chastised for this, but I actually PERSONALLY prefer to use raw pointers. I like having to manually take care of them. Not that smart pointers are bad. By all means they are a great thing, and should be used whenever necessary.
Honestly, I'm more curious - why? Since I started using unique_ptr I don't want to use anything else, I'm still in the process of converting my old code because it makes things so much more safe, reliable and produces way cleaner code. Unless I have very specific requirements for memory management but yeah... is there anything particullary that you like about raw pointer managment, or is it rather habit? ;)
EDIT:
I was always under the assumption that it was best to set them to null for cases like this:
int *pX = new int(5);
delete pX;
if(pX)
{
std::cout << *pX;
}
That will crash. However setting pX to null would alleviate that.
See Phil_t's reply above, but IMHO this is one of the cases where it doesn't really make sense to null out the pointer, for me. You are creating a variable, then deleting it, and then trying to do something with it. This is clearly a logical issue, and should not be "protected" against by setting the variable to null after the delete - eigther your delete or the code afterwards is in the wrong place. Ok, in a more complex case you might conditionally delete pX, but even then I feel there are better ways than just setting pX to null after delete, like changing the execution paths to not even hit the other code if the pointer gets deleted, when it is deleted.
Well though, seeing Phil_t's remark about delete you pretty much have only the options to fail silently, or risk undefined behaviour. Well - its pretty much pick your poison anyways, unless you want to trust your compiler/debugger/platform to handle memory access violations in a somewhat reliable fashion.