"No Object" references in languages

Started by
3 comments, last by Krohm 14 years, 7 months ago
I like Unrealscript alot, but there's something of it I simply cannot deal with: its habit of turning stuff involving None into the zero value. While I can understand why they are going for it, I would like to know more about your experiences with "null references" in various languages. Personally, I always felt the "return 0" approach rather weak. Autovivification sounds a real saviour at first: I wonder how they manage to autovivificate objects with non-trivial parametrized ctors. The language I am using now ends silently failing execution returning zero (at least that's what it's supposed to do) but this bores me constantly. In the beginning, I thought C++/Java habit of throwing killing exceptions were a bit too hard for a scripting language but now I think that this could actually be the only thing to do which doesn't spread unwanted behaviour all around. Am I the only one thinking that null/None should be considered really more important than a minor issue?

Previously "Krohm"

Advertisement
Quote:Original post by Krohm
I like Unrealscript alot, but there's something of it I simply cannot deal with: its habit of turning stuff involving None into the zero value.
While I can understand why they are going for it, I would like to know more about your experiences with "null references" in various languages.

Personally, I always felt the "return 0" approach rather weak.
Autovivification sounds a real saviour at first: I wonder how they manage to autovivificate objects with non-trivial parametrized ctors.

The language I am using now ends silently failing execution returning zero (at least that's what it's supposed to do) but this bores me constantly. In the beginning, I thought C++/Java habit of throwing killing exceptions were a bit too hard for a scripting language but now I think that this could actually be the only thing to do which doesn't spread unwanted behaviour all around.

Am I the only one thinking that null/None should be considered really more important than a minor issue?


Within my own apps, I use a ret=0 for success. If something fails, I have a specific retCode I may look for if there were not any available ExceptionHandlers that perform exactly as I want. But to answer your question, I, too, thing that null/None should be considered as a value other than 0. I would not be able to cope with such a weak response to failure. I also use null to flag when a value has been processed - once processed, I reset the variable to null. It would be insane to consider the null value equivalent to 0.
Quote:Original post by ddboarmBut to answer your question, I, too, thing that null/None should be considered as a value other than 0.
I'm not sure I am explaining me correctly. Just as a clarification in case other people wants to write down a few lines.
As far as I know, Uscript does not consider None as 0 (it probably does internally). In case None is accessed, it spawns an object whose all functions are implemented as { return 0; } and all data fields are valid and initialized to 0. I think it's correct to say it autovivificates an object of the expected type with a few twists.
I am not well aware if UScript considers None as 0 in the sense of executing
if(None == 0) {  /* do this, always true */}

I've never bought this "nullable types" hype in the first place so the problem of mixing numeric types with nulls is something I don't really feel to be a problem.

The underlying secret question here is probably "should a program be allowed to dereference NULL/null/None and live"?

Your point is taken. I also find abortive exceptions to be considerably more resilient and error-proof option; this raises me another question what's the state of Exception Handling in scripting languages? It's a feature I always considered more adeguate for "real" languages. I really know close to nothing about Lua, ActionScript, Python and such.

Previously "Krohm"

I guess what you're talking about is the Null Object pattern?

In my opinion I can see places where using it would significantly simplify the code.

eg.
attack(GetNearestEnemy())


If you had to check the return value of GetNearestEnemy to check that there was actually an enemy nearby, this would be awkward. It is a definite improvement over this:

enemy = GetNearestEnemy()if enemy != Null{    attack(enemy)}


But if you wanted feedback on whether the attack worked or not, that would be interesting to implement. Exception handling could be another approach, but that is cumbersome in the situations where you really don't care about the error. I think many scripting languages are made to simplify the scripts rather than to help you write rigorously correct code. Is there a way to get the best of both worlds? I don't know.
Quote:Original post by Kylotan
I guess what you're talking about is the Null Object pattern?
There seems to be a close relation between this resource and my doubts. Thank you. I think that could be another rationale behind this.

I think your attack example touches the issue. By sure means, attack(GameEnemy) should allow nullref to be safely used. I see good reasons for which nil/null/NULL/None/nullref should be allowed in general, but it's the more specific cases that scares me a bit.
Quote:UDN3 UnrealScriptReference
Variables that refer to actors always either refer to a valid actor (any actor that actually exists in the level), or they contain the value None. None is equivalent to the C/C++ NULL pointer. However, in UnrealScript, it is safe to access variables and call functions with a None reference; the result is always zero.
When I first read this I thought it was a rather good solution. After all UScript is spiffy to say the least. I didn't consider the implications of a None object which does not satisfy invariant.
When I had to, very bad things happened, and happened very far from the code originating nil/null/NULL/None/nullref.
Scripts, as you say, must be easy, and this is why I originally was quite happy with this approach - but just easy to write? I don't know. I am not sure anymore. I think I may be just taking this too seriously.

I think I'll ask {1} dereferencing nil/null/NULL/None/nullref to be turned in a fatal exception like in Java (no, there is not going to be any form of catch blocks any time soon). Personally I like more having things that break when things start go awry in a way one cannot ignore rather than breaking later and trying to hide the problem. Is this at least a rationale common to scripters?{2}


{1} I use UScript as an example since it's famous and implements the same behaviour discussed; this is not related to UScript!
{2} Looks like it's not or UScript wouldn't use this, unless it's locked in legacy, which is possible. I am also relatively sure it's quite more consistent than the mess I am using now so the example is partially moot.

Previously "Krohm"

This topic is closed to new replies.

Advertisement