Quote:Original post by Anonymous Poster
Ha! You fell into the trap ;) :P. The second one is better, because it has explicit binding. The first one would need to be written:
num = a + (b - (c * d / e) )
in order to be as informative as the second.
As was mentioned by Strife, the order of operations comes into play and naturally implies what will be done first. Yes, operator overloading is just syntactic sugar, but when you get down to it, so is every other part of any high-level language. It's
all just sugar that wraps the underlying processor commands. Even assembly could be concidered syntactic sugar, as it uses words (ADD, SUB, MUL, DIV, MOV, etc.) to hide the 1's and 0's that are actually getting fed to the processor. Our lives would be a lot harder without syntactic sugar.
Quote:Original post by Anonymous Poster
*
Has 5 standard meanings, just in basic programming, just off the top of my head. (multiply, cross-product, any-char, any-char-except-dot, FSM-0-or-more). I'm sure if I sat around all day I could think of 3 or 4 more common meanings.
The word 'multiply' has six standard meanings, just looking in my dictionary:
multiply1
1. To increase the quantity, amount, or degree of.
2.
Math. To determine the product of by multiplication.
3. To become more in number, amount, or degree; increase.
4.
Math. To determine the product by multiplication.
5. To grow in number by procreation; propagate.
multiply2
So as to be multiple; in many ways.
I won't go into the word 'add', as I would have to sit here most of the day typing.
Quote:Original post by Anonymous Poster
Answer: all of them, if you're writing parsers. Because the "standardization" on the first operator used in grammars is so weak that all the above are used frequently :(. They also mean 6 different things, if you're writing parsers. Confused? You will be...
Speaking for myself, the symbols didn't confuse me anywhere near as much as the arguments against them. (No offense.)
Quote:Original post by Anonymous Poster
It's a nice idea to think that the use of simple operators is "easier to read", but it's an idea that any professional programmer ought to soon realise is a pipedream: After more than 2000 years of history, Mathematicians *still* haven't managed to standardize on symbols, and they have a lot more to play with than programmers do. IMHO it is naive to think that a symbol means the same thing to different readers, and it's blatantly not true.
Unlike mathematical symbols, programming languages can easily be standardized, though that decision hasn't been made yet for Java. If you think about it, the mathematical symbols we use are much more universal than the words used to represent their meaning. Pick out at random a programmer who doesn't speak English and ask him or her to interpret the following:
num = a + b - c * d / e
Then ask him/her to interpret this:
num = a.add(b.subtract(c.multiply(d).divide(e)))
Which one do you think s/he will understand quicker? In my opinion, it's naive to think that an English word will mean the same thing to different readers.
I have nothing against the decision to avoid operator-overloading in Java - it makes sense for this language. The technique itself, though, can be a very powerful tool and shouldn't be dismissed as something reserved for people with "poor OOP skill".