"Premature Optimization": A discussion

Started by
37 comments, last by neurokaotix 20 years, 5 months ago
quote:Original post by petewood
The original quote is ''the love of money is a root of all kinds of evil'' from 1 Timothy 6:10

Donald Knuth''s quote was ''Premature optimization is the root of all evil in programming''. And yes he is a Christian. And no he isn''t saying that optimization is evil.


The better known english quote is For the love of money is the root of all evil from the King James version.

Most people aren''t going to know the NIV versions.
Advertisement
quote:Original post by CheeseGrater
The better known english quote is For the love of money is the root of all evil from the King James version.

Most people aren't going to know the NIV versions.


The original Greek would be more correct but most people wouldn't understand it. The King James translation actually incorrectly translates it. If you look at most of the other translations (except for Darby) they have 'a root' rather than 'the root'. The NIV was the first on the list and is a commonly used translation. But yes most people will know it from King James.

edit: The main distinction though that I was originally making was that it is the love of money and not money itself.

[edited by - petewood on November 6, 2003 5:50:01 PM]
In my opinion optimization is removing reduntant operations. In my experience, removing reduntant operatations tends to make code smaller and cleaner, as well as faster.

Of course, premature optimization is always bad. However, early optimization isn''t always bad. It would probably be benefitial if that distinction was made more often in discussions on this topic. Choosing a faster algorithm over a slower is optimization as well, as I see it.

Optimization is part of both the design and coding phase. Of course, optimizations in the design needs to be done before you can run a profiler on the realization of that design.

An argument for early (design/code) optimization is what consequences later optimization may have. In an ideal world, everything would be in magical black boxes, but often, optimizing a piece of code do change how other code communicate with it, and so the changes propagate throughout the project.

I am all for: make it work, make it robust, make it fast, in that order. But, if speed is any concern at all, you will need to take it into consideration early, so you don''t have to break, or rewrite large portions of, code when optimizing later. I do usually build one-to-throw-away, in which optimization (that require effort or lessen readability) is banned. Because of this, it is usually easier to actually throw it away.
quote:Original post by CWizard
In my opinion optimization is removing reduntant operations. In my experience, removing reduntant operatations tends to make code smaller and cleaner, as well as faster.


Refactoring is also the act of removing redundant operations. It is also to do with having a flexible development system with robust testing. Your design isn''t fixed in stone and you''re not having to struggle against bad decisions made with limited understanding at the start of development.

It is my interpretation of Knuth and Hoare, that flexibility is necessary and optimisation reduces flexibility.

Refactoring increases flexibility. It also enables you to introduce optimisations to a good design rather than try to introduce optimisations to get around deficiencies in a bad design. The process of refactoring though may take you through intermediate designs which are not optimal.

To introduce optimisations to a design which you are refactoring is likely to be a waste of time as the design needs to be flexible. You might find the specific optimised code is no longer necessary as functions and classes get combined or stripped apart, or removed completely.

See Martin Fowler: Yet Another Optimization Article.
quote:Original post by CWizard
In my opinion optimization is removing reduntant operations. In my experience, removing reduntant operatations tends to make code smaller and cleaner, as well as faster.
That''s simply a wrong definition for optimization. Often optimizations make the code more obfuscated, harder to maintain and longer. There are some trivial optimizations that come from removing redundant operations, but those don''t even make the program run much faster as the compiler may be smart enough to skip some of the useless calculations.
quote:Original post by civguy
quote:Original post by CWizard
In my opinion optimization is removing reduntant operations. In my experience, removing reduntant operatations tends to make code smaller and cleaner, as well as faster.
That''s simply a wrong definition for optimization. Often optimizations make the code more obfuscated, harder to maintain and longer. There are some trivial optimizations that come from removing redundant operations, but those don''t even make the program run much faster as the compiler may be smart enough to skip some of the useless calculations.
Actually, I think that is the exact definition of optimization (as it relates to computation). Do you think that optimizing an algorithm means making it perform more operations?
quote:Original post by CWizard
Actually, I think that is the exact definition of optimization (as it relates to computation). Do you think that optimizing an algorithm means making it perform more operations?
It can do more, but less costly operations as well. Or you can make your program more cache friendly, sometimes by increasing operation count. It's only about speed, not operation count.

But this is just hairsplitting, since I actually interpreted your "removing redundant operations" wrong because of your second sentence. I thought you meant higher level operations: removing redundancy from the code, like replacing 100 similar statements with a for loop. That usually doesn't optimize, but that will make "code smaller and cleaner".

Now that I know you meant low-level operations, I just can't agree with you. All my experience says that code that takes less low-level operations for the same task tends to be longer and more complex, unless the initial code had some clear sillyness in it (not so common for me, *cough* )

[edited by - civguy on November 7, 2003 11:39:07 AM]
quote:Original post by civguy
It can do more, but less costly operations as well. Or you can make your program more cache friendly, sometimes by increasing operation count. It''s only about speed, not operation count.
Admittedly, it is quite complex on todays computers, but in theory it still stands; it is to perform as little work as possible (including doing cache misses etc).

Consider mathematical formulas. I am very bad a coming up with examples (which I''ll be demonstrating shortly ), but take how we calculate an alpha value. (s = source, t = target, a = alpha, r = result):

r = s(1 - a) + t * a;
r = s - sa + t * a;
r = s - a(s + t);

The latter is what we''d use; it is more compact and has fewer and cheaper operations. The same sort of optimization can be applied to normal code as well; making it shorter, cleaner, more compact, and fewer points of errors.
quote:Original post by CWizard
The same sort of optimization can be applied to normal code as well; making it shorter, cleaner, more compact, and fewer points of errors.
But that's just linear interpolation, because it's so simple. It (almost) falls to the category of initial code having a "clear sillyness". Next you might want 4-point bezier interpolation. Initially you might just type it by it's definition

p = p0*(1-x)**3+3*p1*(1-x)**2*x+3*p2*(1-x)*x**2+p3*x**3

Short and clear. Then you want speed, you use forward differencing or some other optimization algo. That one-liner suddenly expands to take more like 10 lines, in which it isn't all that clear that it's calculating a bezier curve.

So while you could have 10% of optimizations that make code 50% smaller for some particular lines, then when you take the 90% of optimizations that make code 500% larger and more complicated (and is where the speed increase really comes from), there's a large net increase in complexity and code length.

Maybe you only do this 10% and leave it at that, I don't know. But for me, it's absurd to hear that optimizations tend to make code cleaner and shorter ('tend', as if that was common). If that were the case, I doubt Knuth would've even said his famous words.

[edited by - civguy on November 9, 2003 4:48:15 AM]

This topic is closed to new replies.

Advertisement