Lower bounds on the complexity of problems

Started by
4 comments, last by alvaro 11 years, 5 months ago
Hello, everyone!

I had a session today with my colleagues to practise the defense for an important scholarship. One of the work packages in my project is the determination of the lower bound complexity of a specific new algorithm. The colleagues questioned whether I would be able to define bounds on the complexity of the problem before finding an algorithm that actually solves the problem.

Now I'm sure this must be possible because, if one takes sorting as an example for instance, it doesn't take long to realise that one needs to iterate over at least every element to sort a whole array (hence linear complexity as the absolute lower bound). In practise ofcourse average sorting complexity will be n log n. It seems like a conclusion one could arrive at before knowing even one sorting algorithm.

Is it possible to determine complexity bounds of a problem before discovering an actual algorithm? If so, did this happen before (like with sorting e.g.), so I could use that particular case to motivate my answer?

Thanks very much,
Jeroen
Advertisement
Even with your sorting example, you already partially knew the algorithm: you must iterate though all elements at least once, leading to an asymptotic lower bound of big-omega(n). Presupposing absolutely nothing about an algorithm would lead to big-omega(1) I suppose, but that's not terribly meaningful or accurate in most cases.

You could make somewhat general statements about classes of algorithms given you know at least the data it operates on (as in sort above):

Vector-based algorithms will be big-omega(n).
Matrix-based algorithms will be big-omega(n[sup]2[/sup])
etc.

Note, these are all asymptotic lower bounds. Generally the least interesting of the bounds family. You wouldn't really be able to say anything about theta or O as far as I know, as determining an upper bound on an unknown is seemingly impossible.
Yes. For the sorting example, in particular, it's proven that the lower bound on number of comparisons required is proportional to n log n. http://en.wikipedia.org/wiki/Comparison_sort#Number_of_comparisons_required_to_sort_a_list

(This is only true for comparison-based sorts. Radix sort, for example, gets around this by re-framing the problem.)
Thank you for your replies.

Yes. For the sorting example, in particular, it's proven that the lower bound on number of comparisons required is proportional to n log n.[/quote]

Very good to know. But when reading more on this, it seems that this kind of proof is often intractable. In the case of sorting, it was possible because a model could be constructed for the entire family of comparison-based algorithms.

So, do you think it was naive of me to suppose I could determine the complexity bounds before discovering the actual algorithm? Or should I keep defending this as a possibility in my project?

Jeroen
So, do you think it was naive of me to suppose I could determine the complexity bounds before discovering the actual algorithm? Or should I keep defending this as a possibility in my project?[/quote]

Not necessarily naive, if you can determine the class of some undetermined algorithm you could prove some asymptotic lower bound to that class as in the sorting example above. But, barring that, I don't believe it's generally possible beyond the most generic assumption of big-omega(1).
If you can prove that some other well-known problem can be reduced to an instance of your problem, you can prove that your problem's complexity is at least as much as that of the other problem. For instance, finding a convex hull of points in 2D is going to take at least Omega(n*log(n)), because for the input that is (x[j],x[j]^2) the output would be a sorted list (or something that can be converted into the sorted list in time O(n)).

The other method I am aware of to prove lower bounds is the use of an adversary argument, but I can't find a good link with examples right now.

This topic is closed to new replies.

Advertisement