• Create Account

# Lower bounds on the complexity of problems

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

5 replies to this topic

### #1godmodder  Members   -  Reputation: 666

Like
0Likes
Like

Posted 19 October 2012 - 09:13 AM

Hello, everyone!

I had a session today with my colleagues to practise the defense for an important scholarship. One of the work packages in my project is the determination of the lower bound complexity of a specific new algorithm. The colleagues questioned whether I would be able to define bounds on the complexity of the problem before finding an algorithm that actually solves the problem.

Now I'm sure this must be possible because, if one takes sorting as an example for instance, it doesn't take long to realise that one needs to iterate over at least every element to sort a whole array (hence linear complexity as the absolute lower bound). In practise ofcourse average sorting complexity will be n log n. It seems like a conclusion one could arrive at before knowing even one sorting algorithm.

Is it possible to determine complexity bounds of a problem before discovering an actual algorithm? If so, did this happen before (like with sorting e.g.), so I could use that particular case to motivate my answer?

Thanks very much,
Jeroen

Edited by godmodder, 19 October 2012 - 09:13 AM.

### #2greenvertex  Members   -  Reputation: 510

Like
1Likes
Like

Posted 19 October 2012 - 11:11 AM

Even with your sorting example, you already partially knew the algorithm: you must iterate though all elements at least once, leading to an asymptotic lower bound of big-omega(n). Presupposing absolutely nothing about an algorithm would lead to big-omega(1) I suppose, but that's not terribly meaningful or accurate in most cases.

You could make somewhat general statements about classes of algorithms given you know at least the data it operates on (as in sort above):

Vector-based algorithms will be big-omega(n).
Matrix-based algorithms will be big-omega(n2)
etc.

Note, these are all asymptotic lower bounds. Generally the least interesting of the bounds family. You wouldn't really be able to say anything about theta or O as far as I know, as determining an upper bound on an unknown is seemingly impossible.

### #3osmanb  Crossbones+   -  Reputation: 1554

Like
1Likes
Like

Posted 19 October 2012 - 02:12 PM

Yes. For the sorting example, in particular, it's proven that the lower bound on number of comparisons required is proportional to n log n. http://en.wikipedia.org/wiki/Comparison_sort#Number_of_comparisons_required_to_sort_a_list

(This is only true for comparison-based sorts. Radix sort, for example, gets around this by re-framing the problem.)

### #4godmodder  Members   -  Reputation: 666

Like
0Likes
Like

Posted 20 October 2012 - 09:23 AM

Yes. For the sorting example, in particular, it's proven that the lower bound on number of comparisons required is proportional to n log n.

Very good to know. But when reading more on this, it seems that this kind of proof is often intractable. In the case of sorting, it was possible because a model could be constructed for the entire family of comparison-based algorithms.

So, do you think it was naive of me to suppose I could determine the complexity bounds before discovering the actual algorithm? Or should I keep defending this as a possibility in my project?

Jeroen

Edited by godmodder, 20 October 2012 - 09:25 AM.

### #5greenvertex  Members   -  Reputation: 510

Like
1Likes
Like

Posted 23 October 2012 - 10:39 AM

So, do you think it was naive of me to suppose I could determine the complexity bounds before discovering the actual algorithm? Or should I keep defending this as a possibility in my project?

Not necessarily naive, if you can determine the class of some undetermined algorithm you could prove some asymptotic lower bound to that class as in the sorting example above. But, barring that, I don't believe it's generally possible beyond the most generic assumption of big-omega(1).

### #6Álvaro  Crossbones+   -  Reputation: 12912

Like
0Likes
Like

Posted 23 October 2012 - 02:30 PM

If you can prove that some other well-known problem can be reduced to an instance of your problem, you can prove that your problem's complexity is at least as much as that of the other problem. For instance, finding a convex hull of points in 2D is going to take at least Omega(n*log(n)), because for the input that is (x[j],x[j]^2) the output would be a sorted list (or something that can be converted into the sorted list in time O(n)).

The other method I am aware of to prove lower bounds is the use of an adversary argument, but I can't find a good link with examples right now.

Edited by alvaro, 23 October 2012 - 02:31 PM.

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

PARTNERS