How to measure the time of an algorithm? (o(n), expo,etc)

Started by
13 comments, last by Zahlman 17 years ago
Hi, thats the question, how do i know that a function for doing some algorithm is o(n) or o(nlog2), or o(n2) , etc.... for example, this algorithm for doing the fibonacci series:

int recursiveFib(int n)
{
 if ( n <= 1 )
   return n;
   return recursiveFib(n-1) + recursiveFib(n-2);
}

i read that it will run in exponential time, but..... why? how can i know when an algorithm can become that expensive?
Advertisement
You prove it.

In this particular situation, count the number of recursive calls to compute fibo(n), and call that time T(n). So:

T(n) = 1 + T(n-1) + T(n-2)
T(0) = T(1) = 1


We can solve this by noticing that the following expression verifies this relationship.

T(n) = 2 fibo(n) - 1


And since we already know that fibo(n) is exponential, so is the complexity.
For n <= 1, you return n, which takes 1 call.
For n = 2, you call fib(1) and fib(0), which is 3 calls total.
for n = 3, you call fib(2) and fib(1), which is 4 calls total.
For n = 4, you call fib(3) and fib(2), which is 7 calls total.
For n = 5, you call fib(4) and fib(3), which is 11 calls total.

If you notice the pattern (ignoring the first one), the number of calls is actually a sequence calculated the same way as fibonacci numbers - adding the two previous. If you want an actual equation to calculate the total number of calls, you'd need to plug the numbers into a recurrence relation solution. To turn that into O-notation, you'd need to remove certain constants and end up with O(x^N) for some x.

As a side note, you could decrease the complexity to a linear relationship (O(N)) using the simple techniques of memoization without altering your algorithm. In this case, you could also make it a simple loop, but it's a good example of how simple memoization can be to greatly reduce running time.
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
Quote:Original post by Extrarius
As a side note, you could decrease the complexity to a linear relationship (O(N)) using the simple techniques of memoization without altering your algorithm. In this case, you could also make it a simple loop, but it's a good example of how simple memoization can be to greatly reduce running time.

I might note that the time complexity depends upon how memoized results are stored and looked up. The actual complexity is O(N M), where M is the complexity of memoization. In this case it certainly would be O(N) because you could just store the memoized results in a random-access array, with complexity O(N 1) = O(N).
sorry guys, but i still dont understand..

Quote:Original post by ToohrVyk
T(n) = 1 + T(n-1) + T(n-2)
T(0) = T(1) = 1


We can solve this by noticing that the following expression verifies this relationship.

T(n) = 2 fibo(n) - 1




how did you get that?
wouldnt T(0) = 1+t(-1)+t(-2)
T(0) = 1-3t ?

what about, for example, the insertion sort:
insert(array a, int length, value) {     int i = length - 1;     while (i >= 0 && a > value) {         a = a<span style="font-weight:bold;">;<br>         i = i - <span class="cpp-number">1</span>;<br>     }<br>     a := value;<br> }<br> <br> insertionSort(array a, <span class="cpp-keyword">int</span> length) {<br>     <span class="cpp-keyword">int</span> i = <span class="cpp-number">0</span>;<br>     <span class="cpp-keyword">while</span> (i &lt; length) {<br>         insert(a, i, a<span style="font-weight:bold;">);<br>         i = i + <span class="cpp-number">1</span>;<br>     }<br> }<br><br></pre></div><!–ENDSCRIPT–><br><br>in the best case, it will run at  O(n), in the worst case, it will run at 0(n^2)… why?<br><br><br>
Quote:Original post by BlackWind
how did you get that?
wouldnt T(0) = 1+t(-1)+t(-2)
T(0) = 1-3t ?


No. T(0) corresponds to a single call, which returns a value right away.
Quote:Original post by BlackWind

in the best case, it will run at O(n), in the worst case, it will run at 0(n^2)... why?


 insertionSort(array a, int length) {     int i = 0;     while (i < length) {         insert(a, i, a);         i = i + 1;     }


How many times does this get executed? Obviously, N times.

But, there's a catch. insert takes time too. How much?

insert(array a, int length, value) {     int i = length - 1;     while (i >= 0 && a > value) {         a = a<span style="font-weight:bold;">;<br>         i = i - <span class="cpp-number">1</span>;<br>     }<br>     a := value;<br> }<br><br></pre></div><!–ENDSCRIPT–><br><br>Repeat from end of array towards beginning, until you eaither reach first element, or a value for which condition is met.<br><br>Since we have no knowledge about contents in the array, we can deduce that &#111;n average, we'll do n/2 iterations.<br><br>In best case, the "a<span style="font-weight:bold;"> &gt; value" will always be true for "i = length - 1", so we'll get O(1) complexity.<br><br>In worst case, we'll always get down to i == 0. But since length changes with i, we get a different value for length. The average length that insert will need use is, n/2. &#79;n first call, it's 1, &#111;n second, 2, … n-2, n-1. The average number of elements, n/2, but the constant is usually omited.<br><br>So insert has complexity between O(1) and O(n).<br><br>Now we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).<br>
While people are giving you reasonable responses, the real answer is to familiarize yourself with the basics of CS algorithm theory. I used the book "Introduction to Algorithms" by Cormen, Leiserson, and Rivest. The very first chapter in the book covers this topic. The rest of the book is quite informative too.

If you can't make sense of that, you probably need to work on your math a bit first; I would bet understanding basic college Calculus is enough.

Geoff
Ok, now its clearer to me.
I have another question:
is it possible to code a way to get the complexity of the algorithm?
or it can only be obtained by pure mathematical analysis?

Quote:Original post by Antheus
Now we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).


for the worst case, shouldnt it be:
O(n) * O(n) = 0(n^2)
Quote:Original post by BlackWind
Ok, now its clearer to me.
I have another question:
is it possible to code a way to get the complexity of the algorithm?
or it can only be obtained by pure mathematical analysis?

Quote:Original post by Antheus
Now we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).


for the worst case, shouldnt it be:
O(n) * O(n) = 0(n^2)


Ofcourse it is possible, Though you might want to decide on a generic way to write the algoritm first. (As it would simplify the process greatly).
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!

This topic is closed to new replies.

Advertisement