How to measure the time of an algorithm? (o(n), expo,etc)

Recommended Posts

Hi, thats the question, how do i know that a function for doing some algorithm is o(n) or o(nlog2), or o(n2) , etc.... for example, this algorithm for doing the fibonacci series:
int recursiveFib(int n)
{
if ( n <= 1 )
return n;
return recursiveFib(n-1) + recursiveFib(n-2);
}


i read that it will run in exponential time, but..... why? how can i know when an algorithm can become that expensive?

Share on other sites
You prove it.

In this particular situation, count the number of recursive calls to compute fibo(n), and call that time T(n). So:

T(n) = 1 + T(n-1) + T(n-2)
T(0) = T(1) = 1

We can solve this by noticing that the following expression verifies this relationship.

T(n) = 2 fibo(n) - 1

And since we already know that fibo(n) is exponential, so is the complexity.

Share on other sites
For n <= 1, you return n, which takes 1 call.
For n = 2, you call fib(1) and fib(0), which is 3 calls total.
for n = 3, you call fib(2) and fib(1), which is 4 calls total.
For n = 4, you call fib(3) and fib(2), which is 7 calls total.
For n = 5, you call fib(4) and fib(3), which is 11 calls total.

If you notice the pattern (ignoring the first one), the number of calls is actually a sequence calculated the same way as fibonacci numbers - adding the two previous. If you want an actual equation to calculate the total number of calls, you'd need to plug the numbers into a recurrence relation solution. To turn that into O-notation, you'd need to remove certain constants and end up with O(x^N) for some x.

As a side note, you could decrease the complexity to a linear relationship (O(N)) using the simple techniques of memoization without altering your algorithm. In this case, you could also make it a simple loop, but it's a good example of how simple memoization can be to greatly reduce running time.

Share on other sites
Quote:
 Original post by ExtrariusAs a side note, you could decrease the complexity to a linear relationship (O(N)) using the simple techniques of memoization without altering your algorithm. In this case, you could also make it a simple loop, but it's a good example of how simple memoization can be to greatly reduce running time.

I might note that the time complexity depends upon how memoized results are stored and looked up. The actual complexity is O(N M), where M is the complexity of memoization. In this case it certainly would be O(N) because you could just store the memoized results in a random-access array, with complexity O(N 1) = O(N).

Share on other sites
sorry guys, but i still dont understand..

Quote:
 Original post by ToohrVykT(n) = 1 + T(n-1) + T(n-2)T(0) = T(1) = 1We can solve this by noticing that the following expression verifies this relationship.T(n) = 2 fibo(n) - 1

how did you get that?
wouldnt T(0) = 1+t(-1)+t(-2)
T(0) = 1-3t ?

what about, for example, the insertion sort:
insert(array a, int length, value) {     int i = length - 1;     while (i >= 0 && a[i] > value) {         a[i + 1] = a[i];         i = i - 1;     }     a[i + 1] := value; }  insertionSort(array a, int length) {     int i = 0;     while (i < length) {         insert(a, i, a[i]);         i = i + 1;     } }

in the best case, it will run at O(n), in the worst case, it will run at 0(n^2)... why?

Share on other sites
Quote:
 Original post by BlackWindhow did you get that? wouldnt T(0) = 1+t(-1)+t(-2) T(0) = 1-3t ?

No. T(0) corresponds to a single call, which returns a value right away.

Share on other sites
Quote:
 Original post by BlackWindin the best case, it will run at O(n), in the worst case, it will run at 0(n^2)... why?

 insertionSort(array a, int length) {     int i = 0;     while (i < length) {         insert(a, i, a[i]);         i = i + 1;     }

How many times does this get executed? Obviously, N times.

But, there's a catch. insert takes time too. How much?

insert(array a, int length, value) {     int i = length - 1;     while (i >= 0 && a[i] > value) {         a[i + 1] = a[i];         i = i - 1;     }     a[i + 1] := value; }

Repeat from end of array towards beginning, until you eaither reach first element, or a value for which condition is met.

Since we have no knowledge about contents in the array, we can deduce that on average, we'll do n/2 iterations.

In best case, the "a[i] > value" will always be true for "i = length - 1", so we'll get O(1) complexity.

In worst case, we'll always get down to i == 0. But since length changes with i, we get a different value for length. The average length that insert will need use is, n/2. On first call, it's 1, on second, 2, ... n-2, n-1. The average number of elements, n/2, but the constant is usually omited.

So insert has complexity between O(1) and O(n).

Now we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).

Share on other sites
While people are giving you reasonable responses, the real answer is to familiarize yourself with the basics of CS algorithm theory. I used the book "Introduction to Algorithms" by Cormen, Leiserson, and Rivest. The very first chapter in the book covers this topic. The rest of the book is quite informative too.

If you can't make sense of that, you probably need to work on your math a bit first; I would bet understanding basic college Calculus is enough.

Geoff

Share on other sites
Ok, now its clearer to me.
I have another question:
is it possible to code a way to get the complexity of the algorithm?
or it can only be obtained by pure mathematical analysis?

Quote:
 Original post by AntheusNow we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).

for the worst case, shouldnt it be:
O(n) * O(n) = 0(n^2)

Share on other sites
Quote:
Original post by BlackWind
Ok, now its clearer to me.
I have another question:
is it possible to code a way to get the complexity of the algorithm?
or it can only be obtained by pure mathematical analysis?

Quote:
 Original post by AntheusNow we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).

for the worst case, shouldnt it be:
O(n) * O(n) = 0(n^2)

Ofcourse it is possible, Though you might want to decide on a generic way to write the algoritm first. (As it would simplify the process greatly).

Share on other sites
Quote:
Original post by SimonForsman
Quote:
 Original post by BlackWindOk, now its clearer to me.I have another question:is it possible to code a way to get the complexity of the algorithm?

Ofcourse it is possible, Though you might want to decide on a generic way to write the algoritm first. (As it would simplify the process greatly).

Actually, it is not. We can trivially reduce this to the halting problem: if an algorithm has an order of complexity, then it must halt, because we can calculate a finite upper bound on the number of steps it takes (by just plugging the problem size and an appropriate finite constant into the order of complexity formula).

Share on other sites
so........ it is not possible?
well, thanks a lot for the replies...

Share on other sites
Quote:
Original post by Zahlman
Quote:
Original post by SimonForsman
Quote:
 Original post by BlackWindOk, now its clearer to me.I have another question:is it possible to code a way to get the complexity of the algorithm?

Ofcourse it is possible, Though you might want to decide on a generic way to write the algoritm first. (As it would simplify the process greatly).

Actually, it is not. We can trivially reduce this to the halting problem: if an algorithm has an order of complexity, then it must halt, because we can calculate a finite upper bound on the number of steps it takes (by just plugging the problem size and an appropriate finite constant into the order of complexity formula).

This only means that it is impossible to calculate the complexity of all algoritms. not that it is impossible to calculate the complexity of algoritms.

If the complexity can be calculated by humans it is also possible for a computer program to do the same. (If the complexity can't be calculated the algoritm is pretty darn useless anyway)

Share on other sites
Quote:
 Original post by SimonForsman[...](If the complexity can't be calculated the algoritm is pretty darn useless anyway)
Actually, there are some _very_ useful algorithms whose complexity can't be described - for example, a compiler that supports compile-time code execute (see Lisp, or C++ if the compiler doesn't have a template recursion limit, etc).

A universal turing machine is nothing but a conceptually simple algorithm, but good luck calculating the general complexity (And no, O(∞) is not an answer =-)

Share on other sites
Quote:
Original post by Extrarius
Quote:
 Original post by SimonForsman[...](If the complexity can't be calculated the algoritm is pretty darn useless anyway)
Actually, there are some _very_ useful algorithms whose complexity can't be described - for example, a compiler that supports compile-time code execute (see Lisp, or C++ if the compiler doesn't have a template recursion limit, etc).

A universal turing machine is nothing but a conceptually simple algorithm, but good luck calculating the general complexity (And no, O(∞) is not an answer =-)

Create an account

Register a new account

• Forum Statistics

• Total Topics
628300
• Total Posts
2981900

• 9
• 9
• 11
• 10
• 10