Jump to content
• Advertisement

# How to measure the time of an algorithm? (o(n), expo,etc)

This topic is 4119 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

## Recommended Posts

Hi, thats the question, how do i know that a function for doing some algorithm is o(n) or o(nlog2), or o(n2) , etc.... for example, this algorithm for doing the fibonacci series:
int recursiveFib(int n)
{
if ( n <= 1 )
return n;
return recursiveFib(n-1) + recursiveFib(n-2);
}


i read that it will run in exponential time, but..... why? how can i know when an algorithm can become that expensive?

#### Share this post

##### Share on other sites
Advertisement
You prove it.

In this particular situation, count the number of recursive calls to compute fibo(n), and call that time T(n). So:

T(n) = 1 + T(n-1) + T(n-2)
T(0) = T(1) = 1

We can solve this by noticing that the following expression verifies this relationship.

T(n) = 2 fibo(n) - 1

And since we already know that fibo(n) is exponential, so is the complexity.

#### Share this post

##### Share on other sites
For n <= 1, you return n, which takes 1 call.
For n = 2, you call fib(1) and fib(0), which is 3 calls total.
for n = 3, you call fib(2) and fib(1), which is 4 calls total.
For n = 4, you call fib(3) and fib(2), which is 7 calls total.
For n = 5, you call fib(4) and fib(3), which is 11 calls total.

If you notice the pattern (ignoring the first one), the number of calls is actually a sequence calculated the same way as fibonacci numbers - adding the two previous. If you want an actual equation to calculate the total number of calls, you'd need to plug the numbers into a recurrence relation solution. To turn that into O-notation, you'd need to remove certain constants and end up with O(x^N) for some x.

As a side note, you could decrease the complexity to a linear relationship (O(N)) using the simple techniques of memoization without altering your algorithm. In this case, you could also make it a simple loop, but it's a good example of how simple memoization can be to greatly reduce running time.

#### Share this post

##### Share on other sites
Quote:
 Original post by ExtrariusAs a side note, you could decrease the complexity to a linear relationship (O(N)) using the simple techniques of memoization without altering your algorithm. In this case, you could also make it a simple loop, but it's a good example of how simple memoization can be to greatly reduce running time.

I might note that the time complexity depends upon how memoized results are stored and looked up. The actual complexity is O(N M), where M is the complexity of memoization. In this case it certainly would be O(N) because you could just store the memoized results in a random-access array, with complexity O(N 1) = O(N).

#### Share this post

##### Share on other sites
sorry guys, but i still dont understand..

Quote:
 Original post by ToohrVykT(n) = 1 + T(n-1) + T(n-2)T(0) = T(1) = 1We can solve this by noticing that the following expression verifies this relationship.T(n) = 2 fibo(n) - 1

how did you get that?
wouldnt T(0) = 1+t(-1)+t(-2)
T(0) = 1-3t ?

what about, for example, the insertion sort:
insert(array a, int length, value) {     int i = length - 1;     while (i >= 0 && a > value) {         a[i + 1] = a;         i = i - 1;     }     a[i + 1] := value; }  insertionSort(array a, int length) {     int i = 0;     while (i < length) {         insert(a, i, a);         i = i + 1;     } }

in the best case, it will run at O(n), in the worst case, it will run at 0(n^2)... why?

#### Share this post

##### Share on other sites
Quote:
 Original post by BlackWindhow did you get that? wouldnt T(0) = 1+t(-1)+t(-2) T(0) = 1-3t ?

No. T(0) corresponds to a single call, which returns a value right away.

#### Share this post

##### Share on other sites
Quote:
 Original post by BlackWindin the best case, it will run at O(n), in the worst case, it will run at 0(n^2)... why?

 insertionSort(array a, int length) {     int i = 0;     while (i < length) {         insert(a, i, a);         i = i + 1;     }

How many times does this get executed? Obviously, N times.

But, there's a catch. insert takes time too. How much?

insert(array a, int length, value) {     int i = length - 1;     while (i >= 0 && a > value) {         a[i + 1] = a;         i = i - 1;     }     a[i + 1] := value; }

Repeat from end of array towards beginning, until you eaither reach first element, or a value for which condition is met.

Since we have no knowledge about contents in the array, we can deduce that on average, we'll do n/2 iterations.

In best case, the "a > value" will always be true for "i = length - 1", so we'll get O(1) complexity.

In worst case, we'll always get down to i == 0. But since length changes with i, we get a different value for length. The average length that insert will need use is, n/2. On first call, it's 1, on second, 2, ... n-2, n-1. The average number of elements, n/2, but the constant is usually omited.

So insert has complexity between O(1) and O(n).

Now we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).

#### Share this post

##### Share on other sites
While people are giving you reasonable responses, the real answer is to familiarize yourself with the basics of CS algorithm theory. I used the book "Introduction to Algorithms" by Cormen, Leiserson, and Rivest. The very first chapter in the book covers this topic. The rest of the book is quite informative too.

If you can't make sense of that, you probably need to work on your math a bit first; I would bet understanding basic college Calculus is enough.

Geoff

#### Share this post

##### Share on other sites
Ok, now its clearer to me.
I have another question:
is it possible to code a way to get the complexity of the algorithm?
or it can only be obtained by pure mathematical analysis?

Quote:
 Original post by AntheusNow we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).

for the worst case, shouldnt it be:
O(n) * O(n) = 0(n^2)

#### Share this post

##### Share on other sites
Quote:
Original post by BlackWind
Ok, now its clearer to me.
I have another question:
is it possible to code a way to get the complexity of the algorithm?
or it can only be obtained by pure mathematical analysis?

Quote:
 Original post by AntheusNow we put both of those together, namely O(n) * O(1) and O(n) * O(n*n), and we get our lower and upper bounds of O(n) and O(n^2), with average case being O(n^2/2).

for the worst case, shouldnt it be:
O(n) * O(n) = 0(n^2)

Ofcourse it is possible, Though you might want to decide on a generic way to write the algoritm first. (As it would simplify the process greatly).

#### Share this post

##### Share on other sites

• Advertisement
• Advertisement

• ### Popular Contributors

1. 1
2. 2
3. 3
Rutin
24
4. 4
JoeJ
18
5. 5
• Advertisement

• 14
• 23
• 11
• 11
• 9
• ### Forum Statistics

• Total Topics
631766
• Total Posts
3002226
×

## Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!