Need an Algorithm in O(n^n)

Started by
31 comments, last by Thygrrr 17 years, 9 months ago
You could always make some really odd example which is O(n^n). Lets say you have a superkey of lenght n. Each symbol in the superkey is a normalkey in itself which might be of n distinct values. Try to brute force the key by trying all combinations.

The only reason it's a O(n^n) is because n is defined as the lenght of the superkey though. If you had define n as the sum of the length of all the normalkeys, it would be of normal key breaking brute force complexity.

But I can't come up with any practical O(n^n)algorithm.

EDIT: If you want to look at really fast growing algorithms take a look at http://en.wikipedia.org/wiki/Ackermann_function
Advertisement
O(n^n) algorithms:

insertion sort
merge sort
bogosort
binary search
strlen

the list goes on...
-Forcaswriteln("Does this actually work?");
Quote:Original post by Forcas
O(n^n) algorithms:

insertion sort

Nope, O(n2) worst case.

Quote:Original post by Forcas
merge sort

Nope, O(nlogn) worst case.

Quote:Original post by Forcas
bogosort

Possibly.

Quote:Original post by Forcas
binary search

Nope, O(logn).

Quote:Original post by Forcas
strlen

Nope, O(n).
Forcas is right though... they are all O(n^n), but it would be more interesting to see some algorithms that are Theta(n^n), but it's common to just ask for O since some algorithms don't have a theta notation running time. So I guess what we really are looking for is an algorithm with a tight O(n^n) running time, meaning it doesn't also have a better running time than O(n^n). Or we could say that we are looking for an algorithm that is NOT O(n!)

Just to clear things up, an algorithm that is O(1), O(n), O(n^2), O(nlogn) or O(logn) is also O(n^n) since O(n^n) is faster growing than all of the others and therefore an upper boundry.

[Edited by - UknowsI on July 17, 2006 8:05:12 AM]
There is an unlimited number of Algorithms worse than O(n^n). You can easily invent one yourself around a function like the Ackermann function, which grows faster than exponentially. Also, algorithms that can't be proven to always terminate may be worse than O(n^n).

For example, you can write an algorithm that does a small calculation for some parameters x and y in some table for A(x,y) times, and you have an algorithm that superexponentially grows with the problem size.

It's not necessarily useful (actually, it's likely to be rather close to a "Bogosort" type of Algorithm), unless you're trying to analyze the effects of pessimization :)


Quote: "NOT O(n!)"
According to Sterling's* equation, the faculty function grows exponentially and thus within O(n^n).

EDIT: Not sure whether it was "Sterling" or some other equation used... but faculty still grows "only" exponentially. That said, I just realized I don't get your point at all... what was that about not O(n!)?

[Edited by - Thygrrr on July 20, 2006 4:08:37 PM]
My point about an algorithm that is not O(n!) was just to find a very fast growing algorithm, but as you pointed out yourself, that's a trivial task if you count in algorithms that might not terminate.
Quote:Original post by Roboguy
Quote:Original post by Forcas
O(n^n) algorithms:

insertion sort

Nope, O(n2) worst case.

Quote:Original post by Forcas
merge sort

Nope, O(nlogn) worst case.

Quote:Original post by Forcas
bogosort

Possibly.

Quote:Original post by Forcas
binary search

Nope, O(logn).

Quote:Original post by Forcas
strlen

Nope, O(n).
All of these are also O(n^n), and you should know that.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Quote:Original post by Promit
Quote:Original post by Roboguy
Quote:Original post by Forcas
O(n^n) algorithms:

insertion sort

Nope, O(n2) worst case.

Quote:Original post by Forcas
merge sort

Nope, O(nlogn) worst case.

Quote:Original post by Forcas
bogosort

Possibly.

Quote:Original post by Forcas
binary search

Nope, O(logn).

Quote:Original post by Forcas
strlen

Nope, O(n).
All of these are also O(n^n), and you should know that.


Merge sort:
Divide, Conquer, Combine

Divide: the divide step computes middle of subarray, which takes constant time. D(n) = O(1)
Conquer: Recursively solve 2 subproblems, each of size n/2, which contributes 2T(n/2).
Combine: the merge procedure takes O(n) time

Now you add D(n) and C(n), so you add O(n) and O(1). This sum is a linear function of n, that is O(n). Adding this to the 2T(n/2) term from the concuer step gives the recurrence for the worst-case running time T(n) of merge sort:
T(n) = { O(1)           if n=1         2T(n/2) + O(n) if n>1         

which can be rewritten as
T(n) = { c              if n=1         2T(n/2) + cn   if n>1

where c stands for time to solve problem of size 1

If we would assume that n is an exact power of 2 than the root costs cn and the 2 subtrees T(n/2). If we would add this recusively we would get a total T of cn lg n + cn, since the tree will have a depth of lg n. This is O(n lg n)

;)

As for the other NON O(n^n) algorithms, need i continue? ;)

Greetings...
They are still O(n^n) ;)

From http://en.wikipedia.org/wiki/Big-O

"More precisely, it is used to describe an asymptotic upper bound"

"the O notation is commonly employed to describe an asymptotic tight bound, but tight bounds are more formally and properly denoted by the Θ (big Theta) "

Quote:Original post by _swx_
They are still O(n^n) ;)

From http://en.wikipedia.org/wiki/Big-O

"More precisely, it is used to describe an asymptotic upper bound"

"the O notation is commonly employed to describe an asymptotic tight bound, but tight bounds are more formally and properly denoted by the Θ (big Theta) "


Yeah i know, but where the hell is the Theta button when you need it :) I missed the also in the previous post, btw. However, big O n^n don't really seem like a problem to find, but how about Θ(n^n)?

This topic is closed to new replies.

Advertisement