running time

Started by
3 comments, last by jjd 18 years, 8 months ago
If something has a running time of O( (log n)^2 ), is this asymptotically worse than O(n)? I'm pretty sure it is, but I guess it's usually better in practice, right? I just wanted to confirm. mike http://www.coolgroups.com/
Mike C.http://www.coolgroups.com/zoomer/http://www.coolgroups.com/ez/
Advertisement
A not-so-rigorous way to decide would be to just pick a really big N. Like 10,000,000,000.

log(N) = 10, so log(N)^2 = 100. For this large N, log squared is smaller. It turns out that for larger N, the difference is even greater.
for low values of N [say for example N=50, but it varies with the particular problem], O(N) is generaly faster than O(log(n)^2)

big-o notation is intended to describe how long something takes as N becomes large, for low values of N it's not very useful

[in general if there is an O(n^2) and a O(n log (n) ) the O(n^2) has less pre-computation and is faster for fewer than some number of elements]
O(N) vs O(log2N):
Free Image Hosting at www.ImageShack.us

As you can see, for large values of N, O(log2N) is better.
John BoltonLocomotive Games (THQ)Current Project: Destroy All Humans (Wii). IN STORES NOW!
Alternatively, you can look at the ratio of the two terms as N approaches infinity, i.e.

lim (log(x))^2 / x as x -> inifnity.

Using L'Hopital's rule, this becomes

lim 2*log(x) / x = 0 (as x -> infinity).

So, x dominates (log(x))^2 in the limit and thus O(x) grows more quickly than O((log(x))^2).


-Josh

P.S. @John: how do you make those sexy sub/superscripts? [smile]

--www.physicaluncertainty.com
--linkedin
--irc.freenode.net#gdnet

This topic is closed to new replies.

Advertisement