running time
If something has a running time of O( (log n)^2 ), is this asymptotically worse than O(n)? I'm pretty sure it is, but I guess it's usually better in practice, right? I just wanted to confirm.
mike
http://www.coolgroups.com/
A not-so-rigorous way to decide would be to just pick a really big N. Like 10,000,000,000.
log(N) = 10, so log(N)^2 = 100. For this large N, log squared is smaller. It turns out that for larger N, the difference is even greater.
log(N) = 10, so log(N)^2 = 100. For this large N, log squared is smaller. It turns out that for larger N, the difference is even greater.
for low values of N [say for example N=50, but it varies with the particular problem], O(N) is generaly faster than O(log(n)^2)
big-o notation is intended to describe how long something takes as N becomes large, for low values of N it's not very useful
[in general if there is an O(n^2) and a O(n log (n) ) the O(n^2) has less pre-computation and is faster for fewer than some number of elements]
big-o notation is intended to describe how long something takes as N becomes large, for low values of N it's not very useful
[in general if there is an O(n^2) and a O(n log (n) ) the O(n^2) has less pre-computation and is faster for fewer than some number of elements]
Alternatively, you can look at the ratio of the two terms as N approaches infinity, i.e.
lim (log(x))^2 / x as x -> inifnity.
Using L'Hopital's rule, this becomes
lim 2*log(x) / x = 0 (as x -> infinity).
So, x dominates (log(x))^2 in the limit and thus O(x) grows more quickly than O((log(x))^2).
-Josh
P.S. @John: how do you make those sexy sub/superscripts? [smile]
lim (log(x))^2 / x as x -> inifnity.
Using L'Hopital's rule, this becomes
lim 2*log(x) / x = 0 (as x -> infinity).
So, x dominates (log(x))^2 in the limit and thus O(x) grows more quickly than O((log(x))^2).
-Josh
P.S. @John: how do you make those sexy sub/superscripts? [smile]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement