Submitted RagNoRoc, Jan 05 2000 03:27 PM | Last updated Jan 05 2000 03:27 PM
A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items.
Informally saying some equation f(n) = O(g(n)) means it is less than some constant multiple of g(n). More
formally it means there are positive constants c and k, such that 0 f(n) cg(n) for all n k. The values of c and k must be fixed for the function f and must not depend on n.