How would I know if a run-time of an algorithm is one of the four types of run-time: quadratic, linear, constant, and logarithmic in a trivial way based on code itself and definition of the four terms?
I did some research on the definition on the four types of run-time but I am putting it in my own words so hopefully my phrasing is correct:
1) an algorithm run-time is constant if the run-time is same no matter the size of the input of the algorithm
I have a question about this one: if the run-time is constant why is it O(1). Why choose 1 of all numbers?
2) an algorithm run-time is linear if the run-time increases as the input size increases
3) an algorithm run-time is logarithmic if the run time increases to the log of the input size. I have a question about this too: are we assuming log of some input to the base 10 or is the base of the log arbitrary?
4) an algorithm run-time is quadratic if the run time increases to the input size squared.
On a side note what is the difference between lg and log?
Here's an example:
loglog n = _ (loglog(n^3)) <-I think the example is taking a log of log n
What should n be assumed to be to evaluate the statement in a trivial way.
On a side note, I'm not familiar with this notation lg < what does lg mean?
Example:
n*lg n = _ ( n*lglgn)