List of increasing order of complexity?
Possible Duplicate:
Plain English explanation of Big O
I am reading an the " Introduction to Algorithms" Book, but dont understand this. O(100), O(log(n)), O(n*log(n)), O(n2), O(n3)
Ok Thanks, i dident even know what it was, so i am going to read that Big O post now. But if anyone can explain this any further in layman's terms it would be much appreciated.
Thanks
Big O (simplifying) indicates how long will a given algorithm to complete, n being the amount of entry.
For example:
O(100) -> will take 100 units to complete no matter how much entry.
O(log(n)) -> will take log(n) to complete
O(n2) -> will take n^2 (n * n) to complete
That is the big O notation and an order of efficiency of algorithms:
O(1)
, not O(100)
- constant time - whatever the input, the algorithm executes in constant time
O(log(n))
- logarithmic time - as input gets larger, so will the time, but by a decreasing amount
O(n*log(n))
- linear * logarithmic - increases larger than linear, but not as fast as the following
O(n^2)
, or generally O(n^k)
where k is a constant - polynomial time, probably the worst of feasible algorithms
There are worse algorithms, that are considered unfeasible for non-small inputs:
O(k^n)
- exponential
O(n!)
- factorial
Algorithms that follow an Ackerman function...
This notation is orientative. For example, some algorithms in O(n^2)
can perform, on average, faster than O(n*log(n))
- see quicksort.
This notation is also an upper bound, meaning it describes a worst case scenario.
It can be used for space complexity or time complexity, where n
is the size of the input provided.
上一篇: 大(Big)中f(n),g(n)和真实常数是什么?
下一篇: 增加复杂程度的列表?