Are there any O(1/n) algorithms?
Are there any O(1/n) algorithms?
Or anything else which is less than O(1)?
This question isn't as stupid as it might seem. At least theoretically, something such as O(1/n) is completely sensible when we take the mathematical definition of the Big O notation:
Now you can easily substitute g(x) for 1/x … it's obvious that the above definition still holds for some f.
For the purpose of estimating asymptotic run-time growth, this is less viable … a meaningful algorithm cannot get faster as the input grows. Sure, you can construct an arbitrary algorithm to fulfill this, eg the following one:
def get_faster(list):
how_long = (1 / len(list)) * 100000
sleep(how_long)
Clearly, this function spends less time as the input size grows … at least until some limit, enforced by the hardware (precision of the numbers, minimum of time that sleep
can wait, time to process arguments etc.): this limit would then be a constant lower bound so in fact the above function still has runtime O(1).
But there are in fact real-world algorithms where the runtime can decrease (at least partially) when the input size increases. Note that these algorithms will not exhibit runtime behaviour below O(1), though. Still, they are interesting. For example, take the very simple text search algorithm by Horspool. Here, the expected runtime will decrease as the length of the search pattern increases (but increasing length of the haystack will once again increase runtime).
Yes.
There is precisely one algorithm with runtime O(1/n), the "empty" algorithm.
For an algorithm to be O(1/n) means that it executes asymptotically in less steps than the algorithm consisting of a single instruction. If it executes in less steps than one step for all n > n0, it must consist of precisely no instruction at all for those n. Since checking 'if n > n0' costs at least 1 instruction, it must consist of no instruction for all n.
Summing up: The only algorithm which is O(1/n) is the empty algorithm, consisting of no instruction.
That's not possible. The definition of Big-O is the not greater than inequality:
A(n) = O(B(n))
<=>
exists constants C and n0, C > 0, n0 > 0 such that
for all n > n0, A(n) <= C * B(n)
So the B(n) is in fact the maximum value, therefore if it decreases as n increases the estimation will not change.
链接地址: http://www.djcxy.com/p/40072.html上一篇: 比较两种算法的复杂性
下一篇: 有没有O(1 / n)算法?