Analysis of algorithms
I am reading algorithm analysis topic. Here is the text snippet from the book
When n doubles, the running time goes up by a factor of 2 for linear programs, 4 for quadratic programs, and 8 for cubic programs. Programs that run in logarithmic time take only an additive constant longer when n doubles, and programs that run in O(n log n) take slightly more than twice as long to run under the same circumstances.
These increases can be hard to spot if the lower-order terms have relatively large coefficients and n is not large enough.
My question is what does author mean lower-order terms have relatively large coefficients? Can any one explain with example
Thanks!
When using O notation, you specify the largest term of the function that is your performance bound. For example, if the performance was always bound by f = c3n3 + c2n2 + c1n + c0, you would say that is is O(n3). The author is saying that when n is small, the coefficients may have a larger impact than n on the performance, for example if c2 were very large and c3 very small, the performance may appear to be O(n2) until the size of n dominates the coefficients if you only go by the relative performance for specific small instances of n.
Suppose your algorithm actually executes n^2 + 1000 n
computations when run on n
elements. Now for n = 1
you need 1001 computations, and for n = 2
you need 2004. The difference from linear growth is tiny, and you can hardly spot the quadratic contribution!
Asymptotically, however, your algorithm takes O(n^2) steps, so asymptotically (as n gets large) doubling the input size quadruples your runtime. But for our small value, doubling from 1 to 2 did not quadruple the runtime! The lower-order term is n
, and its coefficient (1000) is large compared to the coefficient of the leading-order term n^2
(which is 1).
This shows how the asymptotic complexity does not say anything about particular, especially small values. It's merely a limiting statement about the behaviour as n
gets large.
Asymptotic notation refers to the bounds of the runtime as n->infinity. So, a function that is O(n log n) may have an actual runtime of .1*n log n + 100000*n.
In this case, the 100000*n term is the "lower-order term". As n->infinity, this term is overpowered by the .1*n log n term.
However, as you can see, for small n, the 100000*n term will dominate the runtime.
链接地址: http://www.djcxy.com/p/40064.html下一篇: 算法分析