Exponentials: Little Oh

This question already has an answer here:

  • Difference between Big-O and Little-O Notation 3 answers

  • f(n)=o(g(n)) means that f(n)/g(n)->0 when n->infinite .

    For your problem,it should hold a>1 . (n^b)/(a^n)->0 when n->infinite, since (n^b)/(sqrt(a)^n*sqrt(a)^n))=((n^b)/sqrt(a)^n) * (1/sqrt(a)^n) . Let f(n)=((n^b)/sqrt(a)^n) is a function increase first and then decrease, so you can get the maximum value of max(f(n))=M , then (n^b)/(a^n) < M/(sqrt(a)^n) , since a>1, sqrt(a)>1 , so (sqrt(a)^n)->infinite when n->infinite . That is M/(sqrt(a)^n)->0 when n->infinite , At last, we get (n^b)/(a^n)->0 when n->infinite. That is n^b=o(a^n) by definition.


    (For simplicity I'll assume that all functions always return positive values. This is the case for example for functions measuring run-time of an algorithm, as no algorithm runs in "negative" time.)


    First, a recap of big-O notation, to clear up a common misunderstanding:

    To say that f is O(g) means that f grows asymptotically at most as fast as g . More formally, treating both f and g as functions of a variable n , to say that f(n) is O(g(n)) means that there is a constant K , so that eventually, f(n) < K * g(n) . The word "eventually" here means that there is some fixed value N (which is a function of K , f , and g ), so that if n > N then f(n) < K * g(n) .

    For example, the function f(n) = n + 2 is O(n^2) . To see why, let K = 1 . Then, if n > 10 , we have n + 2 < n^2 , so our conditions are satisfied. A few things to note:

  • For n = 1 , we have f(n) = 3 and g(n) = 1 , so f(n) < K * g(n) actually fails. That's ok! Remember, the inequality only needs to hold eventually, and it does not matter if the inequality fails for some small finite list of n .
  • We used K = 1 , but we didn't need to. For example, K = 2 would also have worked. The important thing is that there is some value of K which gives us the inequality we want eventually.
  • We saw that n + 2 is O(n^2) . This might look confusing, and you might say, "Wait, isn't n + 2 actually O(n) ?" The answer is yes. n + 2 is O(n) , O(n^2) , O(n^3) , O(n/3) , etc.

  • Little-o notation is slightly different. Big-O notation, intuitively, says that if f is O(g) , then f grows asymptotically at most as fast as g . Little-o notation says that if f is o(g) , then f grows asymptotically strictly slower than g .

    Formally, f is o(g) if for any (let's say positive) choice of K , eventually the inequality f(n) < K * o(g) holds. So, for instance:

  • The function f(n) = n is not o(n) . This is because, for K = 1 , there is no value of n so that f(n) < K * g(n) . Intuitively, f and g grow asymptotically at the same rate, so f does not grow strictly slower than g does.
  • The function f(n) = n is o(n^2) . Why is this? Pick your favorite positive value of K . (To see the actual point, try to make K small, for example 0.001.) Imagine graphing the functions f(n) and K * g(n) . One is a straight line through the origin of positive slope, and the other is a concave-up parabola through the origin. Eventually the parabola will be higher than the line, and will stay that way. (If you remember your pre-calc/calculus...)

  • Now we get to your actual question: let f(n) = n^b and g(n) = a^n . You asked why f is o(g) .

    Presumably, the author of the original statement treats a and b as constant, positive real numbers, and moreover a > 1 (if a <= 1 then the statement is false).

    The statement, in Engish, is:

    For any positive real number b , and any real number a > 1 , the function n^b grows asymptotically strictly slower than a^n .

    This is an important thing to know if you are ever going to deal with algorithmic complexity. Put simpler, one can say "polynomials grow much slower than exponential functions." It isn't immediately obvious that this is true, and is too much to write out, so here is a reference:

    https://math.stackexchange.com/questions/55468/how-to-prove-that-exponential-grows-faster-than-polynomial

    Probably you will have to have some comfort with math to be able to read any proof of this fact.

    Good luck!


    The super high level meaning of the statement nb is o(an) is just that exponential functions like an grow much faster than polynomial functions, like nb.

    The important thing to understand when looking at big O and little o notation is that they are both upper bounds. I'm guessing that's why you're confused. nb is o(an) because the growth rate of an is much bigger. You could probably find a tighter little o upper bound on nb (one where the gap between the bound and the function is smaller) but an is still valid. It's also probably worth looking at the difference between Big O and little o.

    Remember that a function f is Big O of a function g if for some constant k > 0, you can eventually find a minimum value for n so that f(n) ≤ k * g(n).

    A function f is little o of a function g if for any constant k > 0 you can eventually find a minimum value for n so that f(n) ≤ k * g(n).

    Note that the little o requirement is harder to fulfill, meaning that if a function f is little o of a function g, it is also Big O of g, and it means the function g grows faster than if it were just Big O of g.

    In your example, if b is 3 and a is 2 and we set k to 1, we can work out the minimum value for n so that nb ≤ k * an. In this case, it's between 9 and 10 since 9³ = 729 and 1 * 2⁹ = 512 , which means at 9 an is not yet greater than nb but 10³ = 1000 and 1 * 2¹⁰ = 1024 , which means n is now greater than nb. You can see graphing these functions that n will be greater than nb for any value of n > 10. At this point we've only shown that nb is Big O of n, since Big O only requires that for some value of k > 0 (we picked 1) an ≥ nb for some minimum n (in this case it's between 9 and 10)

    To show that nb is little o of an, we would have to show that for any k greater than 0 you can still find a minimum value of n so that an > nb. For example, if you picked k = .5 the minimum of 10 we found earlier doesn't work, since 10³ = 1000 , and .5 * 2¹⁰ = 512 . But we can just keep sliding the minimum for n out further and further, the smaller you make k the bigger the minimum for n will b. Saying nb is little o of an means no matter how small you make k we will always be able to find a big enough value for n so that nb ≤ k * an

    链接地址: http://www.djcxy.com/p/39880.html

    上一篇: o是一个严格的上限?

    下一篇: 指数:小哦