algorithm

This question already has an answer here:

  • Big O, how do you calculate/approximate it? 22 answers

  • In the first case sum++ executes 0 + 1 + ... + n-1 times. If you apply arithmetic progression formula, you'll get n (n-1) / 2 , which is O(n^2) .

    In the second case we'll have 0 + 1 + 4 + 9 + ... + (n-1)^2 , which is sum of squares of first n-1 numbers, and there's a formula for it: (n-1) n (2n-1)

    The last one is interesting. You can see, actually, that the most nested for loop is called only when j is a multiplicand of i , so you can rewrite the program as follows:

    int n = 500;
    for(int i = 0; i < n; i++) {
       for(int m = 0; m < i; m++) {
           int j = m * i;
           for( k = 0; k < j; k++)
                 sum++
       }
    }
    

    It's easier to work with math notation:

    The formula is derived from the code by analysis: we can see that sum++ gets called j times in the innermost loop, which is called i times, which is called n times. In the end, the problem boils down to a sum of cubes of first n numbers plus lower-order terms (which do not affect the asymptotics)

    One final note: it looks obvious, but I'd like to show that in general sum of first N natural numbers in d th power is Ω(N^(d+1)) (see Wikipedia for Big-Omega notation), that is it grows no slower than that function. You can apply the same reasoning to prove that a stronger condition is satisfied, namely, it belongs to Θ(N^(d+1)) , which combines both Ω and O .


    You are right for all but the last one, which has a tighter bound of O(n^4) : note that the last for loop is only executed if j is a multiple of i . There are x / i multiples of i lower than or equal to x , and i * i / i = i . So the last loop is only executed for i values out of the i * i .

    Note that big-oh gives an upper bound, so i*i vs n*n makes little difference. Strictly speaking, saying they are all O(n^2015) is also correct (because that is a valid upper bound), but it's hardly helpful, so in practice a tight bound is usually used.


    IVlad already gave the correct answer.

    I think what confuses you is the "Big Oh" definition.

  • N^2 has O(N^2)
  • 1/2N^2 has O(N^2)
  • 1/2N^2 + c*N + b also has O(N^2) - by given c and b are constants independent from N
  • Check Big Oh definition from here

    链接地址: http://www.djcxy.com/p/39348.html

    上一篇: 使用Big的数量级

    下一篇: 算法