Determining complexity for recursive functions (Big O notation)
I have a Computer Science Midterm tomorrow and I need help determining the complexity of these recursive functions. I know how to solve simple cases, but I am still trying to learn how to solve these harder cases. These were just a few of the example problems that I could not figure out. Any help would be much appreciated and would greatly help in my studies, Thank you!
int recursiveFun1(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun1(n-1);
}
int recursiveFun2(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun2(n-5);
}
int recursiveFun3(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun3(n/5);
}
void recursiveFun4(int n, int m, int o)
{
if (n <= 0)
{
printf("%d, %dn",m, o);
}
else
{
recursiveFun4(n-1, m+1, o);
recursiveFun4(n-1, m, o+1);
}
}
int recursiveFun5(int n)
{
for (i = 0; i < n; i += 2) {
// do something
}
if (n <= 0)
return 1;
else
return 1 + recursiveFun5(n-5);
}
The time complexity, in Big O notation, for each function, is in numerical order:
O(n)
, often called linear . O(n)
. (Actually called order of n/5 times. And, O(n/5) = O(n) ). O(log(n))
(base 5), often called logarithmic and most often Big O notation and complexity analysis uses base 2. O(2^n)
, or exponential , since each function call calls itself twice unless it has been recursed n times. As for the last function, the for loop takes n/2 since we're increasing by 2, and the recursion take n-5 and since the for loop is called recursively therefore the time complexity is in (n-5) *(n/2) = (2n-10) * n = 2n^2- 10n, due to Asymptotic behavior and worst case scenario considerations or the upper bound that big O is striving for, we are only interested in the largest term so O(n^2)
.
Good luck on your midterms ;)
For the case where n <= 0
, T(n) = O(1)
. Therefore, the time complexity will depend on when n >= 0
.
We will consider the case n >= 0
in the part below.
1.
T(n) = a + T(n - 1)
where a is some constant.
By induction:
T(n) = n * a + T(0) = n * a + b = O(n)
where a, b are some constant.
2.
T(n) = a + T(n - 5)
where a is some constant
By induction:
T(n) = ceil(n / 5) * a + T(k) = ceil(n / 5) * a + b = O(n)
where a, b are some constant and k <= 0
3.
T(n) = a + T(n / 5)
where a is some constant
By induction:
T(n) = a * log5(n) + T(0) = a * log5(n) + b = O(log n)
where a, b are some constant
4.
T(n) = a + 2 * T(n - 1)
where a is some constant
By induction:
T(n) = a + 2a + 4a + ... + 2^n * a + T(0) * 2 ^ n
= a * 2^(n+1) - a + b * 2 ^ n
= (2 * a + b) * 2 ^ n - a
= O(2 ^ n)
where a, b are some constant.
5.
T(n) = n / 2 + T(n - 5)
We can prove by induction that T(5k) >= T(5k - d)
where d = 0, 1, 2, 3, 4
Write n = 5m - b
(m, b are integer; b = 0, 1, 2, 3, 4), then m = (n + b) / 5
:
T(n) = T(5m - b) <= T(5m)
(Actually, to be more rigorous here, a new function T'(n)
should be defined such that T'(5r - q) = T(5r)
where q = 0, 1, 2, 3, 4
. We know T(n) <= T'(n)
as proven above. When we know that T'(n)
is in O(f)
, which means there exist constant a, b so that T'(n) <= a * f(n) + b
, we can derive that T(n) <= a * f(n) + b
and hence T(n)
is in O(f)
. This step is not really necessary, but it is easier to think when you don't have to deal with the remainder.)
Expanding T(5m)
:
T(5m) = 5m / 2 + T(5m - 5)
= (5m / 2 + 5 / 2) * m / 2 + T(0)
= O(m ^ 2) = O(n ^ 2)
Therefore, T(n)
is O(n ^ 2)
.
One of the best ways I find for approximating the complexity of the recursive algorithm is drawing the recursion tree. Once you have the recursive tree:
Complexity = length of tree from root node to leaf node * number of leaf nodes
n
and number of leaf node 1
so complexity will be n*1 = n
The second function will have the length of n/5
and number of leaf nodes again 1
so complexity will be n/5 * 1 = n/5
. It should be approximated to n
For the third function, since n
is being divided by 5 on every recursive call, length of recursive tree will be log(n)(base 5)
, and number of leaf nodes again 1 so complexity will be log(n)(base 5) * 1 = log(n)(base 5)
For the fourth function since every node will have two child nodes, the number of leaf nodes will be equal to (2^n)
and length of the recursive tree will be n
so complexity will be (2^n) * n
. But since n
is insignificant in front of (2^n)
, it can be ignored and complexity can be only said to be (2^n)
.
For the fifth function, there are two elements introducing the complexity. Complexity introduced by recursive nature of function and complexity introduced by for
loop in each function. Doing the above calculation, the complexity introduced by recursive nature of function will be ~ n
and complexity due to for loop n
. Total complexity will be n*n
.
Note: This is a quick and dirty way of calculating complexity(nothing official!). Would love to hear feedback on this. Thanks.
链接地址: http://www.djcxy.com/p/39702.html上一篇: 在iPhone 6 / iPhone 6 Plus上ARKit演示崩溃
下一篇: 确定递归函数的复杂性(大O表示法)