O complexity of a recursive solution
I have a simple recursive solution as below:
public int countPaths(int x, int y) {
if(x == 0 && y == 0) {
return 0;
} else if(x == 0) {
return 1;
} else if(y == 0) {
return 1;
} else {
int count = countPaths(x-1, y);
count += countPaths(x, y-1);
return count;
}
}
This is to solve the following problem from the book: Cracking the coding interview
Imagine a robot sitting on the upper left corner of an X by Y grid. The robot can only move in two directions: right and down. How many possible paths are there for the robot to go from (0,0) to (X,Y)?
I am trying to ascertain the run time complexity and I believe it is O(x+y). I arrived at this by using a recursion tree, for example if x=2 and y=2
The max depth of this tree is (x+y) and work done at each step is a constant. So max work done is (x+y) * c and hence the run time complexity is O(x+y)
Question 1: Am I correct? I believe the upper bound I have calculated is not tight enough
Question 2: Next, if I were to improve the run time using memoization and hence not repeating computing sub-problems, how would the run time complexity as described by Big-o change?
While it's true that the depth of the tree is O(x+y), there's increasingly many nodes at each layer, and it's the number of nodes that determines complexity, not the depth of the tree.
If you write down recurrence relations for the runtime, you get:
T(0, y) = T(x, 0) = 1
T(x, y) = T(x-1, y) + T(x, y-1) + 1
If you ignore the +1 on the second equation (which can only make the run-time better), you get the same function that your code was computing in the first place, which is choose(x+y, y).
For x=y, this is the central binomial coefficient, which is approximately 4^x/sqrt(pi*x), which for even moderately large values of x is large enough to make the algorithm useless.
With memoisation, you're doing a constant amount of work for each value of x and y, so the complexity is O(xy).
If you evaluate the complexity in terms of the number of additions required to evaluate the count for a given pair (x, y)
, you get the recurrence
A(x,y) = A(x-1,y) + A(x,y-1) + 1
,
with A(x,0) = A(0,y) = 0
.
Setting A(x,y) = P(x,y) - 1
the recurrence becomes
P(x,y) - 1 = P(x-1,y) - 1 + P(x,y-1) - 1 + 1,
or
P(x,y) = P(x-1,y) + P(x,y-1),
with P(x,0) = P(0,y) = 1
, which gives the classical Pascal's triangle, and
A(x,y) = (x+y)!/(x!y!) - 1.
You can also work with the number of recursive function calls,
C(x,y) = C(x-1,y) + C(x,y-1) + 2,
with C(0,y) = C(x,0) = 0
.
You will solve it by setting C(x,y) = 2P(x,y) - 2
, and get
C(x,y)= 2(x+y)!/(x!y!)-2.
As regards the asymptotic complexity, this makes no difference. The is no really simpler formula than O((x+y)!/x!y!)
.
With memoization, every evaluation (with x, y>0
) costs only one addition or two calls, and assuming constant time for storage/retrieval of a value the total complexity is the much better O(xy)
.
Based on the valuable input from @Anonymous, we know the recurrence relation is:
T(x, y) = T(x-1, y) + T(x, y-1) + 1
Abusing (which is ok in Big-O analysis) the notation, let x = y
T(x, x) = 2 * T(x-1, x) = 2 * 2 * T(x-2, x) = ... = 2 * ... * 2 * T(0, x)
= O(2^x)
So the run time complexity is
O(2^n) ; where n = max(x, y)
With memoization, I get it, thank @Anonymous, it should be O(xy)
链接地址: http://www.djcxy.com/p/39336.html上一篇: 稀疏循环图中最长的简单路径
下一篇: O递归解决方案的复杂性