What is the Difference between T(n) (reccurence relations), Big O and Big Theta
I am wondering about this for my Algorithm class. It seems to be unclear what the difference is between BigO, Big Theta, and Recurrence relations (T(n)) For example: T(n) = 4T(n/3) + O(1)
A recurrence relation is a way of recursively defining a function. For example, the recurrence relation
T(n) = 4T(n / 3) + O(1)
says that the function is defined so that if you want to determine T(n), you can evaluate T(n / 3), multiply it by 4, an add in some term that's O(1). You can think of the recurrence relation as a way of describing how a function is defined without actually coming up with an explicit formula for it.
Once you have a recurrence relation, though, you'll usually want to figure out some way to get a sense for what kind of function it describes. Does it describe a linear function? Something quadratic? Something exponential?
Big-O notation and Big-Θ notation are tools for describing the long-term growth rates of functions and for categorizing functions into different groups by their growth rates. Big-O notation is used to give upper bounds on a function, while big-Θ notation is used to give both upper and lower bounds on a function.
For example, using the Master Theorem, we can claim that the recurrence you have above is such that T(n) = O(nlog3 4). By this, we mean that the function that's implicitly described by T(n) grows approximately at the rate of the function nlog3 4. We can formalize this by introducing a formal definition of big-O notation.
When we say that T(n) = O(nlog3 4), though, we're not precluding the possibility that T(n) is actually a much smaller function, though. For example, if T(n) = 5, it's true that T(n) = O(nlog3 4), though it doesn't say much. If we make the stronger claim that T(n) = Θ(nlog3 4), we're asserting that, in the long term, T(n) grows at the same rate as nlog3 4.
Hope this helps!
链接地址: http://www.djcxy.com/p/40152.html上一篇: Big Oh是衡量STL复杂性的唯一符号