Big O Running Time for different Data Strucutres
I've tried to come up with the Big O Running time of the following data structures. Are they Correct?
Inserting n integers into an initially empty AVL Tree (best case) O(log n)
Inserting n integers into an initially empty AVL Tree (worst case) O(log n)
Inserting n integers into an initially empty binary search tree that does not enforce structure properties (best case) O(log n)
Inserting n integers into an initially empty binary search tree that does not enforce structure properties (worst case) O(n)
Also an explanation as to why they are incorrect would be helpful
This is incorrect in your definition:
Inserting n integers into an initially empty AVL Tree (best case) O(log n)
You can't access (and copy) n
data items in less than n
operations, because you should read each item (so, O(n)
is bare minimum of moving on n elements).
So, my assumtion is that you give correct O()
for single element (this is a bit wrong, because best can be achieved on special inputs; your estimations are of average case, not the best), so, for total described operation I'll multiply each with O(n)
:
Inserting n integers into an initially empty AVL Tree (best case) O(n*log n)
UPDATE: this is the average; best time can be lower on special inputs.
Inserting n integers into an initially empty AVL Tree (worst case) O(n*log n)
Inserting n integers into an initially empty binary search tree that does not enforce structure properties (best case) O(n*log n)
UPDATE : this may depend on implementation and on sequence of integers; so best case is O(n)
(see comments).
Inserting n integers into an initially empty binary search tree that does not enforce structure properties (worst case) O(n*n)
I am assuming here you need the total time for inserting all the elements:
(1) at best case for an AVL tree , you will never need to go below the root, [ie all the elements are equal to the root] so it will be O(n) . [never need to deepen more then 1 step, regardless of the tree's size]. O(1) per element.
(2) worst case for AVL tree : inserting n integers to an AVL tree is O(nlogn). each step is O(log(T)) where T is the size at that moment. O(log1) + O (log2) + ... + O(logn) = O(log1 + log2 + ... + logn) = O(log(1*...*n)) = O(nlogn)
. so O(nlogn) . O(logn) per element
(3) with no structure enforce, best case , still O(n) , same reason for (1). at best case, all elements you add are exactly the root, so you will never need to go down in the tree, no matter what its size is. O(1) per element.
(4) with no structure enforce, worst case : as said in other answers, finding the place for each element is O(n), so at overall you will have a worst time complexity of O(n^2) . O(n) per element.
How can inserting n
integers result in a Big O of O(logn)
. That doesn't make any sense. Surely reading the integers itself would take at least O(n)
.
So for the unbalanced, BST example the worst case is where you insert a sorted list of numbers like 1,2,3,4
. Inserting 1 takes 0 time. Inserting 2 takes ~1
time. Inserting 3 takes ~2
time. etc. This amounts to 1+2+3+...+n = O(n^2)
.
Similarly in the best case scenario, each subsequent insert takes log(i)
time. So the total running time is log(1)+log(2)+log(3)+...+log(n)
. What this evaluates to is not immediately obvious. But if you know a bit of calculus, you can see that this is (almost) the trapezoidal rule approximation to integral of log n from 1 to n. This is approximately nlogn - n = O(nlogn)
.
I'm sure you can do similar analysis for the AVL trees.
链接地址: http://www.djcxy.com/p/40014.html上一篇: 大O符号Log Base 2或Log Base 10
下一篇: 大O运行时间为不同的数据结构