Heap Sort Space Complexity
I was just reading through Skiena's Algorithm Design Manual book, in particular the section on Heap Sort. He states that
It is an in-place sort, meaning it uses no extra memory over the array containing the elements to be sorted
the algorithm in the book looks like this:
heapsort(item_type s[], int n)
{
int i;
priority_queue q;
make_heap(&q, s, n);
for (i=0; i<n; i++)
s[i] = extract_min(&q);
}
To me, it looks like in addition to the input array s
of items, we're creating a heap data structure in the priority_queue
variable. Doesn't this mean that the required space complexity is O(n) and requires approximately double the memory, not "no extra memory"?
The implementation of heapsort that you've described above sure doesn't look like it works in constant space for precisely the reason that you're worried about.
However, that doesn't mean that it's not possible to implement heapsort in O(1) auxiliary space. Typically, an implementation of heapsort would reorder the elements in the array to implicitly store a binary max-heap. One nifty detail about max-heaps is that the way to dequeue from a max-heap is to swap the root node (the first array element) with the rightmost leaf (the last element in the array that's a part of the heap) and to then bubble that element downward. This means that you can repeatedly dequeue from the max-heap by swapping the first array element with the rightmost slot in the array that hasn't been filled in, then bubbling the swapped element into place. That requires only O(1) auxiliary storage space and is the typical way that heapsort is implemented.
If you're curious, I have my own implementation of heapsort on my personal site, which shows off how to do this.
链接地址: http://www.djcxy.com/p/40052.html上一篇: 空间复杂性的例子
下一篇: 堆排序空间复杂性