Space complexity vs time complexity trade offs

I've been studying some sorting algorithms and have come across some inverse relationships between time and space complexity. For example, an algorithm like selection sort takes O(n^2) but only requires constant space as it can be done in place. An algorithm like merge sort however has O(nlogn) time complexity but requires O(n) space.

My questions are

  • Is there a theorem or law that relates time and space complexity trade offs to one another? Is this phenomenon only present is sorting algorithms or does this trade off also persist in other problems?

  • Is it always a good idea to trade space complexity for time complexity with a huge increase in modern RAM sizes? Or is there times when decreasing the time complexity would make the space complexity prohibitively large.


  • On answer to your first question - no there is not any. This comes from the case by case analysis of an algorithm. There is no mathematical formula that will calculate the space - time complexity trade-off. And yes it persists in other algorithms too. For example: consider the problem of getting running sum in an array - with updates in between. If you store in O(n) memory (array) then the complexity would be O(n) for retrieving specific sum over a range in the array. But if you keep a segment tree with O(4*n) ~ O(n) space complexity it would give O(logn) update and retrieval.

    No it is not. Having huge space complexity won't compensate for the extra memory that we earn today. The performance will be constrained by memory access etc.

    Some cases the decrease in time complexity can only be achieved with relatively high space complexity. For example, If data is stored is uncompressed, it takes more space but access time is smaller than if the data were stored compressed (since compressing the data reduces the amount of space it takes, but it takes some time to run the decompression algorithm). It is depending upon the scenario that we will face and that dictates which solution we want to go for.


    As far as I know, there is no definite law related to time and space complexity trade offs. There is however a tendency for all sorts of algorithmic problem to have multiple solutions, with some requiring less time at the expense of space, and others requiring more space at the expense of time.

    When trying to optimize algorithms, it is very often the case that using more space, for example in the form of pre-calculations, leads to better time-wise performance. Studying time and space complexity can be helpful in observing this tendency, but it can also be misleading. To take your example about merge sort: you may actually implement a merge sort algorithm that requires a constant amount of memory. It can even be an in-place sort. While the space complexity is reduced, and although the time complexity stays the same, there is a performance hit due to big constant or linear time factors (which do not count in O(n log n) ).

    The most common case of optimization for speed is the use of lookup tables, sacrificing some amount of memory to avoid recalculation. Another example is data compression: take the numerous image or audio file formats with each of their benefits and drawbacks.

    In regards to your second question, there are of course situations where some potential increase in performance would lead to an explosion of memory requirements. It isn't hard to find examples in video games, since they often require a lot of computational resources.


    Answers.

  • No, there is no such theorem. Not even for sorting. For example heapsort has time O(n log(n)) and O(1) space complexity.

  • There are a variety of techniques that explicitly trade off space for time. For example memoization. They are not free, and are not always better. Remember, efficient memory use is not just about saving memory. It has benefits all the way from having better locality of memory to reducing data transmission over the wire. As an example look at https://github.com/google/snappy for how in real world systems people choose to use more CPU to save memory because it makes things faster.

  • 链接地址: http://www.djcxy.com/p/39694.html

    上一篇: 大O符号的复杂性排序是什么?

    下一篇: 空间复杂性与时间复杂度之间的权衡