Are memory leaks ever ok?

Is it ever acceptable to have a memory leak in your C or C++ application?

What if you allocate some memory and use it until the very last line of code in your application (for example, a global object's destructor)? As long as the memory consumption doesn't grow over time, is it OK to trust the OS to free your memory for you when your application terminates (on Windows, Mac, and Linux)? Would you even consider this a real memory leak if the memory was being used continuously until it was freed by the OS.

What if a third party library forced this situation on you? Would refuse to use that third party library no matter how great it otherwise might be?

I only see one practical disadvantage, and that is that these benign leaks will show up with memory leak detection tools as false positives.


No.

As professionals, the question we should not be asking ourselves is, "Is it ever OK to do this?" but rather "Is there ever a good reason to do this?" And "hunting down that memory leak is a pain" isn't a good reason.

I like to keep things simple. And the simple rule is that my program should have no memory leaks.

That makes my life simple, too. If I detect a memory leak, I eliminate it, rather than run through some elaborate decision tree structure to determine whether it's an "acceptable" memory leak.

It's similar to compiler warnings – will the warning be fatal to my particular application? Maybe not.

But it's ultimately a matter of professional discipline. Tolerating compiler warnings and tolerating memory leaks is a bad habit that will ultimately bite me in the rear.

To take things to an extreme, would it ever be acceptable for a surgeon to leave some piece of operating equipment inside a patient?

Although it is possible that a circumstance could arise where the cost/risk of removing that piece of equipment exceeds the cost/risk of leaving it in, and there could be circumstances where it was harmless, if I saw this question posted on SurgeonOverflow.com and saw any answer other than "no," it would seriously undermine my confidence in the medical profession.

If a third party library forced this situation on me, it would lead me to seriously suspect the overall quality of the library in question. It would be as if I test drove a car and found a couple loose washers and nuts in one of the cupholders – it may not be a big deal in itself, but it portrays a lack of commitment to quality, so I would consider alternatives.


I don't consider it to be a memory leak unless the amount of memory being "used" keeps growing. Having some unreleased memory, while not ideal, is not a big problem unless the amount of memory required keeps growing.


Let's get our definitions correct, first. A memory leak is when memory is dynamically allocated, eg with malloc() , and all references to the memory are lost without the corresponding free. An easy way to make one is like this:

#define BLK ((size_t)1024)
while(1){
    void * vp = malloc(BLK);
}

Note that every time around the while(1) loop, 1024 (+overhead) bytes are allocated, and the new address assigned to vp; there's no remaining pointer to the previous malloc'ed blocks. This program is guaranteed to run until the heap runs out, and there's no way to recover any of the malloc'ed memory. Memory is "leaking" out of the heap, never to be seen again.

What you're describing, though, sound like

int main(){
    void * vp = malloc(LOTS);
    // Go do something useful
    return 0;
}

You allocate the memory, work with it until the program terminates. This is not a memory leak; it doesn't impair the program, and all the memory will be scavenged up automagically when the program terminates.

Generally, you should avoid memory leaks. First, because like altitude above you and fuel back at the hangar, memory that has leaked and can't be recovered is useless; second, it's a lot easier to code correctly, not leaking memory, at the start than it is to find a memory leak later.

链接地址: http://www.djcxy.com/p/3354.html

上一篇: 活动泄露了最初添加的窗口

下一篇: 内存泄漏是否正常?