Large Object Heap Fragmentation, Issues with arrays
I am writing an analysis application in C# that has to deal with a lot of memory. I use ANTS Memory Profiler 7.4 for optimizing my memory management. While doing so, I realized that all of my double[,] arrays I use (and i need them) are placed on the LOH although the largest of these arrays is about 24.000 bytes. Objects should not be put there before 85.000 bytes as far as I know. The problem is now, that since I have about several thousand instances of these double[,] arrays I have a lot of memory fragmentation (about 25% of my total memory usage is free memory that I can not use). some of these arrays stored on the LOH are even only 1.036 bytes in size. The problem is that sometimes I have to perform larger analysis and then I end up with an out of memory exception because of the massive memory loss due to LOH fragmentation.
Does anyone know why this is happening although it should not be a large object by definition?
The threshold size for putting arrays of doubles on the LOH is much lower than for other types. The reason for this is that items on the LOH are always 64-bit aligned, and doubles benefit greatly from being 64-bit aligned.
Note that this only affects programs running in 32 bits. Programs running in 64 bits have objects that are always aligned on a 64-bit boundary, so that LOH heuristic is not used for 64 bit programs.
The threshold size is 1000 doubles.
Also see https://connect.microsoft.com/VisualStudio/feedback/details/266330/
链接地址: http://www.djcxy.com/p/62362.html上一篇: 我如何使用Google的Google Spreadsheet API?
下一篇: 大对象堆碎片,数组问题