Bad Alloc with a 200GB memory available c++
I'm new to C++, and I'm studying 'compressive sensing' so I need working with huge matrices, and MATLAB is actually slow so I programmed my algorithm with C++.
The thing is that I store big arrays (around 100Mb-1Gb). They are 20 arrays approx. and it works fine with 30 Gb of memory however when the process needs more than 40Gb it just stops. I think it's a memory problem, I tested it on Linux and Windows (OS 64 bits - compilers 64 bits MinGW - 200Gb Ram - intel Xeon) is there any limitation?.
size_t tm=n*m*l;
double *x=new double[tm];
I use around 20 arrays like this one. n,m ~= 1000 and L ~= 30 those are typically sizes.
Thank you
20 arrays, a problem with 40 GB memory use in total - that suggests that the program breaks when an array exceeds 2 GB. This should not happen, a 64 bits address space should use a 64 bits size_t
for object sizes. It appears that MinGW incorrectly uses a 31 bit size (ie losing a sign bit as well).
I don't know how you allocate memory, but this is perhaps fixable by bypassing the broken allocation routine and going straight to the OS allocator. Eg for Windows you could call VirtualAlloc (skip HeapAlloc, it's not designed for such large allocations).
链接地址: http://www.djcxy.com/p/26132.html上一篇: SQL Azure会自动进行地理处理吗?