Big array gives segmentation error in C
I am really new to C, so I am sorry if this is a absolute beginner question, but I am getting a segmentation error when I am building large array, relevant bits of what I am doing is:
unsigned long long ust_limit;
unsigned long long arr_size;
/* ust_limit gets value around here ... */
arr_size = ((ust_limit + 1) / 2) - 1;
unsigned long long numbs[(int)arr_size];
This works for some values of ust_limit, but when it gets above approximately 4.000.000 a segmentation fault occurs. What I want is to detect a possible segfault and fail gracefully. How can I know which values would cause a segmentation fault. And, is this something platform dependant?
You are most likely getting a stack overflow, since you are creating a very large array on the stack. To avoid this, allocate the memory dynamically:
unsigned long long *numbs = malloc(arr_size * sizeof(unsigned long long));
Later, when you are finished with the array, free it again:
free(numbs);
您将数组商店堆栈框架(其大小有限制)改为使用malloc
。
unsigned long long *numbs = malloc(arr_size * sizeof(long long));
// don't forget to free after use
free(numbs)
You're consuming too much stack. The limit is platform dependent.
The exact limit depends on the OS. These limits can be changed to some extent on some operating systems .
For large amounts of memory, you should be using the head with malloc
and/or calloc
(and free
).
上一篇: 是否释放或删除释放内存回到“系统”
下一篇: 大数组给C中的分段错误