Big array gives segmentation error in C

I am really new to C, so I am sorry if this is a absolute beginner question, but I am getting a segmentation error when I am building large array, relevant bits of what I am doing is:

unsigned long long ust_limit;
unsigned long long arr_size;

/* ust_limit gets value around here ... */

arr_size = ((ust_limit + 1) / 2) - 1;
unsigned long long numbs[(int)arr_size];

This works for some values of ust_limit, but when it gets above approximately 4.000.000 a segmentation fault occurs. What I want is to detect a possible segfault and fail gracefully. How can I know which values would cause a segmentation fault. And, is this something platform dependant?


You are most likely getting a stack overflow, since you are creating a very large array on the stack. To avoid this, allocate the memory dynamically:

unsigned long long *numbs = malloc(arr_size * sizeof(unsigned long long));

Later, when you are finished with the array, free it again:

free(numbs);


You're consuming too much stack. The limit is platform dependent.

The exact limit depends on the OS. These limits can be changed to some extent on some operating systems .

For large amounts of memory, you should be using the head with malloc and/or calloc (and free ).

链接地址: http://www.djcxy.com/p/14112.html

上一篇: 当你在malloc之后没有自由时真的会发生什么?

下一篇: 大数组给C中的分段错误