Segmentation fault on large array sizes

You’re probably just getting a stack overflow here. The array is too big to fit in your program’s stack region; the stack growth limit is usually 8 MiB or 1 MiB for user-space code on most mainstream desktop / server OSes. (Normal C++ implementations use the asm stack for automatic storage, i.e. non-static local variables arrays. This makes deallocating them happen for free when functions return or an exception propagates through them.)

If you dynamically allocate the array you should be fine, assuming your machine has enough memory.

int* array = new int[1000000];    // may throw std::bad_alloc

But remember that this will require you to delete[] the array manually to avoid memory leaks, even if your function exits via an exception. Manual new/delete is strongly discouraged in modern C++, prefer RAII.


A better solution would be to use std::vector<int> array (cppreference). You can reserve space for 1000000 elements, if you know how large it will grow. Or even resize it to default-construct them (i.e. zero-initialize the memory, unlike when you declare a plain C-style array with no initializer), like std::vector<int> array(1000000)

When the std::vector object goes out of scope, its destructor will deallocate the storage for you, even if that happens via an exception in a child function that’s caught by a parent function.

Leave a Comment