Re: Max size of data in C++ prog?
On Sun, 19 Dec 2004 12:32:12 +1100 Sam Watkins <swatkins@fastmail.fm> wrote:
there are two different data segments for a C or C++ program,
the stack and the heap.
stack space is (apparently) limited under Linux.
If you declare an array like this in a function:
char c[10*1024*1024];
and write to it all, this will cause a segfault on Linux.
That was my problem.
The heap can get much bigger, you can malloc as much as you want on the heap
(up to the limits of VM and process address space) and it won't segfault, e.g.
char *c = malloc(100*1024*1024);
int i;
for (i=0; i<size; ++i)
c[i] = 'x';
free(c);
Thanks, allocating space with malloc worked great. I get my character
array with 68000000 bytes (2000 x 34000) with no problems.
you need to:
#include <malloc.h>
to use malloc.
malloc.h must have been included by one of my other include files, as I
did not include it directly.
C++ allocates data on the heap when you use "new", and the STL containers
allocate their elements on the heap. For example these are ok:
char *c = new char[100*1024*1024];
// ...
delete[] c;
vector<int> v;
int i;
for (i=0; i<10000000; ++i)
v.push_back(i);
Vectors are single dimensional, I believe, so they wont work in this
case, but I will remember them for the future
As Gregory pointed out (and I had forgotten to mention) global and static
variables go on the heap too.
I tried using a global array, but I got other errors about the
subscripts. I don't remember the errors, off hand, and I didn't worry
about it, since I got things working with dynamic allocation.
Thanks, also, to Michael Marsh for pointing out that I really should
check the return value of malloc to make sure that I have a valid pointer.
--
Marc Shapiro
mshapiro_42@yahoo.com
Reply to: