[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Max size of data in C++ prog?



On Sun, 19 Dec 2004 12:32:12 +1100, Sam Watkins <swatkins@fastmail.fm> wrote:
> The heap can get much bigger, you can malloc as much as you want on the heap
> (up to the limits of VM and process address space) and it won't segfault, e.g.
> 
>   char *c = malloc(100*1024*1024);
>   int i;
>   for (i=0; i<size; ++i)
>         c[i] = 'x';
>   free(c);

That "up to the limits of..." is important, and the code above isn't
safe.  You should always check the output of *alloc to make sure it's
non-NULL.  Otherwise you're back in the crap-shoot of allocating
statically and hoping there was enough space on the stack.

> C++ allocates data on the heap when you use "new", and the STL containers
> allocate their elements on the heap.  For example these are ok:
> 
>   char *c = new char[100*1024*1024];
>   // ...
>   delete[] c;

Ditto here, though it should throw an exception if the memory couldn't
be allocated.  Older implementations will likely return 0.

Incidentally, with C++ I tend to use vectors for a lot of arrays (you
can reserve() space for a certain number of items so that it won't
have to re-allocate) unless they're sparse, in which case I use maps. 
Appropriate try/catch blocks can then make sure that you're handling
funny memory conditions correctly.  In both cases you get to use the
same operator[] notation as for arrays, but an invalid (non-negative)
index will attempt to allocate the space you need.

-- 
Michael A. Marsh
http://www.umiacs.umd.edu/~mmarsh



Reply to: