[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Shouldn't debian be configured better by default ?



Ethan Benson <erbenson@alaska.net> writes:

> On 7/11/99 Sami Dalouche wrote:
> 
> >While I was cleaning my home directory, I saw this program that I compiled.
> >After that, I launched it and... My X became frozen and then crashed 
> >( I executed the program in an Xterm). I think it's because it used 
> >all the memory available...
> >I don't want to try but what could happen if I'd have run it from a console
> >? Whould the system crash ?
> 
> I find it surprising that this program caused this much damage...
> 
> I once tried to crash my Redhat GNU/Linux system with 96MB of real 
> ram and 64MB swap partition, so I had netscape 4.6 go to a keyserver 
> and search for `michael'  (which this server will return a couple 
> thousand results in one complicated html page that ends up being 
> about 15MB in size) well after a long time watching netscape bloat up 
> eventually all memory was consumed all swap all real, any attempt to 
> run the smallest of utilities resulted in seg faults...
> 
> $ ps
> Segmentation fault
> :)
> 
> all i had to do was (slowly) hit the close box on netscape and it 
> went away and all was well and i kept on adding to a 50+ day uptime 
> iirc.

Did you check all of your daemon programs to make sure they were still
running?  When something like that has happened to me, I almost always
find that various daemons have quit because they couldn't get memory
when they needed it, so I have decided that it is easier to just
reboot.  In my case though, it is ImageMagick that has always caused
problems.  It is the only large program that I know of that doesn't
check to see if enough memory is available before grabbing it.  After
I discovered that, I checked xv, gimp, and the netpbm tools, and all
of them would abort with an error message if not enough memory was
available.

-- 
Carl Johnson		carlj@peak.org


Reply to: