[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Dying services due to low memory?



Hi there,

can anyone point me to a solution for the following problem?

I have several machines running as Internet servers, mainly FTP and HTTP. 
They're relatively low-end machines (P100 and 486-133 with 48 resp. 64 MB 
RAM.) Every couple of days I have to restart inetd or other stand-alone 
services (like syslogd, klogd, snmpd, apache.)

I'm pretty sure the reason why the processes fail is that memory usage is 
too high (it's *definitely* not due to memory problems, like failing RAM 
modules or overclocked CPUs.) Memory usage is permanently about 99%, swap 
usage only a few percent. But obviously processes are dying because they 
can't allocate "real" memory?!

Of course a work-around would be to reduce the no. of concurrent FTP users 
that I allow, but I cannot easily do that. I simply cannot accept that 
services die as easily as they do. Isn't there a way to prevent this? I need 
a high availability of my machines, and having to constantly check and 
possibly restart services is not acceptable. :-(

Thanks,

Ralf


-- 
Sign the EU petition against SPAM:          L I N U X       .~.
http://www.politik-digital.de/spam/        The  Choice      /V\
                                            of a  GNU      /( )\
                                           Generation      ^^-^^



Reply to: