Stephen Le wrote:
Hello, I was wondering if there was a way to limit the amount of memory used by any single process or user. I'd like to prevent Apache (and other server daemons) from consuming all available memory and thrashing my disk when everything else is forced to swap. For example, I'm currently testing the coppermine image gallery software. It's written in PHP and uses ImageMagick. Whenever images are uploaded to the gallery, it spawns a number of fairly large /usb/bin/convert processes under the 'www-data' user name. These processes consume all available memory and result in disk thrashing that brings my server down to a grinding halt. Is there a way to prevent this from happening? Thanks, Stephen Le
Hi !If you want to limit memory with PHP, do it with your php.ini which will do the job. If you need to limit memory inside Perl or CGI script, you can use sbox-dtc available here:
http://www.gplhost.com/?rub=softwares&sousrub=sboxsbox is a CGI wrapper script that allows Web site hosting services to safely grant CGI authoring privileges to untrusted clients. In addition to changing the process privileges of client scripts to match their owners, it goes beyond other wrappers by placing configurable ceilings on script resource usage, avoiding unintentional (as well as intentional) denial of service attacks. It also optionally allows the Webmaster to place client's CGI scripts in a chroot'ed shell restricted to
the author's home directories. If you use PHP as a CGI, then I think you wont have problems. Thomas