[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: ITP: speedy cgi ("persistent" perl scripts)

On Fri, Jul 16, 1999 at 02:22:17PM -0700, Sam Horrocks wrote:
>  > ah. that is a problem. in that case, i can't package it for debian.  DSO
>  > modules are really the only option for debian - it doesn't make sense to
>  > compile in a non-standard option which not everyone will use.
>  I've put the DSO onto the todo list.

cool.  if/when it's done, i'll make sure i include it in the debian package.

> > one thing i am going to try doing with speedyCGI is to add
> > suexec-like functionality (test if setuid to username. if yes, then
> > test whether script lives under user's home directory. if not, log
> > security error and abort). the idea is that each vhost user would
> > get their own copy of the speedy binary which would be setuid to
> > them.
> Why not just use suexec and a single (non-setuid) speedy binary?
> Whatever is safe with regular perl should be just as safe with speedy.
> Or are you trying to do something beyond what suexec can do?

well, two reasons. one is to avoid the extra fork of suexec before

second is that suexec doesn't quite work as i want it to, and is a bit
more paranoid than i need it to be - e.g. suexec has a hard-coded cgi
root dir. this doesn't fit my setup: i set up user to have their own
home directory, and everything (public_html/, cgi-bin/, www_logs/, etc
lives under there), and i want to getpwent to find the EUID's home
directory and only abort if the script isn't under there.

as you say, it might be simpler to just modify suexec to work as i want
it to...and following the KISS principle is generally the best option.

>  The memory limits have already been requested by someone else.  When
>  I looked into it, I couldn't find a portable way to do it.

yep. it's really easy for novice perl programmers to write loops that
use up all memory - causing random processes to start crashing...or
endless loops that use too much cpu and slow down the system to a crawl.

mostly it's bad programming, but protecting against that also helps to
protect against malicious programming.


craig sanders

Reply to: