Re: Cron jobs run out of memory?
On Fri, 25 Feb 2005 08:15:15 -0500
Michael Marsh <michael.a.marsh@gmail.com> wrote:
> On Fri, 25 Feb 2005 07:08:26 -0600, Jacob S <stormspotter@6texans.net>
> wrote:
> > I'm obviously no expert, but it sure seems like there is something
> > limiting the memory on cron jobs run as a user. Does anyone have
> > some tips on how I might track this down?
>
> Run the following as a cron job:
>
> #! /bin/bash
> ulimit -a
Thanks, I hadn't thought about ulimit.
When run manually, as a non-root user, ulimit -a outputs the following:
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) unlimited
virtual memory (kbytes, -v) unlimited
When run from a cron job, ulimit -a produces the following output:
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) 32
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 4095
virtual memory (kbytes, -v) unlimited
So it looks like I need to increase the 'max locked memory' for cron
jobs. Except, aren't cron jobs run as the same user that owns the
crontab? How would I change the ulimit settings for the cron jobs
separate from the ulimits when I'm doing stuff from a cli? (I'm guessing
that only root can change a user's ulimit settings, is that right?)
Thanks,
Jacob
Reply to: