[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Cron jobs run out of memory?



On Sat, 12 Feb 2005 08:43:51 -0600
Jacob S <stormspotter@6Texans.net> wrote:

> I have a simple download script that runs from a cron job every
> morning to pull an audio stream from a url and then convert it to mp3
> (the script is attached to this e-mail). It runs fine the first couple
> times after a reboot, but then fails at various stages, leaving these
> messages in my mailbox:
> 
> /home/jacob/bin/download.sh: fork: Cannot allocate memory
> 
> /home/jacob/bin/download.sh: xrealloc: ../bash/subst.c:468: cannot
> reallocate 99 178368 bytes (0 bytes allocated)
> 
> When it gives the fork error, it has usually made it through the
> mplayer section but not lame. When it gives the xrealloc error it
> doesn't even make it that far. However, when I run this script
> manually from a command line it works perfectly.
> 
> The computer is an Athlon 2200+XP on a NForce2 chipset with 512MB of
> ram and 160GB SATA. Here is an example output from 'free' after the
> cron job has failed, but running the program manually worked fine:
> 
> $ free
>        total       used       free     shared    buffers  cached 
> Mem:  516308     514288       2020          0       1712  101836
> -/+ buffers/cache:410740    105568 
> Swap: 497968     467868      30100

At the suggestion of Uwe Dippel, I've been doing some playing around to
see how much ram needs to be free before the cron will complete
successfully. 

I closed some of the programs last night that would have normally stayed
open all night. But sure enough, I still got the "fork: Cannot allocate
memory" error in my mailbox from cron. Here's what free reported only 5
seconds after the failure:

$ free
         total       used       free     shared    buffers  cached
Mem:    516308     200140     316168          0       1288   29320
-/+ buffers/cache: 169532     346776
Swap:   497968     274256     223712

That seems to me like it should have had plenty. After running lame
manually to convert the file, free reported this:

$ free
         total       used       free     shared    buffers  cached
Mem:    516308     385868     130440          0       3856  196952
-/+ buffers/cache: 185060     331248
Swap:   497968     270548     227420

I'm obviously no expert, but it sure seems like there is something
limiting the memory on cron jobs run as a user. Does anyone have some
tips on how I might track this down?

TIA,
Jacob



Reply to: