[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Looping Shell Scripts and System Load

On 2020-06-24 10:19, Martin McCormick wrote:
I wrote a shell script that unzips documents and I originally
wrote it such that it gets document #1, unzips it then gets
document #2, etc and it does that just fine so I wondered if I
could make it run faster by starting several processes at once,
each one unzipping a file.  It's certainly still running and will
eventually finish but I created a monster because it starts as
many processes as there are items to unzip.

unarchive ()  {
  unzip $1
return 0
mountpoint /mags >/dev/null  ||mount /mags
mountpoint /mags >/dev/null || exit 1
cd /mags
#rm -r -f *
      for MEDIAFILE in `ls $MEDIADIR/*`; do
dirname=`basename $MEDIAFILE`
mkdir $dirname
cd $dirname
unarchive $MEDIAFILE &
cd ../
cd ~
umount /mags
exit 0

	If there are 3 zipped files, it's probably going to be ok
and start 3 unzip processes.  This directory had 13 zip files and
the first 2 or 3 roared to life and then things slowed down as
they all tried to run.

	I expected this and I've been doing unix shell scripts
for literally 31 years this Summer so it is no mystery as each
new job spawns a whole new set of processes to unzip the file it
is working on while all the others are still grinding on.

	Miscreants have been known to deliberately create
loops that keep starting processes until the system crashes.

	Fortunately, this is one of my systems but this made me
wonder before I reinvent the wheel if there is a way to make a
shell script throttle itself based on current load so it keeps
slurping resources until the next iteration starts too many and
things start to bog down.  When some of the earlier or shorter
processes finish, the loop can restart and start some more unzips
until all are done.

	Right now, uptime looks like:

  11:48:07 up 26 days, 23:10,  7 users,  load average: 16.15, 15.60, 10.65

	That's pretty loaded so ideally, one could start the
looping script and it would fire up processes until things got
really busy and then not allow any more new procs to start until
some have stopped so cron and other system utilities don't stop
running which is what happens when systems get too busy.

	Thanks for any constructive suggestions.

Martin McCormick   WB5AGZ

GNU Parallel looks like a possibility:




Reply to: