On Sun, Jan 09, 2011 at 10:05:43AM -0600, Stan Hoeppner wrote: > #! /bin/sh > for k in $(ls *.JPG); do convert $k -resize 1024 $k; done > > I use the above script to batch re-size digital camera photos after I > dump them to my web server. It takes a very long time with lots of new > photos as the server is fairly old, even though it is a 2-way SMP, > because the script only runs one convert process at a time serially, > only taking advantage of one CPU. The convert program is part of the > imagemagick toolkit. > > How can I best modify this script so that it splits the overall job in > half, running two simultaneous convert processes, one on each CPU? > Having such a script should cut the total run time in half, or nearly > so, which would really be great. You need parallel: http://ftp.gnu.org/gnu/parallel/ From their home page (http://freshmeat.net/projects/parallel): GNU parallel is a shell tool for executing jobs in parallel locally or using remote computers. A job is typically a single command or a small script that has to be run for each of the lines in the input. The typical input is a list of files, a list of hosts, a list of users, a list of URLs, or a list of tables. If you use xargs today you will find GNU parallel very easy to use, as GNU parallel is written to have the same options as xargs. If you write loops in shell, you will find GNU parallel may be able to replace most of the loops and make them run faster by running several jobs in parallel. If you use ppss or pexec you will find GNU parallel will often make the command easier to read. GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially. This makes it possible to use output from GNU parallel as input for other programs. -- Huella de clave primaria: 0FDA C36F F110 54F4 D42B D0EB 617D 396C 448B 31EB
Attachment:
signature.asc
Description: Digital signature