On Sun, Sep 09, 2007 at 05:35:12PM -0400, Marty wrote: > Andrew Sackville-West wrote: >> On Sun, Sep 09, 2007 at 04:23:42PM -0400, Marty wrote: >>> The following script seems to run abnormally slow on a 400Mhz Sarge >>> system, getting only about one iteration per second in the while loop. >>> It extracts md5sums from a 180k Packages file and makes an indices file. >>> I've narrowed down the slowdown to the lines in the while loop starting >>> with "search=..." >> how have you determined this? > > I checked the output rate by outputing to stdout (instead of piping to gzip > after the "done" statement). I also timed it with the "time" command. but, That only tells you how long it takes to iterate through the loop to get to the gzip command, not how much time is spent in each statement. smth like: while read inputline do echo "input line is " $inputline search=`grep...` echo "search is " $search" if... echo "we got a good search" fi ... done so that you can see how long is actually spent on the creation of a value for $search, who long in the if comparison and so forth. You stated above that the slowdown is in the "search=" lines, and I'm curious to know how you determined that. or did you do time search=`grep...` which could give meaningful output too. ... > >> FTR, you may do much better using something like awk to do this, >> though I'm no script master, just an observation. > > > I tried awk instead of cut, with no dramatic change. sorry, i meant replacing the whole operation with awk which, while a little heavy might be a better solution than instantiating a whole bunch of greps and cuts over and over. but that's just a guess. and your solution should certainly work pretty easily. and i see nothing that would cause it to take a full second per iteration. my brief testing (granted a much more powerful system) scrolled the output right by... faster than I could hope to read. There was definitely no noticeable delay anywhere in the process. A
Attachment:
signature.asc
Description: Digital signature