RituRaj wrote:
--- Matt Price <matt.price@utoronto.ca> wrote:I have a directory that's gotten out of hand with several hundred files. I'm looking foractive files, and normallywould do ls -tr to find the most recently-modified files -- but thelist is so huge it's difficult. So I tried:find . -maxdepth 1 -f file -atime -2find . -maxdepth 1 -f file -atime -2 -exec ls -ltr {}\;
find . -maxdepth 1 -f file -atime -2 | xargs ls -ltris more efficient, and will probably produce more reliable (although not perfect) results.
I.e, the -t implies sorting the entire result set, but -exec runs each ls in isolation, so the resulting order depends entirely on the order that find emits them. If there are not too many files, the xargs approach will fire off a single ls process, thereby sorting the results in time order.
If you have many files and need to guarantee the time order (in order to process the oldest newest files first), the following should do the trick:
#! /usr/bin/perl -w use strict; use constant DELTA => 300; # five minutes my $dir = shift || '.'; my $now = time; opendir DIR, $dir or die "Cannot open directory $dir: $!\n"; print "$_\n" for sort { (stat "$dir/$b")[9] <=> (stat "$dir/$a")[9] } grep { $_ ne '.' and $_ ne '..' and ($now - (stat "$dir/$_")[9]) <= DELTA } readdir DIR; closedir DIR; __END__The sort could use a Guttman-Rosler Transform to avoid excessive statting, but I'm feeling lazy :)
David