[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: find/ls most recent files



RituRaj wrote:
--- Matt Price <matt.price@utoronto.ca> wrote:

I have a directory that's gotten out of hand with
several hundred
files.  I'm looking foractive files, and normally
would do ls -tr to find the most recently-modified files -- but the
list is so huge
it's difficult.  So I tried:

find . -maxdepth 1 -f file -atime -2


find . -maxdepth 1 -f file -atime -2 -exec ls -ltr
{}\;

find . -maxdepth 1 -f file -atime -2 | xargs ls -ltr

is more efficient, and will probably produce more reliable (although not perfect) results.

I.e, the -t implies sorting the entire result set, but -exec runs each ls in isolation, so the resulting order depends entirely on the order that find emits them. If there are not too many files, the xargs approach will fire off a single ls process, thereby sorting the results in time order.

If you have many files and need to guarantee the time order (in order to process the oldest newest files first), the following should do the trick:

#! /usr/bin/perl -w

use strict;
use constant DELTA => 300; # five minutes

my $dir = shift || '.';
my $now = time;

opendir DIR, $dir or die "Cannot open directory $dir: $!\n";

print "$_\n" for
    sort { (stat "$dir/$b")[9] <=> (stat "$dir/$a")[9] }
    grep {
            $_ ne '.'
        and $_ ne '..'
        and ($now - (stat "$dir/$_")[9]) <= DELTA
    }
    readdir DIR;

closedir DIR;
__END__

The sort could use a Guttman-Rosler Transform to avoid excessive statting, but I'm feeling lazy :)

David



Reply to: