[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: How can I find identical files in a directory



On Fri, 23 Apr 2004 11:40:54 -0700
"Karsten M. Self" <kmself@ix.netcom.com> wrote:
>
> on Tue, Apr 20, 2004 at 03:39:25PM -0400, Chris Metzler
> (cmetzler@speakeasy.net) wrote:
> > On Tue, 20 Apr 2004 21:03:15 +0200
> > Wolfgang Pfeiffer <roto@gmx.net> wrote:
> > >
> > > My goal is to get easily rid of identical files on a system:
> > 
> > I did something like this once for a whole filesystem with a bash
> > script.  md5sum'ing *everything* is wasteful of time and cpu cycles,
> > since (probably) most of the things you'll md5sum won't have
> > duplicates.
> 
> However, most files *also* don't change frequently, so after an initial
> "big push" effort, you've got a list of hashes which you _can_ test
> against readily.  Numerous systems do checksum files anyway, such as
> tripwire, etc.
> 
> On a daily basis, you need only compute hashes on modified files.

Fair enough.  I was thinking in terms of having to do this but once
(which was my case); but if it's something you'll have to do multiple
times, then I agree that the overhead isn't a worry.

-c


-- 
Chris Metzler			cmetzler@speakeasy.snip-me.net
		(remove "snip-me." to email)

"As a child I understood how to give; I have forgotten this grace since I
have become civilized." - Chief Luther Standing Bear

Attachment: pgpsSXx_io0sy.pgp
Description: PGP signature


Reply to: