on Tue, Apr 20, 2004 at 03:39:25PM -0400, Chris Metzler (cmetzler@speakeasy.net) wrote:
> On Tue, 20 Apr 2004 21:03:15 +0200
> Wolfgang Pfeiffer <roto@gmx.net> wrote:
> >
> > My goal is to get easily rid of identical files on a system:
>
> I did something like this once for a whole filesystem with a bash
> script. md5sum'ing *everything* is wasteful of time and cpu cycles,
> since (probably) most of the things you'll md5sum won't have duplicates.
However, most files *also* don't change frequently, so after an initial
"big push" effort, you've got a list of hashes which you _can_ test
against readily. Numerous systems do checksum files anyway, such as
tripwire, etc.
On a daily basis, you need only compute hashes on modified files.
Peace.
--
Karsten M. Self <kmself@ix.netcom.com> http://kmself.home.netcom.com/
What Part of "Gestalt" don't you understand?
How about outsourcing the Presidency?
Attachment:
signature.asc
Description: Digital signature