[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: A better backup (was: Re: Question re: splitting files)



Adam Klein <aklein@eskimo.com> writes:

> On Fri, Feb 13, 1998 at 08:25:01AM -0500, Daniel Martin at cush wrote:
> > <snip>
> > Also, a one-bit error can ruin all files recorded after the error -
> > not a great idea.  I really am surprised that there's no standard
> > backup method for debian that:
> > 1) backs up across multiple volumes
> > 2) provides checksums for each file, and
> > 3) allows compression on a file-by-file basis (i.e. allows creation of 
> > an uncompressed archive of compressed files, which is less susceptible 
> > to corruption than a compressed archive of uncompressed files)
> 
> Take a look at the 'afio' package.

I found afio just a few hours after posting, and it does seem that
afio meets requirements 1 and 3 above (which are really the two most
important).  However, it doesn't seem to meet requirement 2
(checksums) - at least, not by itself.  It's control file ability
appears to be general enough to support something like checksums, but
I'm going to have to mess around with a few things before I understand 
what exactly one does with control files, and how to use them.  (for
example, would it be most appropriate to store checksum data for each
file in a separate control file, use a control file of checksum data
every n files, or one big checksum data file at the end of the run?)

Has anyone done something like this already that they'd be willing to
share?  It doesn't seem that it'd be too hard, but I see no reason to
reinvent the wheel.


--
TO UNSUBSCRIBE FROM THIS MAILING LIST: e-mail the word "unsubscribe" to
debian-user-request@lists.debian.org . 
Trouble?  e-mail to templin@bucknell.edu .


Reply to: