[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: best practice for tar and gzip?



On Sun, Aug 12, 2001 at 02:46:42AM -0700, patrick q wrote:

> Hi,
> 
> I have a lot of important archives, up to 10,000 files per ~10 Meg
> tar.gz tarball, that I like to keep as safe as possible.
> 
> I test the archives when I create them, have backups, and off-site
> backups of backups, but I am worried about possible file corruption, ie
> propagating possibly corrupt files through the backup rotation.
>   
> Would it not be better to compress the files individually first and
> then tar them into an archive instead of the normal tar.gz operation,
> to have the best chance of recovering as many files as possible?
> 

Compress files individually. Afio does this, and it is the basis for the
tob (Tape Oriented Backup) utility. The following command should give
you a place to start:

afio -Zvo -b 10k <backup device> < filelist

See the man pages for further info. You can tar these all together once
compressed. The problem with tar is that if there is an error in a tar
archive, everything beyond that point is lost.

Paul



Reply to: