[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: best practice for tar and gzip?



On Sun, 12 Aug 2001 03:46:46 PDT, "Karsten M. Self" writes:
>on Sun, Aug 12, 2001 at 02:46:42AM -0700, patrick q (rpmq@yahoo.com) wrote:
>> I have a lot of important archives, up to 10,000 files per ~10 Meg
>> tar.gz tarball, that I like to keep as safe as possible.
>>
>> I test the archives when I create them, have backups, and off-site
>> backups of backups, but I am worried about possible file corruption, ie
>> propagating possibly corrupt files through the backup rotation.
>> 
>> Would it not be better to compress the files individually first and
>> then tar them into an archive instead of the normal tar.gz operation,
>> to have the best chance of recovering as many files as possible?
<...>
>Your best bet is multiple, redundant, backups, with full verification.
<...>

AOL.

It may also be worth it to not build an archive (with whatever method) 
 and then propagate it through the cycle, but to make one archive to be 
 stored on-site, and create an extra one to be stored off-site, best at 
 another time and/or interval.

I have several customers where a nightly backup goes off-site and 
 another one is done around noon to be stored on-site. Off-site backups 
 are tar.bz2 (for space reasons), on-site plain dump. If one method 
 fails there´s always the other one.

cheers,
&rw
-- 
-- Prof:    So the American government went to IBM to come up with
--          a data encryption standard and they came up with ...
-- Student: EBCDIC!
----


Attachment: pgpoF4zUCgwv3.pgp
Description: PGP signature


Reply to: