[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Extremely large level 1 backups with dump



Karl --

So on my first attempt, I realized that I need to exclude the /media directory, or else the backup drive will attempt to back up itself.  OK, that's fine.

On the second attempt, the backup got into the /proc directory, complained about some files disappearing, and then froze.

I don't have these problems on my work computer, where I use rsync, but there I only back up my home directory.  Here I'm trying to do the entire filesystem (actually both the root and boot filesystems).  So it's a much larger world.

-PT

On Sun, Dec 5, 2010 at 7:15 PM, Peter Tenenbaum <peter.g.tenenbaum@gmail.com> wrote:
Well, after having some difficulty getting rsync to do exactly what I want, I've become convinced to try rsnapshot.  I'll let you know how it goes.

-PT


On Sat, Dec 4, 2010 at 4:14 PM, Peter Tenenbaum <peter.g.tenenbaum@gmail.com> wrote:
Jochen, Paul --

In thinking this over, I think that the best approach is to simply have a daily rsync --archive from my main hard drive to the backup drive.  While I understand that more sophisticated backup systems are often useful in a large system, the system in question is a home computer with only 2 users.  The file complement changes but slowly, and we never delete and rarely overwrite files, so there's no need to be able to, say, recover the 3 days ago version of a file.  The backup system is mainly there for disaster recovery, with daily backups preferred just so that we don't lose many e-mail messages in the event of a catastrophic failure. 

Do you concur that a simple rsync makes more sense in this context, or do you think that I would still benefit from using either the --link-dest option or rsnapshot?

-PT



Reply to: