Re: Seeking Wisdom Concerning Backups
On 2/29/08, Kent West <westk@acu.edu> wrote:
I have a small server on which I need to backup the /home partition.
 I have a Barracuda Terastation Pro backup server sitting right next to
 it, connected via Ethernet.
 The problem is that the Terastation Pro only offers three connection
 methods: Windows Fileshare (Samba/smb/cifs), Apple Filesharing (AFS),
 and FTP).
 I came into the Linux world about the time that FTP was being deprecated
 in favor of SFTP and its variants, so I have a real skittishness of
 using plain FTP. AFS is irrelevant for me. And Samba, whereas slightly
 distasteful, would be okay, except for two problems:
 1. file permissions are not preserved when doing something like rsync, and
 2. tarballs get truncated at an apparent 2GB limit when using tar.
 I don't need anything fancy; just simple and reliable. I had put enough
 effort into learning tar and rsync to make them work (I thought! (is
 this 2GB limit when tarring over smb documented anywhere?)), but then I
 kept running into these show-stoppers as above. I've been very
 frustrated that over the past year of off-and-on "I'm going to get
 serious now and find a solution" I've been unable to find something
 simple(!!!) and reliable. (And by "simple", I mean "easy-to-comprehend
 in two-minutes", not "easy to implement after having mastered every
 command-line switch available".) So I've decided to finally give in and
 ask the big guns on this list.
 Before I spend any more effort trying to setup/learn some other system
 (Bacula, Amanda, whatever), do you folks want to give me any suggestions
 as to the best way to proceed?
 I want something:
 * simple
 * that will back up 10 - 40 GB of /home partition
 * preferably making a full backup every week or so with incrementals
 every day, tossing out old fulls/incrementals as new ones are made
 * that will work over SMB or FTP securely enough that I can stomach it
 * that preserves directory structure/permissions
 * that doesn't run into arbitrary limits like 2GB (or works around them)
 * is automatic, so once set up, I don't have to think about it
 * does not require X or web server installation/tweaking to configure
 * does not require any sort of server piece (other than perhaps an
 /etc/init.d "service" installed as part of a quick and easy "aptitude
 install ..."
 * does not require fancy back-end stuff, like MySQL
 I know some of you experts see a solution immediately in using tar or
 rsync, and are thinking, "Well, if Kent had just done his research he'd
 know that if he'd XXX, then YYY...", but that's just it; I'm not a
 full-time researcher of how tar and rsync and Bacula works, and thus I'm
 throwing myself on the mercy of the community for a workable solution.
 (I suspect there may be a lot of people like me who knows we need to be
 doing backups but can't find a 2nd-grade-easy system to accomplish the
 task.)
 Thanks for any suggestions/help!
 --
 Kent
 --
 To UNSUBSCRIBE, email to debian-user-REQUEST@lists.debian.org
 with a subject of "unsubscribe". Trouble? Contact listmaster@lists.debian.org
 
Hey,
  If all is working except for the 2GB file limit (documented or otherwise) you can just use 'split' to break the archive into smaller parts:
eg:
  `split -b 2000m backup.tar.gz backup.tar.gz.`
or pipe tar straight to it:
   `tar ${your-args} | split -b 2000m - backup.tar.gz.`
  You can then join them by using 'cat':
  `cat backup.tar.gz.* > backup.tar.gz`
 cheers,
Owen.
-- If it ain't broke, fix it till it is.
Reply to: