On Wed, Dec 18, 2019 at 12:02:56PM -0500, rhkramer@gmail.com wrote:
> Aside / Admission: I don't backup all that I should and as often as I should,
> so I'm looking for ways to improve [...]
> Part of the reason for doing my own is that I don't want to be trapped into
> using a system that might disappear or change and leave me with a problem.
I just use rsync. The whole thing is driven from a minimalist script:
#!/bin/bash
home=${HOME:-~}
if test -z "$home" -o \! -d "$home" ; then
echo "can't backup the homeless, sorry"
exit 1
fi
backup=/media/backup/${home#/}
rsync -av --delete --filter="merge $home/.backup/filter" $home/ $backup/
echo -n "syncing..."
sync
echo " done."
df -h
I mount an USB stick (currently 128G) on /media/backup (the stick has a
LUKS encrypted file system on it) and invoke backup.
The only non-quite obvious thing is the option
--filter="merge $home/.backup/filter"
which controls what (not) to back up. This one has a list of excludes
(much shortened) like so
- /.cache/
[...much elided...]
- /.xsession-errors
- /tmp
dir-merge .backup-filter
The last line is interesting: it tells rsync to merge a file .backup-filter
in each directory it visits -- so I can exclude huge subdirs I don't need
to keep (e.g. because they are easy to re-build, etc.).
One example of that: I've a subdirectory virt, where I keep virtual images
and install media. Then virt/.backup-filter looks like this:
+ /.backup-filter
+ /notes
- /*
i.e. "just keep .backup-filter and notes, ignore the rest".
This scheme has served me well over the last ten years. It does have its
limitations: it's sub-optimal with huge files, it won't probably scale
well for huge amounts of data.
But it's easy to use and easy to understand.
Cheers
-- t
Attachment:
signature.asc
Description: Digital signature