[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: What tool can I use to make efficient incremental backups?



On Sun, 20 Aug 2017 02:05:46 -0400
Gene Heskett <gheskett@shentel.net> wrote:

> On Saturday 19 August 2017 23:07:01 Celejar wrote:
> 
> > On Thu, 17 Aug 2017 11:47:34 -0500
> >
> > Mario Castelán Castro <marioxcc.MT@yandex.com> wrote:
> > > Hello.
> > >
> > > Currently I use rsync to make the backups of my personal data,
> > > including some manually selected important files of system
> > > configuration. I keep old backups to be more safe from the scenario
> > > where I have deleted something important, I make a backup, and I
> > > only notice the deletion afterwards.
> > >
> > > Each backup snapshot is stored in its own directory. There is much
> > > redundancy between subsequent backups. I use the option
> > > "--link-dest" to make hard links and thus save space for files that
> > > are *identical* to an already-existing file in the backup
> > > repository. but this is still inefficient. Any change to a file,
> > > even to its metadata (permission, modification time, etc.), will
> > > result in the file being saved at whole, instead of a delta.
> > >
> > > Can you suggest a more efficient alternative?
> >
> > There's Borg, which apparently has good deduplication. I've just
> > started using it, but it's a very sophisticated and quite popular
> > piece of software, judging by chatter in various internet threads.
> >
> > https://borgbackup.readthedocs.io/en/stable/
> >
> > Celejar

> Amanda has quite intelligent ways to do that. I run it nightly and have 

[Snipped lots of miscellaneous, but seemingly irrelevant, discussion about Amanda's virtues.]

Amanda does deduplication? Link?

Celejar


Reply to: