[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Backup Times on a Linux desktop



On 02/11/19 20:24, Konstantin Nebel wrote:
Hi,

this is basically a question, what you guys prefer and do. I have a Linux
destkop and recently I decided to buy a raspberry pi 4 (great device) and
already after a couple days I do not know how I lived without it. So why
Raspberrypi.

In the past I decided not to do backups on purpose. I decided that my data on
my local Computer is not important and to store my important stuff in a
nextcloud I host for myself and do backups of that. And for a long period of
time I was just fine with it.

Now i attached a 4 tb drive to my pi and I decided what the heck, why not
doing backups now.

So now I am thinking. How should I approach backups. On windows it does
magically backups and remind me when they didnt run for a while. I like that
attitude.

On linux with all that decision freedom it can be good and bad cause you have
to think about things :D

(SKIP THIS IF U DONT WANT TO READ TOO MUCH) ;)
So I could do the backup on logout for example but I am not sure if that is
not annoying so I'd like to have your opinion. Oh and yeah. I like to turn off
my computer at night. So a backup running in night is not really an option
unless I do wake on lan and run backup and then turn off. But right now I have
dual boot and Windows on default (for games, shame on me) and I might switch
cause first Gaming on Linux is really becoming rly good and second I could buy
second GPU for my Linux and then forward my GPU to a windows VM running my
games in 3d... Especially after buying Ryzen 3900X (that a monster of cpu)

Whoever read till the end Im thankful and ready to hear your opinion.


Cheers
Konstantin


Hi Konstantin,
In my linux experience I found several solution for backup.
First of all rsync.

Scripted rsync is well suited for your situation. Remember that rsync is not a backup tool/system alone, it is very helpfull when you need to sync file between hosts. Over this you can user --backup option that saves the last copy before it will be overwritten by the new copy in a different dir. You can use SSH to add encryption during transfer. If you add a catalog and configuration you can use it for multiple client. In then past I ran my scripted rsync backup tool, with catalog, prejob/postjob script etc.


Then I encountered bacula. bacula is a beast, complex, hard to configure in the first time but it is very powerfull. It permit pooling, scheduling, mailing, encryption, multiple clients, prejob/postjob script on server and on client, storage on tape or disks, has its own scheduler like cron that works very well, volume recycling, Client GUI, Windows Client, Web Interface and much more. I used it for several server and works great. In some situation I prefer run rsync to a local machine before run backup because on large datasets it requires more times and more overhead on network bandwidth plus all operation like stopping services + create lvm snapshot ecc.... With large datasets rsync permit to sync file very quickly so I can block my service for very small amount of time and the perform backup locally on synced dataset.


There are also other backup tool like rsnapshot (based on rsync) and I think this is the best solution for you. There is bareOS (a clone of bacula), amanda, restic, duplicity, BackupPC and borg.

Borg seems very promising but I performs only push request at the moment and I need pull request. It offers deduplication, encryption and much more.

One word on deduplication: it is a great feature to save space, with deduplication compression ops (that could require much time) are avoided but remember that with deduplication for multiple backups only one version of this files is deduplicated. So if this file get corrupted (for every reason) it will be compromised on all previous backups jobs performed, so the file is lost. For this I try to avoid deduplication on important backup dataset.

My 2 cents.




Reply to: