Re: Backing up a Debian system
On Wed, 31 Jul 2002, Matt Zimmerman wrote:
> On Wed, Jul 31, 2002 at 11:40:55AM -0500, Drew Scott Daniels wrote:
> > My Debian backup steps:
> > 1. I remove files and packages that I'm sure I don't want or need
> > (deborphan can help me figure this out. So could a nice tree of
> > dependencies, but I can't remember what program shows this nicely. Anyone
> > know of such a program?)
> apt-cache + graphviz. But be warned, the output will be larger, more
> complex and less useful than you expect.
The top level is the most interesting usually. I think I remember a gnome
or other graphical front end doing something like what I want. I know
there's a program to keep track of what programs you want and have it
automatically remove the undesired packages that creep in.
> > 10. Append "tar -af backupfile.tar " in a text file, at the beginning of
> > every line that lists a file to backup except the first one which I do
> > "tar -cf backupfile.tar" (tr may be helpful, but what's the proper
> > command? I'd prefer to avoid perl, but is it more common than tr? If so
> > what's the proper perl -e line?). I then make the text file executable and
> > execute it.
> Sounds like you want cpio, or tar --files-from.
That would be very helpful. I'd like to get the list in good form
automatically too. As I recall there was some extra blank space in lines
and some lines that weren't files, but were titles. grep would help a bit
> > 11. I run "bzip2 -9 backupfile.tar".
> You really want to do this in a pipeline to avoid wasting a lot of disk
It would, but the overhead when doing line by line was too high. A pipe
would be better when available. A compression program better than bzip2
might also be useful (if you check the bzip2 web site, you'll see the
author sees much room for improvement in the archive format).
If the person doing the backup was willing to take the overhead, they may
want to look at re-ordering the files in the tar archive to make for better
compression. Order does make a difference, and can make a significant
difference in some cases. Using "file" to put alike files together may
help, but assuming a directory has common files in it is usually a
> > I would appreciate help in generating a program/script to automate these
> > steps (and getting it made into an uploaded package). Also help in
> > figuring out a good way to make an incremental backup would be very
> > useful.
> My backups are aimed toward reconstructing the system from a Debian archive
> + the backup. This means saving selections and package versions,
> /usr/local, /etc, /home, /var and any local directory hierarchies. I also
> back up my partition table and the like.
How do you save the package versions, did I miss that in --get-selections?
My etc collects quite a few default configuration files so many of these
I can get from reinstalling their packages. I also sometimes have files
that I forget under standard directories, or special modifications.
Ideally all of these would be documented, but some sysadmins inherit a
system, and some don't remember where they put their documentation. I like
to check all directories for non package files.