Re: Backups (was: /opt/ again)
On Wed, Sep 15, 1999 at 07:39:19AM +0000, Marc Haber wrote:
> > Considering one can install a fairly robust system (FreeBSD, Debian) over
> >FTP/NFS in under an hour
> If a broadband internet connection is available, yes. That doesn't
> apply to all sites.
Who said anything about an internet connection? Would you do NFS over the
internet? In any shop that is considering getting the machine up "fast" the
quickest way would be to get the bare system on, throw it on the network, and
either mount a drive over NFS or FTP. At least, that is how we did things in
the shop I was in.
Of course now we just have master disks for everything, we just dupe the
disk and drop the data in off backup. Takes us ~1/2 hour to replace a machine
and get it back up and running with the previous configuration.
> The DDS-3 tape drive we use can backup and verify about 2 Gig per
> hour. That compares to the throughput of roughly 4.5 Mbps. Since in
> Germany you can either have 2 Mbps or 34 Mbps (the latter costing
> about 5 KEuro per Month in line cost without even having it connected
> to an ISP). Thus, your approach is rarely applicable over here.
2Gb for the programs which are stored on a varity of different media or
2Gb for the data. I'd rather have 2Gb of data, thanks.
> ack. I'd dispense with backing up /usr iff dpkg could re-install all
> packages that are known to be installed from /usr/state with a single
> call.
File a bug on it, then. I would like to see that to having lost /usr once
on my server at home.
> btw, if you do it this way, /usr/local has to be considered data, too.
Depends on what you have on there. If it is stuff that is easily replaced
from source, recompile. I'd backup the sources, not the programs themselves.
--
Steve C. Lamb | I'm your priest, I'm your shrink, I'm your
ICQ: 5107343 | main connection to the switchboard of souls.
-------------------------------+---------------------------------------------
Reply to: