[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Best practices for updating systems over extremely slow links



I have a couple of debian boxes in very remote areas that are connected back to our wan via a 56kbps satellite link.  Most of the time we have a constant stream of data coming/going to that machine so the link is saturated quite a bit.

I'm having all sorts of trouble getting apt to play nicely with the extremely slow link.  When I try to do an apt-get update,  it seems to work for a while,   then will start to download whichever list it's currently on all over again.  I tried running apt-get update for about 24h and it would never completely download the amd64 main Packages.gz (around 7.5M).  It would just keep trying to start over and over again. Maybe sometimes it will work, but +50% of the time it will crap out.  Apt is configured to use a proxy server, aswell as  http::timeout is set to 300 via apt.conf

FWIW,  I can reliably rsync files over the sat link without issue.  It takes a while for sure,  getting about .75 - 1.5KB/s.   So the files do get there.  So it seems like whatever magic is baked into the rsync protocol to handle these slow links is working alot more reliably for me then the http gets that apt is using.  Running rsync with bwlimit will work all day I've found.

I'm currently trying to build a list of debs that the system wants using something like 

apt-get dist-upgrade --allow-unauthenticated -y --print-uris | grep -o '\'http.*\' | tr "\'" " " > downloads

then wget'ing them locally and rsyncing them up the remote.  Seems to be working so far,  but the last failed apt-get update seemed to blow away the lists on the remote and I can no longer see any pending package upgrades on the system.  

I've also tried tarring up /var/lib/apt/lists/* from a known working system and rsyncing that up to the remote,  to try and update the lists manually I guess.  But that didn't seem to work either.  After dropping the list files in /var/lib/apt/lists and running apt-get dist-upgrade,  still showed no pending updates.  So not sure why that would be.

So after all that,  here are my questions :)

1.  Is there some crappy link tweaks I can use in apt to help apt with transferring data over a 1.5KB link?

2.  In theory,  if I wanted to transfer the apt-get update data via rsync,  should I be able to tar up /var/lib/apt/lists/* and send that manually?  It didn't seem to work,  but I would imagine there's more going on behind the scenes.

3.  Generally just curious what others have done when trying to keep systems up to date in very remote places with limited pipes.


Worst case scenario,  If we had to burn a cd full of debs monthly and ship it out to the remote I guess that would work.  We also have our own custom repos with software that gets updated aswell. But sometimes we would need to push those updates out asap.  Also,  there is only 1 machine at each remote,  so it's not an issue of running approx to save X machines all updating over the network at once.

Thanks guys.



Reply to: