[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: 404s on update/install



on Fri, Nov 16, 2001 at 02:12:45PM -0600, Donald R. Spoon (dspoon@satx.rr.com) wrote:
> Noah Meyerhans wrote:
> > 
> > --snip-- <
> > This is occurring because you're trying to update your system while the
> > mirror is itself updating its archive.
> > 
> > In the Debian mirroring scheme, Packages files often end up being
> > updated before the actual package files.  So it's not uncommon for a
> > Packages file to refer to foo_1.2.3-1_i386.deb before that package
> > actually exists in the archive.
> > 
> > This situation doesn't typically last long, and is corrected as soon as
> > the mirror has finished updating.
> > 
> > I have also seen this happen for longer periods of time in the even that
> > a mirror's disk fills up and its archive is not in a consistant state.
> > I know this has happened at least once with one of the machines that
> > comprises http.us.debian.org.
> > 
> > noah
> 
> Sounds quite plausable to me.  Thanks!
> 
> Is there any way to determine this situation "up-front" in the
> process?  The reason I ask is that I have about 5 computers I am
> trying to keep updated here.  Doing each one manually each day coupled
> with the "re-trys" is getting to be quite a PITA.  It is a good thing
> I am retired, but the wife is getting a little peeved. <g>  
> 
> I have been toying with the idea of setting up a cron job to do the
> updates, but I haven't figured out a satisfactory way to detect this
> condition and abort the update then try again a bit later.
> 
> Maybe a script that checks for the presence of this error code and
> then exits without completing + logged message to that effect??  I
> guess you could set the timing of the cron job that calls this script
> such that it gives the delay you want...dunno.  

My own fix is as follows.

First, for your five systems, a local mirror, or (my preference) a large
squid archive tuned to archive large (e.g.:  10-60 MB) files will speed
sequential updates.  Set this up as a transparent proxy on your gateway
box.

For a recursive request, I run the following, currently only from
command line, though I'm considering adding it to my apt-get cron job:

    sleeptime=<seconds>  # say, several minutes
    apt-get update

    i=1  # initialize a counter

    # Keep trying until successful
    while ! echo y | apt-get -d dist-upgrade
    do
        sleep $sleeptime    # wait for the problem to go away
	i=$(( i + 1 ))      # increment counter....
	apt-get update	    # check for updated packges, in case
    done
    echo "Iterations:  $i"

...which will continue until successful.  Set your 'sleeptime' to
something sane.  I figure 1-10 minutes is reasonable.

If you want to let your other boxen know when the archive's been pulled,
you can set up a semaphore.  Actually, this doesn't much matter as once
you've pulled a particular file, the other boxes should pick it up from
the local cache.

Peace.

-- 
Karsten M. Self <kmself@ix.netcom.com>       http://kmself.home.netcom.com/
 What part of "Gestalt" don't you understand?             Home of the brave
  http://gestalt-system.sourceforge.net/                   Land of the free
   Free Dmitry! Boycott Adobe! Repeal the DMCA! http://www.freesklyarov.org
Geek for Hire                     http://kmself.home.netcom.com/resume.html

Attachment: pgpjFsF1csPjm.pgp
Description: PGP signature


Reply to: