[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Offline APT



I got this reply off-list:

On Sat, 2001-09-01 at 12:40, Thomas Bleicher wrote:
> However, I'm in the same situation and I wrote a small python-script to
> split the file with all the needed packages for an upgrade into
> zip-sized wget-scripts (and show me some statistics I'm interestet in).
> 
> I do:
> 
> apt-get -qq --print-uris [upgrade | install <something>] > urls
> 
> and then run my script on the file urls.
> 
> When I come back with the filled zips, i just copy all the downloaded
> files to /var/cache/apt/archives/. Then I can run "apt-get upgrade" and
> when the archive is scanned, all packages are found and installed,
> provided all have been downloaded successfully. If not, apt tells me
> what is still missing and I can start again ;)

This is what I did in the end without the Python script.  I didn't want
to copy 200 meg of debs into archives/ and as the option to specify the
archive directory didn't work for me, I tried symlinking the entire
directory of downloaded debs into the archives.  That also didn't work.
:-(  In the end I had to copy it but I'd prefer to have a directory on
my spare disk which Apt used as the cache.

Thomas, could you email me that script?  It sounds like what I want to
do, especially if I could get Dir::Cache::archive working.

Regards,
Ross Burton



Reply to: