[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#1012173: apt: Locking timeout for archives



Hi,

(somewhat obsoleted by Julians reply, but just so I haven't only wasted
 my time, lets waste yours as well by sending it anyway 😉)

On Tue, May 31, 2022 at 12:58:05PM +0200, Jouke Witteveen wrote:
> In our setup, we share an archives cache directory [Dir::Cache::archives]
> between multiple (virtual) machines. This turns out to be an effective
> way to save bandwidth and disk space. However, each machine may lock the
> archives directory and make apt unavailable for the other machines. We had

I am not sure we should encourage this setup. It feels a bit brittle,
especially as not every form of directory sharing even supports
sharing the locks (like NFS I think), so that silently works until it
doesn't.

Sadly I don't really have a good alternative suggestion as e.g. the idea
I am on/off playing with of constructing a partial mirror out of
/var/cache/apt/archives (and /var/lib/apt/lists) so that you could us
a mirror+file as a source which would have priority for the
reconstructed partial file:/ mirror and otherwise fall back to the
online https:// source is nice and all, but certainly not as trivial to
set up and maintain as sharing /var/cache/apt/archives and pinky-
promising to each machine involved to not mess up too badly as you are
basically granting each of them full root access on each other.

(I take it that not copying around is your goal, hence me talking about
 file:/ as that uses the files at the place they are rather than copying
 them into /var/cache/apt/archives. Otherwise more "traditional" caching
 solutions are likely an easier/better fit as Julian mentioned)


> 3. Implement per-download locking with a timeout as requested in #478961.

That isn't really gonna work as its not just the download, you also want
to keep the file around until its no longer "needed", so we would need
to hold onto potentially thousands of locks in a typical full-upgrade.

If we just lock individual files while they are downloaded e.g. an 'apt
clean' operation could remove all the ones we finished downloading
already. Just like 'apt' defaults to removing deb files after
installation nowadays, so that the other machine who wanted to reuse
that file ends up failing to find it even through it was here a second
ago…


Best regards

David Kalnischkies

Attachment: signature.asc
Description: PGP signature


Reply to: