[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: RFC: implementation of package pools



On Thu, 19 Oct 2000, Eray Ozkural wrote:

> > No. This is not ideal, it impeads the ability of the ftp team to
> > manipulate the archive. Packaging into sub dirs by source restores some of
> > this ability - in fact organizing by source may be a big win for them, it
> > is too early to tell for sure.

> Yes, but isn't the idea to provide automated tools that will require
> minimal manual intervention? I guess tools can check for consistency or

We already have such tools, but they cannot be 100% effective no matter
how perfect you make them.

> Well, a tool can run a hashing function. Once the tools are stabilized,
> they will have no problems.

Complicating things like this only increases the chance an error will be
made during some king of critical intervention operation.

> A supporting argument: I'm using apt-cache search, or apt-get but avoiding
> manipulation of apt or dpkg databases manually. Because I have reliable
> automated tools.

Yet dpkg still exists, and the files are indeed stored in a text
format. Guess why?

I'm sorry, but advocating an opaque hash really ingores the requirements
of the project.

> For the current set of packages, this may be true. What I'm questioning
> is whether this will hold for 20.000 or 50.000 packages? What's the

Other things will break bad enough that we will never reach these sorts of
numbers. To claim this is insane. Just to give you something to think
about.. 20k packages will make APT require about 7 meg of ram to operate
and something like 40 meg of uncompressed package files. We just can't do
that.

> BTW, I really want to pack together:

This is already done or being done.

Jason



Reply to: