[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: RFS: http-replicator



On Thu, 19 Aug 2004 14:04:06 +0200, "Eduard Bloch" <edi@gmx.de> said:
> #include <hallo.h>
> * Gertjan van Zwieten [Thu, Aug 19 2004, 12:01:12PM]:
>
> > I admit I was not aware of this perl one-liner you're
> > referring to. I
>
> Such as... for x in *.deb; do perl -e 'read(STDIN,$a,2000);
> $a=~s,^(.*?)\r?\n\r?\n,,ms; print $a, <STDIN>' < $x > pure/$x; done
>
> > didn't find any of this on the website, which by the way seems to be
> > currently offline. Still, even with such script some problems
> > remain. For instance apt-cacher's cache can't be easily turned into
> > an offline cache, which can be useful when doing a fresh install.
> > With replicator it's as simple as generating a local Packages.gz
> > with dpkg-scanpackages.
>
> Okay. But I dont's see it as a disadvantage, and even then it would be
> relatively easy to modify apt-cacher to use separate files for package
> contents and HTTP headers.

Sure, apt-cacher can be modified to do this. But what do you conclude
from that? I mean, I can see it could be a reason to not start another
project, but http-replicator is already finished...

Gertjan



Reply to: