[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: squid cache

On Mon, 27 Aug 2001 05:56, Matt Zimmerman wrote:
> On Sat, Oct 13, 2001 at 11:20:54PM +0200, Russell Coker wrote:
> > On Sat, 13 Oct 2001 01:43, Matt Zimmerman wrote:
> > > For a while, I was using a huge "percent" parameter to cause it to
> > > always bypass that check, and use the "min" age:
> >
> > That shouldn't be necessary though.  Squid tells the web server to only
> > deliver the document if it's newer than a cached copy.  For .deb files
> > that should never happen and override-lastmod should only prevent that
> > check saving 500 bytes of transfer but having no impact on the 1M file
> > transfer.
> It doesn't just save 500 bytes of transfer, but DNS lookup and
> connection setup entirely.  Since we know that these files won't change
> for a full day, this should be avoidable.

True.  But the DNS isn't really an issue when you are requesting 100+ files 
from the same site.

Does Squid support persistant connections to sites?  If so then the overhead 
for checking the time stamps on upstream shouldn't be significant.

> For example, if I am bringing a new server online and my favorite mirror
> is unreachable, I can still get package lists, and install any packages
> that are already cached (which I should rightly be able to do).

True.  Is there any way to tell Squid to recognise network outages and sent 
objects from the cache without checking expiry?

http://www.coker.com.au/bonnie++/     Bonnie++ hard drive benchmark
http://www.coker.com.au/postal/       Postal SMTP/POP benchmark
http://www.coker.com.au/projects.html Projects I am working on
http://www.coker.com.au/~russell/     My home page

Reply to: