Re: hardware/optimizations for a download-webserver
Am Fr, den 16.07.2004 schrieb Henrik Heil um 20:53:
> please excuse my general questions.
> A customer asked me to setup a dedicated webserver that will offer ~30
> files (each ~5MB) for download and is expected to receive a lot of
> traffic. Most of the users will have cable modems and their download
> speed should not drop below 50KB/sec.
> My questions are:
> What would be an adequate hardware to handle i.e. 50(average)/150(peak)
> concurrent downloads?
> What is the typical bottleneck in this setup?
> What optimizations should i apply to a standard woody or sarge
> installation? (anything kernelwise?)
Maybe I'm too optimistic, but I really don't think you will max out any
halfway decent server with this load...
30 x 5 MB will give you 150MB content. This should be easily cached in
RAM, even without something like a ramdisk as linux does this by itself.
Disk I/O should not be a problem.
Furthermore the content seems to be static - no need for a fast CPU.
150 concurrent downloads will be no problem for Apache, even with the
default settings. Only if you want to spawn more than 512 (?)
child-processes you'll have to recompile and increase HARD_SERVER_LIMIT.
Summary: Don't bother with tuning the server and don't even think about
setting up a cluster for something like this - definitely overkill. ;o)
I've a Debian box here which currently serves more than 160 req/second
of dynamic content - no problem at all. The HTTP-cluster next to it is
intended to handle WAY bigger loads...
Markus Oswald <email@example.com> \ Unix and Network Administration
Graz, AUSTRIA \ High Availability / Cluster
Mobile: +43 676 6485415 \ System Consulting
Fax: +43 316 428896 \ Web Development