[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

hardware/optimizations for a download-webserver

please excuse my general questions.

A customer asked me to setup a dedicated webserver that will offer ~30 files (each ~5MB) for download and is expected to receive a lot of traffic. Most of the users will have cable modems and their download speed should not drop below 50KB/sec.

My questions are:
What would be an adequate hardware to handle i.e. 50(average)/150(peak) concurrent downloads?
What is the typical bottleneck in this setup?
What optimizations should i apply to a standard woody or sarge installation? (anything kernelwise?)

I have experiences with not so specialized servers (apache1.x/php4.x hosting on debian/woody/sarge) but never really hit any limits with these.

I thought about:

- tuning apache (oviously) -- raising Max/MinSpareServers, AllowOverride none, FollowSymLinks,...

- putting the files on a ramdisk or using mod_mmap_static (only ~600MB alltogether)

- replacing apache with fnord (http://www.fefe.de/fnord/) or cthulhu (http://cthulhu.fnord.at/). Can anyone share experiences with these?

- (as a last resort) using 2 loadbalancing servers with lvs (http://www.linuxvirtualserver.org/).


Henrik Heil, zweipol Coy & Heil GbR

Reply to: