[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#244897: Problems with big files (size over 2 gigabytes)

Package: apache2
Severity: normal

I've been testing several servers for this kind of problems on a Pentium III
machine with kernel 2.6.5 and I have found that a lot of them were not able
to go over the 2 gigas, apache2 is one of them.

To test this I used a real file of more than 2 gigas along with a sparse one
of more than 4 gigas and a small one:
     16 -rw-r--r--    1 root     root     5000000001 Apr 20 15:35 large
      4 -rw-r--r--    1 root     root            6 Apr 20 15:31 small
3075004 -rw-r--r--    1 root     root     3145728000 Apr 20 13:39 zeros

Apache2 is not able to create an index of any dir with files over the two
gigas, for my example, with a couple of files over the two gigas I had to
move them away or I would get an empty directory index.

If you try to get a file over the two gigas from apache2 all you'll get is a
"403 Forbidden" error, even though the file has the right permisions, the
same ones that the small file has, the small file is  served correctly while
any of the big ones will get this reply:

GET /zeros HTTP/1.0

HTTP/1.1 403 Forbidden
Date: Tue, 20 Apr 2004 13:08:37 GMT
Server: Apache/2.0.49 (Debian GNU/Linux)
Content-Length: 415
Connection: close
Content-Type: text/html; charset=iso-8859-1

There is a similar bug in apache, even though its behaviour is different
from the one that apache2 presents, in that bug (#156972) you can find
a source code to create sparse files for testing.


-- System Information:
Debian Release: testing/unstable
  APT prefers unstable
  APT policy: (500, 'unstable')
Architecture: i386 (i686)
Kernel: Linux 2.6.5

Reply to: