[please don't steal threads]
[this doesn't belong to debian-devel]
On Thu, 2003-03-06 at 21:14, Patricio Vera S. wrote:
> Hi guys!,
>
> I'm newbie in Linux and I have a problem : I need hadle about 50.000
> files in one directory, how Linux handle this? how is the performance?, the
> project is :
> I have 2000 reports montly, about 270.000 pages reports montly, so I
> think save the report in a
> YYYYMMDD/<name-of-report>/<name-of-report.num-page> tree directory
> structure.
It depends on the file system. ext2 probably will be dog slow, reiserfs
probably will. Don't know about others.
Usually, in this situation you use an additional level like
.../path/01/... and .../path/02/.... where the 01, 02, ... is created
only to limit the number of files per directory to something like 1000.
(If you ask about suggestions: personally, I'd prefer postgresql over
mysql - but this is mostly for ideological and personal reasons).
cheers
-- vbi
--
P.S. All information contained in the above letter is false,
for reasons of military security.
Attachment:
signature.asc
Description: This is a digitally signed message part