[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Slow Script

On Tue, Feb 03, 2009 at 09:02:52PM -0500, Chris Jones wrote:
> More seriouly, when you are dealing with 32 million records, one major
> venue for optimization is to keep disk access to a minimum. Disk access
> IIRC is measured in milliseconds, RAM access in nanoseconds and above..
> Do the math.. 

Given that the posted loop is operating entirely on Perl in-memory
arrays, the OP is unlikely to be deliberately[1] accessing the disk
during this process.

[1] If it's a tied array, then it could have some magical disk
interaction behind it, but the OP doesn't appear to have reached a state
of Perl Enlightenment which would allow him to create or optimize magic
that deep.  The other possibility for disk access would be if the
dataset is larger than available RAM and it's getting paged in and out
from disk, which is just bad news for performance no matter how you
slice it.  Aside from those two cases, it looks very unlikely that I/O
would be the bottleneck here.

Dave Sherohman
NomadNet, Inc.

Reply to: