[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Re: rm -rf is too slow on large files and directory structure(Around 30000)



On Wed, Feb 15, 2012 at 4:51 PM, Clive Standbridge
<list-user@tgstandbridges.plus.com> wrote:
>> But may provide some benefit when removing a large number (30000) of
>> files (at least empty ones).
>>
>> cbell@circe:~/test$ time find rm -type f -exec rm {} \;
>>
>> real              0m48.127s
>> user              1m32.926s
>> sys               0m38.750s
>
> First thought - how much of that 48 seconds was spent on launching
> 30000 instances of rm? It would be instructive to try
>
>  time find rm -type f -exec rm {} \+
>
> or the more traditional xargs:
>
>  time find rm -type f -print0 | xargs -0 -r rm
>
> Both of those commands should minimise the number of rm instances.
> Similarly for unlink.

Here are the test results:

cbell@circe:~/test$ time find rm -type f -exec rm {} \+

real    0m0.953s
user    0m0.064s
sys     0m0.884s
cbell@circe:~/test$

cbell@circe:~/test$ time find rm -type f -print0 | xargs -0 -r rm

real    0m0.823s
user    0m0.080s
sys     0m0.824s
cbell@circe:~/test$

It doesn't seem possible to run a similar test for unlink as it
appears it only operates on 1 file at a time.  So it does seem that rm
with the find and/or xargs options you provided is the best way to go
(at least for this test case).

-- 
Chris


Reply to: