On Wed, Nov 12, 2008 at 3:44 AM, lee <lee@yun.yagibdah.de> wrote:
Do you mean it is more likely that any one drive in the array fails when
you have more drives, or do you mean that it is more likely for a drive
in the array to fail when you have more drives? If drives fail more
often when being used in an array with more drives, what makes them
fail more often under those conditions?
It's purely a statistical property, not related to being in a RAID
array. But if there's (say) a 5% chance for a given drive to fail on
a given day, there's a 95% chance it won't fail.
If you have two drives, the chance *both* won't fail is the chance of
one not failing, times the chance of the other not failing -- 95%
times 95%, or 90.25%.
With 24, the chance of all the drives not failing is .95^24 or 29.2%.
Of course I just made the rates up, the survival chances of individual
drives are higher. But logic holds; the more drives you're watching,
the more lucky you'd have to be for none of them to be a dud.
-jeff