[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

OS tests: Linux proves to be one of the most stable



I enclose the abstract from a talk that Bart Miller will be giving
today in our Computer Sciences Department.  It is interesting to note
that in this series of tests Linux proved to be one of the most stable
versions of Unix, more stable than commercial versions.  He lists the
GNU utilities as a separate entry.  I think that means that he is
testing the GNU utilities compiled on systems other than GNU/Linux.

His web page
		    http://www.cs.wisc.edu/~bart/
may contain more information on the study.
-- 
Douglas Bates                            bates@stat.wisc.edu
Statistics Department                    608/262-2598
University of Wisconsin - Madison        http://www.stat.wisc.edu/~bates/
------- Start of forwarded message -------
Date: Mon, 13 Oct 1997 06:00:09 -0500 (CDT)
From: "CS Dept. Talks" <colloq@cs.wisc.edu>
Subject: Today's Events


2:30 pm, 2310 CS
Operating Systems and Networking Seminar:   Barton  P.  Miller,  University  of
   Wisconsin, Madison, "Making Programs Explode: Using Simple Random Testing on
   Real Programs"

        In 1990, we published the results of a  study  of  the  reliability  of
   standard  UNIX  utility  programs.  This  study  showed that by using simple
   (almost simplistic) random testing techniques, we could crash or hang 25-33%
   of  the  these  utility  programs.  Recently,  we repeated and significantly
   extended this study using the same basic techniques: subjecting programs  to
   random  input  streams. A distressingly large number of UNIX utilities still
   crash with our tests.

        We tested a wide variety of utility programs on  nine  UNIX  platforms.
   The  programs  were  sent  random input streams.  We used a conservative and
   crude measure of reliability: a  program  is  considered  unreliable  if  it
   crashes  with a core dump or hangs (infinite loop). We used the random test-
   ing to also test X-Window applications and  servers,  network  servers,  and
   system library interfaces.

        The major results of this study are: (1) In the last  five  years,  all
   previously-tested versions of UNIX made noticeable improvements in the reli-
   ability of their utilities.  But ... the failure rate of  these  systems  is
   still  distressingly  high (from 18-23% in the recent study). (3) Even worse
   is that many of the same bugs that we reported in 1990 are still present  in
   the  recent  code releases. (4) The failure rate of utilities on the commer-
   cial versions of UNIX that we tested (from Sun, IBM,  SGI,  DEC,  and  NEXT)
   ranged  from 15 to 43%. (5) The failure rate of the utilities on the freely-
   distributed Linux version of UNIX was second-lowest, at 9%. (6) The  failure
   rate  of  the  public GNU utilities was the lowest in our study, at only 7%.
   (7) We could not crash network services on any of the versions of UNIX  that
   we  tested. (8) Almost 60% of the X-Window applications that we tested crash
   or hang on purely random input data streams (random binary data).  More sig-
   nificant  is that more than 25% of the applications crash or hang given ran-
   dom, but legal X-event streams. (9) We could not crash X server on the  ver-
   sions  of  UNIX  that  we  tested  (i.e., sending random data streams to the
   server).

------- End of forwarded message -------


--
TO UNSUBSCRIBE FROM THIS MAILING LIST: e-mail the word "unsubscribe" to
debian-user-request@lists.debian.org . 
Trouble?  e-mail to templin@bucknell.edu .


Reply to: