[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: gimp1.2: gimp package suggest non-free software



On Wed, Nov 12, 2003 at 10:28:15AM -0600, Steve Langasek wrote:
> On Wed, Nov 12, 2003 at 12:41:10PM +0100, Roberto Suarez Soto wrote:
> > > Sure, keep lowering the quality of Debian in matter of freedom,
> > > there's no matter discussing that.
> > 	I don't see how making more packages available to our users is
> > "lowering the quality of Debian in matter of freedom".
> Oh, you think there's a positive correlation between quality and
> quantity, do you? ;)

Holding the average component quality constant, and increasing the
quantity should increase the overall quality, yes. If you take the
overall quality of Debian to be the chance it'll solve your problem,
then and if the probability of any given package addressing your problem
is "p", and each package's quality (ie, the probability of solving the
problems it addresses) is q_i, then your probability of a given package
addressing your problem is p * q_i, so the probability of a given package
in Debian not solving your problem is (1 - p * q_i), so if Q, the overall
quality, is the probability of Debian solving your problem, we have:

                 n
	1 - Q = prod( 1 - p * q_i )
                i=1

                     n
or	Q     = 1 - prod( 1 - p * q_i )
                    i=1

If we assume q_i is approximately constant, that's about:

                     n
	Q    ~= 1 - prod( 1 - pq )
                    i=1

	Q    ~= 1 - (1-pq)^n

The relationship between quantity (n) and quality (q, and Q) is reasonably
clear here.

But note that we're treating "Q" as "probability some package in Debian
can solve your problem". We really should be talking about "probability
that you can use Debian to solve your problem", which is slightly more
complicated, and should consider at least the following factors:

	(a) How much effort does it take to find and use the package?

	(b) Maybe I can't solve my problem with one package, but perhaps
	    I can put two or three or more packages together to solve it?

(b) is far too hard to worry about -- it reminds me of those horrible
problems in Physics where you have to worry about every possible path
particles could take to get from point A to point B. Yuck.

(a) is easier, though. It depends on two factors: how good our
descriptions and categorisations are, and how many packages we have. With
things like apt-cache search, and remotely respectable descriptions,
I'm inclined to think that we can pretty much treat this as a constant
effort to find the packages that address the problem; followed by a
linear effort to look through those packages to weed out the broken
ones. Hrm, my stats is too weak to work that out from first principles.
Man, do I suck. Oh well. The expected number of packages is that address
the problem is:

             n
	c = sum( i * p^i * (1-p)^(n-i) )
	    i=0

There are two approaches we could take then: one is to say "well, sure,
you get 100 possible packages, but if the first one you look at solves
your problem, then you'll stop", and factor in q_i calculations. I tend
to look through all the packages anyway to find the one that _best_
solves the problem for me, though, so the effort is proportional to n
for me, and the probability of success is 1-(1-q)^c.

So, the work that goes into it is:

	 n
	sum( i * p^i * (1-p)^(n-i) )
	i=0

and the expected payoff is:

	~ 1 - (1-pq)^n
or
	1-(1-q)^c

Hopefully this analysis will be helpful in evaluating future ITPs.

Cheers,
aj

-- 
Anthony Towns <aj@humbug.org.au> <http://azure.humbug.org.au/~aj/>
I don't speak for anyone save myself. GPG signed mail preferred.

Australian DMCA (the Digital Agenda Amendments) Under Review!
	-- http://azure.humbug.org.au/~aj/blog/copyright/digitalagenda

Attachment: pgp4DQnBU68lr.pgp
Description: PGP signature


Reply to: