[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Python modules for every supported version



On Wed, Jun 16, 2004 at 06:07:10PM -0400, Jim Penny wrote:
> On Wed, 16 Jun 2004 14:27:23 -0700
> Matt Zimmerman <mdz@debian.org> wrote:
> > I still do not see why supporting N versions of Python should require
> > N+1 binary packages (or even N).  Why can't they be byte-compiled
> > after installation for all available versions of Python?
> 
> Suppose package foo is installed.  It requires byte compilation.  When
> installed, python2.2 is on the user's box.  It is possible to discover
> the set of pythons on a user's machine, but somewhat nasty.  
> 
> Now, suppose that python2.3 is later installed. How does the
> installation procedure for python2.3 discover that it can (and should)
> bytecompile foo?

It's (morally) no harder than installing menu after installing packages
with menu and menu-method files, or installing a new window manager and
generating its menu. This is a solved problem, and menu is not the only
example of this sort of thing; somebody simply needs to write the code
for python.

> Certainly it is a much nastier problem, and will, at best, lead to a
> very slow installation process for python2.3.

pythonX.Y already byte-compiles all its own modules on installation, of
which there are a substantial number. You'd have to have an awful lot of
extension modules for it to make a huge difference.

Also, I understood that byte-compilation was purely an optimization. If
you feel that doing it is too slow, we could always opt *not* to do it
for modules that are deemed unlikely to be performance-critical (and, of
course, people could byte-compile them themselves if we miss some out).
It's a straightforward convenience/performance trade-off, except that
the convenience is largely the developer's.

Cheers,

-- 
Colin Watson                                  [cjwatson@flatline.org.uk]



Reply to: