On Thu, Jan 26, 2006 at 03:26:25PM +0100, Matthias Klose wrote:
> While preparing some example packages to experiment with
> python-central and python-support, I did see some issues with both
> proposals, in that the dependencies are not fulfilled for every python
> version that both packaging systems claim to support. Feedback is
> welcome.
> For an example see python-pmw (only one binary-all package with the
> same name is built by the source package):
> Package: python-pmw
> Depends: python-tk, python (>= 2.3), python (<< 2.4)
> which, when packaged with one of the packaging systems, becomes:
> Package: python-pmw
> Depends: python-tk, python (>= 2.3)
> Trying to use python-pmw with a python version, which is not the
> default, will fail, if the pythonX.Y-tk package is not installed for
> that version. To generalize, every binary-all python library package
> depending on a binary-arch package (containing an extension module)
> does have this problem.
> Looking at an application using python-pmw (i.e. pymol):
> Package: pymol
> Depends: python (>= 2.3), python-pmw
> The package will still work after an upgrade of the default python
> version. Assuming that an application package uses a specific python
> version, i.e.:
> Depends: ${python:Depends}, python-pmw
> -->
> Depends: python2.3, python-pmw
I think this is going to invariably be the wrong thing to do, just as it is
today. If a package Depends: python2.3, it should also Depend:
python2.3-pmw, not python-pmw. I really don't see any other way to ensure
consistent dependencies except not *providing* a python2.3 package, which
would seem to be even less satisfactory a solution in the general case.
So if python-pmw is intended to contain the bits that make pmw available to
python2.3, it must Provide: python2.3-pmw. When the python-pmw package
ceases to support python2.3, it should drop the Provides; then dpkg will
refuse to upgrade python-pmw without removing the application, which is the
situation we want.
> the package dependencies may not be fulfilled anymore. That could be
> solved by adding to python:Depends python2.3-tk. Either the package
> maintainer has to do that explicitely, or dh_python has to find these.
Is dh_python going to find these goodies for us in other cases? Sorry, I've
never used dh_python.
> The issues (and questions) are:
> - The packaging infrastructure forces the installation of the default
> python version, it's not possible to install a non-default version on
> it's own (if at least one package using the infrastructure is
> installed).
> At least thats one thing I can live with; others as well?
Yep, I think that's a reasonable requirement.
> - As outlined above, we cannot enforce correct dependencies with the
> proposed packaging infrastructure (both variants). That is only the
> case when using a non-default python version. AFAICS this would
> violate the Debian policy. Should there be an exception?
No; see above comments.
> - A packaging infrastructure not supporting binary-arch modules
> covers about 50 out of 200 source packages providing python modules
> and extensions (that number may not be accurate, just counting
> packages using the python- and pythonX.Y- prefixes).
> That number can be raised, if extension modules for all supported
> python versions are made available in one package (at least for the
> version we transition from and for the version we transition to).
> This approach has it's limitations, i.e. python2.3-tk and
> python2.4-tk are built from separate sources and cannot be put in
> one binary package. It does help for packages like zopeinterface
> and twisted, where only very small extension modules are put in
> one package supporting more than one python version. Even larger
> extension modules could be packages this way for at least the
> time of a python transition to support both old and new version (a
> package like pygtk2 comes to mind, having many rdepends).
> We still do have the limitation, that every python module depending
> on a pythonX.Y-foo binary-arch package cannot use the packaging
> infrastructure.
Two comments here. First, I don't think all python extensions *should* be
packaged in a single binary package for all versions; I seem to recall the
Ubuntu wiki page on the topic conceded this point. Second, I don't think
that there's any a priori reason why the python packaging infrastructure
cannot or should not cope with splitting binaries up into python2.x-foo
packages, in an automated fashion.
Given that this is something we're likely to want to support for a while,
then, I think it would be better if efforts were focused first on making
this happen, and later sorting out the merging of binary extensions into a
single package.
> - AFAICS the proposed packaging infrastructure doesn't help the
> migration of a new python default version to testing much. It does
> help maintainers of these 50 source packages, but still requires
> uploads of the other 150 packages (potentially introducing
> dependencies on newly uploaded libs). Supporting more than one
> python version for binary-arch packages does raise that number.
Yes, that's a fair point. But by now, I think more time has been spent
talking about how to improve things, and/or waiting for an implementation to
happen, than the entire python2.4 transition should have taken. :) Even
having 1/4 fewer packages that need to be managed in the transition will
surely be of help. Having binNMUable, all-in-one binary packages for python
extensions would be icing on the cake -- it's certainly far more ambitious
than anything I initially had in mind when starting to poke at python
policy.
So at this point, I would be inclined to suggest that the byte-compiling
stuff be implemented as a first pass, then we can start the python2.4
transition with what we have, and the Magic Dancing Python Extensions<tm>
can be figured out in parallel to this.
> - Just another proposal could be to keep the package structure
> python-foo, python2.3-foo, python2.4-foo, put all arch independent
> files into python-foo, using the proposed python infratstructure
> to promote the packages to each python version and putting
> extension modules into the pythonX.Y packages. In case of
> binary-all modules, the pythonX.Y packages are just dependency
> packages.
> That proposal does address the concern of putting the source in
> only one package, avoiding code dubplication in binary packages,
> but still requires new upload of the package for a python
> transition, not supporting a migration to testing very well.
> There are some packages using such kind of setup.
I don't think we need to avoid *all* reuploads of packages during a
transition. Anyway, I guess this is largely equivalent to what I suggested
above (allowing both the existing approach to binary extensions, and your
enhanced automation).
> Can we neglect the dependency issues for modules available for
> non-default python versions, seeing these just as an aid for doing a
> transition and require packages explicitely using a non-default
> version to add these dependencies themself?
If you mean, not provide automation for such packages, that seems like a
reasonable approach until someone demands support for it and/or provides a
patch. :)
Cheers,
--
Steve Langasek Give me a lever long enough and a Free OS
Debian Developer to set it on, and I can move the world.
vorlon@debian.org http://www.debian.org/
Attachment:
signature.asc
Description: Digital signature