Re: some issues with the proposals for the python packaging infrastructure
Le jeudi 26 janvier 2006 à 15:26 +0100, Matthias Klose a écrit :
> While preparing some example packages to experiment with
> python-central and python-support, I did see some issues with both
> proposals, in that the dependencies are not fulfilled for every python
> version that both packaging systems claim to support. Feedback is
> welcome.
>
> For an example see python-pmw (only one binary-all package with the
> same name is built by the source package):
>
> Package: python-pmw
> Depends: python-tk, python (>= 2.3), python (<< 2.4)
>
> which, when packaged with one of the packaging systems, becomes:
>
> Package: python-pmw
> Depends: python-tk, python (>= 2.3)
>
> Trying to use python-pmw with a python version, which is not the
> default, will fail, if the pythonX.Y-tk package is not installed for
> that version. To generalize, every binary-all python library package
> depending on a binary-arch package (containing an extension module)
> does have this problem.
This is indeed a very problematic issue, and such a case will always
fail by design with all implementations that have been proposed.
[snip]
> - The packaging infrastructure forces the installation of the default
> python version, it's not possible to install a non-default version on
> it's own (if at least one package using the infrastructure is
> installed).
> At least thats one thing I can live with; others as well?
I don't think it's a big issue either.
> - As outlined above, we cannot enforce correct dependencies with the
> proposed packaging infrastructure (both variants). That is only the
> case when using a non-default python version. AFAICS this would
> violate the Debian policy. Should there be an exception?
This would become a nightmare for maintainers. For example, when needing
to use python2.4-pmw, the maintainer would have to check by himself that
he needs to depend on:
python2.4, python-pmw, python2.4-tk
(^^^^^^ or python2.4-pmw if there's a Provides:)
This makes dependencies look inconsistent and it makes impossible to
easily distinguish which packages don't have correct dependencies.
That's too high a price to pay for only a small part of public modules.
> - A packaging infrastructure not supporting binary-arch modules
> covers about 50 out of 200 source packages providing python modules
> and extensions (that number may not be accurate, just counting
> packages using the python- and pythonX.Y- prefixes).
>
> That number can be raised, if extension modules for all supported
> python versions are made available in one package (at least for the
> version we transition from and for the version we transition to).
> This approach has it's limitations, i.e. python2.3-tk and
> python2.4-tk are built from separate sources and cannot be put in
> one binary package. It does help for packages like zopeinterface
> and twisted, where only very small extension modules are put in
> one package supporting more than one python version. Even larger
> extension modules could be packages this way for at least the
> time of a python transition to support both old and new version (a
> package like pygtk2 comes to mind, having many rdepends).
>
> We still do have the limitation, that every python module depending
> on a pythonX.Y-foo binary-arch package cannot use the packaging
> infrastructure.
I really believe mixing binary-arch modules for several python versions
is a slippery slope. We should avoid such horrors whenever possible.
> - AFAICS the proposed packaging infrastructure doesn't help the
> migration of a new python default version to testing much. It does
> help maintainers of these 50 source packages, but still requires
> uploads of the other 150 packages (potentially introducing
> dependencies on newly uploaded libs). Supporting more than one
> python version for binary-arch packages does raise that number.
But makes dependency management a nightmare.
> - Just another proposal could be to keep the package structure
> python-foo, python2.3-foo, python2.4-foo, put all arch independent
> files into python-foo, using the proposed python infratstructure
> to promote the packages to each python version and putting
> extension modules into the pythonX.Y packages. In case of
> binary-all modules, the pythonX.Y packages are just dependency
> packages.
> That proposal does address the concern of putting the source in
> only one package, avoiding code dubplication in binary packages,
> but still requires new upload of the package for a python
> transition, not supporting a migration to testing very well.
> There are some packages using such kind of setup.
But again, they make dependencies look inconsistent. In this case,
depending on pythonX.Y-foo isn't sufficient to have the foo module
available.
> Can we neglect the dependency issues for modules available for
> non-default python versions, seeing these just as an aid for doing a
> transition and require packages explicitely using a non-default
> version to add these dependencies themself?
I don't think so. This would sacrifice the packaging simplicity and
clean dependencies on the altar of the testing scripts. While it is a
good idea to help testing scripts whenever possible, it shouldn't be a
goal by itself.
After this new input and some thinking, I'm afraid that none of the
implementations that have been proposed (including the one I wrote) look
satisfactory for public modules, whether they be binary or python-only
modules.
There is still a situation we can improve easily, though: private
modules. Currently, they have to migrate with python transitions, and
this is only because of byte-compilation. The python-support way of
doing things should still be fine for them, and it can reduce the number
of packages to migrate at once, without complicating anyone's work.
--
.''`. Josselin Mouette /\./\
: :' : josselin.mouette@ens-lyon.org
`. `' joss@debian.org
`- Debian GNU/Linux -- The power of freedom
Reply to: