Bug#1107726: unblock: pytorch{,-cuda}/2.6.0+dfsg-8 (pre-approval)
On Fri, 2025-06-13 at 19:20 +0200, Santiago Vila wrote:
> >
> > We need CUDA >= 12.4 to fix the src:pytorch-cuda FTBFS #1105066,
>
> Hi. Not a release manager, but am I right to think that if we
> do nothing and just wait for CUDA 12.4 to enter testing, the bug
> would be solved as well?
You are right. If pytorch-cuda were in main section, that would
be true. But it is contrib with non-free dependencies.
> Is there a reason what it really needs a rebuild?
> (For example, being statically linked)
We can ask the release team to schedule binNMU, and then the buildd
will refuse to build them because CUDA is from non-free section.
The build process will also pull nvidia-cudnn from internet. Even
if we toggle XS-Autobuild: Yes and allow the CUDA dependency for
buildd, the no-internet restriction during sbuild will fail the
installation of another key dependency nvidia-cudnn[1].
The only way to update the package is manually rebuild the binary
packages for every single supported architecture, and upload the
source and binary packages.
All due to non-free dependency and non-free dependency downloaded
from internet during the build.
We should not leave it as is. The bug will automatically resolve
once CUDA 12.4 migrates. But the fact that the binary package
bin:pytorch-cuda is built against CUDA 12.2 remains unchanged.
We'd better get it rebuilt against the latest CUDA version.
> (I understand the "correctness" of the updated build-dependency,
> but once that trixie is stable, we don't support building packages
> in trixie with build-dependencies not in trixie anymore).
The cuda package maintainer plans to make CUDA 12.4 in trixie.
With CUDA < 12.4 it will FTBFS anyway. The case you mention will
not happen.
[1] package bin:nvidia-cudnn is my installer script that downloads
a redistributable copy of nvidia's cudnn library and installs
it to system path. The license of cuDNN does not seem even
suitable to non-free (the last check was many years ago).
Reply to: