[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

xmonad failure

Hello, I tryed to rebuild all the haskell stack on my computer with

so I did a

dht make-all ... <wait a few days :)>

I end-up with this error

 lab/threadscope_0.2.11.1-3_amd64.changes done
✓ lab/uuagc_0.9.52.1-2_amd64.changes done
✓ lab/libghc-yi-keymap-emacs-dev_0.18.0-1_amd64.deb done
✓ lab/libghc-yi-keymap-emacs-prof_0.18.0-1_amd64.deb done
✓ lab/libghc-yi-keymap-vim-dev_0.18.0-1_amd64.deb done
✓ lab/libghc-yi-keymap-vim-prof_0.18.0-1_amd64.deb done
✓ lab/yi_0.18.0-2_amd64.build done
✓ lab/yi_0.18.0-2_amd64.changes done
make-all: Error when running Shake build system:
* all
* lab/xmonad_0.15-2_amd64.changes

So I looked at the xmonad build log and found this (the chroot is a clean sid)


Processing triggers for man-db (2.9.1-1) ...
Processing triggers for libc-bin (2.30-4) ...
(I)StdLoaders: Parsing and normalizing...
(I)Packages: Parsing Packages file -...
(I)Format822: total packages 59176
(I)Distcheck: Cudf Universe: 59176 packages
(I)Distcheck: --checkonly specified, consider all packages as background packages
(I)Distcheck: Solving...
output-version: 1.2
native-architecture: amd64
  package: sbuild-build-depends-main-dummy
  version: 0.invalid.0
  architecture: amd64
  status: broken
      package: libghc-pandoc-dev
      version: 2.5-3+b1
      architecture: amd64
      unsat-dependency: libghc-http-dev-4000.3.14-a3455:amd64
         package: sbuild-build-depends-main-dummy
         version: 0.invalid.0
         architecture: amd64
         depends: libghc-pandoc-dev:amd64 (>= 1.10)
background-packages: 59175
foreground-packages: 1
total-packages: 59176
broken-packages: 1


But I am a bit lost, because, on my unstable comupter I can install without issue libghc-pandoc-dev.

during the build  with dht, the packages already rebuild are used in order to produce the  new one.


so it seems that the version produce via dht is different from the one in Debian unstable.

do you know if this i normal and how to solve this issue.

thanks for your help


Reply to: