[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#1033837: unblock: devscripts/2.23.3



Package: release.debian.org
Severity: normal
User: release.debian.org@packages.debian.org
Usertags: unblock
X-Debbugs-Cc: devscripts@packages.debian.org
Control: affects -1 + src:devscripts

Please unblock package devscripts

[ Reason ]
2.23.3 is only bugfix, including one RC.
Also, it fixes the description generation.

[ Tests ]
most of the changes include tests, and autopkgtest passed great.

[ Risks ]
the changes are quite trivial by themselves.

[ Checklist ]
  [x] all changes are documented in the d/changelog
  [x] I reviewed all changes and I approve them
  [x] attach debdiff against the package in testing

unblock devscripts/2.23.3

-- 
regards,
                        Mattia Rizzolo

GPG Key: 66AE 2B4A FCCF 3F52 DA18  4D18 4B04 3FCD B944 4540      .''`.
More about me:  https://mapreri.org                             : :'  :
Launchpad user: https://launchpad.net/~mapreri                  `. `'`
Debian QA page: https://qa.debian.org/developer.php?login=mattia  `-
diffstat for devscripts-2.23.2 devscripts-2.23.3

 README                                            |    2 
 debian/changelog                                  |   29 
 debian/control                                    |    5 
 debian/rules                                      |    2 
 lib/Devscripts/Salsa/Hooks.pm                     |  246 +------
 lib/Devscripts/Salsa/check_repo.pm                |   16 
 po4a/po/de.po                                     |    5 
 po4a/po/devscripts.pot                            |    5 
 po4a/po/fr.po                                     |    8 
 po4a/po/pt.po                                     |    7 
 scripts/bts.pl                                    |    2 
 scripts/deb-janitor                               |    6 
 scripts/debootsnap                                |   37 -
 scripts/debootsnap.py                             |  694 ++++++++++++++++++++++
 scripts/devscripts/test/test_debootsnap.py        |   56 +
 scripts/devscripts/test/test_suspicious_source.py |   41 +
 scripts/edit-patch.sh                             |   12 
 scripts/sadt                                      |    2 
 scripts/suspicious-source                         |    2 
 19 files changed, 929 insertions(+), 248 deletions(-)

diff -Nru devscripts-2.23.2/debian/changelog devscripts-2.23.3/debian/changelog
--- devscripts-2.23.2/debian/changelog	2023-02-19 00:56:21.000000000 +0100
+++ devscripts-2.23.3/debian/changelog	2023-03-15 23:52:52.000000000 +0100
@@ -1,3 +1,32 @@
+devscripts (2.23.3) unstable; urgency=medium
+
+  [ Samuel Henrique ]
+  * Fix generation of the extended description (Closes: #1032337)
+
+  [ Benjamin Drung ]
+  * Fix complaints from pylint 2.16.2
+  * suspicious-source: Fix MIME type name for Python code
+  * Add myself to uploaders
+
+  [ Zixing Liu ]
+  * Salsa/check_repo: avoid dependency on Digest::MD5::File (LP: #2007279)
+  * Salsa/Hooks: using if-elsif chains to avoid Switch which is a deprecated
+    package (LP: #2007279)
+
+  [ Johannes Schauer Marin Rodrigues ]
+  * debootsnap:
+    - check to make sure that equivs-build, apt-ftparchive, mmdebstrap,
+      apt-get and dpkg-name exist
+    - allow reading package list from a file
+
+  [ Rémy Martin ]
+  * edit-patch: Fix failure on creating new patch (LP: #1222364)
+
+  [ Paul Wise ]
+  * bts: Fix mangled UTF-8 name
+
+ -- Benjamin Drung <bdrung@debian.org>  Wed, 15 Mar 2023 23:52:52 +0100
+
 devscripts (2.23.2) unstable; urgency=medium
 
   * Team upload.
diff -Nru devscripts-2.23.2/debian/control devscripts-2.23.3/debian/control
--- devscripts-2.23.2/debian/control	2023-02-05 01:14:44.000000000 +0100
+++ devscripts-2.23.3/debian/control	2023-03-15 23:36:26.000000000 +0100
@@ -4,6 +4,7 @@
 Maintainer: Devscripts Maintainers <devscripts@packages.debian.org>
 Uploaders:
  Mattia Rizzolo <mattia@debian.org>,
+ Benjamin Drung <bdrung@debian.org>,
 Build-Depends:
  autodep8 <!nocheck>,
  bash-completion,
@@ -18,7 +19,6 @@
  gnupg <!nocheck> | gnupg2 <!nocheck>,
  help2man,
  isort <!nocheck>,
- libdigest-md5-file-perl <!nocheck>,
  libdistro-info-perl <!nocheck>,
  libdpkg-perl <!nocheck>,
  libfile-desktopentry-perl <!nocheck>,
@@ -32,7 +32,6 @@
  liblist-compare-perl <!nocheck>,
  libmoo-perl <!nocheck>,
  libstring-shellquote-perl <!nocheck>,
- libswitch-perl <!nocheck>,
  libtest-output-perl <!nocheck>,
  libtimedate-perl <!nocheck>,
  libtry-tiny-perl <!nocheck>,
@@ -78,14 +77,12 @@
  file,
  gnupg | gnupg2,
  gpgv | gpgv2,
- libdigest-md5-file-perl,
  libfile-dirlist-perl,
  libfile-homedir-perl,
  libfile-touch-perl,
  libfile-which-perl,
  libipc-run-perl,
  libmoo-perl,
- libswitch-perl,
  libwww-perl,
  patchutils,
  sensible-utils,
diff -Nru devscripts-2.23.2/debian/rules devscripts-2.23.3/debian/rules
--- devscripts-2.23.2/debian/rules	2023-02-10 09:57:47.000000000 +0100
+++ devscripts-2.23.3/debian/rules	2023-02-27 16:46:00.000000000 +0100
@@ -14,4 +14,4 @@
 
 override_dh_gencontrol:
 	dh_gencontrol -- $(SUBSTVARS) \
-		-V"devscripts:LongDesc=$$(cat README| awk '/^- annotate-output/,/^  exim script for sorting//'|sed -e '/^[[:space:]]*$$/d' -e 's/^/ /g')"
+		-V"devscripts:LongDesc=$$(cat README | awk '/^- annotate-output/,/^  exim script for sorting/' | sed -e '/^[[:space:]]*$$/d' -e 's/^/ /g')"
diff -Nru devscripts-2.23.2/lib/Devscripts/Salsa/check_repo.pm devscripts-2.23.3/lib/Devscripts/Salsa/check_repo.pm
--- devscripts-2.23.2/lib/Devscripts/Salsa/check_repo.pm	2023-02-05 00:33:58.000000000 +0100
+++ devscripts-2.23.3/lib/Devscripts/Salsa/check_repo.pm	2023-03-15 16:51:50.000000000 +0100
@@ -3,7 +3,9 @@
 
 use strict;
 use Devscripts::Output;
-use Digest::MD5::File qw(file_md5_hex url_md5_hex);
+use Digest::MD5  qw(md5_hex);
+use Digest::file qw(digest_file_hex);
+use LWP::UserAgent;
 use Moo::Role;
 
 with "Devscripts::Salsa::Repo";
@@ -14,6 +16,14 @@
     return $res;
 }
 
+sub _url_md5_hex {
+    my $res = LWP::UserAgent->new->get(shift());
+    if (!$res->is_success) {
+        return undef;
+    }
+    return Digest::MD5::md5_hex($res->content);
+}
+
 sub _check_repo {
     my ($self, @reponames) = @_;
     my $res = 0;
@@ -101,11 +111,11 @@
             my ($md5_file, $md5_url) = "";
             if ($prms_multipart{avatar}) {
                 ds_verbose "Calculating local checksum";
-                $md5_file = file_md5_hex($prms_multipart{avatar})
+                $md5_file = digest_file_hex($prms_multipart{avatar}, "MD5")
                   or die "$prms_multipart{avatar} failed md5: $!";
                 if ($project->{avatar_url}) {
                     ds_verbose "Calculating remote checksum";
-                    $md5_url = url_md5_hex($project->{avatar_url})
+                    $md5_url = _url_md5_hex($project->{avatar_url})
                       or die "$project->{avatar_url} failed md5: $!";
                 }
                 push @err, "Will set the avatar to be: $prms_multipart{avatar}"
diff -Nru devscripts-2.23.2/lib/Devscripts/Salsa/Hooks.pm devscripts-2.23.3/lib/Devscripts/Salsa/Hooks.pm
--- devscripts-2.23.2/lib/Devscripts/Salsa/Hooks.pm	2023-02-05 00:33:58.000000000 +0100
+++ devscripts-2.23.3/lib/Devscripts/Salsa/Hooks.pm	2023-03-15 16:51:50.000000000 +0100
@@ -4,7 +4,6 @@
 use strict;
 use Devscripts::Output;
 use Moo::Role;
-use Switch;
 
 sub add_hooks {
     my ($self, $repo_id, $repo) = @_;
@@ -187,6 +186,23 @@
     return $res;
 }
 
+sub _check_config {
+    my ($config, $key_name, $config_name, $can_be_private, $res_ref) = @_;
+    if (!$config) { return undef; }
+    for ($config) {
+        if ($can_be_private && ($_ eq "private")) { push @$res_ref, $key_name => "private"; }
+        elsif (qr/y(es)?|true|enabled?/) {
+            push @$res_ref, $key_name => "enabled";
+        }
+        elsif (qr/no?|false|disabled?/) {
+            push @$res_ref, $key_name => "disabled";
+        }
+        else {
+            print "error with SALSA_$config_name";
+        }
+    }
+}
+
 sub desc {
     my ($self, $repo) = @_;
     my @res = ();
@@ -200,217 +216,25 @@
     if ($self->config->build_timeout) {
         push @res, build_timeout => $self->config->build_timeout;
     }
-    if ($self->config->issues) {
-        switch ($self->config->issues) {
-            case "private" { push @res, issues_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, issues_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, issues_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_ISSUES";
-            }
-        }
-    }
-    if ($self->config->repo) {
-        switch ($self->config->repo) {
-            case "private" { push @res, repository_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, repository_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, repository_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_REPO";
-            }
-        }
-    }
-    if ($self->config->mr) {
-        switch ($self->config->mr) {
-            case "private" {
-                push @res, merge_requests_access_level => "private";
-            }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, merge_requests_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, merge_requests_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_MR";
-            }
-        }
-    }
 
-    if ($self->config->forks) {
-        switch ($self->config->forks) {
-            case "private" { push @res, forking_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, forking_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, forking_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_FORKS";
-            }
-        }
-    }
-    if ($self->config->lfs) {
-        switch ($self->config->lfs) {
-            case qr/y(es)?|true|enabled?/ {
-                push @res, lfs_enabled => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, lfs_enabled => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_LFS";
-            }
-        }
-    }
-    if ($self->config->packages) {
-        switch ($self->config->packages) {
-            case qr/y(es)?|true|enabled?/ {
-                push @res, packages_enabled => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, packages_enabled => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_PACKAGES";
-            }
-        }
-    }
-    if ($self->config->jobs) {
-        switch ($self->config->jobs) {
-            case "private" { push @res, builds_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, builds_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, builds_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_JOBS";
-            }
-        }
-    }
-    if ($self->config->container) {
-        switch ($self->config->container) {
-            case "private" {
-                push @res, container_registry_access_level => "private";
-            }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, container_registry_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, container_registry_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_CONTAINER";
-            }
-        }
-    }
-    if ($self->config->analytics) {
-        switch ($self->config->analytics) {
-            case "private" { push @res, analytics_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, analytics_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, analytics_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_ANALYTICS";
-            }
-        }
-    }
-    if ($self->config->requirements) {
-        switch ($self->config->requirements) {
-            case "private" {
-                push @res, requirements_access_level => "private";
-            }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, requirements_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, requirements_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_REQUIREMENTS";
-            }
-        }
-    }
-    if ($self->config->wiki) {
-        switch ($self->config->wiki) {
-            case "private" { push @res, wiki_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, wiki_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, wiki_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_WIKI";
-            }
-        }
-    }
-    if ($self->config->snippets) {
-        switch ($self->config->snippets) {
-            case "private" { push @res, snippets_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, snippets_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, snippets_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_SNIPPETS";
-            }
-        }
-    }
-    if ($self->config->pages) {
-        switch ($self->config->pages) {
-            case "private" { push @res, pages_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, pages_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, pages_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_PAGES";
-            }
-        }
-    }
-    if ($self->config->releases) {
-        switch ($self->config->releases) {
-            case "private" { push @res, releases_access_level => "private"; }
-            case qr/y(es)?|true|enabled?/ {
-                push @res, releases_access_level => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, releases_access_level => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_RELEASES";
-            }
-        }
-    }
-    if ($self->config->auto_devops) {
-        switch ($self->config->auto_devops) {
-            case qr/y(es)?|true|enabled?/ {
-                push @res, auto_devops_enabled => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, auto_devops_enabled => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_AUTO_DEVOPS";
-            }
-        }
-    }
-    if ($self->config->request_acc) {
-        switch ($self->config->request_acc) {
-            case qr/y(es)?|true|enabled?/ {
-                push @res, request_access_enabled => "enabled";
-            }
-            case qr/no?|false|disabled?/ {
-                push @res, request_access_enabled => "disabled";
-            } else {
-                print "error with SALSA_ENABLE_REQUEST_ACC";
-            }
-        }
-    }
+    #              config value                 key name                           config name      has private
+    _check_config( $self->config->issues,       "issues_access_level",             "ENABLE_ISSUES",       1, \@res );
+    _check_config( $self->config->repo,         "repository_access_level",         "ENABLE_REPO",         1, \@res );
+    _check_config( $self->config->mr,           "merge_requests_access_level",     "ENABLE_MR",           1, \@res );
+    _check_config( $self->config->forks,        "forking_access_level",            "ENABLE_FORKS",        1, \@res );
+    _check_config( $self->config->lfs,          "lfs_enabled",                     "ENABLE_LFS",          0, \@res );
+    _check_config( $self->config->packages,     "packages_enabled",                "ENABLE_PACKAGES",     0, \@res );
+    _check_config( $self->config->jobs,         "builds_access_level",             "ENABLE_JOBS",         1, \@res );
+    _check_config( $self->config->container,    "container_registry_access_level", "ENABLE_CONTAINER",    1, \@res );
+    _check_config( $self->config->analytics,    "analytics_access_level",          "ENABLE_ANALYTICS",    1, \@res );
+    _check_config( $self->config->requirements, "requirements_access_level",       "ENABLE_REQUIREMENTS", 1, \@res );
+    _check_config( $self->config->wiki,         "wiki_access_level",               "ENABLE_WIKI",         1, \@res );
+    _check_config( $self->config->snippets,     "snippets_access_level",           "ENABLE_SNIPPETS",     1, \@res );
+    _check_config( $self->config->pages,        "pages_access_level",              "ENABLE_PAGES",        1, \@res );
+    _check_config( $self->config->releases,     "releases_access_level",           "ENABLE_RELEASES",     1, \@res );
+    _check_config( $self->config->auto_devops,  "auto_devops_enabled",             "ENABLE_AUTO_DEVOPS",  0, \@res );
+    _check_config( $self->config->request_acc,  "request_access_enabled",          "ENABLE_REQUEST_ACC",  0, \@res );
+
     if ($self->config->disable_remove_branch) {
         push @res, remove_source_branch_after_merge => 0;
     } elsif ($self->config->enable_remove_branch) {
diff -Nru devscripts-2.23.2/po4a/po/de.po devscripts-2.23.3/po4a/po/de.po
--- devscripts-2.23.2/po4a/po/de.po	2023-02-19 00:55:44.000000000 +0100
+++ devscripts-2.23.3/po4a/po/de.po	2023-03-15 23:52:50.000000000 +0100
@@ -7,7 +7,7 @@
 msgstr ""
 "Project-Id-Version: devscripts 2.18.9\n"
 "Report-Msgid-Bugs-To: devscripts@packages.debian.org\n"
-"POT-Creation-Date: 2023-02-19 00:34+0100\n"
+"POT-Creation-Date: 2023-03-15 23:43+0100\n"
 "PO-Revision-Date: 2020-04-25 23:04+0200\n"
 "Last-Translator: Chris Leick <c.leick@vollbio.de>\n"
 "Language-Team: de <debian-l10n-german@lists.debian.org>\n"
@@ -14068,7 +14068,8 @@
 "exactly the requested selection of packages. This can be used to re-create a "
 "chroot from the past, for example to reproduce a bug. The tool is also used "
 "by debrebuild to build a package in a chroot with build dependencies in the "
-"same version as recorded in the buildinfo file. [python3-pycurl, mmdebstrap]"
+"same version as recorded in the buildinfo file. [apt-utils, dpkg-dev, "
+"equivs, mmdebstrap, python3-pycurl]"
 msgstr ""
 
 #. type: IP
diff -Nru devscripts-2.23.2/po4a/po/devscripts.pot devscripts-2.23.3/po4a/po/devscripts.pot
--- devscripts-2.23.2/po4a/po/devscripts.pot	2023-02-19 00:56:21.000000000 +0100
+++ devscripts-2.23.3/po4a/po/devscripts.pot	2023-03-15 23:52:52.000000000 +0100
@@ -7,7 +7,7 @@
 msgid ""
 msgstr ""
 "Project-Id-Version: PACKAGE VERSION\n"
-"POT-Creation-Date: 2023-02-19 00:34+0100\n"
+"POT-Creation-Date: 2023-03-15 23:43+0100\n"
 "PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
 "Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
 "Language-Team: LANGUAGE <LL@li.org>\n"
@@ -11228,7 +11228,8 @@
 "exactly the requested selection of packages. This can be used to re-create a "
 "chroot from the past, for example to reproduce a bug. The tool is also used "
 "by debrebuild to build a package in a chroot with build dependencies in the "
-"same version as recorded in the buildinfo file. [python3-pycurl, mmdebstrap]"
+"same version as recorded in the buildinfo file. [apt-utils, dpkg-dev, "
+"equivs, mmdebstrap, python3-pycurl]"
 msgstr ""
 
 #. type: IP
diff -Nru devscripts-2.23.2/po4a/po/fr.po devscripts-2.23.3/po4a/po/fr.po
--- devscripts-2.23.2/po4a/po/fr.po	2023-02-19 00:55:44.000000000 +0100
+++ devscripts-2.23.3/po4a/po/fr.po	2023-03-15 23:52:50.000000000 +0100
@@ -14,7 +14,7 @@
 msgid ""
 msgstr ""
 "Project-Id-Version: devscripts\n"
-"POT-Creation-Date: 2023-02-19 00:34+0100\n"
+"POT-Creation-Date: 2023-03-15 23:43+0100\n"
 "PO-Revision-Date: 2023-02-10 18:09+0400\n"
 "Last-Translator: Xavier Guimard <yadd@debian.org>\n"
 "Language-Team: French <debian-l10n-french@lists.debian.org>\n"
@@ -14052,14 +14052,16 @@
 "exactly the requested selection of packages. This can be used to re-create a "
 "chroot from the past, for example to reproduce a bug. The tool is also used "
 "by debrebuild to build a package in a chroot with build dependencies in the "
-"same version as recorded in the buildinfo file. [python3-pycurl, mmdebstrap]"
+"same version as recorded in the buildinfo file. [apt-utils, dpkg-dev, "
+"equivs, mmdebstrap, python3-pycurl]"
 msgstr ""
 "Combine  debootstrap and snapshot.debian.org pour créer un B<chroot> "
 "contenant exactement la sélection de paquets demandés. Ceci peut être "
 "utilisé pour recréer un chroot passé, par exemple pour reproduire un bogue. "
 "Cet outil est également utilisé par B<debrebuild> pour construire un paquet "
 "dont les dépendances de construction sont les mêmes que celles enregistrées "
-"dans le fichier buildinfo. [python3-pycurl, mmdebstrap]"
+"dans le fichier buildinfo. [apt-utils, dpkg-dev, equivs, mmdebstrap, "
+"python3-pycurl]"
 
 #. type: IP
 #: ../doc/devscripts.1:75
diff -Nru devscripts-2.23.2/po4a/po/pt.po devscripts-2.23.3/po4a/po/pt.po
--- devscripts-2.23.2/po4a/po/pt.po	2023-02-19 00:55:44.000000000 +0100
+++ devscripts-2.23.3/po4a/po/pt.po	2023-03-15 23:52:50.000000000 +0100
@@ -6,7 +6,7 @@
 msgid ""
 msgstr ""
 "Project-Id-Version: devscripts 2.22.2\n"
-"POT-Creation-Date: 2023-02-19 00:34+0100\n"
+"POT-Creation-Date: 2023-03-15 23:43+0100\n"
 "PO-Revision-Date: 2022-09-04 23:47+0100\n"
 "Last-Translator: Américo Monteiro <a_monteiro@gmx.com>\n"
 "Language-Team: Portuguese <>\n"
@@ -13819,14 +13819,15 @@
 "exactly the requested selection of packages. This can be used to re-create a "
 "chroot from the past, for example to reproduce a bug. The tool is also used "
 "by debrebuild to build a package in a chroot with build dependencies in the "
-"same version as recorded in the buildinfo file. [python3-pycurl, mmdebstrap]"
+"same version as recorded in the buildinfo file. [apt-utils, dpkg-dev, "
+"equivs, mmdebstrap, python3-pycurl]"
 msgstr ""
 "Combina debootstrap e snapshot.debian.org para criar uma chroot que contém "
 "exactamente a selecção de pacotes requerida. Isto pode ser usado para re-"
 "criar uma chroot do passado, por exemplo para reproduzir um bug. A "
 "ferramenta é também usada pelo debrebuild para compilar um pacote numa "
 "chroot com dependências de compilação na mesma versão como registado no "
-"ficheiro buildinfo. [python3-pycurl, mmdebstrap]"
+"ficheiro buildinfo. [apt-utils, dpkg-dev, equivs, mmdebstrap, python3-pycurl]"
 
 #. type: IP
 #: ../doc/devscripts.1:75
diff -Nru devscripts-2.23.2/README devscripts-2.23.3/README
--- devscripts-2.23.2/README	2023-02-05 00:33:58.000000000 +0100
+++ devscripts-2.23.3/README	2023-03-15 23:35:37.000000000 +0100
@@ -113,7 +113,7 @@
   to re-create a chroot from the past, for example to reproduce a bug. The
   tool is also used by debrebuild to build a package in a chroot with build
   dependencies in the same version as recorded in the buildinfo file.
-  [python3-pycurl, mmdebstrap]
+  [apt-utils, dpkg-dev, equivs, mmdebstrap, python3-pycurl]
 
 - debpkg: A wrapper for dpkg used by debi to allow convenient testing
   of packages.  For debpkg to work, it needs to be made setuid root,
diff -Nru devscripts-2.23.2/scripts/bts.pl devscripts-2.23.3/scripts/bts.pl
--- devscripts-2.23.2/scripts/bts.pl	2023-02-05 00:33:58.000000000 +0100
+++ devscripts-2.23.3/scripts/bts.pl	2023-03-15 23:42:05.000000000 +0100
@@ -3000,7 +3000,7 @@
 
 # The following routines are taken from a patched version of MIME::Words
 # posted at http://mail.nl.linux.org/linux-utf8/2002-01/msg00242.html
-# by Richard =?utf-8?B?xIxlcGFz?= (Chepas) <rch@richard.eu.org>
+# by Richard Čepas (Chepas) <rch@richard.eu.org>
 
 sub MIME_encode_B {
     my $str = shift;
diff -Nru devscripts-2.23.2/scripts/deb-janitor devscripts-2.23.3/scripts/deb-janitor
--- devscripts-2.23.2/scripts/deb-janitor	2023-02-05 00:33:58.000000000 +0100
+++ devscripts-2.23.3/scripts/deb-janitor	2023-03-01 00:23:53.000000000 +0100
@@ -102,8 +102,7 @@
         if err.code == 404:
             raise MissingDiffError(err.read().decode()) from err
         raise err
-    else:
-        return data
+    return data
 
 
 def merge(
@@ -299,8 +298,7 @@
         except MissingDiffError as err:
             logging.fatal("%s", err.args[0])
             return 1
-        else:
-            return 0
+        return 0
     if args.subcommand == "merge":
         source = _get_local_source()
         return merge(source, args.campaign, api_url=args.api_url)
diff -Nru devscripts-2.23.2/scripts/debootsnap devscripts-2.23.3/scripts/debootsnap
--- devscripts-2.23.2/scripts/debootsnap	2023-02-18 23:51:19.000000000 +0100
+++ devscripts-2.23.3/scripts/debootsnap	2023-03-15 23:35:37.000000000 +0100
@@ -28,6 +28,7 @@
 import dataclasses
 import http.server
 import os
+import pathlib
 import re
 import shutil
 import socketserver
@@ -59,6 +60,10 @@
     pass
 
 
+class RetryCountExceeded(Exception):
+    pass
+
+
 # pylint: disable=c-extension-no-member
 class Proxy(http.server.SimpleHTTPRequestHandler):
     last_request = None
@@ -189,7 +194,7 @@
                 # restart from the beginning or otherwise, the result might
                 # include a varnish cache error message
         else:
-            raise Exception("failed too often...")
+            raise RetryCountExceeded("failed too often...")
 
 
 @dataclasses.dataclass
@@ -225,6 +230,12 @@
 def parse_pkgs(val):
     if val == "-":
         val = sys.stdin.read()
+    if val.startswith("./") or val.startswith("/"):
+        val = pathlib.Path(val)
+        if not val.exists():
+            print(f"{val} does not exist", file=sys.stderr)
+            sys.exit(1)
+        val = val.read_text(encoding="utf8")
     pkgs = []
     pattern = re.compile(
         r"""
@@ -248,7 +259,7 @@
     return [pkgs]
 
 
-def parse_args():
+def parse_args(args: list[str]) -> argparse.Namespace:
     parser = argparse.ArgumentParser(
         formatter_class=argparse.RawDescriptionHelpFormatter,
         description="""\
@@ -310,7 +321,8 @@
         type=parse_pkgs,
         help="list of packages, optional architecture and version, separated "
         "by comma or linebreak. Read list from standard input if value is "
-        '"-". The option can be specified multiple times. Package name, '
+        '"-". Read list from a file if value starts with "./" or "/". The '
+        "option can be specified multiple times. Package name, "
         "version and architecture are separated by one or more characters "
         "that are not legal in the respective adjacent field. Leading and "
         "trailing illegal characters are allowed. Example: "
@@ -325,7 +337,7 @@
     parser.add_argument(
         "output", nargs="?", default="-", help="path to output chroot tarball"
     )
-    return parser.parse_args()
+    return parser.parse_args(args)
 
 
 def query_metasnap(pkgsleft, archive, nativearch):
@@ -612,8 +624,8 @@
                 shutil.move(tmpdir2 + "/" + debs[0], tmpdirname + "/cache")
 
 
-def main():
-    args = parse_args()
+def main(arguments: list[str]) -> None:
+    args = parse_args(arguments)
     if args.packages:
         pkgs = [v for sublist in args.packages for v in sublist]
         if args.architecture is None:
@@ -646,6 +658,17 @@
         if a != nativearch:
             foreignarches.add(a)
 
+    for tool in [
+        "equivs-build",
+        "apt-ftparchive",
+        "mmdebstrap",
+        "apt-get",
+        "dpkg-name",
+    ]:
+        if shutil.which(tool) is None:
+            print(f"{tool} is required but not installed", file=sys.stderr)
+            sys.exit(1)
+
     sources = compute_sources(pkgs, nativearch, args.ignore_notfound)
 
     if args.sources_list_only:
@@ -668,4 +691,4 @@
 
 
 if __name__ == "__main__":
-    main()
+    main(sys.argv[1:])
diff -Nru devscripts-2.23.2/scripts/debootsnap.py devscripts-2.23.3/scripts/debootsnap.py
--- devscripts-2.23.2/scripts/debootsnap.py	1970-01-01 01:00:00.000000000 +0100
+++ devscripts-2.23.3/scripts/debootsnap.py	2023-03-15 23:35:37.000000000 +0100
@@ -0,0 +1,694 @@
+#!/usr/bin/env python3
+#
+# Copyright 2021 Johannes Schauer Marin Rodrigues <josch@debian.org>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies or substantial portions of the Software.
+
+# This tool is similar to debootstrap but is able to recreate a chroot
+# containing precisely the given package and version selection. The package
+# list is expected on standard input and may be of the format produced by:
+#
+#     dpkg-query --showformat '${binary:Package}=${Version}\n' --show
+
+# The name was suggested by Adrian Bunk as a portmanteau of debootstrap and
+# snapshot.debian.org.
+
+# TODO: Adress invalid names
+# pylint: disable=invalid-name
+
+import argparse
+import dataclasses
+import http.server
+import os
+import pathlib
+import re
+import shutil
+import socketserver
+import subprocess
+import sys
+import tempfile
+import threading
+import time
+from collections import defaultdict
+from contextlib import contextmanager
+from functools import partial
+from http import HTTPStatus
+from operator import itemgetter
+
+import pycurl
+import requests
+from debian.deb822 import BuildInfo
+
+
+class MyHTTPException(Exception):
+    pass
+
+
+class MyHTTP404Exception(Exception):
+    pass
+
+
+class MyHTTPTimeoutException(Exception):
+    pass
+
+
+class RetryCountExceeded(Exception):
+    pass
+
+
+# pylint: disable=c-extension-no-member
+class Proxy(http.server.SimpleHTTPRequestHandler):
+    last_request = None
+    maxretries = 10
+
+    def do_GET(self):  # pylint: disable=too-many-branches,too-many-statements
+        # check validity and extract the timestamp
+        url = "http://snapshot.debian.org/"; + self.path
+        start = None
+        state = ""
+        written = 0
+        for retrynum in range(self.maxretries):
+            try:
+                c = pycurl.Curl()
+                c.setopt(c.URL, url)
+                # even 100 kB/s is too much sometimes
+                c.setopt(c.MAX_RECV_SPEED_LARGE, 1000 * 1024)  # bytes per second
+                c.setopt(c.CONNECTTIMEOUT, 30)  # the default is 300
+                # sometimes, curl stalls forever and even ctrl+c doesn't work
+                start = time.time()
+
+                def progress(*_):
+                    # a download must not last more than 10 minutes
+                    # with 100 kB/s this means files cannot be larger than 62MB
+                    if time.time() - start > 10 * 60:
+                        print("transfer took too long")
+                        # the code will not see this exception but instead get a
+                        # pycurl.error
+                        raise MyHTTPTimeoutException(url)
+
+                c.setopt(pycurl.NOPROGRESS, 0)
+                c.setopt(pycurl.XFERINFOFUNCTION, progress)
+                # $ host snapshot.debian.org
+                # snapshot.debian.org has address 185.17.185.185
+                # snapshot.debian.org has address 193.62.202.27
+                # c.setopt(c.RESOLVE, ["snapshot.debian.org:80:185.17.185.185"])
+                if written > 0:
+                    c.setopt(pycurl.RESUME_FROM, written)
+
+                def writer_cb(data):
+                    assert state == "headers sent", state
+                    nonlocal written
+                    written += len(data)
+                    return self.wfile.write(data)
+
+                c.setopt(c.WRITEFUNCTION, writer_cb)
+
+                # using a header callback allows us to send headers of our own
+                # with the correct content-length value out without having to
+                # wait for perform() to finish
+                def header_cb(line):
+                    nonlocal state
+                    # if this is a retry, then the headers have already been
+                    # sent and there is nothing to do
+                    if state == "headers sent":
+                        return
+                    # HTTP standard specifies that headers are encoded in iso-8859-1
+                    line = line.decode("iso-8859-1").rstrip()
+                    # the first try must be a http 200
+                    if line == "HTTP/1.1 200 OK":
+                        assert state == ""
+                        self.send_response(HTTPStatus.OK)
+                        state = "http200 sent"
+                        return
+                    # the header is done
+                    if line == "":
+                        assert state == "length sent"
+                        self.end_headers()
+                        state = "headers sent"
+                        return
+                    field, value = line.split(":", 1)
+                    field = field.strip().lower()
+                    value = value.strip()
+                    # we are only interested in content-length
+                    if field != "content-length":
+                        return
+                    assert state == "http200 sent"
+                    self.send_header("Content-Length", value)
+                    state = "length sent"
+
+                c.setopt(c.HEADERFUNCTION, header_cb)
+                c.perform()
+                if c.getinfo(c.RESPONSE_CODE) == 404:
+                    raise MyHTTP404Exception(f"got HTTP 404 for {url}")
+                if c.getinfo(c.RESPONSE_CODE) not in [200, 206]:
+                    raise MyHTTPException(
+                        f"got HTTP {c.getinfo(c.RESPONSE_CODE)} for {url}"
+                    )
+                c.close()
+                # if the requests finished too quickly, sleep the remaining time
+                # s/r  r/h
+                # 3    1020
+                # 2.5  1384
+                # 2.4  1408
+                # 2    1466
+                # 1.5  2267
+                seconds_per_request = 1.5
+                if self.last_request is not None:
+                    sleep_time = seconds_per_request - (time.time() - self.last_request)
+                    if sleep_time > 0:
+                        time.sleep(sleep_time)
+                self.last_request = time.time()
+                break
+            except pycurl.error as e:
+                code, _ = e.args
+                if code in [
+                    pycurl.E_PARTIAL_FILE,
+                    pycurl.E_COULDNT_CONNECT,
+                    pycurl.E_ABORTED_BY_CALLBACK,
+                ]:
+                    if retrynum == self.maxretries - 1:
+                        break
+                    if code == pycurl.E_ABORTED_BY_CALLBACK:
+                        # callback was aborted due to timeout
+                        pass
+                    sleep_time = 4 ** (retrynum + 1)
+                    print(f"retrying after {sleep_time} s...")
+                    time.sleep(sleep_time)
+                    continue
+                raise
+            except MyHTTPException as e:
+                print("got HTTP error:", repr(e))
+                if retrynum == self.maxretries - 1:
+                    break
+                sleep_time = 4 ** (retrynum + 1)
+                print(f"retrying after {sleep_time} s...")
+                time.sleep(sleep_time)
+                # restart from the beginning or otherwise, the result might
+                # include a varnish cache error message
+        else:
+            raise RetryCountExceeded("failed too often...")
+
+
+@dataclasses.dataclass
+class Source:
+    archive: str
+    timestamp: str
+    suite: str
+    components: list[str]
+
+    def deb_line(self, host: str = "snapshot.debian.org") -> str:
+        return (
+            f"deb [check-valid-until=no] http://{host}/archive/{self.archive}";
+            f"/{self.timestamp}/ {self.suite} {' '.join(self.components)}\n"
+        )
+
+
+def parse_buildinfo(val):
+    with open(val, encoding="utf8") as f:
+        buildinfo = BuildInfo(f)
+    pkgs = []
+    for dep in buildinfo.relations["installed-build-depends"]:
+        assert len(dep) == 1
+        dep = dep[0]
+        assert dep["arch"] is None
+        assert dep["restrictions"] is None
+        assert len(dep["version"]) == 2
+        rel, version = dep["version"]
+        assert rel == "="
+        pkgs.append((dep["name"], dep["archqual"], version))
+    return pkgs, buildinfo.get("Build-Architecture")
+
+
+def parse_pkgs(val):
+    if val == "-":
+        val = sys.stdin.read()
+    if val.startswith("./") or val.startswith("/"):
+        val = pathlib.Path(val)
+        if not val.exists():
+            print(f"{val} does not exist", file=sys.stderr)
+            sys.exit(1)
+        val = val.read_text(encoding="utf8")
+    pkgs = []
+    pattern = re.compile(
+        r"""
+            ^[^a-z0-9]*                    # garbage at the beginning
+            ([a-z0-9][a-z0-9+.-]+)         # package name
+            (?:[^a-z0-9+.-]+([a-z0-9-]+))? # optional version
+            [^A-Za-z0-9.+~:-]+             # optional garbage
+            ([A-Za-z0-9.+~:-]+)            # version
+            [^A-Za-z0-9.+~:-]*$            # garbage at the end
+            """,
+        re.VERBOSE,
+    )
+    for line in re.split(r"[,\r\n]+", val):
+        if not line:
+            continue
+        match = pattern.fullmatch(line)
+        if match is None:
+            print(f"cannot parse: {line}", file=sys.stderr)
+            sys.exit(1)
+        pkgs.append(match.groups())
+    return [pkgs]
+
+
+def parse_args(args: list[str]) -> argparse.Namespace:
+    parser = argparse.ArgumentParser(
+        formatter_class=argparse.RawDescriptionHelpFormatter,
+        description="""\
+
+Combines debootstrap and snapshot.debian.org to create a chroot with exact
+package versions from the past either to reproduce bugs or to test source
+package reproducibility.
+
+To obtain a list of packages run the following command on one machine:
+
+    $ dpkg-query --showformat '${binary:Package}=${Version}\\n' --show
+
+And pass the output to debootsnap with the --packages argument. The result
+will be a chroot tarball with precisely the package versions as they were
+found on the system that ran dpkg-query.
+""",
+        epilog="""\
+
+*EXAMPLES*
+
+On one system run:
+
+    $ dpkg-query --showformat '${binary:Package}=${Version}\\n' --show > pkglist
+
+Then copy over "pkglist" and on another system run:
+
+    $ debootsnap --pkgs=./pkglist chroot.tar
+
+Or use a buildinfo file as input:
+
+    $ debootsnap --buildinfo=./package.buildinfo chroot.tar
+
+""",
+    )
+    parser.add_argument(
+        "--architecture",
+        "--nativearch",
+        help="native architecture of the chroot. Ignored if --buildinfo is"
+        " used. Foreign architectures are inferred from the package list."
+        " Not required if packages are architecture qualified.",
+    )
+    parser.add_argument(
+        "--ignore-notfound",
+        action="store_true",
+        help="only warn about packages that cannot be found on "
+        "snapshot.debian.org instead of exiting",
+    )
+    group = parser.add_mutually_exclusive_group(required=True)
+    group.add_argument(
+        "--buildinfo",
+        type=parse_buildinfo,
+        help="use packages from a buildinfo file. Read buildinfo file from "
+        'standard input if value is "-".',
+    )
+    group.add_argument(
+        "--packages",
+        "--pkgs",
+        action="extend",
+        type=parse_pkgs,
+        help="list of packages, optional architecture and version, separated "
+        "by comma or linebreak. Read list from standard input if value is "
+        '"-". Read list from a file if value starts with "./" or "/". The '
+        "option can be specified multiple times. Package name, "
+        "version and architecture are separated by one or more characters "
+        "that are not legal in the respective adjacent field. Leading and "
+        "trailing illegal characters are allowed. Example: "
+        "pkg1:arch=ver1,pkg2:arch=ver2",
+    )
+    parser.add_argument(
+        "--sources-list-only",
+        action="store_true",
+        help="only query metasnap.debian.net and print the sources.list "
+        "needed to create chroot and exit",
+    )
+    parser.add_argument(
+        "output", nargs="?", default="-", help="path to output chroot tarball"
+    )
+    return parser.parse_args(args)
+
+
+def query_metasnap(pkgsleft, archive, nativearch):
+    handled_pkgs = set(pkgsleft)
+    r = requests.post(
+        "http://metasnap.debian.net/cgi-bin/api";,
+        files={
+            "archive": archive,
+            "arch": nativearch,
+            "pkgs": ",".join([n + ":" + a + "=" + v for n, a, v in handled_pkgs]),
+        },
+        timeout=60,
+    )
+    if r.status_code == 404:
+        for line in r.text.splitlines():
+            n, a, v = line.split()
+            handled_pkgs.remove((n, a, v))
+        r = requests.post(
+            "http://metasnap.debian.net/cgi-bin/api";,
+            files={
+                "archive": archive,
+                "arch": nativearch,
+                "pkgs": ",".join([n + ":" + a + "=" + v for n, a, v in handled_pkgs]),
+            },
+            timeout=60,
+        )
+    assert r.status_code == 200, r.text
+
+    suite2pkgs = defaultdict(set)
+    pkg2range = {}
+    for line in r.text.splitlines():
+        n, a, v, s, c, b, e = line.split()
+        assert (n, a, v) in handled_pkgs
+        suite2pkgs[s].add((n, a, v))
+        # this will only keep one range of packages with multiple
+        # ranges but we don't care because we only need one
+        pkg2range[((n, a, v), s)] = (c, b, e)
+
+    return handled_pkgs, suite2pkgs, pkg2range
+
+
+def comp_ts(ranges):
+    last = "19700101T000000Z"  # impossibly early date
+    res = []
+    for c, b, e in ranges:
+        if last >= b:
+            # add the component the current timestamp needs
+            res[-1][1].add(c)
+            continue
+        # add new timestamp with initial component
+        last = e
+        res.append((last, set([c])))
+    return res
+
+
+def compute_sources(pkgs, nativearch, ignore_notfound) -> list[Source]:
+    sources = []
+    pkgsleft = set(pkgs)
+    for archive in [
+        "debian",
+        "debian-debug",
+        "debian-security",
+        "debian-ports",
+        "debian-volatile",
+        "debian-backports",
+    ]:
+        if len(pkgsleft) == 0:
+            break
+
+        handled_pkgs, suite2pkgs, pkg2range = query_metasnap(
+            pkgsleft, archive, nativearch
+        )
+
+        # greedy algorithm:
+        # pick the suite covering most packages first
+        while len(handled_pkgs) > 0:
+            bestsuite = sorted(suite2pkgs.items(), key=lambda v: len(v[1]))[-1][0]
+            ranges = [pkg2range[nav, bestsuite] for nav in suite2pkgs[bestsuite]]
+            # sort by end-time
+            ranges.sort(key=itemgetter(2))
+
+            for ts, comps in comp_ts(ranges):
+                sources.append(Source(archive, ts, bestsuite, comps))
+
+            for nav in suite2pkgs[bestsuite]:
+                handled_pkgs.remove(nav)
+                pkgsleft.remove(nav)
+                for suite in suite2pkgs:
+                    if suite == bestsuite:
+                        continue
+                    if nav in suite2pkgs[suite]:
+                        suite2pkgs[suite].remove(nav)
+            del suite2pkgs[bestsuite]
+    if pkgsleft:
+        print("cannot find:", file=sys.stderr)
+        print(
+            "\n".join([f"{pkg[0]}:{pkg[1]}={pkg[2]}" for pkg in pkgsleft]),
+            file=sys.stderr,
+        )
+        if not ignore_notfound:
+            sys.exit(1)
+
+    return sources
+
+
+def create_repo(tmpdirname, pkgs):
+    with open(tmpdirname + "/control", "w", encoding="utf8") as f:
+
+        def pkg2name(n, a, v):
+            if a is None:
+                return f"{n} (= {v})"
+            return f"{n}:{a} (= {v})"
+
+        f.write("Package: debootsnap-dummy\n")
+        f.write(f"Depends: {', '.join([pkg2name(*pkg) for pkg in pkgs])}\n")
+    subprocess.check_call(
+        ["equivs-build", tmpdirname + "/control"], cwd=tmpdirname + "/cache"
+    )
+
+    packages_content = subprocess.check_output(
+        ["apt-ftparchive", "packages", "."], cwd=tmpdirname + "/cache"
+    )
+    with open(tmpdirname + "/cache/Packages", "wb") as f:
+        f.write(packages_content)
+    release_content = subprocess.check_output(
+        [
+            "apt-ftparchive",
+            "release",
+            "-oAPT::FTPArchive::Release::Suite=dummysuite",
+            ".",
+        ],
+        cwd=tmpdirname + "/cache",
+    )
+    with open(tmpdirname + "/cache/Release", "wb") as f:
+        f.write(release_content)
+
+
+@contextmanager
+def serve_repo(tmpdirname):
+    httpd = http.server.HTTPServer(
+        ("localhost", 0),
+        partial(http.server.SimpleHTTPRequestHandler, directory=tmpdirname + "/cache"),
+    )
+    # run server in a new thread
+    server_thread = threading.Thread(target=httpd.serve_forever)
+    server_thread.daemon = True
+    # start thread
+    server_thread.start()
+    # retrieve port (in case it was generated automatically)
+    _, port = httpd.server_address
+    try:
+        yield port
+    finally:
+        httpd.shutdown()
+        httpd.server_close()
+        server_thread.join()
+
+
+def run_mmdebstrap(
+    tmpdirname, sources: list[Source], nativearch, foreignarches, output
+):
+    with open(tmpdirname + "/sources.list", "w", encoding="utf8") as f:
+        for source in sources:
+            f.write(source.deb_line())
+    # we serve the directory via http instead of using a copy:// mirror
+    # because the temporary directory is not accessible to the unshared
+    # user
+    with serve_repo(tmpdirname) as port:
+        cmd = [
+            "mmdebstrap",
+            f"--architectures={','.join([nativearch] + list(foreignarches))}",
+            "--variant=essential",
+            "--include=debootsnap-dummy",
+            '--aptopt=Apt::Key::gpgvcommand "/usr/libexec/mmdebstrap/gpgvnoexpkeysig"',
+            '--customize-hook=chroot "$1" dpkg -r debootsnap-dummy',
+            '--customize-hook=chroot "$1" dpkg-query --showformat '
+            "'${binary:Package}=${Version}\\n' --show > \"$1/pkglist\"",
+            "--customize-hook=download /pkglist ./pkglist",
+            '--customize-hook=rm "$1/pkglist"',
+            "--customize-hook=upload sources.list /etc/apt/sources.list",
+            "dummysuite",
+            output,
+            f"deb [trusted=yes] http://localhost:{port}/ ./",
+        ]
+        subprocess.check_call(cmd, cwd=tmpdirname)
+
+    newpkgs = set()
+    with open(tmpdirname + "/pkglist", encoding="utf8") as f:
+        for line in f:
+            line = line.rstrip()
+            n, v = line.split("=")
+            a = nativearch
+            if ":" in n:
+                n, a = n.split(":")
+            newpkgs.add((n, a, v))
+
+    return newpkgs
+
+
+@contextmanager
+def proxy_snapshot(tmpdirname):
+    httpd = socketserver.TCPServer(
+        # the default address family for socketserver is AF_INET so we
+        # explicitly bind to ipv4 localhost
+        ("localhost", 0),
+        partial(Proxy, directory=tmpdirname + "/cache"),
+    )
+    # run server in a new thread
+    server_thread = threading.Thread(target=httpd.serve_forever)
+    server_thread.daemon = True
+    # start thread
+    server_thread.start()
+    # retrieve port (in case it was generated automatically)
+    _, port = httpd.server_address
+    try:
+        yield port
+    finally:
+        httpd.shutdown()
+        httpd.server_close()
+        server_thread.join()
+
+
+def download_packages(
+    tmpdirname, sources: list[Source], pkgs, nativearch, foreignarches
+):
+    for d in [
+        "/etc/apt/apt.conf.d",
+        "/etc/apt/sources.list.d",
+        "/etc/apt/preferences.d",
+        "/var/cache/apt",
+        "/var/lib/apt/lists/partial",
+        "/var/lib/dpkg",
+    ]:
+        os.makedirs(tmpdirname + "/" + d)
+    # apt-get update requires /var/lib/dpkg/status
+    with open(tmpdirname + "/var/lib/dpkg/status", "w", encoding="utf8") as f:
+        pass
+    with open(tmpdirname + "/apt.conf", "w", encoding="utf8") as f:
+        f.write(f'Apt::Architecture "{nativearch}";\n')
+        f.write("Apt::Architectures { " + f'"{nativearch}"; ')
+        for a in foreignarches:
+            f.write(f'"{a}"; ')
+        f.write("};\n")
+        f.write('Dir "' + tmpdirname + '";\n')
+        f.write('Dir::Etc::Trusted "/etc/apt/trusted.gpg";\n')
+        f.write('Dir::Etc::TrustedParts "/usr/share/keyrings/";\n')
+        f.write('Acquire::Languages "none";\n')
+        # f.write("Acquire::http::Dl-Limit \"1000\";\n")
+        # f.write("Acquire::https::Dl-Limit \"1000\";\n")
+        f.write('Acquire::Retries "5";\n')
+        # ignore expired signatures
+        f.write('Apt::Key::gpgvcommand "/usr/libexec/mmdebstrap/gpgvnoexpkeysig";\n')
+
+    os.makedirs(tmpdirname + "/cache")
+
+    with proxy_snapshot(tmpdirname) as port:
+        with open(tmpdirname + "/etc/apt/sources.list", "w", encoding="utf8") as f:
+            for source in sources:
+                f.write(source.deb_line(f"localhost:{port}"))
+        subprocess.check_call(
+            ["apt-get", "update", "--error-on=any"],
+            env={"APT_CONFIG": tmpdirname + "/apt.conf"},
+        )
+        for i, nav in enumerate(pkgs):
+            print(f"{i + 1} of {len(pkgs)}")
+            with tempfile.TemporaryDirectory() as tmpdir2:
+                subprocess.check_call(
+                    ["apt-get", "download", "--yes", f"{nav[0]}:{nav[1]}={nav[2]}"],
+                    cwd=tmpdir2,
+                    env={"APT_CONFIG": tmpdirname + "/apt.conf"},
+                )
+                debs = os.listdir(tmpdir2)
+                assert len(debs) == 1
+                # Normalize the package name to how it appears in the archive.
+                # Mainly this removes the epoch from the filename, see
+                # https://bugs.debian.org/645895
+                # This avoids apt bugs connected with a percent sign in the
+                # filename as they occasionally appear, for example as
+                # introduced in apt 2.1.15 and later fixed by DonKult:
+                # https://salsa.debian.org/apt-team/apt/-/merge_requests/175
+                subprocess.check_call(["dpkg-name", tmpdir2 + "/" + debs[0]])
+                debs = os.listdir(tmpdir2)
+                assert len(debs) == 1
+                shutil.move(tmpdir2 + "/" + debs[0], tmpdirname + "/cache")
+
+
+def main(arguments: list[str]) -> None:
+    args = parse_args(arguments)
+    if args.packages:
+        pkgs = [v for sublist in args.packages for v in sublist]
+        if args.architecture is None:
+            arches = {a for _, a, _ in pkgs if a is not None}
+            if len(arches) == 0:
+                print("packages are not architecture qualified", file=sys.stderr)
+                print(
+                    "use --architecture to set the native architecture", file=sys.stderr
+                )
+                sys.exit(1)
+            elif len(arches) > 1:
+                print("more than one architecture in the package list", file=sys.stderr)
+                print(
+                    "use --architecture to set the native architecture", file=sys.stderr
+                )
+                sys.exit(1)
+            nativearch = arches.pop()
+            assert arches == set()
+        else:
+            nativearch = args.architecture
+    else:
+        pkgs, nativearch = args.buildinfo
+    # unknown architectures are the native architecture
+    pkgs = [(n, a if a is not None else nativearch, v) for n, a, v in pkgs]
+    # make package list unique
+    pkgs = list(set(pkgs))
+    # compute foreign architectures
+    foreignarches = set()
+    for _, a, _ in pkgs:
+        if a != nativearch:
+            foreignarches.add(a)
+
+    for tool in [
+        "equivs-build",
+        "apt-ftparchive",
+        "mmdebstrap",
+        "apt-get",
+        "dpkg-name",
+    ]:
+        if shutil.which(tool) is None:
+            print(f"{tool} is required but not installed", file=sys.stderr)
+            sys.exit(1)
+
+    sources = compute_sources(pkgs, nativearch, args.ignore_notfound)
+
+    if args.sources_list_only:
+        for source in sources:
+            print(source.deb_line(), end="")
+        sys.exit(0)
+
+    with tempfile.TemporaryDirectory() as tmpdirname:
+        download_packages(tmpdirname, sources, pkgs, nativearch, foreignarches)
+
+        create_repo(tmpdirname, pkgs)
+
+        newpkgs = run_mmdebstrap(
+            tmpdirname, sources, nativearch, foreignarches, args.output
+        )
+
+    # make sure that the installed packages match the requested package
+    # list
+    assert set(newpkgs) == set(pkgs)
+
+
+if __name__ == "__main__":
+    main(sys.argv[1:])
diff -Nru devscripts-2.23.2/scripts/devscripts/test/test_debootsnap.py devscripts-2.23.3/scripts/devscripts/test/test_debootsnap.py
--- devscripts-2.23.2/scripts/devscripts/test/test_debootsnap.py	1970-01-01 01:00:00.000000000 +0100
+++ devscripts-2.23.3/scripts/devscripts/test/test_debootsnap.py	2023-03-15 23:35:37.000000000 +0100
@@ -0,0 +1,56 @@
+# Copyright (C) 2023, Benjamin Drung <bdrung@debian.org>
+#
+# Permission to use, copy, modify, and/or distribute this software for any
+# purpose with or without fee is hereby granted, provided that the above
+# copyright notice and this permission notice appear in all copies.
+#
+# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
+# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
+# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
+# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
+# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
+# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
+# PERFORMANCE OF THIS SOFTWARE.
+
+"""Test debootsnap script."""
+
+import contextlib
+import io
+import tempfile
+import unittest
+import unittest.mock
+
+from debootsnap import main, parse_pkgs
+
+
+class TestDebootsnap(unittest.TestCase):
+    """Test debootsnap script."""
+
+    @unittest.mock.patch("shutil.which")
+    def test_missing_tools(self, which_mock) -> None:
+        """Test debootsnap fails cleanly if required binaries are missing."""
+        which_mock.return_value = None
+        stderr = io.StringIO()
+        with contextlib.redirect_stderr(stderr):
+            with self.assertRaisesRegex(SystemExit, "1"):
+                main(["--packages=pkg1:arch=ver1", "chroot.tar"])
+        self.assertEqual(
+            stderr.getvalue(), "equivs-build is required but not installed\n"
+        )
+        which_mock.assert_called_once_with("equivs-build")
+
+    def test_parse_pkgs_from_file(self) -> None:
+        """Test parse_pkgs() for a given file name."""
+        with tempfile.NamedTemporaryFile(mode="w", prefix="devscripts-") as pkgfile:
+            pkgfile.write("pkg1:arch=ver1\npkg2:arch=ver2\n")
+            pkgfile.flush()
+            pkgs = parse_pkgs(pkgfile.name)
+        self.assertEqual(pkgs, [[("pkg1", "arch", "ver1"), ("pkg2", "arch", "ver2")]])
+
+    def test_parse_pkgs_missing_file(self) -> None:
+        """Test parse_pkgs() for a missing file name."""
+        stderr = io.StringIO()
+        with contextlib.redirect_stderr(stderr):
+            with self.assertRaisesRegex(SystemExit, "1"):
+                parse_pkgs("/non-existing/pkgfile")
+        self.assertEqual(stderr.getvalue(), "/non-existing/pkgfile does not exist\n")
diff -Nru devscripts-2.23.2/scripts/devscripts/test/test_suspicious_source.py devscripts-2.23.3/scripts/devscripts/test/test_suspicious_source.py
--- devscripts-2.23.2/scripts/devscripts/test/test_suspicious_source.py	1970-01-01 01:00:00.000000000 +0100
+++ devscripts-2.23.3/scripts/devscripts/test/test_suspicious_source.py	2023-03-02 15:33:09.000000000 +0100
@@ -0,0 +1,41 @@
+# Copyright (C) 2023, Benjamin Drung <bdrung@debian.org>
+#
+# Permission to use, copy, modify, and/or distribute this software for any
+# purpose with or without fee is hereby granted, provided that the above
+# copyright notice and this permission notice appear in all copies.
+#
+# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
+# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
+# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
+# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
+# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
+# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
+# PERFORMANCE OF THIS SOFTWARE.
+
+"""Test suspicious-source script."""
+
+import pathlib
+import subprocess
+import tempfile
+import unittest
+
+
+class TestSuspiciousSource(unittest.TestCase):
+    """Test suspicious-source script."""
+
+    @staticmethod
+    def _run_suspicious_source(directory: str) -> str:
+        suspicious_source = subprocess.run(
+            ["./suspicious-source", "-d", directory],
+            check=True,
+            stdout=subprocess.PIPE,
+            text=True,
+        )
+        return suspicious_source.stdout.strip()
+
+    def test_python_sript(self) -> None:
+        """Test not complaining about Python code."""
+        with tempfile.TemporaryDirectory(prefix="devscripts-") as tmpdir:
+            python_file = pathlib.Path(tmpdir) / "example.py"
+            python_file.write_text("#!/usr/bin/python3\nprint('hello world')\n")
+            self.assertEqual(self._run_suspicious_source(tmpdir), "")
diff -Nru devscripts-2.23.2/scripts/edit-patch.sh devscripts-2.23.3/scripts/edit-patch.sh
--- devscripts-2.23.2/scripts/edit-patch.sh	2023-02-05 00:33:58.000000000 +0100
+++ devscripts-2.23.3/scripts/edit-patch.sh	2023-03-15 23:37:22.000000000 +0100
@@ -122,8 +122,10 @@
 
 edit_patch_quilt() {
     export QUILT_PATCHES=debian/patches
-    top_patch=$(quilt top)
-    echo "Top patch: $top_patch"
+    if [ -e $QUILT_PATCHES ]; then
+        top_patch=$(quilt top)
+        echo "Top patch: $top_patch"
+    fi
     if [ -e $PREFIX/$1 ]; then
         # if it's an existing patch and we are at the end of the stack,
         # go back at the beginning
@@ -141,8 +143,10 @@
     # use a sub-shell
     quilt shell
     quilt refresh
-    echo "Reverting quilt back to $top_patch"
-    quilt pop $top_patch
+    if [ -n $top_patch ]; then
+        echo "Reverting quilt back to $top_patch"
+        quilt pop $top_patch
+    fi
     vcs_add $PREFIX/$1 $PREFIX/series
 }
 
diff -Nru devscripts-2.23.2/scripts/sadt devscripts-2.23.3/scripts/sadt
--- devscripts-2.23.2/scripts/sadt	2023-02-18 23:50:46.000000000 +0100
+++ devscripts-2.23.3/scripts/sadt	2023-03-01 00:23:53.000000000 +0100
@@ -346,7 +346,7 @@
                 for package in packages:
                     or_clauses += [
                         stripped_or_clause
-                        + [dict(name=package, version=None, arch=None)]
+                        + [{"name": package, "version": None, "arch": None}]
                     ]
             else:
                 or_clauses += [or_clause]
diff -Nru devscripts-2.23.2/scripts/suspicious-source devscripts-2.23.3/scripts/suspicious-source
--- devscripts-2.23.2/scripts/suspicious-source	2023-02-05 00:33:58.000000000 +0100
+++ devscripts-2.23.3/scripts/suspicious-source	2023-03-02 15:33:09.000000000 +0100
@@ -75,8 +75,8 @@
     "text/x-perl",
     "text/x-php",
     "text/x-po",
-    "text/x-python",
     "text/x-ruby",
+    "text/x-script.python",
     "text/x-shellscript",
     "text/x-tex",
     "text/x-texinfo",

Attachment: signature.asc
Description: PGP signature


Reply to: