[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

[britney] ITM: britney-fixes-2016-03-merge-round-1



Hi,

I have bundled a series of patches and have selected a subset for
merging into master now.

 * Please review the attached patch series (or branch at [1]).
 * The patch series requires an updated version of the test suite.
   - Please checkout the "britney-fixes-2016-03-merge-round-1" in your
     britney2-tests repository[2]
 * Deadline for comments being Monday the 11th of April.

While the majority of the patches are mainly code cleanup / refactoring,
we do have a couple of highlights (in no particular order):

 * Crash fix #815995 (Patch 0010)
   - Minor tweak to this version compared to the one attached in the
     bug.
 * Versioned provides support #786803 (Patches 0008 + 0009)
   - NB: Punted on multi-arch'ified provides for now.
 * Partial support for reading packages from a standard mirror
   layout (Patch 0020)
   - It is missing the part where various ("non-mirror") input data
     files (e.g. BugsV, Dates) are moved *out* of the mirror.  I intend
     to look into that in a future patch series.
   - This is two of AJ's commits squashed, rebased and ported to
     python3.
 * Stop migration of cruft binaries no longer in testing (0016)
   - It solves cruft re-entering testing (with IGNORE_CRUFT=1)


In the "coming soon" department, I also got a patch series for improving
the excuses.yaml file considerably.  If you are interested, you can take
a peek at [3].  I will issue a separate "ITM" mail for those patches
when I am ready.

Thanks,
~Niels

[1]
https://anonscm.debian.org/cgit/users/nthykier/britney.git/log/?h=britney-fixes-2016-03-merge-round-1

[2]
https://anonscm.debian.org/cgit/collab-maint/britney2-tests.git/log/?h=britney-fixes-2016-03-merge-round-1

[3]
https://anonscm.debian.org/cgit/users/nthykier/britney.git/log/?h=britney-fixes-2016-03

From c0919a807f0624dc4ce1c1008614412f3bf64969 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 17 Jan 2016 10:43:26 +0000
Subject: [PATCH 01/33] britney.py: Add all_binaries to store binaries by
 pkg_id

It can be used to A) to make the mismatch check more efficient and B)
share identical binaries between suites.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 76 ++++++++++++++++++++++++++++++--------------------------------
 1 file changed, 37 insertions(+), 39 deletions(-)

diff --git a/britney.py b/britney.py
index 9c7580a..9fbd53e 100755
--- a/britney.py
+++ b/britney.py
@@ -213,6 +213,19 @@ from consts import (VERSION, SECTION, BINARIES, MAINTAINER, FAKESRC,
 __author__ = 'Fabio Tranchitella and the Debian Release Team'
 __version__ = '2.0'
 
+# NB: ESSENTIAL deliberately skipped as the 2011 and 2012
+# parts of the live-data tests require it (britney merges
+# this field correctly from the unstable version where
+# available)
+check_field_name = dict((globals()[fn], fn) for fn in
+                         (
+                          "SOURCE SOURCEVER ARCHITECTURE MULTIARCH" +
+                          " DEPENDS CONFLICTS PROVIDES"
+                         ).split()
+                        )
+
+check_fields = sorted(check_field_name)
+
 class Britney(object):
     """Britney, the Debian testing updater script
     
@@ -260,6 +273,7 @@ class Britney(object):
                 print('\n'.join('%4d %s' % (len(nuninst[x]), x) for x in self.options.architectures))
                 return
 
+        self.all_binaries = {}
         # read the source and binary packages for the involved distributions
         self.sources['testing'] = self.read_sources(self.options.testing)
         self.sources['unstable'] = self.read_sources(self.options.unstable)
@@ -287,8 +301,6 @@ class Britney(object):
                 # here.
                 self.binaries['pu'][arch] = ({}, {})
 
-            self._check_mismatches(arch)
-
         self.__log("Compiling Installability tester", type="I")
         self._build_installability_tester(self.options.architectures)
 
@@ -329,43 +341,23 @@ class Britney(object):
         self.urgencies = self.read_urgencies(self.options.testing)
         self.excuses = []
 
-    def _check_mismatches(self, arch):
-        suites = [s for s in self.binaries if arch in self.binaries[s]]
-
-        # NB: ESSENTIAL deliberately skipped as the 2011 and 2012
-        # parts of the live-data tests require it (britney merges
-        # this field correctly from the unstable version where
-        # available)
-        check_field_name = dict( (globals()[fn], fn) for fn in
-               ("SOURCE SOURCEVER ARCHITECTURE MULTIARCH"
-                 + " DEPENDS CONFLICTS PROVIDES").split() )
-        check_fields = check_field_name.keys()
-
-        any_mismatch = False
-        for s1, s2 in product(suites, suites):
-            if s1 >= s2: continue
-            s1_pkgs = self.binaries[s1][arch][0]
-            s2_pkgs = self.binaries[s2][arch][0]
-            pkgs = set(s1_pkgs) & set(s2_pkgs)
-            for p in pkgs:
-                if s1_pkgs[p][VERSION] != s2_pkgs[p][VERSION]: continue
-                bad = []
-                for f in check_fields:
-                    if s1_pkgs[p][f] != s2_pkgs[p][f]:
-                        bad.append((f, s1_pkgs[p][f], s2_pkgs[p][f]))
-
-                if bad:
-                    any_mismatch = True
-                    self.__log("Mismatch found %s %s %s differs in %s vs %s" % (
-                        p, s1_pkgs[p][VERSION], arch, s1, s2), type="E")
-                    for f, v1, v2 in bad:
-                        self.__log(" ... %s %s != %s" % (check_field_name[f], v1, v2))
-
-        # test suite doesn't appreciate aborts of this nature
-        if any_mismatch:
-            self.__log("Mismatches found, exiting.", type="I")
-            sys.exit(1)
-        return
+    def merge_pkg_entries(self, package, parch, pkg_entry1, pkg_entry2,
+                          check_fields=check_fields, check_field_name=check_field_name):
+        bad = []
+        for f in check_fields:
+            if pkg_entry1[f] != pkg_entry2[f]:
+                bad.append((f, pkg_entry1[f], pkg_entry2[f]))
+
+        if bad:
+            self.__log("Mismatch found %s %s %s differs" % (
+                package, pkg_entry1[VERSION], parch), type="E")
+            for f, v1, v2 in bad:
+                self.__log(" ... %s %s != %s" % (check_field_name[f], v1, v2))
+            raise ValueError("Invalid data set")
+
+        # Merge ESSENTIAL if necessary
+        if pkg_entry2[ESSENTIAL]:
+            pkg_entry1[ESSENTIAL] = True
 
     def __parse_arguments(self):
         """Parse the command line arguments
@@ -618,6 +610,7 @@ class Britney(object):
         packages = {}
         provides = {}
         sources = self.sources
+        all_binaries = self.all_binaries
 
         filename = os.path.join(basedir, "Packages_%s" % arch)
         self.__log("Loading binary packages from %s" % filename)
@@ -639,6 +632,7 @@ class Britney(object):
                 continue
             pkg = intern(pkg)
             version = intern(version)
+            pkg_id = (pkg, version, arch)
 
             # Merge Pre-Depends with Depends and Conflicts with
             # Breaks. Britney is not interested in the "finer
@@ -707,6 +701,10 @@ class Britney(object):
 
             # add the resulting dictionary to the package list
             packages[pkg] = dpkg
+            if pkg_id in all_binaries:
+                self.merge_pkg_entries(pkg, arch, all_binaries[pkg_id], dpkg)
+            else:
+                all_binaries[pkg_id] = dpkg
 
         # return a tuple with the list of real and virtual packages
         return (packages, provides)
-- 
2.8.0.rc3

From 9a2da823999b691fcdc3ae3b9fb18bef699119a7 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 17 Jan 2016 10:56:47 +0000
Subject: [PATCH 02/33] Store the pkg_id in undo instead of the package info

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 15 +++++++--------
 britney_util.py |  4 ++--
 2 files changed, 9 insertions(+), 10 deletions(-)

diff --git a/britney.py b/britney.py
index 9fbd53e..ab2436c 100755
--- a/britney.py
+++ b/britney.py
@@ -2079,7 +2079,7 @@ class Britney(object):
 
                     pkg_data = binaries_t_a[binary]
                     # save the old binary for undo
-                    undo['binaries'][p] = pkg_data
+                    undo['binaries'][p] = rm_pkg_id
                     if pkey not in eqv_set:
                         # all the reverse dependencies are affected by
                         # the change
@@ -2110,9 +2110,9 @@ class Britney(object):
         # updates but not supported as a manual hint
         elif item.package in packages_t[item.architecture][0]:
             binaries_t_a = packages_t[item.architecture][0]
-            undo['binaries'][item.package + "/" + item.architecture] = binaries_t_a[item.package]
             version = binaries_t_a[item.package][VERSION]
             pkg_id = (item.package, version, item.architecture)
+            undo['binaries'][item.package + "/" + item.architecture] = pkg_id
             affected.add(pkg_id)
             affected.update(inst_tester.reverse_dependencies_of(pkg_id))
             del binaries_t_a[item.package]
@@ -2138,10 +2138,10 @@ class Britney(object):
                 # all of its reverse dependencies as affected
                 if binary in binaries_t_a:
                     old_pkg_data = binaries_t_a[binary]
-                    # save the old binary package
-                    undo['binaries'][p] = old_pkg_data
                     old_version = old_pkg_data[VERSION]
                     old_pkg_id = (binary, old_version, parch)
+                    # save the old binary package
+                    undo['binaries'][p] = old_pkg_id
                     if not equivalent_replacement:
                         # all the reverse dependencies are affected by
                         # the change
@@ -2162,8 +2162,7 @@ class Britney(object):
                     # by this function
                     for (tundo, tpkg) in hint_undo:
                         if p in tundo['binaries']:
-                            pv = tundo['binaries'][p][VERSION]
-                            tpkg_id = (p, pv, parch)
+                            tpkg_id = tundo['binaries'][p]
                             affected.update(inst_tester.reverse_dependencies_of(tpkg_id))
 
                 # add/update the binary package from the source suite
@@ -2289,7 +2288,7 @@ class Britney(object):
         # check if the action improved the uninstallability counters
         if not is_accepted and automatic_revert:
             undo_copy = list(reversed(undo_list))
-            undo_changes(undo_copy, self._inst_tester, self.sources, self.binaries)
+            undo_changes(undo_copy, self._inst_tester, self.sources, self.binaries, self.all_binaries)
 
         return (is_accepted, nuninst_after, undo_list, arch)
 
@@ -2490,7 +2489,7 @@ class Britney(object):
             if not lundo: return
             lundo.reverse()
 
-            undo_changes(lundo, self._inst_tester, self.sources, self.binaries)
+            undo_changes(lundo, self._inst_tester, self.sources, self.binaries, self.all_binaries)
 
 
     def assert_nuninst_is_correct(self):
diff --git a/britney_util.py b/britney_util.py
index 8a21a23..1cb96d6 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -114,7 +114,7 @@ def iter_except(func, exception, first=None):
         pass
 
 
-def undo_changes(lundo, inst_tester, sources, binaries,
+def undo_changes(lundo, inst_tester, sources, binaries, all_binary_packages,
                  BINARIES=BINARIES):
     """Undoes one or more changes to testing
 
@@ -171,7 +171,7 @@ def undo_changes(lundo, inst_tester, sources, binaries,
                 if p in binaries_t_a:
                     rmpkgdata = binaries_t_a[p]
                     inst_tester.remove_testing_binary((binary, rmpkgdata[VERSION], arch))
-                pkgdata = undo['binaries'][p]
+                pkgdata = all_binary_packages[undo['binaries'][p]]
                 binaries_t_a[binary] = pkgdata
                 inst_tester.add_testing_binary((binary, pkgdata[VERSION], arch))
 
-- 
2.8.0.rc3

From d497ef9f2dc287b7e7b05602f84c85080ffdc23a Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 17 Jan 2016 13:28:52 +0000
Subject: [PATCH 03/33] Use sets in the provides-table

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 14 ++++++--------
 1 file changed, 6 insertions(+), 8 deletions(-)

diff --git a/britney.py b/britney.py
index ab2436c..7c0985c 100755
--- a/britney.py
+++ b/britney.py
@@ -608,7 +608,7 @@ class Britney(object):
         """
 
         packages = {}
-        provides = {}
+        provides = defaultdict(set)
         sources = self.sources
         all_binaries = self.all_binaries
 
@@ -693,9 +693,7 @@ class Britney(object):
             if dpkg[PROVIDES]:
                 parts = [p.strip() for p in dpkg[PROVIDES].split(",")]
                 for p in parts:
-                    if p not in provides:
-                        provides[p] = []
-                    provides[p].append(pkg)
+                    provides[p].add(pkg)
                 dpkg[PROVIDES] = parts
             else: dpkg[PROVIDES] = []
 
@@ -2091,7 +2089,7 @@ class Britney(object):
                     for j in pkg_data[PROVIDES]:
                         key = j + "/" + parch
                         if key not in undo['virtual']:
-                            undo['virtual'][key] = provides_t_a[j][:]
+                            undo['virtual'][key] = provides_t_a[j].copy()
                         provides_t_a[j].remove(binary)
                         if not provides_t_a[j]:
                             del provides_t_a[j]
@@ -2174,10 +2172,10 @@ class Britney(object):
                     key = j + "/" + parch
                     if j not in provides_t_a:
                         undo['nvirtual'].append(key)
-                        provides_t_a[j] = []
+                        provides_t_a[j] = set()
                     elif key not in undo['virtual']:
-                        undo['virtual'][key] = provides_t_a[j][:]
-                    provides_t_a[j].append(binary)
+                        undo['virtual'][key] = provides_t_a[j].copy()
+                    provides_t_a[j].add(binary)
                 if not equivalent_replacement:
                     # all the reverse dependencies are affected by the change
                     affected.add(updated_pkg_id)
-- 
2.8.0.rc3

From 51908eb4de528695d954507602142eb30d2f2873 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 17 Jan 2016 14:33:43 +0000
Subject: [PATCH 04/33] inst-tester: split _pick_choice into two

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 installability/tester.py | 158 +++++++++++++++++++++++++----------------------
 1 file changed, 85 insertions(+), 73 deletions(-)

diff --git a/installability/tester.py b/installability/tester.py
index 986dde4..b87ae6e 100644
--- a/installability/tester.py
+++ b/installability/tester.py
@@ -328,11 +328,11 @@ class InstallabilityTester(object):
         #
         # * A package is installable if never and musts are disjointed
         #   and both check and choices are empty.
-        #   - exception: _pick_choice may determine the installability
+        #   - exception: resolve_choice may determine the installability
         #     of t via recursion (calls _check_inst).  In this case
         #     check and choices are not (always) empty.
 
-        def _pick_choice(rebuild, set=set, len=len):
+        def _prune_choices(rebuild, set=set, len=len):
             """Picks a choice from choices and updates rebuild.
 
             Prunes the choices and updates "rebuild" to reflect the
@@ -382,75 +382,14 @@ class InstallabilityTester(object):
                     # all alternatives would violate the conflicts or are uninstallable
                     # => package is not installable
                     stats.choice_presolved += 1
-                    return None
+                    return False
 
                 # The choice is still deferred
                 rebuild.add(frozenset(remain))
 
-            if check or not rebuild:
-                return False
-
-            choice = iter(rebuild.pop())
-            last = next(choice) # pick one to go last
-            for p in choice:
-                musts_copy = musts.copy()
-                never_tmp = set()
-                choices_tmp = set()
-                check_tmp = set([p])
-                if not self._check_loop(universe, testing, eqv_table,
-                                        stats, musts_copy, never_tmp,
-                                        cbroken, choices_tmp,
-                                        check_tmp):
-                    # p cannot be chosen/is broken (unlikely, but ...)
-                    continue
-
-                # Test if we can pick p without any consequences.
-                # - when we can, we avoid a backtrack point.
-                if never_tmp <= never and choices_tmp <= rebuild:
-                    # we can pick p without picking up new conflicts
-                    # or unresolved choices.  Therefore we commit to
-                    # using p.
-                    #
-                    # NB: Optimally, we would go to the start of this
-                    # routine, but to conserve stack-space, we return
-                    # and expect to be called again later.
-                    musts.update(musts_copy)
-                    stats.choice_resolved_without_restore_point += 1
-                    return False
-
-                if not musts.isdisjoint(never_tmp):
-                    # If we pick p, we will definitely end up making
-                    # t uninstallable, so p is a no-go.
-                    continue
+            return True
 
-                stats.backtrace_restore_point_created += 1
-                # We are not sure that p is safe, setup a backtrack
-                # point and recurse.
-                never_tmp |= never
-                choices_tmp |= rebuild
-                if self._check_inst(p, musts_copy, never_tmp,
-                                    choices_tmp):
-                    # Success, p was a valid choice and made it all
-                    # installable
-                    return True
-
-                # If we get here, we failed to find something that
-                # would satisfy choice (without breaking the
-                # installability of t).  This means p cannot be used
-                # to satisfy the dependencies, so pretend to conflict
-                # with it - hopefully it will reduce future choices.
-                never.add(p)
-                stats.backtrace_restore_point_used += 1
-
-            # Optimization for the last case; avoid the recursive call
-            # and just assume the last will lead to a solution.  If it
-            # doesn't there is no solution and if it does, we don't
-            # have to back-track anyway.
-            check.add(last)
-            musts.add(last)
-            stats.backtrace_last_option += 1
-            return False
-        # END _pick_choice
+        # END _prune_choices
 
         while check:
             if not check_loop(choices, check):
@@ -459,15 +398,20 @@ class InstallabilityTester(object):
 
             if choices:
                 rebuild = set()
-                # We have to "guess" now, which is always fun, but not cheap
-                r = _pick_choice(rebuild)
-                if r is None:
+
+                if not _prune_choices(rebuild):
                     verdict = False
                     break
-                if r:
-                    # The recursive call have already updated the
-                    # cache so there is not point in doing it again.
-                    return True
+
+                while not check and rebuild:
+                    # We have to "guess" now, which is always fun, but not cheap. We
+                    # stop guessing:
+                    # - once we run out of choices to make (obviously), OR
+                    # - if one of the choices exhaust all but one option
+                    if self.resolve_choice(check, musts, never, rebuild):
+                        # The recursive call have already updated the
+                        # cache so there is not point in doing it again.
+                        return True
                 choices = rebuild
 
         if verdict:
@@ -479,6 +423,74 @@ class InstallabilityTester(object):
 
         return verdict
 
+    def resolve_choice(self, check, musts, never, choices):
+        universe = self._universe
+        testing = self._testing
+        eqv_table = self._eqv_table
+        stats = self._stats
+        cbroken = self._cache_broken
+
+        choice = iter(choices.pop())
+        last = next(choice) # pick one to go last
+        for p in choice:
+            musts_copy = musts.copy()
+            never_tmp = set()
+            choices_tmp = set()
+            check_tmp = set([p])
+            if not self._check_loop(universe, testing, eqv_table,
+                                    stats, musts_copy, never_tmp,
+                                    cbroken, choices_tmp,
+                                    check_tmp):
+                # p cannot be chosen/is broken (unlikely, but ...)
+                continue
+
+            # Test if we can pick p without any consequences.
+            # - when we can, we avoid a backtrack point.
+            if never_tmp <= never and choices_tmp <= choices:
+                # we can pick p without picking up new conflicts
+                # or unresolved choices.  Therefore we commit to
+                # using p.
+                #
+                # NB: Optimally, we would go to the start of this
+                # routine, but to conserve stack-space, we return
+                # and expect to be called again later.
+                musts.update(musts_copy)
+                stats.choice_resolved_without_restore_point += 1
+                return False
+
+            if not musts.isdisjoint(never_tmp):
+                # If we pick p, we will definitely end up making
+                # t uninstallable, so p is a no-go.
+                continue
+
+            stats.backtrace_restore_point_created += 1
+            # We are not sure that p is safe, setup a backtrack
+            # point and recurse.
+            never_tmp |= never
+            choices_tmp |= choices
+            if self._check_inst(p, musts_copy, never_tmp,
+                                choices_tmp):
+                # Success, p was a valid choice and made it all
+                # installable
+                return True
+
+            # If we get here, we failed to find something that
+            # would satisfy choice (without breaking the
+            # installability of t).  This means p cannot be used
+            # to satisfy the dependencies, so pretend to conflict
+            # with it - hopefully it will reduce future choices.
+            never.add(p)
+            stats.backtrace_restore_point_used += 1
+
+        # Optimization for the last case; avoid the recursive call
+        # and just assume the last will lead to a solution.  If it
+        # doesn't there is no solution and if it does, we don't
+        # have to back-track anyway.
+        check.add(last)
+        musts.add(last)
+        stats.backtrace_last_option += 1
+        return False
+
     def _check_loop(self, universe, testing, eqv_table, stats, musts, never,
                     cbroken, choices, check, len=len,
                     frozenset=frozenset):
-- 
2.8.0.rc3

From c22a9ceb9266b5d3b396db462ecf6e3abdddbf0d Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 17 Jan 2016 15:14:14 +0000
Subject: [PATCH 05/33] inst-tester: Move loop into resolve_choices

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 installability/tester.py | 124 ++++++++++++++++++++++++-----------------------
 1 file changed, 63 insertions(+), 61 deletions(-)

diff --git a/installability/tester.py b/installability/tester.py
index b87ae6e..406922d 100644
--- a/installability/tester.py
+++ b/installability/tester.py
@@ -328,7 +328,7 @@ class InstallabilityTester(object):
         #
         # * A package is installable if never and musts are disjointed
         #   and both check and choices are empty.
-        #   - exception: resolve_choice may determine the installability
+        #   - exception: resolve_choices may determine the installability
         #     of t via recursion (calls _check_inst).  In this case
         #     check and choices are not (always) empty.
 
@@ -403,12 +403,12 @@ class InstallabilityTester(object):
                     verdict = False
                     break
 
-                while not check and rebuild:
+                if not check and rebuild:
                     # We have to "guess" now, which is always fun, but not cheap. We
                     # stop guessing:
                     # - once we run out of choices to make (obviously), OR
                     # - if one of the choices exhaust all but one option
-                    if self.resolve_choice(check, musts, never, rebuild):
+                    if self.resolve_choices(check, musts, never, rebuild):
                         # The recursive call have already updated the
                         # cache so there is not point in doing it again.
                         return True
@@ -423,73 +423,75 @@ class InstallabilityTester(object):
 
         return verdict
 
-    def resolve_choice(self, check, musts, never, choices):
+    def resolve_choices(self, check, musts, never, choices):
         universe = self._universe
         testing = self._testing
         eqv_table = self._eqv_table
         stats = self._stats
         cbroken = self._cache_broken
 
-        choice = iter(choices.pop())
-        last = next(choice) # pick one to go last
-        for p in choice:
-            musts_copy = musts.copy()
-            never_tmp = set()
-            choices_tmp = set()
-            check_tmp = set([p])
-            if not self._check_loop(universe, testing, eqv_table,
-                                    stats, musts_copy, never_tmp,
-                                    cbroken, choices_tmp,
-                                    check_tmp):
-                # p cannot be chosen/is broken (unlikely, but ...)
-                continue
-
-            # Test if we can pick p without any consequences.
-            # - when we can, we avoid a backtrack point.
-            if never_tmp <= never and choices_tmp <= choices:
-                # we can pick p without picking up new conflicts
-                # or unresolved choices.  Therefore we commit to
-                # using p.
-                #
-                # NB: Optimally, we would go to the start of this
-                # routine, but to conserve stack-space, we return
-                # and expect to be called again later.
-                musts.update(musts_copy)
-                stats.choice_resolved_without_restore_point += 1
-                return False
+        while choices:
+            choice_options = choices.pop()
+
+            choice = iter(choice_options)
+            last = next(choice)  # pick one to go last
+            solved = False
+            for p in choice:
+                musts_copy = musts.copy()
+                never_tmp = set()
+                choices_tmp = set()
+                check_tmp = set([p])
+                if not self._check_loop(universe, testing, eqv_table,
+                                        stats, musts_copy, never_tmp,
+                                        cbroken, choices_tmp,
+                                        check_tmp):
+                    # p cannot be chosen/is broken (unlikely, but ...)
+                    continue
 
-            if not musts.isdisjoint(never_tmp):
-                # If we pick p, we will definitely end up making
-                # t uninstallable, so p is a no-go.
-                continue
+                # Test if we can pick p without any consequences.
+                # - when we can, we avoid a backtrack point.
+                if never_tmp <= never and choices_tmp <= choices:
+                    # we can pick p without picking up new conflicts
+                    # or unresolved choices.  Therefore we commit to
+                    # using p.
+                    musts.update(musts_copy)
+                    stats.choice_resolved_without_restore_point += 1
+                    solved = True
+                    break
 
-            stats.backtrace_restore_point_created += 1
-            # We are not sure that p is safe, setup a backtrack
-            # point and recurse.
-            never_tmp |= never
-            choices_tmp |= choices
-            if self._check_inst(p, musts_copy, never_tmp,
-                                choices_tmp):
-                # Success, p was a valid choice and made it all
-                # installable
-                return True
+                if not musts.isdisjoint(never_tmp):
+                    # If we pick p, we will definitely end up making
+                    # t uninstallable, so p is a no-go.
+                    continue
 
-            # If we get here, we failed to find something that
-            # would satisfy choice (without breaking the
-            # installability of t).  This means p cannot be used
-            # to satisfy the dependencies, so pretend to conflict
-            # with it - hopefully it will reduce future choices.
-            never.add(p)
-            stats.backtrace_restore_point_used += 1
-
-        # Optimization for the last case; avoid the recursive call
-        # and just assume the last will lead to a solution.  If it
-        # doesn't there is no solution and if it does, we don't
-        # have to back-track anyway.
-        check.add(last)
-        musts.add(last)
-        stats.backtrace_last_option += 1
-        return False
+                stats.backtrace_restore_point_created += 1
+                # We are not sure that p is safe, setup a backtrack
+                # point and recurse.
+                never_tmp |= never
+                choices_tmp |= choices
+                if self._check_inst(p, musts_copy, never_tmp,
+                                    choices_tmp):
+                    # Success, p was a valid choice and made it all
+                    # installable
+                    return True
+
+                # If we get here, we failed to find something that
+                # would satisfy choice (without breaking the
+                # installability of t).  This means p cannot be used
+                # to satisfy the dependencies, so pretend to conflict
+                # with it - hopefully it will reduce future choices.
+                never.add(p)
+                stats.backtrace_restore_point_used += 1
+
+            if not solved:
+                # Optimization for the last case; avoid the recursive call
+                # and just assume the last will lead to a solution.  If it
+                # doesn't there is no solution and if it does, we don't
+                # have to back-track anyway.
+                check.add(last)
+                musts.add(last)
+                stats.backtrace_last_option += 1
+                return False
 
     def _check_loop(self, universe, testing, eqv_table, stats, musts, never,
                     cbroken, choices, check, len=len,
-- 
2.8.0.rc3

From b2c77cb0603528b7728b091e923ec8f6cf86b466 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 17 Jan 2016 18:35:09 +0000
Subject: [PATCH 06/33] inst-tester: Add a missing param to doc strings

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 installability/tester.py | 19 ++++++++++++-------
 1 file changed, 12 insertions(+), 7 deletions(-)

diff --git a/installability/tester.py b/installability/tester.py
index 406922d..1d03480 100644
--- a/installability/tester.py
+++ b/installability/tester.py
@@ -105,15 +105,17 @@ class InstallabilityTester(object):
     def stats(self):
         return self._stats
 
-    def are_equivalent(self, p1, p2):
-        """Test if p1 and p2 are equivalent
+    def are_equivalent(self, pkg_id1, pkg_id2):
+        """Test if pkg_id1 and pkg_id2 are equivalent
 
-        Returns True if p1 and p2 have the same "signature" in
+        :param pkg_id1 The id of the first package
+        :param pkg_id2 The id of the second package
+        :return: True if pkg_id1 and pkg_id2 have the same "signature" in
         the package dependency graph (i.e. relations can not tell
-        them apart semantically except for their name)
+        them apart semantically except for their name). Otherwise False
         """
         eqv_table = self._eqv_table
-        return p1 in eqv_table and p2 in eqv_table[p1]
+        return pkg_id1 in eqv_table and pkg_id2 in eqv_table[pkg_id1]
 
     def reverse_dependencies_of(self, pkg_id):
         """Returns the set of reverse dependencies of a given package
@@ -162,6 +164,8 @@ class InstallabilityTester(object):
 
         If the package is not known, this method will throw an
         KeyError.
+
+        :param pkg_id The id of the package
         """
 
         if pkg_id not in self._universe:
@@ -188,8 +192,9 @@ class InstallabilityTester(object):
     def remove_testing_binary(self, pkg_id):
         """Remove a binary from "testing"
 
+        :param pkg_id The id of the package
         If the package is not known, this method will throw an
-        Keyrror.
+        KeyError.
         """
 
         if pkg_id not in self._universe:
@@ -219,6 +224,7 @@ class InstallabilityTester(object):
         The package is assumed to be in "testing" and only packages in
         "testing" can be used to satisfy relations.
 
+        :param pkg_id The id of the package
         Returns True iff the package is installable.
         Returns False otherwise.
         """
@@ -239,7 +245,6 @@ class InstallabilityTester(object):
         self._stats.cache_misses += 1
         return self._check_inst(pkg_id)
 
-
     def _check_inst(self, t, musts=None, never=None, choices=None):
         # See the explanation of musts, never and choices below.
 
-- 
2.8.0.rc3

From 275f8acc6a1e47ce07cf1266185fd5e2aeb9e3a1 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 17 Jan 2016 18:35:50 +0000
Subject: [PATCH 07/33] inst-tester: Use short-hand syntax for new sets

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 installability/tester.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/installability/tester.py b/installability/tester.py
index 1d03480..23818e5 100644
--- a/installability/tester.py
+++ b/installability/tester.py
@@ -293,7 +293,7 @@ class InstallabilityTester(object):
             choices = set()
 
         # The subset of musts we haven't checked yet.
-        check = set([t])
+        check = {t}
 
         if len(musts) == 1:
             # Include the essential packages in testing as a starting point.
@@ -445,7 +445,7 @@ class InstallabilityTester(object):
                 musts_copy = musts.copy()
                 never_tmp = set()
                 choices_tmp = set()
-                check_tmp = set([p])
+                check_tmp = {p}
                 if not self._check_loop(universe, testing, eqv_table,
                                         stats, musts_copy, never_tmp,
                                         cbroken, choices_tmp,
-- 
2.8.0.rc3

From b0d794609771cdad1a1644e4228dc51f622ba637 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Tue, 22 Mar 2016 15:12:53 +0000
Subject: [PATCH 08/33] Partially support versioned provides

With this patch, Britney will correctly parse (and deparse) a
versioned Provides.  Furthermore, she will allow it to satisfy any
unversioned dependency on the provided package.

This is the easy half of #786803.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 29 ++++++++++++++++++++++-------
 britney_util.py | 15 ++++++++++++++-
 2 files changed, 36 insertions(+), 8 deletions(-)

diff --git a/britney.py b/britney.py
index 7c0985c..a75be5d 100755
--- a/britney.py
+++ b/britney.py
@@ -691,11 +691,24 @@ class Britney(object):
 
             # register virtual packages and real packages that provide them
             if dpkg[PROVIDES]:
-                parts = [p.strip() for p in dpkg[PROVIDES].split(",")]
-                for p in parts:
-                    provides[p].add(pkg)
-                dpkg[PROVIDES] = parts
-            else: dpkg[PROVIDES] = []
+                parts = apt_pkg.parse_depends(dpkg[PROVIDES], False)
+                nprov = []
+                for or_clause in parts:
+                    if len(or_clause) != 1:
+                        msg = "Ignoring invalid provides in %s: Alternatives [%s]" % (str(pkg_id), str(or_clause))
+                        self.__log(msg, type='W')
+                        continue
+                    for part in or_clause:
+                        provided, version, op = part
+                        if op != '' and op != '=':
+                            msg = "Ignoring invalid provides in %s: %s (%s %s)" % (str(pkg_id), provided, op, version)
+                            self.__log(msg, type='W')
+                            continue
+                        provides[provided].add(pkg)
+                        nprov.append(part)
+                dpkg[PROVIDES] = nprov
+            else:
+                dpkg[PROVIDES] = []
 
             # add the resulting dictionary to the package list
             packages[pkg] = dpkg
@@ -2086,7 +2099,8 @@ class Britney(object):
                         affected.update(inst_tester.negative_dependencies_of(rm_pkg_id))
 
                     # remove the provided virtual packages
-                    for j in pkg_data[PROVIDES]:
+                    for prov_rel in pkg_data[PROVIDES]:
+                        j = prov_rel[0]
                         key = j + "/" + parch
                         if key not in undo['virtual']:
                             undo['virtual'][key] = provides_t_a[j].copy()
@@ -2168,7 +2182,8 @@ class Britney(object):
                 binaries_t_a[binary] = new_pkg_data
                 inst_tester.add_testing_binary(updated_pkg_id)
                 # register new provided packages
-                for j in new_pkg_data[PROVIDES]:
+                for prov_rel in new_pkg_data[PROVIDES]:
+                    j = prov_rel[0]
                     key = j + "/" + parch
                     if j not in provides_t_a:
                         undo['nvirtual'].append(key)
diff --git a/britney_util.py b/britney_util.py
index 1cb96d6..6363c58 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -418,6 +418,19 @@ def write_sources(sources_s, filename):
            f.write(output + "\n\n")
 
 
+def relation_atom_to_string(atom):
+    """Take a parsed dependency and turn it into a string
+    """
+    pkg, version, rel_op = atom
+    if rel_op != '':
+        if rel_op in ('<', '>'):
+            # APT translate "<<" and ">>" into "<" and ">".  We have
+            # deparse those into the original form.
+            rel_op += rel_op
+        return "%s (%s %s)" % (pkg, rel_op, version)
+    return pkg
+
+
 def write_controlfiles(sources, packages, suite, basedir):
     """Write the control files
 
@@ -462,7 +475,7 @@ def write_controlfiles(sources, packages, suite, basedir):
                         output += (k + ": " + source + "\n")
                     elif key == PROVIDES:
                         if bin_data[key]:
-                            output += (k + ": " + ", ".join(bin_data[key]) + "\n")
+                            output += (k + ": " + ", ".join(relation_atom_to_string(p) for p in bin_data[key]) + "\n")
                     elif key == ESSENTIAL:
                         if bin_data[key]:
                             output += (k + ": " + " yes\n")
-- 
2.8.0.rc3

From 8971097ae5f0c58440d0f0c512f618052847b320 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Tue, 22 Mar 2016 15:28:39 +0000
Subject: [PATCH 09/33] Support versioned provides (without multi-arch)

Closes: #786803

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 29 ++++++++++++++++++-----------
 1 file changed, 18 insertions(+), 11 deletions(-)

diff --git a/britney.py b/britney.py
index a75be5d..3133138 100755
--- a/britney.py
+++ b/britney.py
@@ -699,12 +699,15 @@ class Britney(object):
                         self.__log(msg, type='W')
                         continue
                     for part in or_clause:
-                        provided, version, op = part
+                        provided, provided_version, op = part
                         if op != '' and op != '=':
                             msg = "Ignoring invalid provides in %s: %s (%s %s)" % (str(pkg_id), provided, op, version)
                             self.__log(msg, type='W')
                             continue
-                        provides[provided].add(pkg)
+                        provided = intern(provided)
+                        provided_version = intern(provided_version)
+                        part = (provided, provided_version, intern(op))
+                        provides[provided].add((pkg, provided_version))
                         nprov.append(part)
                 dpkg[PROVIDES] = nprov
             else:
@@ -994,13 +997,19 @@ class Britney(object):
                         packages.append(name)
 
             # look for the package in the virtual packages list and loop on them
-            for prov in provides_s_a.get(name, []):
-                if prov not in binaries_s_a: continue
+            for prov, prov_version in provides_s_a.get(name, []):
+                if prov not in binaries_s_a:
+                    continue
                 # A provides only satisfies:
                 # - an unversioned dependency (per Policy Manual §7.5)
                 # - a dependency without an architecture qualifier
                 #   (per analysis of apt code)
-                if op == '' and version == '' and archqual is None:
+                if archqual is not None:
+                    # Punt on this case - these days, APT and dpkg might actually agree on
+                    # this.
+                    continue
+                if (op == '' and version == '') or \
+                   (prov_version != '' and apt_pkg.check_dep(prov_version, op, version)):
                     packages.append(prov)
 
         return packages
@@ -2099,12 +2108,11 @@ class Britney(object):
                         affected.update(inst_tester.negative_dependencies_of(rm_pkg_id))
 
                     # remove the provided virtual packages
-                    for prov_rel in pkg_data[PROVIDES]:
-                        j = prov_rel[0]
+                    for j, prov_version, _ in pkg_data[PROVIDES]:
                         key = j + "/" + parch
                         if key not in undo['virtual']:
                             undo['virtual'][key] = provides_t_a[j].copy()
-                        provides_t_a[j].remove(binary)
+                        provides_t_a[j].remove((binary, prov_version))
                         if not provides_t_a[j]:
                             del provides_t_a[j]
                     # finally, remove the binary package
@@ -2182,15 +2190,14 @@ class Britney(object):
                 binaries_t_a[binary] = new_pkg_data
                 inst_tester.add_testing_binary(updated_pkg_id)
                 # register new provided packages
-                for prov_rel in new_pkg_data[PROVIDES]:
-                    j = prov_rel[0]
+                for j, prov_version, _ in new_pkg_data[PROVIDES]:
                     key = j + "/" + parch
                     if j not in provides_t_a:
                         undo['nvirtual'].append(key)
                         provides_t_a[j] = set()
                     elif key not in undo['virtual']:
                         undo['virtual'][key] = provides_t_a[j].copy()
-                    provides_t_a[j].add(binary)
+                    provides_t_a[j].add((binary, prov_version))
                 if not equivalent_replacement:
                     # all the reverse dependencies are affected by the change
                     affected.add(updated_pkg_id)
-- 
2.8.0.rc3

From 672b08407c6df14262a0654d20f07765653afdab Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Tue, 22 Mar 2016 20:37:48 +0000
Subject: [PATCH 10/33] britney: Work around bug 815995

This bug involves a corner case that involves:

 * source orig providing liborig1 and orig-doc in testing
 * source orig providing liborig2 and orig-doc in unstable
 * source hijack providing liborig2 and orig-doc in both testing and unstable,
   where the versions of hijack's binaries has a higher version than those of
   "orig".

The arch:all packages are needed to trigger this, because Britney
flags an arch:any package as "out of date" and stops the migration
there.  However, she is more lenient with arch:all packages.

What happens is that Britney realises that src:orig need to be updated
in testing (to remove liborig1).  This leaves src:orig with no
binaries left in testing (as the orig-doc from hijack is used) and it
is therefore removed as an obsolete source.  The obsolete removal then
exploded because Britney was also trying to remove the liborig1
package, which is no longer there.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 4 ++++
 1 file changed, 4 insertions(+)

diff --git a/britney.py b/britney.py
index 3133138..e96c6e5 100755
--- a/britney.py
+++ b/britney.py
@@ -1931,6 +1931,10 @@ class Britney(object):
                         and parch != migration_architecture):
                         continue
 
+                    # Work around #815995
+                    if migration_architecture == 'source' and is_removal and binary not in binaries_t[parch][0]:
+                        continue
+
                     if (not include_hijacked
                         and binaries_t[parch][0][binary][SOURCE] != source_name):
                         continue
-- 
2.8.0.rc3

From 03e6ae788cce6121569eb8746e174ce96e2a7631 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 23 Mar 2016 09:55:07 +0000
Subject: [PATCH 11/33] Remove all calls to "same_source" - they were overkill

The same_source is supposed to compare two versions and (if needed)
"massage" a binNMU version into a source version.  This extra feature
of same_source happens to be unused (and not generally applicable):

 1) We always compare two source versions, so there is never a
    binNMU version in the first place.
 2) binary versions are *not* always equal to their source version
    (even with the binNMU suffix stripped).  This happens when
    packages use "dpkg-gencontrol -v<version>".

Note this causes results from some live-data tests to change, because
there has been a sourceful upload with a binNMU version.  It was
intended as a binNMU, but was uploaded with the source as well.  As
Britney no longer works around this issue, it makes her remove the
affected packages in the end (as their source version does not match
the version in testing).

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 47 +++++++++++++++++++++--------------------------
 britney_util.py | 32 ++------------------------------
 2 files changed, 23 insertions(+), 56 deletions(-)

diff --git a/britney.py b/britney.py
index e96c6e5..4ab7e18 100755
--- a/britney.py
+++ b/britney.py
@@ -199,7 +199,7 @@ from installability.builder import InstallabilityTesterBuilder
 from excuse import Excuse
 from migrationitem import MigrationItem
 from hints import HintCollection
-from britney_util import (old_libraries_format, same_source, undo_changes,
+from britney_util import (old_libraries_format, undo_changes,
                           compute_reverse_tree,
                           read_nuninst, write_nuninst, write_heidi,
                           eval_uninst, newly_uninst, make_migrationitem,
@@ -1112,7 +1112,7 @@ class Britney(object):
         self.excuses.append(excuse)
         return True
 
-    def should_upgrade_srcarch(self, src, arch, suite, same_source=same_source):
+    def should_upgrade_srcarch(self, src, arch, suite):
         """Check if a set of binary packages should be upgraded
 
         This method checks if the binary packages produced by the source
@@ -1122,8 +1122,6 @@ class Britney(object):
         It returns False if the given packages don't need to be upgraded,
         True otherwise. In the former case, a new excuse is appended to
         the object attribute excuses.
-
-        same_source is an optimization to avoid "load global".
         """
         # retrieve the source packages for testing and suite
         source_t = self.sources['testing'][src]
@@ -1140,7 +1138,7 @@ class Britney(object):
         # version in testing, then stop here and return False
         # (as a side effect, a removal may generate such excuses for both the source
         # package and its binary packages on each architecture)
-        for hint in [ x for x in self.hints.search('remove', package=src) if same_source(source_t[VERSION], x.version) ]:
+        for hint in [x for x in self.hints.search('remove', package=src) if source_t[VERSION] == x.version]:
             excuse.addhtml("Removal request by %s" % (hint.user))
             excuse.addhtml("Trying to remove package, not update it")
             excuse.addhtml("Not considered")
@@ -1170,13 +1168,13 @@ class Britney(object):
 
             # if the new binary package is not from the same source as the testing one, then skip it
             # this implies that this binary migration is part of a source migration
-            if same_source(source_u[VERSION], pkgsv) and not same_source(source_t[VERSION], pkgsv):
+            if source_u[VERSION] == pkgsv and source_t[VERSION] != pkgsv:
                 anywrongver = True
                 excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u[VERSION], pkgsv, source_t[VERSION]))
                 continue
 
             # cruft in unstable
-            if not same_source(source_u[VERSION], pkgsv) and not same_source(source_t[VERSION], pkgsv):
+            if source_u[VERSION] != pkgsv and source_t[VERSION] != pkgsv:
                 if self.options.ignore_cruft:
                     excuse.addhtml("Old cruft: %s %s (but ignoring cruft, so nevermind)" % (pkg_name, pkgsv))
                 else:
@@ -1186,7 +1184,7 @@ class Britney(object):
 
             # if the source package has been updated in unstable and this is a binary migration, skip it
             # (the binaries are now out-of-date)
-            if same_source(source_t[VERSION], pkgsv) and source_t[VERSION] != source_u[VERSION]:
+            if source_t[VERSION] == pkgsv and source_t[VERSION] != source_u[VERSION]:
                 anywrongver = True
                 excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u[VERSION], pkgsv, source_u[VERSION]))
                 continue
@@ -1219,7 +1217,7 @@ class Britney(object):
         # package is not fake, then check what packages should be removed
         if not anywrongver and (anyworthdoing or not self.sources[suite][src][FAKESRC]):
             srcv = self.sources[suite][src][VERSION]
-            ssrc = same_source(source_t[VERSION], srcv)
+            ssrc = source_t[VERSION] == srcv
             # if this is a binary-only migration via *pu, we never want to try
             # removing binary packages
             if not (ssrc and suite != 'unstable'):
@@ -1264,7 +1262,7 @@ class Britney(object):
         # otherwise, return False
         return False
 
-    def should_upgrade_src(self, src, suite, same_source=same_source):
+    def should_upgrade_src(self, src, suite):
         """Check if source package should be upgraded
 
         This method checks if a source package should be upgraded. The analysis
@@ -1274,8 +1272,6 @@ class Britney(object):
         It returns False if the given package doesn't need to be upgraded,
         True otherwise. In the former case, a new excuse is appended to
         the object attribute excuses.
-
-        same_source is an opt to avoid "load global".
         """
 
         # retrieve the source packages for testing (if available) and suite
@@ -1320,8 +1316,8 @@ class Britney(object):
         # if there is a `remove' hint and the requested version is the same as the
         # version in testing, then stop here and return False
         for item in self.hints.search('remove', package=src):
-            if source_t and same_source(source_t[VERSION], item.version) or \
-               same_source(source_u[VERSION], item.version):
+            if source_t and source_t[VERSION] == item.version or \
+               source_u[VERSION] == item.version:
                 excuse.addhtml("Removal request by %s" % (item.user))
                 excuse.addhtml("Trying to remove package, not update it")
                 excuse.addreason("remove")
@@ -1346,7 +1342,7 @@ class Britney(object):
             unblock_cmd = "un" + block_cmd
             unblocks = self.hints.search(unblock_cmd, package=src)
 
-            if unblocks and unblocks[0].version is not None and same_source(unblocks[0].version, source_u[VERSION]):
+            if unblocks and unblocks[0].version is not None and unblocks[0].version == source_u[VERSION]:
                 if suite == 'unstable' or block_cmd == 'block-udeb':
                     excuse.addhtml("Ignoring %s request by %s, due to %s request by %s" %
                                    (block_cmd, blocked[block_cmd].user, unblock_cmd, unblocks[0].user))
@@ -1380,22 +1376,22 @@ class Britney(object):
         if suite == 'unstable':
             if src not in self.dates:
                 self.dates[src] = (source_u[VERSION], self.date_now)
-            elif not same_source(self.dates[src][0], source_u[VERSION]):
+            elif self.dates[src][0] != source_u[VERSION]:
                 self.dates[src] = (source_u[VERSION], self.date_now)
 
             days_old = self.date_now - self.dates[src][1]
             min_days = self.MINDAYS[urgency]
 
-            for age_days_hint in [ x for x in self.hints.search('age-days', package=src) if \
-               same_source(source_u[VERSION], x.version) ]:
+            for age_days_hint in [x for x in self.hints.search('age-days', package=src)
+                                  if source_u[VERSION] == x.version]:
                 excuse.addhtml("Overriding age needed from %d days to %d by %s" % (min_days,
                     int(age_days_hint.days), age_days_hint.user))
                 min_days = int(age_days_hint.days)
 
             excuse.setdaysold(days_old, min_days)
             if days_old < min_days:
-                urgent_hints = [ x for x in self.hints.search('urgent', package=src) if \
-                   same_source(source_u[VERSION], x.version) ]
+                urgent_hints = [x for x in self.hints.search('urgent', package=src)
+                                if source_u[VERSION] == x.version]
                 if urgent_hints:
                     excuse.addhtml("Too young, but urgency pushed by %s" % (urgent_hints[0].user))
                 else:
@@ -1458,7 +1454,7 @@ class Britney(object):
                 # if it wasn't built by the same source, it is out-of-date
                 # if there is at least one binary on this arch which is
                 # up-to-date, there is a build on this arch
-                if not same_source(source_u[VERSION], pkgsv):
+                if source_u[VERSION] != pkgsv:
                     if pkgsv not in oodbins:
                         oodbins[pkgsv] = []
                     oodbins[pkgsv].append(pkg)
@@ -1561,7 +1557,7 @@ class Britney(object):
                         "though it fixes more than it introduces, whine at debian-release)" % pkg)
 
         # check if there is a `force' hint for this package, which allows it to go in even if it is not updateable
-        forces = [ x for x in self.hints.search('force', package=src) if same_source(source_u[VERSION], x.version) ]
+        forces = [x for x in self.hints.search('force', package=src) if source_u[VERSION] == x.version]
         if forces:
             excuse.dontinvalidate = True
         if not update_candidate and forces:
@@ -1637,14 +1633,12 @@ class Britney(object):
                     exclookup[x].is_valid = False
             i = i + 1
  
-    def write_excuses(self, same_source=same_source):
+    def write_excuses(self):
         """Produce and write the update excuses
 
         This method handles the update excuses generation: the packages are
         looked at to determine whether they are valid candidates. For the details
         of this procedure, please refer to the module docstring.
-
-        same_source is an opt to avoid "load global".
         """
 
         self.__log("Update Excuses generation started", type="I")
@@ -1704,7 +1698,8 @@ class Britney(object):
 
             # check if the version specified in the hint is the same as the considered package
             tsrcv = sources['testing'][src][VERSION]
-            if not same_source(tsrcv, item.version): continue
+            if tsrcv != item.version:
+                continue
 
             # add the removal of the package to upgrade_me and build a new excuse
             upgrade_me.append("-%s" % (src))
diff --git a/britney_util.py b/britney_util.py
index 6363c58..372609a 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -26,7 +26,6 @@ from functools import partial
 from datetime import datetime
 from itertools import chain, repeat, filterfalse
 import os
-import re
 import time
 import yaml
 
@@ -37,31 +36,6 @@ from consts import (VERSION, BINARIES, PROVIDES, DEPENDS, CONFLICTS,
                     SOURCE, SOURCEVER, MAINTAINER, MULTIARCH,
                     ESSENTIAL)
 
-binnmu_re = re.compile(r'^(.*)\+b\d+$')
-
-def same_source(sv1, sv2, binnmu_re=binnmu_re):
-    """Check if two version numbers are built from the same source
-
-    This method returns a boolean value which is true if the two
-    version numbers specified as parameters are built from the same
-    source. The main use of this code is to detect binary-NMU.
-
-    binnmu_re is an optimization to avoid "load global".
-    """
-    if sv1 == sv2:
-        return 1
-
-    m = binnmu_re.match(sv1)
-    if m: sv1 = m.group(1)
-    m = binnmu_re.match(sv2)
-    if m: sv2 = m.group(1)
-
-    if sv1 == sv2:
-        return 1
-
-    return 0
-
-
 def ifilter_except(container, iterable=None):
     """Filter out elements in container
 
@@ -486,7 +460,7 @@ def write_controlfiles(sources, packages, suite, basedir):
     write_sources(sources_s, os.path.join(basedir, 'Sources'))
 
 
-def old_libraries(sources, packages, fucked_arches=frozenset(), same_source=same_source):
+def old_libraries(sources, packages, fucked_arches=frozenset()):
     """Detect old libraries left in testing for smooth transitions
 
     This method detects old libraries which are in testing but no
@@ -497,8 +471,6 @@ def old_libraries(sources, packages, fucked_arches=frozenset(), same_source=same
     For "fucked" architectures, outdated binaries are allowed to be in
     testing, so they are only added to the removal list if they are no longer
     in unstable.
-
-    same_source is an optimisation to avoid "load global".
     """
     sources_t = sources['testing']
     testing = packages['testing']
@@ -507,7 +479,7 @@ def old_libraries(sources, packages, fucked_arches=frozenset(), same_source=same
     for arch in testing:
         for pkg_name in testing[arch][0]:
             pkg = testing[arch][0][pkg_name]
-            if not same_source(sources_t[pkg[SOURCE]][VERSION], pkg[SOURCEVER]) and \
+            if sources_t[pkg[SOURCE]][VERSION] != pkg[SOURCEVER] and \
                 (arch not in fucked_arches or pkg_name not in unstable[arch][0]):
                 migration = "-" + "/".join((pkg_name, arch, pkg[SOURCEVER]))
                 removals.append(MigrationItem(migration))
-- 
2.8.0.rc3

From bb642a7edb58259b09573bd04c032cde2bdd3ed9 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 23 Mar 2016 15:10:08 +0000
Subject: [PATCH 12/33] Use pkg_id instead pkg/arch in BINARIES

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 38 ++++++++++++++++++--------------------
 britney_util.py |  7 +++----
 2 files changed, 21 insertions(+), 24 deletions(-)

diff --git a/britney.py b/britney.py
index 4ab7e18..639a602 100755
--- a/britney.py
+++ b/britney.py
@@ -674,7 +674,6 @@ class Britney(object):
                 if "(" in source:
                     dpkg[SOURCEVER] = intern(source[source.find("(")+1:source.find(")")])
 
-            pkgarch = "%s/%s" % (pkg,arch)
             # if the source package is available in the distribution, then register this binary package
             if dpkg[SOURCE] in sources[distribution]:
                 # There may be multiple versions of any arch:all packages
@@ -683,11 +682,11 @@ class Britney(object):
                 # source -> binary mapping once. It doesn't matter which
                 # of the versions we include as only the package name and
                 # architecture are recorded.
-                if pkgarch not in sources[distribution][dpkg[SOURCE]][BINARIES]:
-                    sources[distribution][dpkg[SOURCE]][BINARIES].append(pkgarch)
+                if pkg_id not in sources[distribution][dpkg[SOURCE]][BINARIES]:
+                    sources[distribution][dpkg[SOURCE]][BINARIES].append(pkg_id)
             # if the source package doesn't exist, create a fake one
             else:
-                sources[distribution][dpkg[SOURCE]] = [dpkg[SOURCEVER], 'faux', [pkgarch], None, True]
+                sources[distribution][dpkg[SOURCE]] = [dpkg[SOURCEVER], 'faux', [pkg_id], None, True]
 
             # register virtual packages and real packages that provide them
             if dpkg[PROVIDES]:
@@ -1151,11 +1150,11 @@ class Britney(object):
         anyworthdoing = False
 
         # for every binary package produced by this source in unstable for this architecture
-        for pkg in sorted(filter(lambda x: x.endswith("/" + arch), source_u[BINARIES]), key=lambda x: x.split("/")[0]):
-            pkg_name = pkg.split("/")[0]
+        for pkg_id in sorted(x for x in source_u[BINARIES] if x[2] == arch):
+            pkg_name = pkg_id[0]
 
             # retrieve the testing (if present) and unstable corresponding binary packages
-            binary_t = pkg in source_t[BINARIES] and self.binaries['testing'][arch][0][pkg_name] or None
+            binary_t = pkg_name in self.binaries['testing'][arch][0] and self.binaries['testing'][arch][0][pkg_name] or None
             binary_u = self.binaries[suite][arch][0][pkg_name]
 
             # this is the source version for the new binary package
@@ -1228,7 +1227,8 @@ class Britney(object):
                                                         arch,
                                                         False)
 
-                for pkg in sorted(x.split("/")[0] for x in source_data[BINARIES] if x.endswith("/"+arch)):
+                for pkg_id in sorted(x for x in source_data[BINARIES] if x[2] == arch):
+                    pkg = pkg_id[0]
                     # if the package is architecture-independent, then ignore it
                     tpkg_data = self.binaries['testing'][arch][0][pkg]
                     if tpkg_data[ARCHITECTURE] == 'all':
@@ -1245,8 +1245,7 @@ class Britney(object):
                             # it "interesting" on its own.  This case happens quite often with smooth updatable
                             # packages, where the old binary "survives" a full run because it still has
                             # reverse dependencies.
-                            name = (pkg, tpkg_data[VERSION], tpkg_data[ARCHITECTURE])
-                            if name not in smoothbins:
+                            if pkg_id not in smoothbins:
                                 anyworthdoing = True
 
         # if there is nothing wrong and there is something worth doing, this is a valid candidate
@@ -1407,7 +1406,7 @@ class Britney(object):
                 # if the package in testing has no binaries on this
                 # architecture, it can't be out-of-date
                 if not any(x for x in self.sources["testing"][src][BINARIES]
-                           if x.endswith("/"+arch) and self.binaries["testing"][arch][0][x.split("/")[0]][ARCHITECTURE] != 'all'):
+                           if x[2] == arch and self.binaries["testing"][arch][0][x[0]][ARCHITECTURE] != 'all'):
                     continue
                     
                 # if the (t-)p-u package has produced any binaries on
@@ -1443,7 +1442,8 @@ class Britney(object):
             oodbins = {}
             uptodatebins = False
             # for every binary package produced by this source in the suite for this architecture
-            for pkg in sorted(x.split("/")[0] for x in self.sources[suite][src][BINARIES] if x.endswith("/"+arch)):
+            for pkg_id in sorted(x for x in self.sources[suite][src][BINARIES] if x[2] == arch):
+                pkg = pkg_id[0]
                 if pkg not in pkgs: pkgs[pkg] = []
                 pkgs[pkg].append(arch)
 
@@ -1920,8 +1920,8 @@ class Britney(object):
                 # remove all the binaries
 
                 # first, build a list of eligible binaries
-                for p in source_data[BINARIES]:
-                    binary, parch = p.split("/")
+                for pkg_id in source_data[BINARIES]:
+                    binary, _, parch = pkg_id
                     if (migration_architecture != 'source'
                         and parch != migration_architecture):
                         continue
@@ -1933,8 +1933,7 @@ class Britney(object):
                     if (not include_hijacked
                         and binaries_t[parch][0][binary][SOURCE] != source_name):
                         continue
-                    version = binaries_t[parch][0][binary][VERSION]
-                    bins.append((binary, version, parch))
+                    bins.append(pkg_id)
 
                 for pkg_id in bins:
                     binary, _, parch = pkg_id
@@ -2008,11 +2007,10 @@ class Britney(object):
         # add the new binary packages (if we are not removing)
         if not is_removal:
             source_data = sources[suite][source_name]
-            for p in source_data[BINARIES]:
-                binary, parch = p.split("/")
+            for pkg_id in source_data[BINARIES]:
+                binary, _, parch = pkg_id
                 if migration_architecture not in ['source', parch]:
                     continue
-                version = self.binaries[suite][parch][0][binary][VERSION]
 
                 if (not include_hijacked
                     and self.binaries[suite][parch][0][binary][SOURCE] != source_name):
@@ -2027,7 +2025,7 @@ class Britney(object):
                                 rms.remove((rm_b, rm_v, rm_p))
                     continue
 
-                adds.add((binary, version, parch))
+                adds.add(pkg_id)
 
         return (adds, rms, smoothbins)
 
diff --git a/britney_util.py b/britney_util.py
index 372609a..4ac2c67 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -123,12 +123,11 @@ def undo_changes(lundo, inst_tester, sources, binaries, all_binary_packages,
     # undo all new binaries (consequence of the above)
     for (undo, item) in lundo:
         if not item.is_removal and item.package in sources[item.suite]:
-            for p in sources[item.suite][item.package][BINARIES]:
-                binary, arch = p.split("/")
+            for pkg_id in sources[item.suite][item.package][BINARIES]:
+                binary, _, arch = pkg_id
                 if item.architecture in ['source', arch]:
-                    version = binaries["testing"][arch][0][binary][VERSION]
                     del binaries["testing"][arch][0][binary]
-                    inst_tester.remove_testing_binary((binary, version, arch))
+                    inst_tester.remove_testing_binary(pkg_id)
 
 
     # STEP 3
-- 
2.8.0.rc3

From 765d6c1ea0fc8b904df79d0b20ee0bc16efec51a Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 23 Mar 2016 15:21:27 +0000
Subject: [PATCH 13/33] Refactor some code to avoid unnecessary table
 lookups/checks

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 36 +++++++++++++++++++-----------------
 1 file changed, 19 insertions(+), 17 deletions(-)

diff --git a/britney.py b/britney.py
index 639a602..6f5e235 100755
--- a/britney.py
+++ b/britney.py
@@ -1149,13 +1149,16 @@ class Britney(object):
         anywrongver = False
         anyworthdoing = False
 
+        packages_t_a = self.binaries['testing'][arch][0]
+        packages_s_a = self.binaries[suite][arch][0]
+
         # for every binary package produced by this source in unstable for this architecture
         for pkg_id in sorted(x for x in source_u[BINARIES] if x[2] == arch):
             pkg_name = pkg_id[0]
 
             # retrieve the testing (if present) and unstable corresponding binary packages
-            binary_t = pkg_name in self.binaries['testing'][arch][0] and self.binaries['testing'][arch][0][pkg_name] or None
-            binary_u = self.binaries[suite][arch][0][pkg_name]
+            binary_t = pkg_name in packages_t_a and packages_t_a[pkg_name] or None
+            binary_u = packages_s_a[pkg_name]
 
             # this is the source version for the new binary package
             pkgsv = binary_u[SOURCEVER]
@@ -1214,28 +1217,27 @@ class Britney(object):
 
         # if there is nothing wrong and there is something worth doing or the source
         # package is not fake, then check what packages should be removed
-        if not anywrongver and (anyworthdoing or not self.sources[suite][src][FAKESRC]):
-            srcv = self.sources[suite][src][VERSION]
+        if not anywrongver and (anyworthdoing or not source_u[FAKESRC]):
+            srcv = source_u[VERSION]
             ssrc = source_t[VERSION] == srcv
             # if this is a binary-only migration via *pu, we never want to try
             # removing binary packages
             if not (ssrc and suite != 'unstable'):
                 # for every binary package produced by this source in testing for this architecture
-                source_data = self.sources['testing'][src]
                 _, _, smoothbins = self._compute_groups(src,
                                                         "unstable",
                                                         arch,
                                                         False)
 
-                for pkg_id in sorted(x for x in source_data[BINARIES] if x[2] == arch):
+                for pkg_id in sorted(x for x in source_t[BINARIES] if x[2] == arch):
                     pkg = pkg_id[0]
                     # if the package is architecture-independent, then ignore it
-                    tpkg_data = self.binaries['testing'][arch][0][pkg]
+                    tpkg_data = packages_t_a[pkg]
                     if tpkg_data[ARCHITECTURE] == 'all':
                         excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
                         continue
                     # if the package is not produced by the new source package, then remove it from testing
-                    if pkg not in self.binaries[suite][arch][0]:
+                    if pkg not in packages_s_a:
                         excuse.addhtml("Removed binary: %s %s" % (pkg, tpkg_data[VERSION]))
                         # the removed binary is only interesting if this is a binary-only migration,
                         # as otherwise the updated source will already cause the binary packages
@@ -1397,16 +1399,16 @@ class Britney(object):
                     update_candidate = False
                     excuse.addreason("age")
 
-        if suite in ['pu', 'tpu']:
+        all_binaries = self.all_binaries
+
+        if suite in ('pu', 'tpu') and source_t:
             # o-o-d(ish) checks for (t-)p-u
+            # This only makes sense if the package is actually in testing.
             for arch in self.options.architectures:
-                if src not in self.sources["testing"]:
-                    continue
-                    
                 # if the package in testing has no binaries on this
                 # architecture, it can't be out-of-date
-                if not any(x for x in self.sources["testing"][src][BINARIES]
-                           if x[2] == arch and self.binaries["testing"][arch][0][x[0]][ARCHITECTURE] != 'all'):
+                if not any(x for x in source_t[BINARIES]
+                           if x[2] == arch and all_binaries[x][ARCHITECTURE] != 'all'):
                     continue
                     
                 # if the (t-)p-u package has produced any binaries on
@@ -1442,13 +1444,13 @@ class Britney(object):
             oodbins = {}
             uptodatebins = False
             # for every binary package produced by this source in the suite for this architecture
-            for pkg_id in sorted(x for x in self.sources[suite][src][BINARIES] if x[2] == arch):
+            for pkg_id in sorted(x for x in source_u[BINARIES] if x[2] == arch):
                 pkg = pkg_id[0]
                 if pkg not in pkgs: pkgs[pkg] = []
                 pkgs[pkg].append(arch)
 
                 # retrieve the binary package and its source version
-                binary_u = self.binaries[suite][arch][0][pkg]
+                binary_u = all_binaries[pkg_id]
                 pkgsv = binary_u[SOURCEVER]
 
                 # if it wasn't built by the same source, it is out-of-date
@@ -1512,7 +1514,7 @@ class Britney(object):
                     excuse.addhtml(text)
 
         # if the source package has no binaries, set update_candidate to False to block the update
-        if len(self.sources[suite][src][BINARIES]) == 0:
+        if not source_u[BINARIES]:
             excuse.addhtml("%s has no binaries on any arch" % src)
             excuse.addreason("no-binaries")
             update_candidate = False
-- 
2.8.0.rc3

From 9160fedcf457d230cc4c553253bf551cd8e8bb23 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 23 Mar 2016 16:26:04 +0000
Subject: [PATCH 14/33] britney.py: Remove unused named parameter

The "include_hijacked" parameter of "_compute_groups" was always
false, so there was little point in having it.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 11 ++++-------
 1 file changed, 4 insertions(+), 7 deletions(-)

diff --git a/britney.py b/britney.py
index 6f5e235..59d05c3 100755
--- a/britney.py
+++ b/britney.py
@@ -1855,7 +1855,7 @@ class Britney(object):
 
 
     def _compute_groups(self, source_name, suite, migration_architecture,
-                        is_removal, include_hijacked=False,
+                        is_removal,
                         allow_smooth_updates=True,
                         removals=frozenset()):
         """Compute the groups of binaries being migrated by item
@@ -1874,8 +1874,6 @@ class Britney(object):
           architecture).  [Same as item.architecture, where available]
         * "is_removal" is a boolean determining if this is a removal
            or not [Same as item.is_removal, where available]
-        * "include_hijacked" determines whether hijacked binaries should
-          be included in results or not. (defaults: False)
         * "allow_smooth_updates" is a boolean determing whether smooth-
           updates are permitted in this migration.  When set to False,
           the "smoothbins" return value will always be the empty set.
@@ -1932,8 +1930,8 @@ class Britney(object):
                     if migration_architecture == 'source' and is_removal and binary not in binaries_t[parch][0]:
                         continue
 
-                    if (not include_hijacked
-                        and binaries_t[parch][0][binary][SOURCE] != source_name):
+                    # Do not include hijacked binaries
+                    if binaries_t[parch][0][binary][SOURCE] != source_name:
                         continue
                     bins.append(pkg_id)
 
@@ -2014,8 +2012,7 @@ class Britney(object):
                 if migration_architecture not in ['source', parch]:
                     continue
 
-                if (not include_hijacked
-                    and self.binaries[suite][parch][0][binary][SOURCE] != source_name):
+                if self.binaries[suite][parch][0][binary][SOURCE] != source_name:
                     # This binary package has been hijacked by some other source.
                     # So don't add it as part of this update.
                     #
-- 
2.8.0.rc3

From ab6df65fdf5a2dee4f1ed086fc7e15f17adf42be Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 23 Mar 2016 16:52:33 +0000
Subject: [PATCH 15/33] britney_util: Remove some unused imports

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney_util.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/britney_util.py b/britney_util.py
index 4ac2c67..9401df6 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -21,10 +21,9 @@
 # GNU General Public License for more details.
 
 
-import apt_pkg
 from functools import partial
 from datetime import datetime
-from itertools import chain, repeat, filterfalse
+from itertools import filterfalse
 import os
 import time
 import yaml
@@ -36,6 +35,7 @@ from consts import (VERSION, BINARIES, PROVIDES, DEPENDS, CONFLICTS,
                     SOURCE, SOURCEVER, MAINTAINER, MULTIARCH,
                     ESSENTIAL)
 
+
 def ifilter_except(container, iterable=None):
     """Filter out elements in container
 
-- 
2.8.0.rc3

From c34d03ca498f826e51027617e2ae7fa3bedcdc7c Mon Sep 17 00:00:00 2001
From: Emilio Pozuelo Monfort <pochu@debian.org>
Date: Wed, 23 Mar 2016 20:07:25 +0000
Subject: [PATCH 16/33] Fix migration for sources with old cruft that isn't in
 testing

A package can have old cruft no longer in testing, which can't
migrate because it depends on old libraries or packages that
aren't in testing anymore, preventing migration.

There's no point in trying to migrate old cruft that has already
been removed in testing anyway, so don't do that.

Signed-off-by: Emilio Pozuelo Monfort <pochu@debian.org>
Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/britney.py b/britney.py
index 59d05c3..f806047 100755
--- a/britney.py
+++ b/britney.py
@@ -2024,6 +2024,12 @@ class Britney(object):
                                 rms.remove((rm_b, rm_v, rm_p))
                     continue
 
+                # Don't add the binary if it is old cruft that is no longer in testing
+                if (parch not in self.options.fucked_arches and
+                    source_data[VERSION] != self.binaries[suite][parch][0][binary][SOURCEVER] and
+                    binary not in binaries_t[parch][0]):
+                    continue
+
                 adds.add(pkg_id)
 
         return (adds, rms, smoothbins)
-- 
2.8.0.rc3

From ec6f24a68c9c40358db071aa2af8a49e638ccb55 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Thu, 24 Mar 2016 07:43:08 +0000
Subject: [PATCH 17/33] britney.py: Avoid creating empty lists for provides

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/britney.py b/britney.py
index f806047..6623c7c 100755
--- a/britney.py
+++ b/britney.py
@@ -965,7 +965,7 @@ class Britney(object):
     # Utility methods for package analysis
     # ------------------------------------
 
-    def get_dependency_solvers(self, block, packages_s_a):
+    def get_dependency_solvers(self, block, packages_s_a, empty_set=frozenset()):
         """Find the packages which satisfy a dependency block
 
         This method returns the list of packages which satisfy a dependency
@@ -996,7 +996,7 @@ class Britney(object):
                         packages.append(name)
 
             # look for the package in the virtual packages list and loop on them
-            for prov, prov_version in provides_s_a.get(name, []):
+            for prov, prov_version in provides_s_a.get(name, empty_set):
                 if prov not in binaries_s_a:
                     continue
                 # A provides only satisfies:
-- 
2.8.0.rc3

From 34d05dc5824b767386ab03bd27f301ea52669a0a Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Thu, 24 Mar 2016 07:45:04 +0000
Subject: [PATCH 18/33] britney.py: assert provides table is up to date

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/britney.py b/britney.py
index 6623c7c..71bbb89 100755
--- a/britney.py
+++ b/britney.py
@@ -997,8 +997,7 @@ class Britney(object):
 
             # look for the package in the virtual packages list and loop on them
             for prov, prov_version in provides_s_a.get(name, empty_set):
-                if prov not in binaries_s_a:
-                    continue
+                assert prov in binaries_s_a
                 # A provides only satisfies:
                 # - an unversioned dependency (per Policy Manual §7.5)
                 # - a dependency without an architecture qualifier
-- 
2.8.0.rc3

From d850bfd947768f2fd2a2589233e8445e8474164c Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Thu, 24 Mar 2016 07:51:36 +0000
Subject: [PATCH 19/33] Refactor some local expressions

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 7 ++-----
 1 file changed, 2 insertions(+), 5 deletions(-)

diff --git a/britney.py b/britney.py
index 71bbb89..670877a 100755
--- a/britney.py
+++ b/britney.py
@@ -1583,10 +1583,9 @@ class Britney(object):
         This method returns a dictionary where the keys are the package names
         and the values are the excuse names which depend on it.
         """
-        res = {}
+        res = defaultdict(list)
         for exc in self.excuses:
             for d in exc.deps:
-                if d not in res: res[d] = []
                 res[d].append(exc.name)
         return res
 
@@ -1598,9 +1597,7 @@ class Britney(object):
         `valid' and `invalid' excuses.
         """
         # build a lookup-by-name map
-        exclookup = {}
-        for e in self.excuses:
-            exclookup[e.name] = e
+        exclookup = {e.name: e for e in self.excuses}
 
         # build the reverse dependencies
         revdeps = self.reversed_exc_deps()
-- 
2.8.0.rc3

From ce8adacd55e915b03b2acb0eb83824a7454fc58d Mon Sep 17 00:00:00 2001
From: Anthony Towns <aj@erisian.com.au>
Date: Mon, 27 Apr 2015 02:13:27 +1000
Subject: [PATCH 20/33] britney.py: Add support for multiple components

Adds a --components command line argument (and corresponding config file
option). If specified, package info is expected to be in the usual Debian
mirror layout, ie:

   testing/source/Sources
   testing/binary-${ARCH}/Packages

(nthykier: Squashed, rebased and did some porting to Python3)
Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 141 ++++++++++++++++++++++++++++++++++++++++++-------------------
 1 file changed, 98 insertions(+), 43 deletions(-)

diff --git a/britney.py b/britney.py
index 670877a..c6c32b1 100755
--- a/britney.py
+++ b/britney.py
@@ -387,6 +387,8 @@ class Britney(object):
                                help="do not build the non-installability status, use the cache from file")
         parser.add_option("", "--print-uninst", action="store_true", dest="print_uninst", default=False,
                                help="just print a summary of uninstallable packages")
+        parser.add_option("", "--components", action="store", dest="components",
+                               help="Sources/Packages are laid out by components listed (, sep)")
         (self.options, self.args) = parser.parse_args()
         
         # integrity checks
@@ -417,6 +419,11 @@ class Britney(object):
                          not getattr(self.options, k.lower()):
                         setattr(self.options, k.lower(), v)
 
+        if getattr(self.options, "components", None):
+            self.options.components = [s.strip() for s in self.options.components.split(",")]
+        else:
+            self.options.components = None
+
         if not hasattr(self.options, "heidi_delta_output"):
             self.options.heidi_delta_output = self.options.heidi_output + "Delta"
 
@@ -543,19 +550,10 @@ class Britney(object):
     # Data reading/writing methods
     # ----------------------------
 
-    def read_sources(self, basedir, intern=sys.intern):
-        """Read the list of source packages from the specified directory
-        
-        The source packages are read from the `Sources' file within the
-        directory specified as `basedir' parameter. Considering the
-        large amount of memory needed, not all the fields are loaded
-        in memory. The available fields are Version, Maintainer and Section.
+    def _read_sources_file(self, filename, sources=None, intern=sys.intern):
+        if sources is None:
+            sources = {}
 
-        The method returns a list where every item represents a source
-        package as a dictionary.
-        """
-        sources = {}
-        filename = os.path.join(basedir, "Sources")
         self.__log("Loading source packages from %s" % filename)
 
         with open(filename, encoding='utf-8') as f:
@@ -583,38 +581,37 @@ class Britney(object):
                            ]
         return sources
 
-    def read_binaries(self, basedir, distribution, arch, intern=sys.intern):
-        """Read the list of binary packages from the specified directory
-        
-        The binary packages are read from the `Packages_${arch}' files
-        within the directory specified as `basedir' parameter, replacing
-        ${arch} with the value of the arch parameter. Considering the
-        large amount of memory needed, not all the fields are loaded
-        in memory. The available fields are Version, Source, Multi-Arch,
-        Depends, Conflicts, Provides and Architecture.
-        
-        After reading the packages, reverse dependencies are computed
-        and saved in the `rdepends' keys, and the `Provides' field is
-        used to populate the virtual packages list.
+    def read_sources(self, basedir):
+        """Read the list of source packages from the specified directory
 
-        The dependencies are parsed with the apt_pkg.parse_depends method,
-        and they are stored both as the format of its return value and
-        text.
+        The source packages are read from the `Sources' file within the
+        directory specified as `basedir' parameter. Considering the
+        large amount of memory needed, not all the fields are loaded
+        in memory. The available fields are Version, Maintainer and Section.
 
-        The method returns a tuple. The first element is a list where
-        every item represents a binary package as a dictionary; the second
-        element is a dictionary which maps virtual packages to real
-        packages that provide them.
+        The method returns a list where every item represents a source
+        package as a dictionary.
         """
 
-        packages = {}
-        provides = defaultdict(set)
-        sources = self.sources
-        all_binaries = self.all_binaries
+        if self.options.components:
+            sources = {}
+            for component in self.options.components:
+                filename = os.path.join(basedir, component, "source", "Sources")
+                self._read_sources_file(filename, sources)
+        else:
+            filename = os.path.join(basedir, "Sources")
+            sources = self._read_sources_file(filename)
 
-        filename = os.path.join(basedir, "Packages_%s" % arch)
+        return sources
+
+    def _read_packages_file(self, filename, arch, srcdist, packages=None, intern=sys.intern):
         self.__log("Loading binary packages from %s" % filename)
 
+        if packages is None:
+            packages = {}
+
+        all_binaries = self.all_binaries
+
         with open(filename, encoding='utf-8') as f:
             Packages = apt_pkg.TagFile(f)
         get_field = Packages.section.get
@@ -675,20 +672,19 @@ class Britney(object):
                     dpkg[SOURCEVER] = intern(source[source.find("(")+1:source.find(")")])
 
             # if the source package is available in the distribution, then register this binary package
-            if dpkg[SOURCE] in sources[distribution]:
+            if dpkg[SOURCE] in srcdist:
                 # There may be multiple versions of any arch:all packages
                 # (in unstable) if some architectures have out-of-date
                 # binaries.  We only want to include the package in the
                 # source -> binary mapping once. It doesn't matter which
                 # of the versions we include as only the package name and
                 # architecture are recorded.
-                if pkg_id not in sources[distribution][dpkg[SOURCE]][BINARIES]:
-                    sources[distribution][dpkg[SOURCE]][BINARIES].append(pkg_id)
+                if pkg_id not in srcdist[dpkg[SOURCE]][BINARIES]:
+                    srcdist[dpkg[SOURCE]][BINARIES].append(pkg_id)
             # if the source package doesn't exist, create a fake one
             else:
-                sources[distribution][dpkg[SOURCE]] = [dpkg[SOURCEVER], 'faux', [pkg_id], None, True]
+                srcdist[dpkg[SOURCE]] = [dpkg[SOURCEVER], 'faux', [pkg_id], None, True]
 
-            # register virtual packages and real packages that provide them
             if dpkg[PROVIDES]:
                 parts = apt_pkg.parse_depends(dpkg[PROVIDES], False)
                 nprov = []
@@ -706,7 +702,6 @@ class Britney(object):
                         provided = intern(provided)
                         provided_version = intern(provided_version)
                         part = (provided, provided_version, intern(op))
-                        provides[provided].add((pkg, provided_version))
                         nprov.append(part)
                 dpkg[PROVIDES] = nprov
             else:
@@ -719,6 +714,66 @@ class Britney(object):
             else:
                 all_binaries[pkg_id] = dpkg
 
+            # add the resulting dictionary to the package list
+            packages[pkg] = dpkg
+
+        return packages
+
+    def read_binaries(self, basedir, distribution, arch):
+        """Read the list of binary packages from the specified directory
+
+        The binary packages are read from the `Packages' files for `arch'.
+
+        If components are specified, the files
+        for each component are loaded according to the usual Debian mirror
+        layout.
+
+        If no components are specified, a single file named
+        `Packages_${arch}' is expected to be within the directory
+        specified as `basedir' parameter, replacing ${arch} with the
+        value of the arch parameter.
+
+        Considering the
+        large amount of memory needed, not all the fields are loaded
+        in memory. The available fields are Version, Source, Multi-Arch,
+        Depends, Conflicts, Provides and Architecture.
+
+        After reading the packages, reverse dependencies are computed
+        and saved in the `rdepends' keys, and the `Provides' field is
+        used to populate the virtual packages list.
+
+        The dependencies are parsed with the apt_pkg.parse_depends method,
+        and they are stored both as the format of its return value and
+        text.
+
+        The method returns a tuple. The first element is a list where
+        every item represents a binary package as a dictionary; the second
+        element is a dictionary which maps virtual packages to real
+        packages that provide them.
+        """
+
+        if self.options.components:
+            packages = {}
+            for component in self.options.components:
+                filename = os.path.join(basedir,
+                             component, "binary-%s" % arch, "Packages")
+                self._read_packages_file(filename, arch,
+                      self.sources[distribution], packages)
+        else:
+            filename = os.path.join(basedir, "Packages_%s" % arch)
+            packages = self._read_packages_file(filename, arch,
+                             self.sources[distribution])
+
+        # create provides
+        provides = defaultdict(set)
+
+        for pkg, dpkg in packages.items():
+            # register virtual packages and real packages that provide
+            # them
+            for provided_pkg, provided_version, _ in dpkg[PROVIDES]:
+                provides[provided_pkg].add((pkg, provided_version))
+
+
         # return a tuple with the list of real and virtual packages
         return (packages, provides)
 
-- 
2.8.0.rc3

From 8d13e4489a767334fb8f8c68db485d1a152a8a7e Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Thu, 24 Mar 2016 15:08:18 +0000
Subject: [PATCH 21/33] Make components and --control-files mutually exclusive

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/britney.py b/britney.py
index c6c32b1..99f9078 100755
--- a/britney.py
+++ b/britney.py
@@ -424,6 +424,12 @@ class Britney(object):
         else:
             self.options.components = None
 
+        if self.options.control_files and self.options.components:
+            # We cannot regenerate the control files correctly when reading from an
+            # actual mirror (we don't which package goes in what component etc.).
+            self.__log("Cannot use --control-files with mirror-layout (components)!", type="E")
+            sys.exit(1)
+
         if not hasattr(self.options, "heidi_delta_output"):
             self.options.heidi_delta_output = self.options.heidi_output + "Delta"
 
-- 
2.8.0.rc3

From 154e16a8746f98aafa6e908846baffcfd91d5f9a Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Thu, 24 Mar 2016 15:59:41 +0000
Subject: [PATCH 22/33] britney.py: Remove redundant open before TagFile

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 6 ++----
 1 file changed, 2 insertions(+), 4 deletions(-)

diff --git a/britney.py b/britney.py
index 99f9078..50ddde5 100755
--- a/britney.py
+++ b/britney.py
@@ -562,8 +562,7 @@ class Britney(object):
 
         self.__log("Loading source packages from %s" % filename)
 
-        with open(filename, encoding='utf-8') as f:
-            Packages = apt_pkg.TagFile(f)
+        Packages = apt_pkg.TagFile(filename)
         get_field = Packages.section.get
         step = Packages.step
 
@@ -618,8 +617,7 @@ class Britney(object):
 
         all_binaries = self.all_binaries
 
-        with open(filename, encoding='utf-8') as f:
-            Packages = apt_pkg.TagFile(f)
+        Packages = apt_pkg.TagFile(filename)
         get_field = Packages.section.get
         step = Packages.step
 
-- 
2.8.0.rc3

From efe351edbbe7054b574f0fe7b792e7e50ce3fa4b Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 25 Mar 2016 07:43:46 +0000
Subject: [PATCH 23/33] britney.py: Rename __log to log

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 114 ++++++++++++++++++++++++++++++-------------------------------
 1 file changed, 57 insertions(+), 57 deletions(-)

diff --git a/britney.py b/britney.py
index 50ddde5..c8b9388 100755
--- a/britney.py
+++ b/britney.py
@@ -267,7 +267,7 @@ class Britney(object):
             self.hints = self.read_hints(self.options.unstable)
 
         if self.options.nuninst_cache:
-            self.__log("Not building the list of non-installable packages, as requested", type="I")
+            self.log("Not building the list of non-installable packages, as requested", type="I")
             if self.options.print_uninst:
                 print('* summary')
                 print('\n'.join('%4d %s' % (len(nuninst[x]), x) for x in self.options.architectures))
@@ -301,18 +301,18 @@ class Britney(object):
                 # here.
                 self.binaries['pu'][arch] = ({}, {})
 
-        self.__log("Compiling Installability tester", type="I")
+        self.log("Compiling Installability tester", type="I")
         self._build_installability_tester(self.options.architectures)
 
         if not self.options.nuninst_cache:
-            self.__log("Building the list of non-installable packages for the full archive", type="I")
+            self.log("Building the list of non-installable packages for the full archive", type="I")
             nuninst = {}
             self._inst_tester.compute_testing_installability()
             for arch in self.options.architectures:
-                self.__log("> Checking for non-installable packages for architecture %s" % arch, type="I")
+                self.log("> Checking for non-installable packages for architecture %s" % arch, type="I")
                 result = self.get_nuninst(arch, build=True)
                 nuninst.update(result)
-                self.__log("> Found %d non-installable packages" % len(nuninst[arch]), type="I")
+                self.log("> Found %d non-installable packages" % len(nuninst[arch]), type="I")
                 if self.options.print_uninst:
                     self.nuninst_arch_report(nuninst, arch)
 
@@ -324,12 +324,12 @@ class Britney(object):
                 write_nuninst(self.options.noninst_status, nuninst)
 
             stats = self._inst_tester.compute_stats()
-            self.__log("> Installability tester statistics (per architecture)", type="I")
+            self.log("> Installability tester statistics (per architecture)", type="I")
             for arch in self.options.architectures:
                 arch_stat = stats[arch]
-                self.__log(">  %s" % arch, type="I")
+                self.log(">  %s" % arch, type="I")
                 for stat in arch_stat.stat_summary():
-                    self.__log(">  - %s" % stat, type="I")
+                    self.log(">  - %s" % stat, type="I")
 
         # read the release-critical bug summaries for testing and unstable
         self.bugs = {'unstable': self.read_bugs(self.options.unstable),
@@ -349,10 +349,10 @@ class Britney(object):
                 bad.append((f, pkg_entry1[f], pkg_entry2[f]))
 
         if bad:
-            self.__log("Mismatch found %s %s %s differs" % (
+            self.log("Mismatch found %s %s %s differs" % (
                 package, pkg_entry1[VERSION], parch), type="E")
             for f, v1, v2 in bad:
-                self.__log(" ... %s %s != %s" % (check_field_name[f], v1, v2))
+                self.log(" ... %s %s != %s" % (check_field_name[f], v1, v2))
             raise ValueError("Invalid data set")
 
         # Merge ESSENTIAL if necessary
@@ -393,11 +393,11 @@ class Britney(object):
         
         # integrity checks
         if self.options.nuninst_cache and self.options.print_uninst:
-            self.__log("nuninst_cache and print_uninst are mutually exclusive!", type="E")
+            self.log("nuninst_cache and print_uninst are mutually exclusive!", type="E")
             sys.exit(1)
         # if the configuration file exists, then read it and set the additional options
         elif not os.path.isfile(self.options.config):
-            self.__log("Unable to read the configuration file (%s), exiting!" % self.options.config, type="E")
+            self.log("Unable to read the configuration file (%s), exiting!" % self.options.config, type="E")
             sys.exit(1)
 
         # minimum days for unstable-testing transition and the list of hints
@@ -427,7 +427,7 @@ class Britney(object):
         if self.options.control_files and self.options.components:
             # We cannot regenerate the control files correctly when reading from an
             # actual mirror (we don't which package goes in what component etc.).
-            self.__log("Cannot use --control-files with mirror-layout (components)!", type="E")
+            self.log("Cannot use --control-files with mirror-layout (components)!", type="E")
             sys.exit(1)
 
         if not hasattr(self.options, "heidi_delta_output"):
@@ -452,7 +452,7 @@ class Britney(object):
             self.options.ignore_cruft == "0":
             self.options.ignore_cruft = False
 
-    def __log(self, msg, type="I"):
+    def log(self, msg, type="I"):
         """Print info messages according to verbosity level
         
         An easy-and-simple log method which prints messages to the standard
@@ -560,7 +560,7 @@ class Britney(object):
         if sources is None:
             sources = {}
 
-        self.__log("Loading source packages from %s" % filename)
+        self.log("Loading source packages from %s" % filename)
 
         Packages = apt_pkg.TagFile(filename)
         get_field = Packages.section.get
@@ -610,7 +610,7 @@ class Britney(object):
         return sources
 
     def _read_packages_file(self, filename, arch, srcdist, packages=None, intern=sys.intern):
-        self.__log("Loading binary packages from %s" % filename)
+        self.log("Loading binary packages from %s" % filename)
 
         if packages is None:
             packages = {}
@@ -695,13 +695,13 @@ class Britney(object):
                 for or_clause in parts:
                     if len(or_clause) != 1:
                         msg = "Ignoring invalid provides in %s: Alternatives [%s]" % (str(pkg_id), str(or_clause))
-                        self.__log(msg, type='W')
+                        self.log(msg, type='W')
                         continue
                     for part in or_clause:
                         provided, provided_version, op = part
                         if op != '' and op != '=':
                             msg = "Ignoring invalid provides in %s: %s (%s %s)" % (str(pkg_id), provided, op, version)
-                            self.__log(msg, type='W')
+                            self.log(msg, type='W')
                             continue
                         provided = intern(provided)
                         provided_version = intern(provided_version)
@@ -795,11 +795,11 @@ class Britney(object):
         """
         bugs = defaultdict(list)
         filename = os.path.join(basedir, "BugsV")
-        self.__log("Loading RC bugs data from %s" % filename)
+        self.log("Loading RC bugs data from %s" % filename)
         for line in open(filename, encoding='ascii'):
             l = line.split()
             if len(l) != 2:
-                self.__log("Malformed line found in line %s" % (line), type='W')
+                self.log("Malformed line found in line %s" % (line), type='W')
                 continue
             pkg = l[0]
             bugs[pkg] += l[1].split(",")
@@ -865,14 +865,14 @@ class Britney(object):
         """
         dates = {}
         filename = os.path.join(basedir, "Dates")
-        self.__log("Loading upload data from %s" % filename)
+        self.log("Loading upload data from %s" % filename)
         for line in open(filename, encoding='ascii'):
             l = line.split()
             if len(l) != 3: continue
             try:
                 dates[l[0]] = (l[1], int(l[2]))
             except ValueError:
-                self.__log("Dates, unable to parse \"%s\"" % line, type="E")
+                self.log("Dates, unable to parse \"%s\"" % line, type="E")
         return dates
 
     def write_dates(self, basedir, dates):
@@ -882,7 +882,7 @@ class Britney(object):
         read_dates.
         """
         filename = os.path.join(basedir, "Dates")
-        self.__log("Writing upload data to %s" % filename)
+        self.log("Writing upload data to %s" % filename)
         with open(filename, 'w', encoding='utf-8') as f:
             for pkg in sorted(dates):
                 f.write("%s %s %d\n" % ((pkg,) + dates[pkg]))
@@ -904,7 +904,7 @@ class Britney(object):
 
         urgencies = {}
         filename = os.path.join(basedir, "Urgency")
-        self.__log("Loading upload urgencies from %s" % filename)
+        self.log("Loading upload urgencies from %s" % filename)
         for line in open(filename, errors='surrogateescape', encoding='ascii'):
             l = line.split()
             if len(l) != 3: continue
@@ -957,9 +957,9 @@ class Britney(object):
             else:
                 filename = os.path.join(basedir, "Hints", who)
                 if not os.path.isfile(filename):
-                    self.__log("Cannot read hints list from %s, no such file!" % filename, type="E")
+                    self.log("Cannot read hints list from %s, no such file!" % filename, type="E")
                     continue
-                self.__log("Loading hints list from %s" % filename)
+                self.log("Loading hints list from %s" % filename)
                 with open(filename, encoding='utf-8') as f:
                     lines = f.readlines()
             for line in lines:
@@ -977,7 +977,7 @@ class Britney(object):
                     continue
                 elif len(l) == 1:
                     # All current hints require at least one argument
-                    self.__log("Malformed hint found in %s: '%s'" % (filename, line), type="W")
+                    self.log("Malformed hint found in %s: '%s'" % (filename, line), type="W")
                 elif l[0] in ["approve", "block", "block-all", "block-udeb", "unblock", "unblock-udeb", "force", "urgent", "remove"]:
                     if l[0] == 'approve': l[0] = 'unblock'
                     for package in l[1:]:
@@ -998,16 +998,16 @@ class Britney(object):
                     if x in ['unblock', 'unblock-udeb']:
                         if apt_pkg.version_compare(hint2.version, hint.version) < 0:
                             # This hint is for a newer version, so discard the old one
-                            self.__log("Overriding %s[%s] = ('%s', '%s') with ('%s', '%s')" %
+                            self.log("Overriding %s[%s] = ('%s', '%s') with ('%s', '%s')" %
                                (x, package, hint2.version, hint2.user, hint.version, hint.user), type="W")
                             hint2.set_active(False)
                         else:
                             # This hint is for an older version, so ignore it in favour of the new one
-                            self.__log("Ignoring %s[%s] = ('%s', '%s'), ('%s', '%s') is higher or equal" %
+                            self.log("Ignoring %s[%s] = ('%s', '%s'), ('%s', '%s') is higher or equal" %
                                (x, package, hint.version, hint.user, hint2.version, hint2.user), type="W")
                             hint.set_active(False)
                     else:
-                        self.__log("Overriding %s[%s] = ('%s', '%s', '%s') with ('%s', '%s', '%s')" %
+                        self.log("Overriding %s[%s] = ('%s', '%s', '%s') with ('%s', '%s', '%s')" %
                            (x, package, hint2.version, hint2.user, hint2.days,
                             hint.version, hint.user, hint.days), type="W")
                         hint2.set_active(False)
@@ -1016,7 +1016,7 @@ class Britney(object):
 
         # Sanity check the hints hash
         if len(hints["block"]) == 0 and len(hints["block-udeb"]) == 0:
-            self.__log("WARNING: No block hints at all, not even udeb ones!", type="W")
+            self.log("WARNING: No block hints at all, not even udeb ones!", type="W")
 
         return hints
 
@@ -1698,7 +1698,7 @@ class Britney(object):
         of this procedure, please refer to the module docstring.
         """
 
-        self.__log("Update Excuses generation started", type="I")
+        self.log("Update Excuses generation started", type="I")
 
         # list of local methods and variables (for better performance)
         sources = self.sources
@@ -1813,15 +1813,15 @@ class Britney(object):
 
         # write excuses to the output file
         if not self.options.dry_run:
-            self.__log("> Writing Excuses to %s" % self.options.excuses_output, type="I")
+            self.log("> Writing Excuses to %s" % self.options.excuses_output, type="I")
             write_excuses(self.excuses, self.options.excuses_output,
                           output_format="legacy-html")
             if hasattr(self.options, 'excuses_yaml_output'):
-                self.__log("> Writing YAML Excuses to %s" % self.options.excuses_yaml_output, type="I")
+                self.log("> Writing YAML Excuses to %s" % self.options.excuses_yaml_output, type="I")
                 write_excuses(self.excuses, self.options.excuses_yaml_output,
                           output_format="yaml")
 
-        self.__log("Update Excuses generation completed", type="I")
+        self.log("Update Excuses generation completed", type="I")
 
     # Upgrade run
     # -----------
@@ -2570,27 +2570,27 @@ class Britney(object):
 
 
     def assert_nuninst_is_correct(self):
-        self.__log("> Update complete - Verifying non-installability counters", type="I")
+        self.log("> Update complete - Verifying non-installability counters", type="I")
 
         cached_nuninst = self.nuninst_orig
         self._inst_tester.compute_testing_installability()
         computed_nuninst = self.get_nuninst(build=True)
         if cached_nuninst != computed_nuninst:
-            self.__log("==================== NUNINST OUT OF SYNC =========================", type="E")
+            self.log("==================== NUNINST OUT OF SYNC =========================", type="E")
             for arch in self.options.architectures:
                 expected_nuninst = set(cached_nuninst[arch])
                 actual_nuninst = set(computed_nuninst[arch])
                 false_negatives = actual_nuninst - expected_nuninst
                 false_positives = expected_nuninst - actual_nuninst
                 if false_negatives:
-                    self.__log(" %s - unnoticed nuninst: %s" % (arch, str(false_negatives)), type="E")
+                    self.log(" %s - unnoticed nuninst: %s" % (arch, str(false_negatives)), type="E")
                 if false_positives:
-                    self.__log(" %s - invalid nuninst: %s" % (arch, str(false_positives)), type="E")
-                self.__log(" %s - actual nuninst: %s" % (arch, str(actual_nuninst)), type="I")
-                self.__log("==================== NUNINST OUT OF SYNC =========================", type="E")
+                    self.log(" %s - invalid nuninst: %s" % (arch, str(false_positives)), type="E")
+                self.log(" %s - actual nuninst: %s" % (arch, str(actual_nuninst)), type="I")
+                self.log("==================== NUNINST OUT OF SYNC =========================", type="E")
             raise AssertionError("NUNINST OUT OF SYNC")
 
-        self.__log("> All non-installability counters are ok", type="I")
+        self.log("> All non-installability counters are ok", type="I")
 
 
     def upgrade_testing(self):
@@ -2601,11 +2601,11 @@ class Britney(object):
         commands.
         """
 
-        self.__log("Starting the upgrade test", type="I")
+        self.log("Starting the upgrade test", type="I")
         self.output_write("Generated on: %s\n" % (time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time()))))
         self.output_write("Arch order is: %s\n" % ", ".join(self.options.architectures))
 
-        self.__log("> Calculating current uninstallability counters", type="I")
+        self.log("> Calculating current uninstallability counters", type="I")
         self.nuninst_orig = self.get_nuninst()
         # nuninst_orig may get updated during the upgrade process
         self.nuninst_orig_save = self.get_nuninst()
@@ -2660,7 +2660,7 @@ class Britney(object):
         # obsolete source packages
         # a package is obsolete if none of the binary packages in testing
         # are built by it
-        self.__log("> Removing obsolete source packages from testing", type="I")
+        self.log("> Removing obsolete source packages from testing", type="I")
         # local copies for performance
         sources = self.sources['testing']
         binaries = self.binaries['testing']
@@ -2677,7 +2677,7 @@ class Britney(object):
 
         # smooth updates
         if self.options.smooth_updates:
-            self.__log("> Removing old packages left in testing from smooth updates", type="I")
+            self.log("> Removing old packages left in testing from smooth updates", type="I")
             removals = old_libraries(self.sources, self.binaries, self.options.fucked_arches)
             if removals:
                 self.output_write("Removing packages left in testing for smooth updates (%d):\n%s" % \
@@ -2696,7 +2696,7 @@ class Britney(object):
         if not self.options.dry_run:
             # re-write control files
             if self.options.control_files:
-                self.__log("Writing new testing control files to %s" %
+                self.log("Writing new testing control files to %s" %
                            self.options.testing)
                 write_controlfiles(self.sources, self.binaries,
                                    'testing', self.options.testing)
@@ -2708,20 +2708,20 @@ class Britney(object):
                 self.write_dates(self.options.testing, self.dates)
 
             # write HeidiResult
-            self.__log("Writing Heidi results to %s" % self.options.heidi_output)
+            self.log("Writing Heidi results to %s" % self.options.heidi_output)
             write_heidi(self.options.heidi_output, self.sources["testing"],
                         self.binaries["testing"])
 
-            self.__log("Writing delta to %s" % self.options.heidi_delta_output)
+            self.log("Writing delta to %s" % self.options.heidi_delta_output)
             write_heidi_delta(self.options.heidi_delta_output,
                               self.all_selected)
 
 
         self.printuninstchange()
-        self.__log("Test completed!", type="I")
+        self.log("Test completed!", type="I")
 
     def printuninstchange(self):
-        self.__log("Checking for newly uninstallable packages", type="I")
+        self.log("Checking for newly uninstallable packages", type="I")
         text = eval_uninst(self.options.architectures, newly_uninst(
                         self.nuninst_orig_save, self.nuninst_orig))
 
@@ -2735,7 +2735,7 @@ class Britney(object):
         This method provides a command line interface for the release team to
         try hints and evaluate the results.
         """
-        self.__log("> Calculating current uninstallability counters", type="I")
+        self.log("> Calculating current uninstallability counters", type="I")
         self.nuninst_orig = self.get_nuninst()
         self.nuninst_orig_save = self.get_nuninst()
 
@@ -2781,7 +2781,7 @@ class Britney(object):
         try:
             readline.write_history_file(histfile)
         except IOError as e:
-            self.__log("Could not write %s: %s" % (histfile, e), type="W")
+            self.log("Could not write %s: %s" % (histfile, e), type="W")
 
     def do_hint(self, hinttype, who, pkgvers):
         """Process hints
@@ -2795,7 +2795,7 @@ class Britney(object):
         else:
             _pkgvers = pkgvers
 
-        self.__log("> Processing '%s' hint from %s" % (hinttype, who), type="I")
+        self.log("> Processing '%s' hint from %s" % (hinttype, who), type="I")
         self.output_write("Trying %s from %s: %s\n" % (hinttype, who, " ".join("%s/%s" % (x.uvname, x.version) for x in _pkgvers)))
 
         ok = True
@@ -2851,7 +2851,7 @@ class Britney(object):
         excuses relationships. If they build a circular dependency, which we already
         know as not-working with the standard do_all algorithm, try to `easy` them.
         """
-        self.__log("> Processing hints from the auto hinter",
+        self.log("> Processing hints from the auto hinter",
                    type="I")
 
         # consider only excuses which are valid candidates
@@ -2968,9 +2968,9 @@ class Britney(object):
             else:
                 self.upgrade_testing()
 
-            self.__log('> Stats from the installability tester', type="I")
+            self.log('> Stats from the installability tester', type="I")
             for stat in self._inst_tester.stats.stats():
-                self.__log('>   %s' % stat, type="I")
+                self.log('>   %s' % stat, type="I")
 
     def _installability_test(self, pkg_name, pkg_id, broken, to_check, nuninst_arch):
         """Test for installability of a package on an architecture
-- 
2.8.0.rc3

From b0cf611d2eec0bf4455b28ea0dab3010de1ebd20 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 25 Mar 2016 07:49:31 +0000
Subject: [PATCH 24/33] Move some of the hint parsing into hints.py

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py |  34 ++++----------------
 hints.py   | 105 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++-----
 2 files changed, 104 insertions(+), 35 deletions(-)

diff --git a/britney.py b/britney.py
index c8b9388..09a3acb 100755
--- a/britney.py
+++ b/britney.py
@@ -198,7 +198,7 @@ from urllib.parse import quote
 from installability.builder import InstallabilityTesterBuilder
 from excuse import Excuse
 from migrationitem import MigrationItem
-from hints import HintCollection
+from hints import HintParser
 from britney_util import (old_libraries_format, undo_changes,
                           compute_reverse_tree,
                           read_nuninst, write_nuninst, write_heidi,
@@ -948,12 +948,13 @@ class Britney(object):
         The method returns a dictionary where the key is the command, and
         the value is the list of affected packages.
         """
-        hints = HintCollection()
+        hint_parser = HintParser(self)
 
         for who in self.HINTS.keys():
             if who == 'command-line':
                 lines = self.options.hints and self.options.hints.split(';') or ()
                 filename = '<cmd-line>'
+                hint_parser.parse_hints(who, self.HINTS[who], filename, lines)
             else:
                 filename = os.path.join(basedir, "Hints", who)
                 if not os.path.isfile(filename):
@@ -961,32 +962,9 @@ class Britney(object):
                     continue
                 self.log("Loading hints list from %s" % filename)
                 with open(filename, encoding='utf-8') as f:
-                    lines = f.readlines()
-            for line in lines:
-                line = line.strip()
-                if line == "": continue
-                l = line.split()
-                if l[0] == 'finished':
-                    break
-                if l[0] == 'remark':
-                    # Ignore "no-op" hint, the sole purpose of which is to be
-                    # found by hint grep (and show up in "d"'s
-                    # output).
-                    continue
-                elif l[0] not in self.HINTS[who]:
-                    continue
-                elif len(l) == 1:
-                    # All current hints require at least one argument
-                    self.log("Malformed hint found in %s: '%s'" % (filename, line), type="W")
-                elif l[0] in ["approve", "block", "block-all", "block-udeb", "unblock", "unblock-udeb", "force", "urgent", "remove"]:
-                    if l[0] == 'approve': l[0] = 'unblock'
-                    for package in l[1:]:
-                        hints.add_hint('%s %s' % (l[0], package), who)
-                elif l[0] in ["age-days"]:
-                    for package in l[2:]:
-                        hints.add_hint('%s %s %s' % (l[0], l[1], package), who)
-                else:
-                    hints.add_hint(l, who)
+                    hint_parser.parse_hints(who, self.HINTS[who], filename, f)
+
+        hints = hint_parser.hints
 
         for x in ["block", "block-all", "block-udeb", "unblock", "unblock-udeb", "force", "urgent", "remove", "age-days"]:
             z = {}
diff --git a/hints.py b/hints.py
index 6643257..48efdb1 100644
--- a/hints.py
+++ b/hints.py
@@ -16,6 +16,11 @@ from __future__ import print_function
 
 from migrationitem import MigrationItem
 
+
+class MalformedHintException(Exception):
+    pass
+
+
 class HintCollection(object):
     def __init__(self):
         self._hints = []
@@ -35,10 +40,8 @@ class HintCollection(object):
                ]
 
     def add_hint(self, hint, user):
-        try:
-            self._hints.append(Hint(hint, user))
-        except AssertionError:
-            print("Ignoring broken hint %r from %s" % (hint, user))
+        self._hints.append(Hint(hint, user))
+
 
 class Hint(object):
     NO_VERSION = [ 'block', 'block-all', 'block-udeb' ]
@@ -48,7 +51,7 @@ class Hint(object):
         self._user = user
         self._active = True
         self._days = None
-        if isinstance(hint, list):
+        if isinstance(hint, list) or isinstance(hint, tuple):
             self._type = hint[0]
             self._packages = hint[1:]
         else:
@@ -71,9 +74,11 @@ class Hint(object):
     def check(self):
         for package in self.packages:
             if self.type in self.__class__.NO_VERSION:
-                assert package.version is None, package
+                if package.version is not None:
+                    raise MalformedHintException("\"%s\" needs unversioned packages, got \"%s\"" % (self.type, package))
             else:
-                assert package.version is not None, package
+                if package.version is None:
+                    raise MalformedHintException("\"%s\" needs versioned packages, got \"%s\"" % (self.type, package))
 
     def set_active(self, active):
         self._active = active
@@ -125,3 +130,89 @@ class Hint(object):
         else:
             return None
 
+
+def age_day_hint(hints, who, hint_name, new_age, *args):
+    for package in args:
+        hints.add_hint('%s %s %s' % (hint_name, new_age, package), who)
+
+
+def split_into_one_hint_per_package(hints, who, hint_name, *args):
+    for package in args:
+        hints.add_hint('%s %s' % (hint_name, package), who)
+
+
+def single_hint_taking_list_of_packages(hints, who, *args):
+    hints.add_hint(args, who)
+
+
+class HintParser(object):
+
+    def __init__(self, britney):
+        self._britney = britney
+        self.hints = HintCollection()
+        self._hint_table = {
+            'remark': (0, lambda *x: None),
+
+            # Migration grouping hints
+            'easy': (2, single_hint_taking_list_of_packages), # Easy needs at least 2 to make sense
+            'force-hint': (1, single_hint_taking_list_of_packages),
+            'hint': (1, single_hint_taking_list_of_packages),
+
+            # Age / urgent
+            'urgent': (1, split_into_one_hint_per_package),
+            'age-days': (2, age_day_hint),
+
+            # Block / freeze related hints
+            'block': (1, split_into_one_hint_per_package),
+            'block-all': (1, split_into_one_hint_per_package),
+            'block-udeb': (1, split_into_one_hint_per_package),
+            'unblock': (1, split_into_one_hint_per_package),
+            'unblock-udeb': (1, split_into_one_hint_per_package),
+
+            # Other
+            'remove': (1, split_into_one_hint_per_package),
+            'force': (1, split_into_one_hint_per_package),
+        }
+        self._aliases = {
+            'approve': 'unblock',
+        }
+
+    def parse_hints(self, who, permitted_hints, filename, lines):
+        hint_table = self._hint_table
+        line_no = 0
+        hints = self.hints
+        aliases = self._aliases
+        for line in lines:
+            line = line.strip()
+            line_no += 1
+            if line == "" or line.startswith('#'):
+                continue
+            l = line.split()
+            hint_name = l[0]
+            if hint_name in aliases:
+                hint_name = aliases[hint_name]
+                l[0] = hint_name
+            if hint_name == 'finished':
+                break
+            if hint_name not in hint_table:
+                self.log("Unknown hint found in %s (line %d): '%s'" % (filename, line_no, line), type="W")
+                continue
+            if hint_name not in permitted_hints:
+                reason = 'The hint is not a part of the permitted hints for ' + who
+                self.log("Ignoring \"%s\" hint from %s found in %s (line %d): %s" % (
+                    hint_name, who, filename, line_no, reason), type="I")
+                continue
+            min_args, hint_parser_impl = hint_table[hint_name]
+            if len(l) - 1 < min_args:
+                self.log("Malformed hint found in %s (line %d): Needs at least %d argument(s), got %d" % (
+                    filename, line_no, min_args, len(l) - 1), type="W")
+                continue
+            try:
+                hint_parser_impl(hints, who, *l)
+            except MalformedHintException as e:
+                self.log("Malformed hint found in %s (line %d): \"%s\"" % (
+                    filename, line_no, e.args[0]), type="W")
+                continue
+
+    def log(self, msg, type="I"):
+        self._britney.log(msg, type=type)
-- 
2.8.0.rc3

From d1c00231a0d7e996ee79e3da27381a56680a2019 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 25 Mar 2016 09:18:35 +0000
Subject: [PATCH 25/33] Make excuses a dict rather than a list

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 61 +++++++++++++++++++++++++++------------------------------
 britney_util.py | 15 +++++++-------
 2 files changed, 36 insertions(+), 40 deletions(-)

diff --git a/britney.py b/britney.py
index 09a3acb..bd59770 100755
--- a/britney.py
+++ b/britney.py
@@ -260,6 +260,7 @@ class Britney(object):
         self.sources = {}
         self.binaries = {}
         self.all_selected = []
+        self.excuses = {}
 
         try:
             self.hints = self.read_hints(self.options.hintsdir)
@@ -339,7 +340,6 @@ class Britney(object):
         # read additional data
         self.dates = self.read_dates(self.options.testing)
         self.urgencies = self.read_urgencies(self.options.testing)
-        self.excuses = []
 
     def merge_pkg_entries(self, package, parch, pkg_entry1, pkg_entry2,
                           check_fields=check_fields, check_field_name=check_field_name):
@@ -1140,11 +1140,11 @@ class Britney(object):
                 "(check https://release.debian.org/testing/freeze_policy.html if update is needed)" % hint.user)
             excuse.addhtml("Not considered")
             excuse.addreason("block")
-            self.excuses.append(excuse)
+            self.excuses[excuse.name] = excuse
             return False
 
         excuse.is_valid = True
-        self.excuses.append(excuse)
+        self.excuses[excuse.name] = excuse
         return True
 
     def should_upgrade_srcarch(self, src, arch, suite):
@@ -1178,7 +1178,7 @@ class Britney(object):
             excuse.addhtml("Trying to remove package, not update it")
             excuse.addhtml("Not considered")
             excuse.addreason("remove")
-            self.excuses.append(excuse)
+            self.excuses[excuse.name] = excuse
             return False
 
         # the starting point is that there is nothing wrong and nothing worth doing
@@ -1289,12 +1289,12 @@ class Britney(object):
         # if there is nothing wrong and there is something worth doing, this is a valid candidate
         if not anywrongver and anyworthdoing:
             excuse.is_valid = True
-            self.excuses.append(excuse)
+            self.excuses[excuse.name] = excuse
             return True
         # else if there is something worth doing (but something wrong, too) this package won't be considered
         elif anyworthdoing:
             excuse.addhtml("Not considered")
-            self.excuses.append(excuse)
+            self.excuses[excuse.name] = excuse
 
         # otherwise, return False
         return False
@@ -1334,7 +1334,7 @@ class Britney(object):
         # if the version in unstable is older, then stop here with a warning in the excuse and return False
         if source_t and apt_pkg.version_compare(source_u[VERSION], source_t[VERSION]) < 0:
             excuse.addhtml("ALERT: %s is newer in testing (%s %s)" % (src, source_t[VERSION], source_u[VERSION]))
-            self.excuses.append(excuse)
+            self.excuses[excuse.name] = excuse
             excuse.addreason("newerintesting")
             return False
 
@@ -1611,7 +1611,7 @@ class Britney(object):
             # TODO
             excuse.addhtml("Not considered")
 
-        self.excuses.append(excuse)
+        self.excuses[excuse.name] = excuse
         return update_candidate
 
     def reversed_exc_deps(self):
@@ -1621,7 +1621,7 @@ class Britney(object):
         and the values are the excuse names which depend on it.
         """
         res = defaultdict(list)
-        for exc in self.excuses:
+        for exc in self.excuses.values():
             for d in exc.deps:
                 res[d].append(exc.name)
         return res
@@ -1633,8 +1633,7 @@ class Britney(object):
         on invalid excuses. The two parameters contains the list of
         `valid' and `invalid' excuses.
         """
-        # build a lookup-by-name map
-        exclookup = {e.name: e for e in self.excuses}
+        excuses = self.excuses
 
         # build the reverse dependencies
         revdeps = self.reversed_exc_deps()
@@ -1653,19 +1652,19 @@ class Britney(object):
             # loop on the reverse dependencies
             for x in revdeps[invalid[i]]:
                 # if the item is valid and it is marked as `dontinvalidate', skip the item
-                if x in valid and exclookup[x].dontinvalidate:
+                if x in valid and excuses[x].dontinvalidate:
                     continue
 
                 # otherwise, invalidate the dependency and mark as invalidated and
                 # remove the depending excuses
-                exclookup[x].invalidate_dep(invalid[i])
+                excuses[x].invalidate_dep(invalid[i])
                 if x in valid:
                     p = valid.index(x)
                     invalid.append(valid.pop(p))
-                    exclookup[x].addhtml("Invalidated by dependency")
-                    exclookup[x].addhtml("Not considered")
-                    exclookup[x].addreason("depends")
-                    exclookup[x].is_valid = False
+                    excuses[x].addhtml("Invalidated by dependency")
+                    excuses[x].addhtml("Not considered")
+                    excuses[x].addreason("depends")
+                    excuses[x].is_valid = False
             i = i + 1
  
     def write_excuses(self):
@@ -1689,7 +1688,7 @@ class Britney(object):
         # if a package is going to be removed, it will have a "-" prefix
         upgrade_me = []
 
-        self.excuses = []
+        excuses = self.excuses = {}
 
         # for every source package in testing, check if it should be removed
         for pkg in sources['testing']:
@@ -1743,16 +1742,13 @@ class Britney(object):
             excuse.addhtml("Removal request by %s" % (item.user))
             excuse.addhtml("Package is broken, will try to remove")
             excuse.addreason("remove")
-            self.excuses.append(excuse)
-
-        # sort the excuses by daysold and name
-        self.excuses.sort(key=lambda x: x.sortkey())
+            excuses[excuse.name] = excuse
 
         # extract the not considered packages, which are in the excuses but not in upgrade_me
-        unconsidered = [e.name for e in self.excuses if e.name not in upgrade_me]
+        unconsidered = [ename for ename in excuses if ename not in upgrade_me]
 
         # invalidate impossible excuses
-        for e in self.excuses:
+        for e in excuses.values():
             # parts[0] == package name
             # parts[1] == optional architecture
             parts = e.name.split('/')
@@ -1792,12 +1788,13 @@ class Britney(object):
         # write excuses to the output file
         if not self.options.dry_run:
             self.log("> Writing Excuses to %s" % self.options.excuses_output, type="I")
-            write_excuses(self.excuses, self.options.excuses_output,
+            sorted_excuses = sorted(excuses.values(), key=lambda x: x.sortkey())
+            write_excuses(sorted_excuses, self.options.excuses_output,
                           output_format="legacy-html")
             if hasattr(self.options, 'excuses_yaml_output'):
                 self.log("> Writing YAML Excuses to %s" % self.options.excuses_yaml_output, type="I")
-                write_excuses(self.excuses, self.options.excuses_yaml_output,
-                          output_format="yaml")
+                write_excuses(sorted_excuses, self.options.excuses_yaml_output,
+                              output_format="yaml")
 
         self.log("Update Excuses generation completed", type="I")
 
@@ -2834,12 +2831,12 @@ class Britney(object):
 
         # consider only excuses which are valid candidates
         valid_excuses = frozenset(y.uvname for y in self.upgrade_me)
-        excuses = dict((x.name, x) for x in self.excuses if x.name in valid_excuses)
-        excuses_deps = dict((name, set(excuse.deps)) for name, excuse in excuses.items())
+        excuses = self.excuses
+        excuses_deps = dict((name, set(excuse.deps)) for name, excuse in excuses.items() if name in valid_excuses)
         sources_t = self.sources['testing']
 
         def find_related(e, hint, circular_first=False):
-            if e not in excuses:
+            if e not in valid_excuses:
                 return False
             excuse = excuses[e]
             if e in sources_t and sources_t[e][VERSION] == excuse.ver[1]:
@@ -2858,7 +2855,7 @@ class Britney(object):
         candidates = []
         mincands = []
         seen_hints = set()
-        for e in excuses:
+        for e in valid_excuses:
             excuse = excuses[e]
             if e in sources_t and sources_t[e][VERSION] == excuse.ver[1]:
                 continue
@@ -2878,7 +2875,7 @@ class Britney(object):
 
                 for item, ver in items:
                     # excuses which depend on "item" or are depended on by it
-                    new_items = [(x, excuses[x].ver[1]) for x in excuses if \
+                    new_items = [(x, excuses[x].ver[1]) for x in valid_excuses if \
                        (item in excuses_deps[x] or x in excuses_deps[item]) \
                        and (x, excuses[x].ver[1]) not in seen_items]
                     items.extend(new_items)
diff --git a/britney_util.py b/britney_util.py
index 9401df6..eb6a529 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -342,7 +342,7 @@ def make_migrationitem(package, sources, VERSION=VERSION):
     return MigrationItem("%s/%s" % (item.uvname, sources[item.suite][item.package][VERSION]))
 
 
-def write_excuses(excuses, dest_file, output_format="yaml"):
+def write_excuses(excuselist, dest_file, output_format="yaml"):
     """Write the excuses to dest_file
 
     Writes a list of excuses in a specified output_format to the
@@ -351,12 +351,11 @@ def write_excuses(excuses, dest_file, output_format="yaml"):
     """
     if output_format == "yaml":
         with open(dest_file, 'w', encoding='utf-8') as f:
-            excuselist = []
-            for e in excuses:
-                excuselist.append(e.excusedata())
-            excusesdata = {}
-            excusesdata["sources"] = excuselist
-            excusesdata["generated-date"] = datetime.utcnow()
+            edatalist = [e.excusedata() for e in excuselist]
+            excusesdata = {
+                'sources': edatalist,
+                'generated-date': datetime.utcnow(),
+            }
             f.write(yaml.dump(excusesdata, default_flow_style=False, allow_unicode=True))
     elif output_format == "legacy-html":
         with open(dest_file, 'w', encoding='utf-8') as f:
@@ -365,7 +364,7 @@ def write_excuses(excuses, dest_file, output_format="yaml"):
             f.write("<meta http-equiv=\"Content-Type\" content=\"text/html;charset=utf-8\"></head><body>\n")
             f.write("<p>Generated: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "</p>\n")
             f.write("<ul>\n")
-            for e in excuses:
+            for e in excuselist:
                 f.write("<li>%s" % e.html())
             f.write("</ul></body></html>\n")
     else:
-- 
2.8.0.rc3

From 9f5e3bbf4cfd9dad4cf64bdafd0817f15487436a Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 25 Mar 2016 12:20:09 +0000
Subject: [PATCH 26/33] Move two installability testing functions to
 britney_util

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 59 ++-------------------------------------------------------
 britney_util.py | 57 +++++++++++++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 59 insertions(+), 57 deletions(-)

diff --git a/britney.py b/britney.py
index bd59770..c0c4cfd 100755
--- a/britney.py
+++ b/britney.py
@@ -205,7 +205,7 @@ from britney_util import (old_libraries_format, undo_changes,
                           eval_uninst, newly_uninst, make_migrationitem,
                           write_excuses, write_heidi_delta, write_controlfiles,
                           old_libraries, is_nuninst_asgood_generous,
-                          clone_nuninst)
+                          clone_nuninst, check_installability)
 from consts import (VERSION, SECTION, BINARIES, MAINTAINER, FAKESRC,
                    SOURCE, SOURCEVER, ARCHITECTURE, DEPENDS, CONFLICTS,
                    PROVIDES, MULTIARCH, ESSENTIAL)
@@ -2245,33 +2245,6 @@ class Britney(object):
         # return the package name, the suite, the list of affected packages and the undo dictionary
         return (affected, undo)
 
-
-    def _check_packages(self, binaries, arch, affected, check_archall, nuninst):
-        broken = nuninst[arch + "+all"]
-        to_check = []
-
-        # broken packages (first round)
-        for pkg_id in (x for x in affected if x[2] == arch):
-            name, version, parch = pkg_id
-            if name not in binaries[parch][0]:
-                continue
-            pkgdata = binaries[parch][0][name]
-            if version != pkgdata[VERSION]:
-                # Not the version in testing right now, ignore
-                continue
-            actual_arch = pkgdata[ARCHITECTURE]
-            nuninst_arch = None
-            # only check arch:all packages if requested
-            if check_archall or actual_arch != 'all':
-                nuninst_arch = nuninst[parch]
-            elif actual_arch == 'all':
-                nuninst[parch].discard(name)
-            self._installability_test(name, pkg_id, broken, to_check, nuninst_arch)
-
-        # We have always overshot the affected set, so to_check does not
-        # contain anything new.
-        assert affected.issuperset(to_check)
-
     def try_migration(self, actions, nuninst_now, lundo=None, automatic_revert=True):
         is_accepted = True
         affected_architectures = set()
@@ -2327,7 +2300,7 @@ class Britney(object):
         for arch in affected_architectures:
             check_archall = arch in nobreakall_arches
 
-            self._check_packages(packages_t, arch, affected, check_archall, nuninst_after)
+            check_installability(self._inst_tester, packages_t, arch, affected, check_archall, nuninst_after)
 
             # if the uninstallability counter is worse than before, break the loop
             if automatic_revert and len(nuninst_after[arch]) > len(nuninst_now[arch]):
@@ -2947,34 +2920,6 @@ class Britney(object):
             for stat in self._inst_tester.stats.stats():
                 self.log('>   %s' % stat, type="I")
 
-    def _installability_test(self, pkg_name, pkg_id, broken, to_check, nuninst_arch):
-        """Test for installability of a package on an architecture
-
-        (pkg_name, pkg_version, pkg_arch) is the package to check.
-
-        broken is the set of broken packages.  If p changes
-        installability (e.g. goes from uninstallable to installable),
-        broken will be updated accordingly.  Furthermore, p will be
-        added to "to_check" for futher processing.
-
-        If nuninst_arch is not None then it also updated in the same
-        way as broken is.
-        """
-        r = self._inst_tester.is_installable(pkg_id)
-        if not r:
-            # not installable
-            if pkg_name not in broken:
-                broken.add(pkg_name)
-                to_check.append(pkg_id)
-            if nuninst_arch is not None and pkg_name not in nuninst_arch:
-                nuninst_arch.add(pkg_name)
-        else:
-            if pkg_name in broken:
-                to_check.append(pkg_id)
-                broken.remove(pkg_name)
-            if nuninst_arch is not None and pkg_name in nuninst_arch:
-                nuninst_arch.remove(pkg_name)
-
 
 if __name__ == '__main__':
     Britney().main()
diff --git a/britney_util.py b/britney_util.py
index eb6a529..5da1905 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -518,3 +518,60 @@ def clone_nuninst(nuninst, packages_s, architectures):
         clone[arch] = set(x for x in nuninst[arch] if x in packages_s[arch][0])
         clone[arch + "+all"] = set(x for x in nuninst[arch + "+all"] if x in packages_s[arch][0])
     return clone
+
+
+def test_installability(inst_tester, pkg_name, pkg_id, broken, to_check, nuninst_arch):
+    """Test for installability of a package on an architecture
+
+    (pkg_name, pkg_version, pkg_arch) is the package to check.
+
+    broken is the set of broken packages.  If p changes
+    installability (e.g. goes from uninstallable to installable),
+    broken will be updated accordingly.  Furthermore, p will be
+    added to "to_check" for futher processing.
+
+    If nuninst_arch is not None then it also updated in the same
+    way as broken is.
+    """
+    r = inst_tester.is_installable(pkg_id)
+    if not r:
+        # not installable
+        if pkg_name not in broken:
+            broken.add(pkg_name)
+            to_check.append(pkg_id)
+        if nuninst_arch is not None and pkg_name not in nuninst_arch:
+            nuninst_arch.add(pkg_name)
+    else:
+        if pkg_name in broken:
+            to_check.append(pkg_id)
+            broken.remove(pkg_name)
+        if nuninst_arch is not None and pkg_name in nuninst_arch:
+            nuninst_arch.remove(pkg_name)
+
+
+def check_installability(inst_tester, binaries, arch, affected, check_archall, nuninst):
+    broken = nuninst[arch + "+all"]
+    to_check = []
+    packages_t_a = binaries[arch][0]
+
+    # broken packages (first round)
+    for pkg_id in (x for x in affected if x[2] == arch):
+        name, version, parch = pkg_id
+        if name not in packages_t_a:
+            continue
+        pkgdata = packages_t_a[name]
+        if version != pkgdata[VERSION]:
+            # Not the version in testing right now, ignore
+            continue
+        actual_arch = pkgdata[ARCHITECTURE]
+        nuninst_arch = None
+        # only check arch:all packages if requested
+        if check_archall or actual_arch != 'all':
+            nuninst_arch = nuninst[parch]
+        elif actual_arch == 'all':
+            nuninst[parch].discard(name)
+        test_installability(inst_tester, name, pkg_id, broken, to_check, nuninst_arch)
+
+    # We have always overshot the affected set, so to_check does not
+    # contain anything new.
+    assert affected.issuperset(to_check)
-- 
2.8.0.rc3

From e977bc117325cb4377947c55ae19d3a51d3f3e2c Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 25 Mar 2016 12:33:33 +0000
Subject: [PATCH 27/33] britney.py: Log when old libs are present but not
 removed

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/britney.py b/britney.py
index c0c4cfd..8b9eaa4 100755
--- a/britney.py
+++ b/britney.py
@@ -2624,16 +2624,17 @@ class Britney(object):
             self.do_all(actions=removals)
 
         # smooth updates
+        removals = old_libraries(self.sources, self.binaries, self.options.fucked_arches)
         if self.options.smooth_updates:
             self.log("> Removing old packages left in testing from smooth updates", type="I")
-            removals = old_libraries(self.sources, self.binaries, self.options.fucked_arches)
             if removals:
                 self.output_write("Removing packages left in testing for smooth updates (%d):\n%s" % \
                     (len(removals), old_libraries_format(removals)))
                 self.do_all(actions=removals)
                 removals = old_libraries(self.sources, self.binaries, self.options.fucked_arches)
         else:
-            removals = ()
+            self.log("> Not removing old packages left in testing from smooth updates (smooth-updates disabled)",
+                     type="I")
 
         self.output_write("List of old libraries in testing (%d):\n%s" % \
              (len(removals), old_libraries_format(removals)))
-- 
2.8.0.rc3

From 58caa08ab6b9a82bb789c065e29a1f9c7c0c9ac5 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 25 Mar 2016 12:38:36 +0000
Subject: [PATCH 28/33] britney.py: Remove uncommented printf debugging

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 2 --
 1 file changed, 2 deletions(-)

diff --git a/britney.py b/britney.py
index 8b9eaa4..9bd6537 100755
--- a/britney.py
+++ b/britney.py
@@ -2098,8 +2098,6 @@ class Britney(object):
                                                item.architecture,
                                                item.is_removal,
                                                removals=removals)
-        #print("+++ %s" % (sorted(updates)))
-        #print("--- %s" % (sorted(rms)))
 
         # remove all binary packages (if the source already exists)
         if item.architecture == 'source' or not item.is_removal:
-- 
2.8.0.rc3

From 22777655af319e6bddec9ea0d2009ad5b0f44d44 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 26 Mar 2016 11:37:55 +0000
Subject: [PATCH 29/33] Replace some string concat+split with tuples

Replace some of the:

 p = binary + "/" + parch
 binary, parch = p.split("/")

with regular tuples.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 15 +++++++--------
 britney_util.py |  6 +++---
 2 files changed, 10 insertions(+), 11 deletions(-)

diff --git a/britney.py b/britney.py
index 9bd6537..f9616a6 100755
--- a/britney.py
+++ b/britney.py
@@ -2123,7 +2123,7 @@ class Britney(object):
                 # remove all the binaries which aren't being smooth updated
                 for rm_pkg_id in rms:
                     binary, version, parch = rm_pkg_id
-                    p = binary + "/" + parch
+                    p = (binary, parch)
                     binaries_t_a, provides_t_a = packages_t[parch]
                     pkey = (binary, parch)
 
@@ -2139,7 +2139,7 @@ class Britney(object):
 
                     # remove the provided virtual packages
                     for j, prov_version, _ in pkg_data[PROVIDES]:
-                        key = j + "/" + parch
+                        key = (j, parch)
                         if key not in undo['virtual']:
                             undo['virtual'][key] = provides_t_a[j].copy()
                         provides_t_a[j].remove((binary, prov_version))
@@ -2162,7 +2162,7 @@ class Britney(object):
             binaries_t_a = packages_t[item.architecture][0]
             version = binaries_t_a[item.package][VERSION]
             pkg_id = (item.package, version, item.architecture)
-            undo['binaries'][item.package + "/" + item.architecture] = pkg_id
+            undo['binaries'][(item.package, item.architecture)] = pkg_id
             affected.add(pkg_id)
             affected.update(inst_tester.reverse_dependencies_of(pkg_id))
             del binaries_t_a[item.package]
@@ -2174,7 +2174,6 @@ class Britney(object):
 
             for updated_pkg_id in updates:
                 binary, new_version, parch = updated_pkg_id
-                p = "%s/%s" % (binary, parch)
                 key = (binary, parch)
                 binaries_t_a, provides_t_a = packages_t[parch]
                 equivalent_replacement = key in eqv_set
@@ -2191,7 +2190,7 @@ class Britney(object):
                     old_version = old_pkg_data[VERSION]
                     old_pkg_id = (binary, old_version, parch)
                     # save the old binary package
-                    undo['binaries'][p] = old_pkg_id
+                    undo['binaries'][key] = old_pkg_id
                     if not equivalent_replacement:
                         # all the reverse dependencies are affected by
                         # the change
@@ -2211,8 +2210,8 @@ class Britney(object):
                     # ignored as their reverse trees are already handled
                     # by this function
                     for (tundo, tpkg) in hint_undo:
-                        if p in tundo['binaries']:
-                            tpkg_id = tundo['binaries'][p]
+                        if key in tundo['binaries']:
+                            tpkg_id = tundo['binaries'][key]
                             affected.update(inst_tester.reverse_dependencies_of(tpkg_id))
 
                 # add/update the binary package from the source suite
@@ -2221,7 +2220,7 @@ class Britney(object):
                 inst_tester.add_testing_binary(updated_pkg_id)
                 # register new provided packages
                 for j, prov_version, _ in new_pkg_data[PROVIDES]:
-                    key = j + "/" + parch
+                    key = (j, parch)
                     if j not in provides_t_a:
                         undo['nvirtual'].append(key)
                         provides_t_a[j] = set()
diff --git a/britney_util.py b/britney_util.py
index 5da1905..ca472e1 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -134,7 +134,7 @@ def undo_changes(lundo, inst_tester, sources, binaries, all_binary_packages,
     # undo all other binary package changes (except virtual packages)
     for (undo, item) in lundo:
         for p in undo['binaries']:
-            binary, arch = p.split("/")
+            binary, arch = p
             if binary[0] == "-":
                 version = binaries["testing"][arch][0][binary][VERSION]
                 del binaries['testing'][arch][0][binary[1:]]
@@ -152,10 +152,10 @@ def undo_changes(lundo, inst_tester, sources, binaries, all_binary_packages,
     # undo all changes to virtual packages
     for (undo, item) in lundo:
         for p in undo['nvirtual']:
-            j, arch = p.split("/")
+            j, arch = p
             del binaries['testing'][arch][1][j]
         for p in undo['virtual']:
-            j, arch = p.split("/")
+            j, arch = p
             if j[0] == '-':
                 del binaries['testing'][arch][1][j[1:]]
             else:
-- 
2.8.0.rc3

From a911af1f239e2b4c1e95f9481fce55e253a50d34 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 26 Mar 2016 14:46:44 +0000
Subject: [PATCH 30/33] Remove unused named parameter

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 installability/tester.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/installability/tester.py b/installability/tester.py
index 23818e5..7643596 100644
--- a/installability/tester.py
+++ b/installability/tester.py
@@ -337,7 +337,7 @@ class InstallabilityTester(object):
         #     of t via recursion (calls _check_inst).  In this case
         #     check and choices are not (always) empty.
 
-        def _prune_choices(rebuild, set=set, len=len):
+        def _prune_choices(rebuild, len=len):
             """Picks a choice from choices and updates rebuild.
 
             Prunes the choices and updates "rebuild" to reflect the
-- 
2.8.0.rc3

From 2d42774f8cb5710710a9ebedd1b6309aa6642be4 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 26 Mar 2016 18:03:12 +0000
Subject: [PATCH 31/33] Remove redundant assert

We asserted that a list of packages was a subset of the "affected"
set.  This was a given, since the list was seeded *only* with packages
from affected (and only if their installability changed).

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney_util.py | 13 +++----------
 1 file changed, 3 insertions(+), 10 deletions(-)

diff --git a/britney_util.py b/britney_util.py
index ca472e1..b99c388 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -520,15 +520,14 @@ def clone_nuninst(nuninst, packages_s, architectures):
     return clone
 
 
-def test_installability(inst_tester, pkg_name, pkg_id, broken, to_check, nuninst_arch):
+def test_installability(inst_tester, pkg_name, pkg_id, broken, nuninst_arch):
     """Test for installability of a package on an architecture
 
     (pkg_name, pkg_version, pkg_arch) is the package to check.
 
     broken is the set of broken packages.  If p changes
     installability (e.g. goes from uninstallable to installable),
-    broken will be updated accordingly.  Furthermore, p will be
-    added to "to_check" for futher processing.
+    broken will be updated accordingly.
 
     If nuninst_arch is not None then it also updated in the same
     way as broken is.
@@ -538,12 +537,10 @@ def test_installability(inst_tester, pkg_name, pkg_id, broken, to_check, nuninst
         # not installable
         if pkg_name not in broken:
             broken.add(pkg_name)
-            to_check.append(pkg_id)
         if nuninst_arch is not None and pkg_name not in nuninst_arch:
             nuninst_arch.add(pkg_name)
     else:
         if pkg_name in broken:
-            to_check.append(pkg_id)
             broken.remove(pkg_name)
         if nuninst_arch is not None and pkg_name in nuninst_arch:
             nuninst_arch.remove(pkg_name)
@@ -551,7 +548,6 @@ def test_installability(inst_tester, pkg_name, pkg_id, broken, to_check, nuninst
 
 def check_installability(inst_tester, binaries, arch, affected, check_archall, nuninst):
     broken = nuninst[arch + "+all"]
-    to_check = []
     packages_t_a = binaries[arch][0]
 
     # broken packages (first round)
@@ -570,8 +566,5 @@ def check_installability(inst_tester, binaries, arch, affected, check_archall, n
             nuninst_arch = nuninst[parch]
         elif actual_arch == 'all':
             nuninst[parch].discard(name)
-        test_installability(inst_tester, name, pkg_id, broken, to_check, nuninst_arch)
+        test_installability(inst_tester, name, pkg_id, broken, nuninst_arch)
 
-    # We have always overshot the affected set, so to_check does not
-    # contain anything new.
-    assert affected.issuperset(to_check)
-- 
2.8.0.rc3

From e1bb8db4c64f8ed6dee1959b417fc2a1ff787f0d Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 26 Mar 2016 18:07:37 +0000
Subject: [PATCH 32/33] Remove two redundant ifs duplicating an earlier if

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney_util.py | 9 ++++-----
 1 file changed, 4 insertions(+), 5 deletions(-)

diff --git a/britney_util.py b/britney_util.py
index b99c388..2e366ec 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -429,7 +429,8 @@ def write_controlfiles(sources, packages, suite, basedir):
                 output = "Package: %s\n" % pkg
                 bin_data = binaries[pkg]
                 for key, k in key_pairs:
-                    if not bin_data[key]: continue
+                    if not bin_data[key]:
+                        continue
                     if key == SOURCE:
                         src = bin_data[SOURCE]
                         if sources_s[src][MAINTAINER]:
@@ -446,11 +447,9 @@ def write_controlfiles(sources, packages, suite, basedir):
                                 source = src
                         output += (k + ": " + source + "\n")
                     elif key == PROVIDES:
-                        if bin_data[key]:
-                            output += (k + ": " + ", ".join(relation_atom_to_string(p) for p in bin_data[key]) + "\n")
+                        output += (k + ": " + ", ".join(relation_atom_to_string(p) for p in bin_data[key]) + "\n")
                     elif key == ESSENTIAL:
-                        if bin_data[key]:
-                            output += (k + ": " + " yes\n")
+                        output += (k + ": " + " yes\n")
                     else:
                         output += (k + ": " + bin_data[key] + "\n")
                 f.write(output + "\n")
-- 
2.8.0.rc3

From 47cc4c0a110be90e2e5e3c1a23f7c6ce24d76114 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 26 Mar 2016 18:42:52 +0000
Subject: [PATCH 33/33] Avoid some unnecessary effort in compuing "affected"

 * A package being removed is *not* affected
   - It will just be filtered out later in "check_packages"
 * Since all transitive reverse dependencies will be added to
   "affected" at the end of the method, there is no reason to
   find the immediate set of reverse dependencies of a package
   if the package is added itself as well.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 7 +------
 1 file changed, 1 insertion(+), 6 deletions(-)

diff --git a/britney.py b/britney.py
index f9616a6..365fe3c 100755
--- a/britney.py
+++ b/britney.py
@@ -2133,7 +2133,6 @@ class Britney(object):
                     if pkey not in eqv_set:
                         # all the reverse dependencies are affected by
                         # the change
-
                         affected.update(inst_tester.reverse_dependencies_of(rm_pkg_id))
                         affected.update(inst_tester.negative_dependencies_of(rm_pkg_id))
 
@@ -2163,7 +2162,6 @@ class Britney(object):
             version = binaries_t_a[item.package][VERSION]
             pkg_id = (item.package, version, item.architecture)
             undo['binaries'][(item.package, item.architecture)] = pkg_id
-            affected.add(pkg_id)
             affected.update(inst_tester.reverse_dependencies_of(pkg_id))
             del binaries_t_a[item.package]
             inst_tester.remove_testing_binary(pkg_id)
@@ -2192,10 +2190,8 @@ class Britney(object):
                     # save the old binary package
                     undo['binaries'][key] = old_pkg_id
                     if not equivalent_replacement:
-                        # all the reverse dependencies are affected by
-                        # the change
-                        affected.update(inst_tester.reverse_dependencies_of(old_pkg_id))
                         # all the reverse conflicts
+                        affected.update(inst_tester.reverse_dependencies_of(old_pkg_id))
                         affected.update(inst_tester.negative_dependencies_of(old_pkg_id))
                     inst_tester.remove_testing_binary(old_pkg_id)
                 elif hint_undo:
@@ -2230,7 +2226,6 @@ class Britney(object):
                 if not equivalent_replacement:
                     # all the reverse dependencies are affected by the change
                     affected.add(updated_pkg_id)
-                    affected.update(inst_tester.reverse_dependencies_of(updated_pkg_id))
                     affected.update(inst_tester.negative_dependencies_of(updated_pkg_id))
 
             # add/update the source package
-- 
2.8.0.rc3

Attachment: signature.asc
Description: OpenPGP digital signature


Reply to: