[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

[ITM] Britney2 - Mirror layout support, improved excuses.yaml and constraints/faux package support



Hi,

I have created another Britney branch that I would like to merge into
master[1].

 * Proposed deadline for review: 2016-05-18 (~1½ week from now)

Key highlights are:

 * Full support for regular mirrors (as read-only data sources)
   - Albeit still lacking partial suite support.
   - Notably database files are moved "out" of "testing" and "unstable"
     data directories.

 * More machine parsable excuses in excuses.yaml
   - Now including missing builds and (most) dependencies issues

 * All current "faux package" use cases are supported directly in
   Britney.
   - Notably constraints offer a stronger guarantee for keeping
     packages installable.

 * [code-style] Binary packages are now namedtuples and (handled as)
   immutable objects.


ALONG WITH THE MERGE
====================

As a part of the merge, I intend to /also/ change the britney1
repository.  Notably, I want to:

 * Migrate the faux packages to use Britney's "native" faux package
   support and constraints.
   - This will obsolete most of britney1.git/fauxpkg

 * Move the location of the age, urgency and bugs files.
   - They would be moved to their new names and be placed in e.g.
     /srv/release.debian.org/britney/state

I am also considering to remove the "pkglist" and have Britney2 read the
data directly from the mirror itself.  It will not have any adverse
effects on the able to do extra runs/re-runs, but we would have to
remove the "--control-files" argument.
  On the plus side, we would be able to reclaim all of
"~release/britney/var/data*", which is about 2.7GB.

TEST SUITE
==========

There are no regressions or test result improvements in the current test
suite. However, I have added new tests to cover some of the new
features. There are available from:

https://anonscm.debian.org/cgit/collab-maint/britney2-tests.git/log/?h=britney-fixes-2016-03-merge-round2

CONSTRAINTS VS. FAUX PACKAGES
=============================

Previously we have used faux packages for two purposes:

 * Create fake packages to satisfy dependencies for packages in
   non-free/contrib that depends on packages not in Debian
   - Like vendor specific machine configuration packages

 * To ensure certain packages were always present and installable
   in testing.
   - This includes basically d-i packages plus all of the task packages.

In the patch series, I have made these use-cases distinct and the
feature for the latter is called a "constraint" (bike-shedding welcome).
 Noteworthy features of constraints:

 * Migrations will be rolled back if they make regressions on
   constraints (for non-BREAK_ARCHES).
   - Even if they would improve nuninst counters.
   - Accordingly, Britney will never do an "uninstallability trade"
     at the cost of a constraint violation.

 * Constraints work like ratchets: Once satisfied, they will be
   enforced.
  - Each constraint is checked in isolation and Britney will not trade
    one constraint for another (although we could implement that feature
    if you think it has value).

 * Constraints are still subject to BREAK_ARCHES and violations that
   occur only on BREAK_ARCHES will be ignored (like we do with nuninst
   counters)

For now, I have opted for implementing constraints as actual (fake)
packages for two reasons:

 * It was simple and easily permitted reuse of existing optimisations to
   avoid unnecessary installability testing.

 * It meant that the output format would remain compatible (and we
   didn't have to patch testing.pl, etc. to learn about constraints).

That said, I do not necessarily intend to keep this implementation
detail forever.

Thanks,
~Niels


[1] Also available via:

https://anonscm.debian.org/cgit/users/nthykier/britney.git/log/?h=britney-fixes-2016-03

(Named after when the branch started rather than anything sensible)
From d6c980fea3b6eb49daa2fc5a87f1b776946c65b0 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 25 Mar 2016 15:23:34 +0000
Subject: [PATCH 01/28] Move age-handling into a separate file

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py           | 167 ++++++++---------------------------
 excuse.py            |   1 +
 policies/__init__.py |   0
 policies/policy.py   | 242 +++++++++++++++++++++++++++++++++++++++++++++++++++
 4 files changed, 281 insertions(+), 129 deletions(-)
 create mode 100644 policies/__init__.py
 create mode 100644 policies/policy.py

diff --git a/britney.py b/britney.py
index 7518b60..928315d 100755
--- a/britney.py
+++ b/britney.py
@@ -57,7 +57,7 @@ Other than source and binary packages, Britney loads the following data:
     of a source package (see Britney.read_dates).
 
   * Urgencies, which contains the urgency of the upload of a given
-    version of a source package (see Britney.read_urgencies).
+    version of a source package (see AgePolicy._read_urgencies).
 
   * Hints, which contains lists of commands which modify the standard behaviour
     of Britney (see Britney.read_hints).
@@ -206,6 +206,7 @@ from britney_util import (old_libraries_format, undo_changes,
                           write_excuses, write_heidi_delta, write_controlfiles,
                           old_libraries, is_nuninst_asgood_generous,
                           clone_nuninst, check_installability)
+from policies.policy import AgePolicy, PolicyVerdict
 from consts import (VERSION, SECTION, BINARIES, MAINTAINER, FAKESRC,
                    SOURCE, SOURCEVER, ARCHITECTURE, DEPENDS, CONFLICTS,
                    PROVIDES, MULTIARCH, ESSENTIAL)
@@ -248,10 +249,9 @@ class Britney(object):
         This method initializes and populates the data lists, which contain all
         the information needed by the other methods of the class.
         """
-        # britney's "day" begins at 3pm
-        self.date_now = int(((time.time() / (60*60)) - 15) / 24)
 
         # parse the command line arguments
+        self.policies = []
         self.__parse_arguments()
         MigrationItem.set_architectures(self.options.architectures)
 
@@ -336,10 +336,9 @@ class Britney(object):
         self.bugs = {'unstable': self.read_bugs(self.options.unstable),
                      'testing': self.read_bugs(self.options.testing),}
         self.normalize_bugs()
-
-        # read additional data
-        self.dates = self.read_dates(self.options.testing)
-        self.urgencies = self.read_urgencies(self.options.testing)
+        for policy in self.policies:
+            policy.hints = self.hints
+            policy.initialise(self)
 
     def merge_pkg_entries(self, package, parch, pkg_entry1, pkg_entry2,
                           check_fields=check_fields, check_field_name=check_field_name):
@@ -402,7 +401,7 @@ class Britney(object):
 
         # minimum days for unstable-testing transition and the list of hints
         # are handled as an ad-hoc case
-        self.MINDAYS = {}
+        MINDAYS = {}
         self.HINTS = {'command-line': self.HINTS_ALL}
         with open(self.options.config, encoding='utf-8') as config:
             for line in config:
@@ -411,7 +410,7 @@ class Britney(object):
                     k = k.strip()
                     v = v.strip()
                     if k.startswith("MINDAYS_"):
-                        self.MINDAYS[k.split("_")[1].lower()] = int(v)
+                        MINDAYS[k.split("_")[1].lower()] = int(v)
                     elif k.startswith("HINTS_"):
                         self.HINTS[k.split("_")[1].lower()] = \
                             reduce(lambda x,y: x+y, [hasattr(self, "HINTS_" + i) and getattr(self, "HINTS_" + i) or (i,) for i in v.split()])
@@ -452,6 +451,8 @@ class Britney(object):
             self.options.ignore_cruft == "0":
             self.options.ignore_cruft = False
 
+        self.policies.append(AgePolicy(self.options, MINDAYS))
+
     def log(self, msg, type="I"):
         """Print info messages according to verbosity level
         
@@ -863,90 +864,6 @@ class Britney(object):
             if maxvert is None:
                 self.bugs['testing'][pkg] = []
 
-    def read_dates(self, basedir):
-        """Read the upload date for the packages from the specified directory
-        
-        The upload dates are read from the `Dates' file within the directory
-        specified as `basedir' parameter. The file contains rows with the
-        format:
-
-        <package-name> <version> <date-of-upload>
-
-        The dates are expressed as the number of days from 1970-01-01.
-
-        The method returns a dictionary where the key is the binary package
-        name and the value is a tuple with two items, the version and the date.
-        """
-        dates = {}
-        filename = os.path.join(basedir, "Dates")
-        self.log("Loading upload data from %s" % filename)
-        for line in open(filename, encoding='ascii'):
-            l = line.split()
-            if len(l) != 3: continue
-            try:
-                dates[l[0]] = (l[1], int(l[2]))
-            except ValueError:
-                self.log("Dates, unable to parse \"%s\"" % line, type="E")
-        return dates
-
-    def write_dates(self, basedir, dates):
-        """Write the upload date for the packages to the specified directory
-
-        For a more detailed explanation of the format, please check the method
-        read_dates.
-        """
-        filename = os.path.join(basedir, "Dates")
-        self.log("Writing upload data to %s" % filename)
-        with open(filename, 'w', encoding='utf-8') as f:
-            for pkg in sorted(dates):
-                f.write("%s %s %d\n" % ((pkg,) + dates[pkg]))
-
-
-    def read_urgencies(self, basedir):
-        """Read the upload urgency of the packages from the specified directory
-        
-        The upload urgencies are read from the `Urgency' file within the
-        directory specified as `basedir' parameter. The file contains rows
-        with the format:
-
-        <package-name> <version> <urgency>
-
-        The method returns a dictionary where the key is the binary package
-        name and the value is the greatest urgency from the versions of the
-        package that are higher then the testing one.
-        """
-
-        urgencies = {}
-        filename = os.path.join(basedir, "Urgency")
-        self.log("Loading upload urgencies from %s" % filename)
-        for line in open(filename, errors='surrogateescape', encoding='ascii'):
-            l = line.split()
-            if len(l) != 3: continue
-
-            # read the minimum days associated with the urgencies
-            urgency_old = urgencies.get(l[0], None)
-            mindays_old = self.MINDAYS.get(urgency_old, 1000)
-            mindays_new = self.MINDAYS.get(l[2], self.MINDAYS[self.options.default_urgency])
-
-            # if the new urgency is lower (so the min days are higher), do nothing
-            if mindays_old <= mindays_new:
-                continue
-
-            # if the package exists in testing and it is more recent, do nothing
-            tsrcv = self.sources['testing'].get(l[0], None)
-            if tsrcv and apt_pkg.version_compare(tsrcv[VERSION], l[1]) >= 0:
-                continue
-
-            # if the package doesn't exist in unstable or it is older, do nothing
-            usrcv = self.sources['unstable'].get(l[0], None)
-            if not usrcv or apt_pkg.version_compare(usrcv[VERSION], l[1]) < 0:
-                continue
-
-            # update the urgency for the package
-            urgencies[l[0]] = l[2]
-
-        return urgencies
-
     def read_hints(self, basedir):
         """Read the hint commands from the specified directory
         
@@ -1357,13 +1274,6 @@ class Britney(object):
             excuse.addhtml("%s source package doesn't exist" % (src))
             update_candidate = False
 
-        # retrieve the urgency for the upload, ignoring it if this is a NEW package (not present in testing)
-        urgency = self.urgencies.get(src, self.options.default_urgency)
-        if not source_t:
-            if self.MINDAYS[urgency] < self.MINDAYS[self.options.default_urgency]:
-                excuse.addhtml("Ignoring %s urgency setting for NEW package" % (urgency))
-                urgency = self.options.default_urgency
-
         # if there is a `remove' hint and the requested version is the same as the
         # version in testing, then stop here and return False
         for item in self.hints.search('remove', package=src):
@@ -1424,30 +1334,32 @@ class Britney(object):
         # permanence in unstable before updating testing; if the source package is too young,
         # the check fails and we set update_candidate to False to block the update; consider
         # the age-days hint, if specified for the package
-        if suite == 'unstable':
-            if src not in self.dates:
-                self.dates[src] = (source_u[VERSION], self.date_now)
-            elif self.dates[src][0] != source_u[VERSION]:
-                self.dates[src] = (source_u[VERSION], self.date_now)
-
-            days_old = self.date_now - self.dates[src][1]
-            min_days = self.MINDAYS[urgency]
-
-            for age_days_hint in [x for x in self.hints.search('age-days', package=src)
-                                  if source_u[VERSION] == x.version]:
-                excuse.addhtml("Overriding age needed from %d days to %d by %s" % (min_days,
-                    int(age_days_hint.days), age_days_hint.user))
-                min_days = int(age_days_hint.days)
-
-            excuse.setdaysold(days_old, min_days)
-            if days_old < min_days:
-                urgent_hints = [x for x in self.hints.search('urgent', package=src)
-                                if source_u[VERSION] == x.version]
-                if urgent_hints:
-                    excuse.addhtml("Too young, but urgency pushed by %s" % (urgent_hints[0].user))
+        policy_info = excuse.policy_info
+        policy_verdict = PolicyVerdict.PASS
+        for policy in self.policies:
+            if suite in policy.applicable_suites:
+                v = policy.apply_policy(policy_info, suite, src, source_t, source_u)
+                if v.value > policy_verdict.value:
+                    policy_verdict = v
+
+        if policy_verdict.is_rejected:
+            update_candidate = False
+
+        # Joggle some things into excuses
+        # - remove once the YAML is the canonical source for this information
+        if 'age' in policy_info:
+            age_info = policy_info['age']
+            age_hint = age_info.get('age-requirement-reduced', None)
+            age_min_req = age_info['age-requirement']
+            if age_hint:
+                new_req = age_hint['new-requirement']
+                who = age_hint['changed-by']
+                if age_hint['new-requirement']:
+                    excuse.addhtml("Overriding age needed from %d days to %d by %s" % (
+                        age_min_req, age_hint['new-requirement'], who))
                 else:
-                    update_candidate = False
-                    excuse.addreason("age")
+                    excuse.addhtml("Too young, but urgency pushed by %s" % who)
+            excuse.setdaysold(age_info['current-age'], age_min_req)
 
         all_binaries = self.all_binaries
 
@@ -1560,7 +1472,7 @@ class Britney(object):
                         excuse.addreason("build-arch")
                         excuse.addreason("build-arch-%s" % arch)
 
-                if self.date_now != self.dates[src][1]:
+                if 'age' in policy_info and policy_info['age']['current-age']:
                     excuse.addhtml(text)
 
         # if the source package has no binaries, set update_candidate to False to block the update
@@ -2656,11 +2568,8 @@ class Britney(object):
                 write_controlfiles(self.sources, self.binaries,
                                    'testing', self.options.testing)
 
-            # write dates
-            try:
-                self.write_dates(self.options.outputdir, self.dates)
-            except AttributeError:
-                self.write_dates(self.options.testing, self.dates)
+            for policy in self.policies:
+                policy.save_state(self)
 
             # write HeidiResult
             self.log("Writing Heidi results to %s" % self.options.heidi_output)
diff --git a/excuse.py b/excuse.py
index 8942bef..e12318e 100644
--- a/excuse.py
+++ b/excuse.py
@@ -59,6 +59,7 @@ class Excuse(object):
         self.oldbugs = set()
         self.reason = {}
         self.htmlline = []
+        self.policy_info = {}
 
     def sortkey(self):
         if self.daysold == None:
diff --git a/policies/__init__.py b/policies/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/policies/policy.py b/policies/policy.py
new file mode 100644
index 0000000..ed6d6a2
--- /dev/null
+++ b/policies/policy.py
@@ -0,0 +1,242 @@
+from abc import abstractmethod
+from enum import Enum, unique
+import apt_pkg
+import os
+import time
+
+from consts import VERSION
+
+
+@unique
+class PolicyVerdict(Enum):
+    """"""
+    """
+    The migration item passed the policy.
+    """
+    PASS = 1
+    """
+    The policy was completely overruled by a hint.
+    """
+    PASS_HINTED = 2
+    """
+    The migration item did not pass the policy, but the failure is believed
+    to be temporary
+    """
+    REJECTED_TEMPORARILY = 3
+    """
+    The migration item did not pass the policy and the failure is believed
+    to be uncorrectable (i.e. a hint or a new version is needed)
+    """
+    REJECTED_PERMANENTLY = 4
+
+    @property
+    def is_rejected(self):
+        return True if self.name.startswith('REJECTED') else False
+
+
+class BasePolicy(object):
+
+    def __init__(self, options, applicable_suites):
+        self.options = options
+        self.applicable_suites = applicable_suites
+        self.hints = None
+
+    # FIXME: use a proper logging framework
+    def log(self, msg, type="I"):
+        """Print info messages according to verbosity level
+
+        An easy-and-simple log method which prints messages to the standard
+        output. The type parameter controls the urgency of the message, and
+        can be equal to `I' for `Information', `W' for `Warning' and `E' for
+        `Error'. Warnings and errors are always printed, and information is
+        printed only if verbose logging is enabled.
+        """
+        if self.options.verbose or type in ("E", "W"):
+            print("%s: [%s] - %s" % (type, time.asctime(), msg))
+
+    def initialise(self, britney):
+        """Called once to make the policy initialise any data structures
+
+        This is useful for e.g. parsing files or other "heavy do-once" work.
+        """
+        pass
+
+    def save_state(self, britney):
+        """Called once at the end of the run to make the policy save any persistent data
+
+        Note this will *not* be called for "dry-runs" as such runs should not change
+        the state.
+        """
+        pass
+
+    @abstractmethod
+    def apply_policy(self, policy_info, suite, source_name, source_data_tdist, source_data_srcdist):
+        pass
+
+
+class AgePolicy(BasePolicy):
+    """Configurable Aging policy for source migrations
+
+    The AgePolicy will let packages stay in the source suite for a pre-defined
+    amount of days before letting migrate (based on their urgency, if any).
+
+    The AgePolicy's decision is influenced by the following:
+
+    State files:
+     * ${TESTING}/Urgency: File containing urgencies for source packages.
+       Note that urgencies are "sticky" and the most "urgent" urgency will be
+       used (i.e. the one with lowest age-requirements).
+       - This file needs to be updated externally, if the policy should take
+         urgencies into consideration.  If empty (or not updated), the policy
+         will simply use the default urgency (see the "Config" section below)
+       - In Debian, these values are taken from the .changes file, but that is
+         not a requirement for Britney.
+     * ${TESTING}/Dates: File containing the age of all source packages.
+       - The policy will automatically update this file.
+    Config:
+     * DEFAULT_URGENCY: Name of the urgency used for packages without an urgency
+       (or for unknown urgencies).  Will also  be used to set the "minimum"
+       aging requirements for packages not in the target suite.
+     * MINDAYS_<URGENCY>: The age-requirements in days for packages with the
+       given urgency.
+       - Commonly used urgencies are: low, medium, high, emergency, critical
+    Hints:
+     * urgent <source>/<version>: Disregard the age requirements for a given
+       source/version.
+     * age-days X <source>/<version>: Set the age requirements for a given
+       source/version to X days.  Note that X can exceed the highest
+       age-requirement normally given.
+
+    """
+
+    def __init__(self, options, mindays):
+        super().__init__(options, {'unstable'})
+        self._min_days = mindays
+        if options.default_urgency not in mindays:
+            raise ValueError("Missing age-requirement for default urgency (MINDAYS_%s)" % options.default_urgency)
+        self._min_days_default = mindays[options.default_urgency]
+        # britney's "day" begins at 3pm
+        self._date_now = int(((time.time() / (60*60)) - 15) / 24)
+        self._dates = {}
+        self._urgencies = {}
+
+    def initialise(self, britney):
+        super().initialise(britney)
+        self._read_dates_file()
+        self._read_urgencies_file(britney)
+
+    def save_state(self, britney):
+        super().save_state(britney)
+        self._write_dates_file()
+
+    def apply_policy(self, policy_info, suite, source_name, source_data_tdist, source_data_srcdist):
+        # retrieve the urgency for the upload, ignoring it if this is a NEW package (not present in testing)
+        urgency = self._urgencies.get(source_name, self.options.default_urgency)
+        if 'age' not in policy_info:
+            policy_info['age'] = age_info = {}
+        else:
+            age_info = policy_info['age']
+
+        if urgency not in self._min_days:
+            age_info['unknown-urgency'] = urgency
+            urgency = self.options.default_urgency
+
+        if not source_data_tdist:
+            if self._min_days[urgency] < self._min_days_default:
+                age_info['urgency-reduced'] = {
+                    'from': urgency,
+                    'to': self.options.default_urgency,
+                }
+                urgency = self.options.default_urgency
+
+        if source_name not in self._dates:
+            self._dates[source_name] = (source_data_srcdist[VERSION], self._date_now)
+        elif self._dates[source_name][0] != source_data_srcdist[VERSION]:
+            self._dates[source_name] = (source_data_srcdist[VERSION], self._date_now)
+
+        days_old = self._date_now - self._dates[source_name][1]
+        min_days = self._min_days[urgency]
+        age_info['age-requirement'] = min_days
+        age_info['current-age'] = days_old
+
+        for age_days_hint in [x for x in self.hints.search('age-days', package=source_name)
+                              if source_data_srcdist[VERSION] == x.version]:
+            new_req = int(age_days_hint.days)
+            age_info['age-requirement-reduced'] = {
+                'new-requirement': new_req,
+                'changed-by': age_days_hint.user
+            }
+            min_days = new_req
+
+        if days_old < min_days:
+            urgent_hints = [x for x in self.hints.search('urgent', package=source_name)
+                            if source_data_srcdist[VERSION] == x.version]
+            if urgent_hints:
+                age_info['age-requirement-reduced'] = {
+                    'new-requirement': 0,
+                    'changed-by': urgent_hints[0].user
+                }
+                return PolicyVerdict.PASS_HINTED
+            else:
+                return PolicyVerdict.REJECTED_TEMPORARILY
+
+        return PolicyVerdict.PASS
+
+    def _read_dates_file(self):
+        """Parse the dates file"""
+        dates = self._dates
+        filename = os.path.join(self.options.testing, 'Dates')
+        with open(filename, encoding='utf-8') as fd:
+            for line in fd:
+                # <source> <version> <date>
+                l = line.split()
+                if len(l) != 3:
+                    continue
+                try:
+                    dates[l[0]] = (l[1], int(l[2]))
+                except ValueError:
+                    pass
+
+    def _read_urgencies_file(self, britney):
+        urgencies = self._urgencies
+        filename = os.path.join(self.options.testing, 'Urgency')
+        min_days_default = self._min_days_default
+        with open(filename, errors='surrogateescape', encoding='ascii') as fd:
+            for line in fd:
+                # <source> <version> <urgency>
+                l = line.split()
+                if len(l) != 3:
+                    continue
+
+                # read the minimum days associated with the urgencies
+                urgency_old = urgencies.get(l[0], None)
+                mindays_old = self._min_days.get(urgency_old, 1000)
+                mindays_new = self._min_days.get(l[2], min_days_default)
+
+                # if the new urgency is lower (so the min days are higher), do nothing
+                if mindays_old <= mindays_new:
+                    continue
+
+                # if the package exists in testing and it is more recent, do nothing
+                tsrcv = britney.sources['testing'].get(l[0], None)
+                if tsrcv and apt_pkg.version_compare(tsrcv[VERSION], l[1]) >= 0:
+                    continue
+
+                # if the package doesn't exist in unstable or it is older, do nothing
+                usrcv = britney.sources['unstable'].get(l[0], None)
+                if not usrcv or apt_pkg.version_compare(usrcv[VERSION], l[1]) < 0:
+                    continue
+
+                # update the urgency for the package
+                urgencies[l[0]] = l[2]
+
+    def _write_dates_file(self):
+        dates = self._dates
+        directory = self.options.testing
+        filename = os.path.join(directory, 'Dates')
+        filename_tmp = os.path.join(directory, 'Dates_new')
+        with open(filename_tmp, 'w', encoding='utf-8') as fd:
+            for pkg in sorted(dates):
+                version, date = dates[pkg]
+                fd.write("%s %s %d\n" % (pkg, version, date))
+        os.rename(filename_tmp, filename)
-- 
2.8.1

From 012f3ef4c75c679e3085efac29be04cea1bd76d1 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 26 Mar 2016 08:16:39 +0000
Subject: [PATCH 02/28] Move RC bug handing into a policy

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py         | 144 +++++++++++------------------------------------------
 policies/policy.py |  98 +++++++++++++++++++++++++++++++++++-
 2 files changed, 126 insertions(+), 116 deletions(-)

diff --git a/britney.py b/britney.py
index 928315d..a07c1db 100755
--- a/britney.py
+++ b/britney.py
@@ -51,7 +51,7 @@ and Britney.read_binaries).
 Other than source and binary packages, Britney loads the following data:
 
   * BugsV, which contains the list of release-critical bugs for a given
-    version of a source or binary package (see Britney.read_bugs).
+    version of a source or binary package (see RCBugPolicy.read_bugs).
 
   * Dates, which contains the date of the upload of a given version 
     of a source package (see Britney.read_dates).
@@ -206,7 +206,7 @@ from britney_util import (old_libraries_format, undo_changes,
                           write_excuses, write_heidi_delta, write_controlfiles,
                           old_libraries, is_nuninst_asgood_generous,
                           clone_nuninst, check_installability)
-from policies.policy import AgePolicy, PolicyVerdict
+from policies.policy import AgePolicy, RCBugPolicy, PolicyVerdict
 from consts import (VERSION, SECTION, BINARIES, MAINTAINER, FAKESRC,
                    SOURCE, SOURCEVER, ARCHITECTURE, DEPENDS, CONFLICTS,
                    PROVIDES, MULTIARCH, ESSENTIAL)
@@ -332,10 +332,6 @@ class Britney(object):
                 for stat in arch_stat.stat_summary():
                     self.log(">  - %s" % stat, type="I")
 
-        # read the release-critical bug summaries for testing and unstable
-        self.bugs = {'unstable': self.read_bugs(self.options.unstable),
-                     'testing': self.read_bugs(self.options.testing),}
-        self.normalize_bugs()
         for policy in self.policies:
             policy.hints = self.hints
             policy.initialise(self)
@@ -452,6 +448,7 @@ class Britney(object):
             self.options.ignore_cruft = False
 
         self.policies.append(AgePolicy(self.options, MINDAYS))
+        self.policies.append(RCBugPolicy(self.options))
 
     def log(self, msg, type="I"):
         """Print info messages according to verbosity level
@@ -792,78 +789,9 @@ class Britney(object):
             for provided_pkg, provided_version, _ in dpkg[PROVIDES]:
                 provides[provided_pkg].add((pkg, provided_version))
 
-
         # return a tuple with the list of real and virtual packages
         return (packages, provides)
 
-    def read_bugs(self, basedir):
-        """Read the release critical bug summary from the specified directory
-        
-        The RC bug summaries are read from the `BugsV' file within the
-        directory specified in the `basedir' parameter. The file contains
-        rows with the format:
-
-        <package-name> <bug number>[,<bug number>...]
-
-        The method returns a dictionary where the key is the binary package
-        name and the value is the list of open RC bugs for it.
-        """
-        bugs = defaultdict(list)
-        filename = os.path.join(basedir, "BugsV")
-        self.log("Loading RC bugs data from %s" % filename)
-        for line in open(filename, encoding='ascii'):
-            l = line.split()
-            if len(l) != 2:
-                self.log("Malformed line found in line %s" % (line), type='W')
-                continue
-            pkg = l[0]
-            bugs[pkg] += l[1].split(",")
-        return bugs
-
-    def __maxver(self, pkg, dist):
-        """Return the maximum version for a given package name
-        
-        This method returns None if the specified source package
-        is not available in the `dist' distribution. If the package
-        exists, then it returns the maximum version between the
-        source package and its binary packages.
-        """
-        maxver = None
-        if pkg in self.sources[dist]:
-            maxver = self.sources[dist][pkg][VERSION]
-        for arch in self.options.architectures:
-            if pkg not in self.binaries[dist][arch][0]: continue
-            pkgv = self.binaries[dist][arch][0][pkg][VERSION]
-            if maxver is None or apt_pkg.version_compare(pkgv, maxver) > 0:
-                maxver = pkgv
-        return maxver
-
-    def normalize_bugs(self):
-        """Normalize the release critical bug summaries for testing and unstable
-        
-        The method doesn't return any value: it directly modifies the
-        object attribute `bugs'.
-        """
-        # loop on all the package names from testing and unstable bug summaries
-        for pkg in set(chain(self.bugs['testing'], self.bugs['unstable'])):
-
-            # make sure that the key is present in both dictionaries
-            if pkg not in self.bugs['testing']:
-                self.bugs['testing'][pkg] = []
-            elif pkg not in self.bugs['unstable']:
-                self.bugs['unstable'][pkg] = []
-
-            if pkg.startswith("src:"):
-                pkg = pkg[4:]
-
-            # retrieve the maximum version of the package in testing:
-            maxvert = self.__maxver(pkg, 'testing')
-
-            # if the package is not available in testing, then reset
-            # the list of RC bugs
-            if maxvert is None:
-                self.bugs['testing'][pkg] = []
-
     def read_hints(self, basedir):
         """Read the hint commands from the specified directory
         
@@ -1361,6 +1289,31 @@ class Britney(object):
                     excuse.addhtml("Too young, but urgency pushed by %s" % who)
             excuse.setdaysold(age_info['current-age'], age_min_req)
 
+        # if the suite is unstable, then we have to check the release-critical bug lists before
+        # updating testing; if the unstable package has RC bugs that do not apply to the testing
+        # one, the check fails and we set update_candidate to False to block the update
+        if 'rc-bugs' in policy_info:
+            rcbugs_info = policy_info['rc-bugs']
+            new_bugs = rcbugs_info['unique-source-bugs']
+            old_bugs = rcbugs_info['unique-target-bugs']
+
+            excuse.setbugs(old_bugs, new_bugs)
+
+            if new_bugs:
+                excuse.addhtml("%s <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?"; \
+                               "pkg=src:%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
+                               "target=\"_blank\">has new bugs</a>!" % (src, quote(src)))
+                excuse.addhtml("Updating %s introduces new bugs: %s" % (src, ", ".join(
+                    ["<a href=\"http://bugs.debian.org/%s\";>#%s</a>" % (quote(a), a) for a in new_bugs])))
+                update_candidate = False
+
+            if old_bugs:
+                excuse.addhtml("Updating %s fixes old bugs: %s" % (src, ", ".join(
+                    ["<a href=\"http://bugs.debian.org/%s\";>#%s</a>" % (quote(a), a) for a in old_bugs])))
+            if new_bugs and len(old_bugs) > len(new_bugs):
+                excuse.addhtml("%s introduces new bugs, so still ignored (even "
+                               "though it fixes more than it introduces, whine at debian-release)" % src)
+
         all_binaries = self.all_binaries
 
         if suite in ('pu', 'tpu') and source_t:
@@ -1372,7 +1325,7 @@ class Britney(object):
                 if not any(x for x in source_t[BINARIES]
                            if x[2] == arch and all_binaries[x][ARCHITECTURE] != 'all'):
                     continue
-                    
+
                 # if the (t-)p-u package has produced any binaries on
                 # this architecture then we assume it's ok. this allows for
                 # uploads to (t-)p-u which intentionally drop binary
@@ -1481,45 +1434,6 @@ class Britney(object):
             excuse.addreason("no-binaries")
             update_candidate = False
 
-        # if the suite is unstable, then we have to check the release-critical bug lists before
-        # updating testing; if the unstable package has RC bugs that do not apply to the testing
-        # one, the check fails and we set update_candidate to False to block the update
-        if suite == 'unstable':
-            for pkg in pkgs:
-                bugs_t = []
-                bugs_u = []
-                if pkg in self.bugs['testing']:
-                    bugs_t.extend(self.bugs['testing'][pkg])
-                if pkg in self.bugs['unstable']:
-                    bugs_u.extend(self.bugs['unstable'][pkg])
-                if 'source' in pkgs[pkg]:
-                    spkg = "src:%s" % (pkg)
-                    if spkg in self.bugs['testing']:
-                        bugs_t.extend(self.bugs['testing'][spkg])
-                    if spkg in self.bugs['unstable']:
-                        bugs_u.extend(self.bugs['unstable'][spkg])
- 
-                new_bugs = sorted(set(bugs_u).difference(bugs_t))
-                old_bugs = sorted(set(bugs_t).difference(bugs_u))
-
-                excuse.setbugs(old_bugs,new_bugs)
-
-                if len(new_bugs) > 0:
-                    excuse.addhtml("%s (%s) <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?"; \
-                        "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
-                        "target=\"_blank\">has new bugs</a>!" % (pkg, ", ".join(pkgs[pkg]), quote(pkg)))
-                    excuse.addhtml("Updating %s introduces new bugs: %s" % (pkg, ", ".join(
-                        ["<a href=\"http://bugs.debian.org/%s\";>#%s</a>" % (quote(a), a) for a in new_bugs])))
-                    update_candidate = False
-                    excuse.addreason("buggy")
-
-                if len(old_bugs) > 0:
-                    excuse.addhtml("Updating %s fixes old bugs: %s" % (pkg, ", ".join(
-                        ["<a href=\"http://bugs.debian.org/%s\";>#%s</a>" % (quote(a), a) for a in old_bugs])))
-                if len(old_bugs) > len(new_bugs) and len(new_bugs) > 0:
-                    excuse.addhtml("%s introduces new bugs, so still ignored (even "
-                        "though it fixes more than it introduces, whine at debian-release)" % pkg)
-
         # check if there is a `force' hint for this package, which allows it to go in even if it is not updateable
         forces = [x for x in self.hints.search('force', package=src) if source_u[VERSION] == x.version]
         if forces:
diff --git a/policies/policy.py b/policies/policy.py
index ed6d6a2..12665fd 100644
--- a/policies/policy.py
+++ b/policies/policy.py
@@ -4,7 +4,7 @@ import apt_pkg
 import os
 import time
 
-from consts import VERSION
+from consts import VERSION, BINARIES
 
 
 @unique
@@ -240,3 +240,99 @@ class AgePolicy(BasePolicy):
                 version, date = dates[pkg]
                 fd.write("%s %s %d\n" % (pkg, version, date))
         os.rename(filename_tmp, filename)
+
+
+class RCBugPolicy(BasePolicy):
+    """RC bug regression policy for source migrations
+
+    The RCBugPolicy will read provided list of RC bugs and block any
+    source upload that would introduce a *new* RC bug in the target
+    suite.
+
+    The RCBugPolicy's decision is influenced by the following:
+
+    State files:
+     * ${UNSTABLE}/BugsV: File containing RC bugs for packages in the
+      source suite.
+       - This file needs to be updated externally.
+     * ${TESTING}/BugsV: File containing RC bugs for packages in the
+       target suite.
+       - This file needs to be updated externally.
+    """
+
+    def __init__(self, options):
+        super().__init__(options, {'unstable'})
+        self._bugs = {}
+
+    def initialise(self, britney):
+        super().initialise(britney)
+        self._bugs['unstable'] = self._read_bugs(self.options.unstable)
+        self._bugs['testing'] = self._read_bugs(self.options.testing)
+
+    def apply_policy(self, policy_info, suite, source_name, source_data_tdist, source_data_srcdist):
+        # retrieve the urgency for the upload, ignoring it if this is a NEW package (not present in testing)
+        if 'rc-bugs' not in policy_info:
+            policy_info['rc-bugs'] = rcbugs_info = {}
+        else:
+            rcbugs_info = policy_info['rc-bugs']
+
+        bugs_t = set()
+        bugs_u = set()
+
+        for src_key in (source_name, 'src:%s' % source_name):
+            if source_data_tdist and src_key in self._bugs['testing']:
+                bugs_t.update(self._bugs['testing'][src_key])
+            if src_key in self._bugs['unstable']:
+                bugs_u.update(self._bugs['unstable'][src_key])
+
+        for pkg, _, _ in source_data_srcdist[BINARIES]:
+            if pkg in self._bugs['unstable']:
+                bugs_u |= self._bugs['unstable'][pkg]
+        if source_data_tdist:
+            for pkg, _, _ in source_data_tdist[BINARIES]:
+                if pkg in self._bugs['testing']:
+                    bugs_t |= self._bugs['testing'][pkg]
+
+        # If a package is not in testing, it has no RC bugs per
+        # definition.  Unfortunately, it seems that the live-data is
+        # not always accurate (e.g. live-2011-12-13 suggests that
+        # obdgpslogger had the same bug in testing and unstable,
+        # but obdgpslogger was not in testing at that time).
+        # - For the curious, obdgpslogger was removed on that day
+        #   and the BTS probably had not caught up with that fact.
+        #   (https://tracker.debian.org/news/415935)
+        assert not bugs_t or source_data_tdist, "%s had bugs in testing but is not in testing" % source_name
+
+        rcbugs_info['shared-bugs'] = sorted(bugs_u & bugs_t)
+        rcbugs_info['unique-source-bugs'] = sorted(bugs_u - bugs_t)
+        rcbugs_info['unique-target-bugs'] = sorted(bugs_t - bugs_u)
+
+        if not bugs_u or bugs_u <= bugs_t:
+            return PolicyVerdict.PASS
+        return PolicyVerdict.REJECTED_PERMANENTLY
+
+    def _read_bugs(self, basedir):
+        """Read the release critical bug summary from the specified directory
+
+        The RC bug summaries are read from the `BugsV' file within the
+        directory specified in the `basedir' parameter. The file contains
+        rows with the format:
+
+        <package-name> <bug number>[,<bug number>...]
+
+        The method returns a dictionary where the key is the binary package
+        name and the value is the list of open RC bugs for it.
+        """
+        bugs = {}
+        filename = os.path.join(basedir, "BugsV")
+        self.log("Loading RC bugs data from %s" % filename)
+        for line in open(filename, encoding='ascii'):
+            l = line.split()
+            if len(l) != 2:
+                self.log("Malformed line found in line %s" % (line), type='W')
+                continue
+            pkg = l[0]
+            if pkg not in bugs:
+                bugs[pkg] = set()
+            bugs[pkg].update(l[1].split(","))
+        return bugs
-- 
2.8.1

From 19bca38117f133ab7de519e162e5c2f03c4337f5 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 7 May 2016 15:16:16 +0000
Subject: [PATCH 03/28] Move Dates into a new state-dir

Partly solves GH#2.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.conf       |  5 +++++
 policies/policy.py | 28 +++++++++++++++++++++++-----
 2 files changed, 28 insertions(+), 5 deletions(-)

diff --git a/britney.conf b/britney.conf
index 434a4f9..8137fb0 100644
--- a/britney.conf
+++ b/britney.conf
@@ -13,6 +13,11 @@ EXCUSES_YAML_OUTPUT = /srv/release.debian.org/britney/var/data-b2/output/excuses
 UPGRADE_OUTPUT    = /srv/release.debian.org/britney/var/data-b2/output/output.txt
 HEIDI_OUTPUT      = /srv/release.debian.org/britney/var/data-b2/output/HeidiResult
 
+# Directory for input files that Britney will update herself
+# (e.g. aging information) or will need regular updates
+# (e.g. urgency information).
+STATE_DIR          = /srv/release.debian.org/britney/state
+
 # List of release architectures
 ARCHITECTURES     = i386 amd64 arm64 armel armhf mips mipsel powerpc ppc64el s390x
 
diff --git a/policies/policy.py b/policies/policy.py
index 12665fd..565e4ae 100644
--- a/policies/policy.py
+++ b/policies/policy.py
@@ -91,7 +91,8 @@ class AgePolicy(BasePolicy):
          will simply use the default urgency (see the "Config" section below)
        - In Debian, these values are taken from the .changes file, but that is
          not a requirement for Britney.
-     * ${TESTING}/Dates: File containing the age of all source packages.
+     * ${STATE_DIR}/age-policy-dates: File containing the age of all source
+       packages.
        - The policy will automatically update this file.
     Config:
      * DEFAULT_URGENCY: Name of the urgency used for packages without an urgency
@@ -185,7 +186,14 @@ class AgePolicy(BasePolicy):
     def _read_dates_file(self):
         """Parse the dates file"""
         dates = self._dates
-        filename = os.path.join(self.options.testing, 'Dates')
+        fallback_filename = os.path.join(self.options.testing, 'Dates')
+        try:
+            filename = os.path.join(self.options.state_dir, 'age-policy-dates')
+            if not os.path.exists(filename) and os.path.exists(fallback_filename):
+                filename = fallback_filename
+        except AttributeError:
+            filename = fallback_filename
+
         with open(filename, encoding='utf-8') as fd:
             for line in fd:
                 # <source> <version> <date>
@@ -232,14 +240,24 @@ class AgePolicy(BasePolicy):
 
     def _write_dates_file(self):
         dates = self._dates
-        directory = self.options.testing
-        filename = os.path.join(directory, 'Dates')
-        filename_tmp = os.path.join(directory, 'Dates_new')
+        try:
+            directory = self.options.state_dir
+            basename = 'age-policy-dates'
+            old_file = os.path.join(self.options.testing, 'Dates')
+        except AttributeError:
+            directory = self.options.testing
+            basename = 'Dates'
+            old_file = None
+        filename = os.path.join(directory, basename)
+        filename_tmp = os.path.join(directory, '%s_new' % basename)
         with open(filename_tmp, 'w', encoding='utf-8') as fd:
             for pkg in sorted(dates):
                 version, date = dates[pkg]
                 fd.write("%s %s %d\n" % (pkg, version, date))
         os.rename(filename_tmp, filename)
+        if old_file is not None and os.path.exists(old_file):
+            self.log("Removing old age-policy-dates file %s" % old_file)
+            os.unlink(old_file)
 
 
 class RCBugPolicy(BasePolicy):
-- 
2.8.1

From 5b2e478f28eac21031550fb7291392e0fba6f6e2 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 7 May 2016 15:45:52 +0000
Subject: [PATCH 04/28] Prefer Urgencies file from the state-dir

Partly solves GH#2

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 policies/policy.py | 15 +++++++++++----
 1 file changed, 11 insertions(+), 4 deletions(-)

diff --git a/policies/policy.py b/policies/policy.py
index 565e4ae..8abae64 100644
--- a/policies/policy.py
+++ b/policies/policy.py
@@ -83,9 +83,9 @@ class AgePolicy(BasePolicy):
     The AgePolicy's decision is influenced by the following:
 
     State files:
-     * ${TESTING}/Urgency: File containing urgencies for source packages.
-       Note that urgencies are "sticky" and the most "urgent" urgency will be
-       used (i.e. the one with lowest age-requirements).
+     * ${STATE_DIR}/age-policy-urgencies: File containing urgencies for source
+       packages. Note that urgencies are "sticky" and the most "urgent" urgency
+       will be used (i.e. the one with lowest age-requirements).
        - This file needs to be updated externally, if the policy should take
          urgencies into consideration.  If empty (or not updated), the policy
          will simply use the default urgency (see the "Config" section below)
@@ -207,8 +207,15 @@ class AgePolicy(BasePolicy):
 
     def _read_urgencies_file(self, britney):
         urgencies = self._urgencies
-        filename = os.path.join(self.options.testing, 'Urgency')
         min_days_default = self._min_days_default
+        fallback_filename = os.path.join(self.options.testing, 'Urgency')
+        try:
+            filename = os.path.join(self.options.state_dir, 'age-policy-urgencies')
+            if not os.path.exists(filename) and os.path.exists(fallback_filename):
+                filename = fallback_filename
+        except AttributeError:
+            filename = fallback_filename
+
         with open(filename, errors='surrogateescape', encoding='ascii') as fd:
             for line in fd:
                 # <source> <version> <urgency>
-- 
2.8.1

From 98808d10e0476385d6db42a976ff08691b649bfe Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 7 May 2016 15:52:56 +0000
Subject: [PATCH 05/28] Prefer bugs files from the state-dir

Closes GH#2.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 policies/policy.py | 33 +++++++++++++++++++++------------
 1 file changed, 21 insertions(+), 12 deletions(-)

diff --git a/policies/policy.py b/policies/policy.py
index 8abae64..1b266a4 100644
--- a/policies/policy.py
+++ b/policies/policy.py
@@ -277,11 +277,11 @@ class RCBugPolicy(BasePolicy):
     The RCBugPolicy's decision is influenced by the following:
 
     State files:
-     * ${UNSTABLE}/BugsV: File containing RC bugs for packages in the
-      source suite.
+     * ${STATE_DIR}/rc-bugs-unstable: File containing RC bugs for packages in
+       the source suite.
        - This file needs to be updated externally.
-     * ${TESTING}/BugsV: File containing RC bugs for packages in the
-       target suite.
+     * ${STATE_DIR}/rc-bugs-testing: File containing RC bugs for packages in
+       the target suite.
        - This file needs to be updated externally.
     """
 
@@ -291,8 +291,20 @@ class RCBugPolicy(BasePolicy):
 
     def initialise(self, britney):
         super().initialise(britney)
-        self._bugs['unstable'] = self._read_bugs(self.options.unstable)
-        self._bugs['testing'] = self._read_bugs(self.options.testing)
+        fallback_unstable = os.path.join(self.options.unstable, 'BugsV')
+        fallback_testing = os.path.join(self.options.testing, 'BugsV')
+        try:
+            filename_unstable = os.path.join(self.options.state_dir, 'rc-bugs-unstable')
+            filename_testing = os.path.join(self.options.state_dir, 'rc-bugs-testing')
+            if not os.path.exists(filename_unstable) and not os.path.exists(filename_testing) and \
+               os.path.exists(fallback_unstable) and os.path.exists(fallback_testing):
+                filename_unstable = fallback_unstable
+                filename_testing = fallback_testing
+        except AttributeError:
+            filename_unstable = fallback_unstable
+            filename_testing = fallback_testing
+        self._bugs['unstable'] = self._read_bugs(filename_unstable)
+        self._bugs['testing'] = self._read_bugs(filename_testing)
 
     def apply_policy(self, policy_info, suite, source_name, source_data_tdist, source_data_srcdist):
         # retrieve the urgency for the upload, ignoring it if this is a NEW package (not present in testing)
@@ -336,12 +348,10 @@ class RCBugPolicy(BasePolicy):
             return PolicyVerdict.PASS
         return PolicyVerdict.REJECTED_PERMANENTLY
 
-    def _read_bugs(self, basedir):
-        """Read the release critical bug summary from the specified directory
+    def _read_bugs(self, filename):
+        """Read the release critical bug summary from the specified file
 
-        The RC bug summaries are read from the `BugsV' file within the
-        directory specified in the `basedir' parameter. The file contains
-        rows with the format:
+        The file contains rows with the format:
 
         <package-name> <bug number>[,<bug number>...]
 
@@ -349,7 +359,6 @@ class RCBugPolicy(BasePolicy):
         name and the value is the list of open RC bugs for it.
         """
         bugs = {}
-        filename = os.path.join(basedir, "BugsV")
         self.log("Loading RC bugs data from %s" % filename)
         for line in open(filename, encoding='ascii'):
             l = line.split()
-- 
2.8.1

From 883c4f45414a81116c81e80ddfadb927cfcd57df Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 06:22:46 +0000
Subject: [PATCH 06/28] Optimise a few hints.search calls

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py         | 4 ++--
 policies/policy.py | 8 ++++----
 2 files changed, 6 insertions(+), 6 deletions(-)

diff --git a/britney.py b/britney.py
index a07c1db..b63a2d6 100755
--- a/britney.py
+++ b/britney.py
@@ -1032,7 +1032,7 @@ class Britney(object):
         # version in testing, then stop here and return False
         # (as a side effect, a removal may generate such excuses for both the source
         # package and its binary packages on each architecture)
-        for hint in [x for x in self.hints.search('remove', package=src) if source_t[VERSION] == x.version]:
+        for hint in self.hints.search('remove', package=src, version=source_t[VERSION]):
             excuse.addhtml("Removal request by %s" % (hint.user))
             excuse.addhtml("Trying to remove package, not update it")
             excuse.addhtml("Not considered")
@@ -1435,7 +1435,7 @@ class Britney(object):
             update_candidate = False
 
         # check if there is a `force' hint for this package, which allows it to go in even if it is not updateable
-        forces = [x for x in self.hints.search('force', package=src) if source_u[VERSION] == x.version]
+        forces = self.hints.search('force', package=src, version=source_u[VERSION])
         if forces:
             excuse.dontinvalidate = True
         if not update_candidate and forces:
diff --git a/policies/policy.py b/policies/policy.py
index 1b266a4..1f874ab 100644
--- a/policies/policy.py
+++ b/policies/policy.py
@@ -160,8 +160,8 @@ class AgePolicy(BasePolicy):
         age_info['age-requirement'] = min_days
         age_info['current-age'] = days_old
 
-        for age_days_hint in [x for x in self.hints.search('age-days', package=source_name)
-                              if source_data_srcdist[VERSION] == x.version]:
+        for age_days_hint in self.hints.search('age-days', package=source_name,
+                                               version=source_data_srcdist[VERSION]):
             new_req = int(age_days_hint.days)
             age_info['age-requirement-reduced'] = {
                 'new-requirement': new_req,
@@ -170,8 +170,8 @@ class AgePolicy(BasePolicy):
             min_days = new_req
 
         if days_old < min_days:
-            urgent_hints = [x for x in self.hints.search('urgent', package=source_name)
-                            if source_data_srcdist[VERSION] == x.version]
+            urgent_hints = self.hints.search('urgent', package=source_name,
+                                             version=source_data_srcdist[VERSION])
             if urgent_hints:
                 age_info['age-requirement-reduced'] = {
                     'new-requirement': 0,
-- 
2.8.1

From b18ff0ec71b7447131477fd656d0c48a28ec0571 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 06:37:46 +0000
Subject: [PATCH 07/28] Add policy-info to YAML excuses

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 excuse.py | 6 ++----
 1 file changed, 2 insertions(+), 4 deletions(-)

diff --git a/excuse.py b/excuse.py
index e12318e..b5e2c07 100644
--- a/excuse.py
+++ b/excuse.py
@@ -216,10 +216,8 @@ class Excuse(object):
         excusedata["source"] = self.name
         excusedata["old-version"] = self.ver[0]
         excusedata["new-version"] = self.ver[1]
-        excusedata["age"] = self.daysold
-        excusedata["age-needed"] = self.mindays
-        excusedata["new-bugs"] = sorted(self.newbugs)
-        excusedata["old-bugs"] = sorted(self.oldbugs)
+        if self.policy_info:
+            excusedata['policy_info'] = self.policy_info
         if self.forced:
             excusedata["forced-reason"] = sorted(list(self.reason.keys()))
             excusedata["reason"] = []
-- 
2.8.1

From 8411b50256ac14196f639928d1a93fe7d8913836 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 06:44:33 +0000
Subject: [PATCH 08/28] excuses.yaml: Make maint + component info machine
 parsable

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 excuse.py | 26 +++++++-------------------
 1 file changed, 7 insertions(+), 19 deletions(-)

diff --git a/excuse.py b/excuse.py
index b5e2c07..824dd95 100644
--- a/excuse.py
+++ b/excuse.py
@@ -173,24 +173,10 @@ class Excuse(object):
         """"adding reason"""
         self.reason[reason] = 1
 
-    # TODO merge with html()
-    def text(self):
+    # TODO: remove
+    def _text(self):
         """Render the excuse in text"""
         res = []
-        res.append("%s (%s to %s)" % \
-            (self.name, self.ver[0], self.ver[1]))
-        if self.maint:
-            maint = self.maint
-            res.append("Maintainer: %s" % maint)
-        if self.section and self.section.find("/") > -1:
-            res.append("Section: %s" % (self.section))
-        if self.daysold != None:
-            if self.daysold < self.mindays:
-                res.append(("Too young, only %d of %d days old" %
-                (self.daysold, self.mindays)))
-            else:
-                res.append(("%d days old (needed %d days)" %
-                (self.daysold, self.mindays)))
         for x in self.htmlline:
             res.append("" + x + "")
         lastdep = ""
@@ -205,17 +191,19 @@ class Excuse(object):
         for (n,a) in self.break_deps:
             if n not in self.deps:
                 res.append("Ignoring %s depends: %s" % (a, n))
-        if self.is_valid:
-            res.append("Valid candidate")
         return res
 
     def excusedata(self):
         """Render the excuse in as key-value data"""
         excusedata = {}
-        excusedata["excuses"] = self.text()
+        excusedata["excuses"] = self._text()
         excusedata["source"] = self.name
         excusedata["old-version"] = self.ver[0]
         excusedata["new-version"] = self.ver[1]
+        if self.maint:
+            excusedata['maintainer'] = self.maint
+        if self.section and self.section.find("/") > -1:
+            excusedata['component'] = self.section.split('/')[0]
         if self.policy_info:
             excusedata['policy_info'] = self.policy_info
         if self.forced:
-- 
2.8.1

From 613587fb3309290b671fd779d7b7d902e61b859c Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 06:47:45 +0000
Subject: [PATCH 09/28] Remove "Not considered" note from excuses.yaml

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 8 --------
 excuse.py  | 2 ++
 2 files changed, 2 insertions(+), 8 deletions(-)

diff --git a/britney.py b/britney.py
index b63a2d6..264dd2a 100755
--- a/britney.py
+++ b/britney.py
@@ -997,7 +997,6 @@ class Britney(object):
         for hint in self.hints.search('block', package=pkg, removal=True):
             excuse.addhtml("Not touching package, as requested by %s "
                 "(check https://release.debian.org/testing/freeze_policy.html if update is needed)" % hint.user)
-            excuse.addhtml("Not considered")
             excuse.addreason("block")
             self.excuses[excuse.name] = excuse
             return False
@@ -1035,7 +1034,6 @@ class Britney(object):
         for hint in self.hints.search('remove', package=src, version=source_t[VERSION]):
             excuse.addhtml("Removal request by %s" % (hint.user))
             excuse.addhtml("Trying to remove package, not update it")
-            excuse.addhtml("Not considered")
             excuse.addreason("remove")
             self.excuses[excuse.name] = excuse
             return False
@@ -1152,7 +1150,6 @@ class Britney(object):
             return True
         # else if there is something worth doing (but something wrong, too) this package won't be considered
         elif anyworthdoing:
-            excuse.addhtml("Not considered")
             self.excuses[excuse.name] = excuse
 
         # otherwise, return False
@@ -1446,10 +1443,6 @@ class Britney(object):
         # if the package can be updated, it is a valid candidate
         if update_candidate:
             excuse.is_valid = True
-        # else it won't be considered
-        else:
-            # TODO
-            excuse.addhtml("Not considered")
 
         self.excuses[excuse.name] = excuse
         return update_candidate
@@ -1502,7 +1495,6 @@ class Britney(object):
                     p = valid.index(x)
                     invalid.append(valid.pop(p))
                     excuses[x].addhtml("Invalidated by dependency")
-                    excuses[x].addhtml("Not considered")
                     excuses[x].addreason("depends")
                     excuses[x].is_valid = False
             i = i + 1
diff --git a/excuse.py b/excuse.py
index 824dd95..52e4ea4 100644
--- a/excuse.py
+++ b/excuse.py
@@ -161,6 +161,8 @@ class Excuse(object):
                 res += "<li>Ignoring %s depends: <a href=\"#%s\">%s</a>\n" % (a, n, n)
         if self.is_valid:
             res += "<li>Valid candidate\n"
+        else:
+            res += "<li>Not considered\n"
         res = res + "</ul>\n"
         return res
 
-- 
2.8.1

From 728f33cf8863d82bd6cd3ec49176506e7e9e5b6b Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 06:55:02 +0000
Subject: [PATCH 10/28] excuses.yaml: Distinguish between source and item names

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 excuse.py | 8 +++++++-
 1 file changed, 7 insertions(+), 1 deletion(-)

diff --git a/excuse.py b/excuse.py
index 52e4ea4..e338f06 100644
--- a/excuse.py
+++ b/excuse.py
@@ -197,9 +197,15 @@ class Excuse(object):
 
     def excusedata(self):
         """Render the excuse in as key-value data"""
+        source = self.name
+        if '/' in source:
+            source = source.split("/")[0]
+        if source[0] == '-':
+            source = source[1:]
         excusedata = {}
         excusedata["excuses"] = self._text()
-        excusedata["source"] = self.name
+        excusedata["item-name"] = self.name
+        excusedata["source"] = source
         excusedata["old-version"] = self.ver[0]
         excusedata["new-version"] = self.ver[1]
         if self.maint:
-- 
2.8.1

From ebb1e9c060bf2ebb157d79b41210f12d2f793c4c Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 09:37:02 +0000
Subject: [PATCH 11/28] Move missing-builds/cruft to excuses

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 18 ++++++------------
 excuse.py  | 24 ++++++++++++++++++++++++
 2 files changed, 30 insertions(+), 12 deletions(-)

diff --git a/britney.py b/britney.py
index 264dd2a..e160d74 100755
--- a/britney.py
+++ b/britney.py
@@ -1340,12 +1340,10 @@ class Britney(object):
 
                 if arch in self.options.fucked_arches:
                     text = text + " (but %s isn't keeping up, so never mind)" % (arch)
+                    excuse.missing_build_on_ood_arch(arch)
                 else:
                     update_candidate = False
-                    excuse.addreason("arch")
-                    excuse.addreason("arch-%s" % arch)
-                    excuse.addreason("build-arch")
-                    excuse.addreason("build-arch-%s" % arch)
+                    excuse.missing_build_on_arch(arch)
 
                 excuse.addhtml(text)
 
@@ -1372,6 +1370,7 @@ class Britney(object):
                     if pkgsv not in oodbins:
                         oodbins[pkgsv] = []
                     oodbins[pkgsv].append(pkg)
+                    excuse.add_old_binary(pkg, pkgsv)
                     continue
                 else:
                     # if the binary is arch all, it doesn't count as
@@ -1405,22 +1404,17 @@ class Britney(object):
 
                 if arch in self.options.fucked_arches:
                     text = text + " (but %s isn't keeping up, so nevermind)" % (arch)
+                    if not uptodatebins:
+                        excuse.missing_build_on_ood_arch(arch)
                 else:
                     if uptodatebins:
-                        excuse.addreason("cruft-arch")
-                        excuse.addreason("cruft-arch-%s" % arch)
                         if self.options.ignore_cruft:
                             text = text + " (but ignoring cruft, so nevermind)"
                         else:
                             update_candidate = False
-                            excuse.addreason("arch")
-                            excuse.addreason("arch-%s" % arch)
                     else:
                         update_candidate = False
-                        excuse.addreason("arch")
-                        excuse.addreason("arch-%s" % arch)
-                        excuse.addreason("build-arch")
-                        excuse.addreason("build-arch-%s" % arch)
+                        excuse.missing_build_on_arch(arch)
 
                 if 'age' in policy_info and policy_info['age']['current-age']:
                     excuse.addhtml(text)
diff --git a/excuse.py b/excuse.py
index e338f06..e1da7c3 100644
--- a/excuse.py
+++ b/excuse.py
@@ -14,8 +14,10 @@
 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 # GNU General Public License for more details.
 
+from collections import defaultdict
 import re
 
+
 class Excuse(object):
     """Excuse class
     
@@ -59,6 +61,9 @@ class Excuse(object):
         self.oldbugs = set()
         self.reason = {}
         self.htmlline = []
+        self.missing_builds = set()
+        self.missing_builds_ood_arch = set()
+        self.old_binaries = defaultdict(set)
         self.policy_info = {}
 
     def sortkey(self):
@@ -130,6 +135,18 @@ class Excuse(object):
         """Add a note in HTML"""
         self.htmlline.append(note)
 
+    def missing_build_on_arch(self, arch):
+        """Note that the item is missing a build on a given architecture"""
+        self.missing_builds.add(arch)
+
+    def missing_build_on_ood_arch(self, arch):
+        """Note that the item is missing a build on a given "out of date" architecture"""
+        self.missing_builds.add(arch)
+
+    def add_old_binary(self, binary, from_source_version):
+        """Denote than an old binary ("cruft") is available from a previous source version"""
+        self.old_binaries[from_source_version].add(binary)
+
     def html(self):
         """Render the excuse in HTML"""
         res = "<a id=\"%s\" name=\"%s\">%s</a> (%s to %s)\n<ul>\n" % \
@@ -214,6 +231,13 @@ class Excuse(object):
             excusedata['component'] = self.section.split('/')[0]
         if self.policy_info:
             excusedata['policy_info'] = self.policy_info
+        if self.missing_builds or self.missing_builds_ood_arch:
+            excusedata['missing-builds'] = {
+                'on-architectures': sorted(self.missing_builds),
+                'on-unimportant-architectures': sorted(self.missing_builds_ood_arch),
+            }
+        if self.old_binaries:
+            excusedata['old-binaries'] = {x: sorted(self.old_binaries[x]) for x in self.old_binaries}
         if self.forced:
             excusedata["forced-reason"] = sorted(list(self.reason.keys()))
             excusedata["reason"] = []
-- 
2.8.1

From ff56048cb057866a85a148c15a822d7c4e590d18 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 10:09:58 +0000
Subject: [PATCH 12/28] Add block/unblock-hint info to excuses.yaml

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 15 +++++++++++----
 excuse.py  | 19 +++++++++++++++++++
 2 files changed, 30 insertions(+), 4 deletions(-)

diff --git a/britney.py b/britney.py
index e160d74..28bf13e 100755
--- a/britney.py
+++ b/britney.py
@@ -1214,12 +1214,18 @@ class Britney(object):
         for hint in self.hints.search(package=src):
             if hint.type == 'block':
                 blocked['block'] = hint
+                excuse.add_hint(hint)
             if hint.type == 'block-udeb':
                 blocked['block-udeb'] = hint
-        for hint in self.hints.search(type='block-all', package='source'):
-            blocked.setdefault('block', hint)
-        if suite in ['pu', 'tpu']:
+                excuse.add_hint(hint)
+        if 'block' not in blocked:
+            for hint in self.hints.search(type='block-all', package='source'):
+                blocked['block'] = hint
+                excuse.add_hint(hint)
+                break
+        if suite in ('pu', 'tpu'):
             blocked['block'] = '%s-block' % (suite)
+            excuse.needs_approval = True
 
         # if the source is blocked, then look for an `unblock' hint; the unblock request
         # is processed only if the specified version is correct. If a package is blocked
@@ -1229,7 +1235,8 @@ class Britney(object):
             unblocks = self.hints.search(unblock_cmd, package=src)
 
             if unblocks and unblocks[0].version is not None and unblocks[0].version == source_u[VERSION]:
-                if suite == 'unstable' or block_cmd == 'block-udeb':
+                excuse.add_hint(unblocks[0])
+                if block_cmd == 'block-udeb' or not excuse.needs_approval:
                     excuse.addhtml("Ignoring %s request by %s, due to %s request by %s" %
                                    (block_cmd, blocked[block_cmd].user, unblock_cmd, unblocks[0].user))
                 else:
diff --git a/excuse.py b/excuse.py
index e1da7c3..0760841 100644
--- a/excuse.py
+++ b/excuse.py
@@ -50,6 +50,8 @@ class Excuse(object):
         self.section = None
         self._is_valid = False
         self._dontinvalidate = False
+        self.needs_approval = False
+        self.hints = []
         self.forced = False
 
         self.invalid_deps = []
@@ -147,6 +149,9 @@ class Excuse(object):
         """Denote than an old binary ("cruft") is available from a previous source version"""
         self.old_binaries[from_source_version].add(binary)
 
+    def add_hint(self, hint):
+        self.hints.append(hint)
+
     def html(self):
         """Render the excuse in HTML"""
         res = "<a id=\"%s\" name=\"%s\">%s</a> (%s to %s)\n<ul>\n" % \
@@ -236,6 +241,20 @@ class Excuse(object):
                 'on-architectures': sorted(self.missing_builds),
                 'on-unimportant-architectures': sorted(self.missing_builds_ood_arch),
             }
+        if self.needs_approval:
+            status = 'not-approved'
+            for h in self.hints:
+                if h.type == 'unblock':
+                    status = 'approved'
+                    break
+            excusedata['manual-approval-status'] = status
+        if self.hints:
+            hint_info = [{
+                             'hint-type': h.type,
+                             'hint-from': h.user,
+                         } for h in self.hints]
+
+            excusedata['hints'] = hint_info
         if self.old_binaries:
             excusedata['old-binaries'] = {x: sorted(self.old_binaries[x]) for x in self.old_binaries}
         if self.forced:
-- 
2.8.1

From 8449d1add2f328fd565de59d7a48565488eec05b Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Mon, 28 Mar 2016 11:17:25 +0000
Subject: [PATCH 13/28] Remove redundant notes about arch:all packages

In an architecture only migration, we currently see two notes for
every arch:all packages:

 * Ignoring "new" arch:all package
 * Ignoring "removal" of arch:all package

But a closer look at the situation is that the arch:all packages is
generally the same in both testing and the source suite.

This commit removes these notes when the arch:all is the same in both
suites.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 8 ++++++--
 1 file changed, 6 insertions(+), 2 deletions(-)

diff --git a/britney.py b/britney.py
index 28bf13e..cc14a8c 100755
--- a/britney.py
+++ b/britney.py
@@ -1058,7 +1058,9 @@ class Britney(object):
 
             # if the new binary package is architecture-independent, then skip it
             if binary_u[ARCHITECTURE] == 'all':
-                excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u[VERSION], pkgsv))
+                if pkg_id not in source_t[BINARIES]:
+                    # only add a note if the arch:all does not match the expected version
+                    excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u[VERSION], pkgsv))
                 continue
 
             # if the new binary package is not from the same source as the testing one, then skip it
@@ -1127,7 +1129,9 @@ class Britney(object):
                     # if the package is architecture-independent, then ignore it
                     tpkg_data = packages_t_a[pkg]
                     if tpkg_data[ARCHITECTURE] == 'all':
-                        excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
+                        if pkg_id not in source_u[BINARIES]:
+                            # only add a note if the arch:all does not match the expected version
+                            excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
                         continue
                     # if the package is not produced by the new source package, then remove it from testing
                     if pkg not in packages_s_a:
-- 
2.8.1

From 6f835e08b0c1b3f2f3f10c9109564d62f1c49e59 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 30 Mar 2016 19:21:25 +0000
Subject: [PATCH 14/28] excuse: present dependency info in a machine parseable
 way

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 excuse.py | 23 +++++++++++------------
 1 file changed, 11 insertions(+), 12 deletions(-)

diff --git a/excuse.py b/excuse.py
index 0760841..6eb59af 100644
--- a/excuse.py
+++ b/excuse.py
@@ -203,18 +203,6 @@ class Excuse(object):
         res = []
         for x in self.htmlline:
             res.append("" + x + "")
-        lastdep = ""
-        for x in sorted(self.deps, key=lambda x: x.split('/')[0]):
-            dep = x.split('/')[0]
-            if dep == lastdep: continue
-            lastdep = dep
-            if x in self.invalid_deps:
-                res.append("Depends: %s %s (not considered)" % (self.name, dep))
-            else:
-                res.append("Depends: %s %s" % (self.name, dep))
-        for (n,a) in self.break_deps:
-            if n not in self.deps:
-                res.append("Ignoring %s depends: %s" % (a, n))
         return res
 
     def excusedata(self):
@@ -241,6 +229,17 @@ class Excuse(object):
                 'on-architectures': sorted(self.missing_builds),
                 'on-unimportant-architectures': sorted(self.missing_builds_ood_arch),
             }
+        if self.deps or self.invalid_deps or self.break_deps:
+            excusedata['dependencies'] = dep_data = {}
+            migrate_after = sorted(x for x in self.deps if x not in self.invalid_deps)
+            break_deps = [x for x, _ in self.break_deps if x not in self.deps]
+
+            if self.invalid_deps:
+                dep_data['blocked-by'] = sorted(self.invalid_deps)
+            if migrate_after:
+                dep_data['migrate-after'] = migrate_after
+            if break_deps:
+                dep_data['unimportant-dependencies'] = sorted(break_deps)
         if self.needs_approval:
             status = 'not-approved'
             for h in self.hints:
-- 
2.8.1

From faef597249f137c0fd464caec6bc61da9dfa96cb Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 6 Apr 2016 20:17:27 +0000
Subject: [PATCH 15/28] read_pkgs: Refactor source+src_version parsing

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 29 ++++++++++++++++-------------
 1 file changed, 16 insertions(+), 13 deletions(-)

diff --git a/britney.py b/britney.py
index cc14a8c..d65cbea 100755
--- a/britney.py
+++ b/britney.py
@@ -668,10 +668,20 @@ class Britney(object):
             breaks = get_field('Breaks')
             if breaks:
                 final_conflicts_list.append(breaks)
+
+            source = pkg
+            source_version = version
+            # retrieve the name and the version of the source package
+            source_raw = get_field('Source')
+            if source_raw:
+                source = intern(source_raw.split(" ")[0])
+                if "(" in source_raw:
+                    source_version = intern(source_raw[source_raw.find("(")+1:source_raw.find(")")])
+
             dpkg = [version,
                     intern(get_field('Section')),
-                    pkg,
-                    version,
+                    source,
+                    source_version,
                     intern(get_field('Architecture')),
                     get_field('Multi-Arch'),
                     deps,
@@ -680,26 +690,19 @@ class Britney(object):
                     ess,
                    ]
 
-            # retrieve the name and the version of the source package
-            source = get_field('Source')
-            if source:
-                dpkg[SOURCE] = intern(source.split(" ")[0])
-                if "(" in source:
-                    dpkg[SOURCEVER] = intern(source[source.find("(")+1:source.find(")")])
-
             # if the source package is available in the distribution, then register this binary package
-            if dpkg[SOURCE] in srcdist:
+            if source in srcdist:
                 # There may be multiple versions of any arch:all packages
                 # (in unstable) if some architectures have out-of-date
                 # binaries.  We only want to include the package in the
                 # source -> binary mapping once. It doesn't matter which
                 # of the versions we include as only the package name and
                 # architecture are recorded.
-                if pkg_id not in srcdist[dpkg[SOURCE]][BINARIES]:
-                    srcdist[dpkg[SOURCE]][BINARIES].append(pkg_id)
+                if pkg_id not in srcdist[source][BINARIES]:
+                    srcdist[source][BINARIES].append(pkg_id)
             # if the source package doesn't exist, create a fake one
             else:
-                srcdist[dpkg[SOURCE]] = [dpkg[SOURCEVER], 'faux', [pkg_id], None, True]
+                srcdist[source] = [source_version, 'faux', [pkg_id], None, True]
 
             if dpkg[PROVIDES]:
                 parts = apt_pkg.parse_depends(dpkg[PROVIDES], False)
-- 
2.8.1

From 995e99d6a369fb8f48a3875373757e806d2af706 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 6 Apr 2016 20:20:38 +0000
Subject: [PATCH 16/28] read_pkgs: Parse Provides a bit earlier

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 47 ++++++++++++++++++++++++-----------------------
 1 file changed, 24 insertions(+), 23 deletions(-)

diff --git a/britney.py b/britney.py
index d65cbea..04948f3 100755
--- a/britney.py
+++ b/britney.py
@@ -678,6 +678,29 @@ class Britney(object):
                 if "(" in source_raw:
                     source_version = intern(source_raw[source_raw.find("(")+1:source_raw.find(")")])
 
+            provides_raw = get_field('Provides')
+            if provides_raw:
+                parts = apt_pkg.parse_depends(provides_raw, False)
+                nprov = []
+                for or_clause in parts:
+                    if len(or_clause) != 1:
+                        msg = "Ignoring invalid provides in %s: Alternatives [%s]" % (str(pkg_id), str(or_clause))
+                        self.log(msg, type='W')
+                        continue
+                    for part in or_clause:
+                        provided, provided_version, op = part
+                        if op != '' and op != '=':
+                            msg = "Ignoring invalid provides in %s: %s (%s %s)" % (str(pkg_id), provided, op, version)
+                            self.log(msg, type='W')
+                            continue
+                        provided = intern(provided)
+                        provided_version = intern(provided_version)
+                        part = (provided, provided_version, intern(op))
+                        nprov.append(part)
+                provides = nprov
+            else:
+                provides = []
+
             dpkg = [version,
                     intern(get_field('Section')),
                     source,
@@ -686,7 +709,7 @@ class Britney(object):
                     get_field('Multi-Arch'),
                     deps,
                     ', '.join(final_conflicts_list) or None,
-                    get_field('Provides'),
+                    provides,
                     ess,
                    ]
 
@@ -704,28 +727,6 @@ class Britney(object):
             else:
                 srcdist[source] = [source_version, 'faux', [pkg_id], None, True]
 
-            if dpkg[PROVIDES]:
-                parts = apt_pkg.parse_depends(dpkg[PROVIDES], False)
-                nprov = []
-                for or_clause in parts:
-                    if len(or_clause) != 1:
-                        msg = "Ignoring invalid provides in %s: Alternatives [%s]" % (str(pkg_id), str(or_clause))
-                        self.log(msg, type='W')
-                        continue
-                    for part in or_clause:
-                        provided, provided_version, op = part
-                        if op != '' and op != '=':
-                            msg = "Ignoring invalid provides in %s: %s (%s %s)" % (str(pkg_id), provided, op, version)
-                            self.log(msg, type='W')
-                            continue
-                        provided = intern(provided)
-                        provided_version = intern(provided_version)
-                        part = (provided, provided_version, intern(op))
-                        nprov.append(part)
-                dpkg[PROVIDES] = nprov
-            else:
-                dpkg[PROVIDES] = []
-
             # add the resulting dictionary to the package list
             packages[pkg] = dpkg
             if pkg_id in all_binaries:
-- 
2.8.1

From 59272ee95b88509adfe82e411d1015242d535348 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 6 Apr 2016 20:25:03 +0000
Subject: [PATCH 17/28] Re-order parsing of suites to avoid changing binary
 pkgs

In the next commit, the binary packages will be turned into
namedtuples and will become immutable objects.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 7 ++++---
 1 file changed, 4 insertions(+), 3 deletions(-)

diff --git a/britney.py b/britney.py
index 04948f3..4329115 100755
--- a/britney.py
+++ b/britney.py
@@ -291,7 +291,6 @@ class Britney(object):
         self.binaries['pu'] = {}
 
         for arch in self.options.architectures:
-            self.binaries['testing'][arch] = self.read_binaries(self.options.testing, "testing", arch)
             self.binaries['unstable'][arch] = self.read_binaries(self.options.unstable, "unstable", arch)
             self.binaries['tpu'][arch] = self.read_binaries(self.options.tpu, "tpu", arch)
             if hasattr(self.options, 'pu'):
@@ -301,6 +300,9 @@ class Britney(object):
                 # properly initialised, so insert two empty dicts
                 # here.
                 self.binaries['pu'][arch] = ({}, {})
+            # Load testing last as some live-data tests have more complete information in
+            # unstable
+            self.binaries['testing'][arch] = self.read_binaries(self.options.testing, "testing", arch)
 
         self.log("Compiling Installability tester", type="I")
         self._build_installability_tester(self.options.architectures)
@@ -351,8 +353,7 @@ class Britney(object):
             raise ValueError("Invalid data set")
 
         # Merge ESSENTIAL if necessary
-        if pkg_entry2[ESSENTIAL]:
-            pkg_entry1[ESSENTIAL] = True
+        assert pkg_entry1[ESSENTIAL] or not pkg_entry2[ESSENTIAL]
 
     def __parse_arguments(self):
         """Parse the command line arguments
-- 
2.8.1

From a914847c4d47aa72b062c26f69cca7a02f85fab1 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 6 Apr 2016 20:30:03 +0000
Subject: [PATCH 18/28] Make binary packages a namedtuple

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 19 ++++++++++++++++---
 1 file changed, 16 insertions(+), 3 deletions(-)

diff --git a/britney.py b/britney.py
index 4329115..74326be 100755
--- a/britney.py
+++ b/britney.py
@@ -188,7 +188,7 @@ import optparse
 
 import apt_pkg
 
-from collections import defaultdict
+from collections import defaultdict, namedtuple
 from functools import reduce
 from itertools import chain, product
 from operator import attrgetter
@@ -227,6 +227,19 @@ check_field_name = dict((globals()[fn], fn) for fn in
 
 check_fields = sorted(check_field_name)
 
+BinaryPackage = namedtuple('BinaryPackage', [
+                               'version',
+                               'section',
+                               'source',
+                               'source_version',
+                               'architecture',
+                               'multi_arch',
+                               'depends',
+                               'conflicts',
+                               'provides',
+                               'is_essential',
+                           ])
+
 class Britney(object):
     """Britney, the Debian testing updater script
     
@@ -702,7 +715,7 @@ class Britney(object):
             else:
                 provides = []
 
-            dpkg = [version,
+            dpkg = BinaryPackage(version,
                     intern(get_field('Section')),
                     source,
                     source_version,
@@ -712,7 +725,7 @@ class Britney(object):
                     ', '.join(final_conflicts_list) or None,
                     provides,
                     ess,
-                   ]
+                   )
 
             # if the source package is available in the distribution, then register this binary package
             if source in srcdist:
-- 
2.8.1

From 0c848ec58257d2848a88a926a2b47e6ca92d04e6 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Wed, 6 Apr 2016 20:49:40 +0000
Subject: [PATCH 19/28] Prefer pkg.foo to pkg[FOO] for binary packages

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 97 ++++++++++++++++++++++++++++-----------------------------
 britney_util.py | 42 ++++++++++++-------------
 2 files changed, 69 insertions(+), 70 deletions(-)

diff --git a/britney.py b/britney.py
index 74326be..dcee419 100755
--- a/britney.py
+++ b/britney.py
@@ -208,8 +208,8 @@ from britney_util import (old_libraries_format, undo_changes,
                           clone_nuninst, check_installability)
 from policies.policy import AgePolicy, RCBugPolicy, PolicyVerdict
 from consts import (VERSION, SECTION, BINARIES, MAINTAINER, FAKESRC,
-                   SOURCE, SOURCEVER, ARCHITECTURE, DEPENDS, CONFLICTS,
-                   PROVIDES, MULTIARCH, ESSENTIAL)
+                   SOURCE, SOURCEVER, ARCHITECTURE, CONFLICTS, DEPENDS,
+                   PROVIDES, MULTIARCH)
 
 __author__ = 'Fabio Tranchitella and the Debian Release Team'
 __version__ = '2.0'
@@ -360,13 +360,13 @@ class Britney(object):
 
         if bad:
             self.log("Mismatch found %s %s %s differs" % (
-                package, pkg_entry1[VERSION], parch), type="E")
+                package, pkg_entry1.version, parch), type="E")
             for f, v1, v2 in bad:
                 self.log(" ... %s %s != %s" % (check_field_name[f], v1, v2))
             raise ValueError("Invalid data set")
 
         # Merge ESSENTIAL if necessary
-        assert pkg_entry1[ESSENTIAL] or not pkg_entry2[ESSENTIAL]
+        assert pkg_entry1.is_essential or not pkg_entry2.is_essential
 
     def __parse_arguments(self):
         """Parse the command line arguments
@@ -487,10 +487,9 @@ class Britney(object):
             testing = (dist == 'testing')
             for pkgname in binaries[dist][arch][0]:
                 pkgdata = binaries[dist][arch][0][pkgname]
-                version = pkgdata[VERSION]
+                version = pkgdata.version
                 t = (pkgname, version, arch)
-                essential = pkgdata[ESSENTIAL]
-                if not builder.add_binary(t, essential=essential,
+                if not builder.add_binary(t, essential=pkgdata.is_essential,
                                           in_testing=testing):
                     continue
 
@@ -499,11 +498,11 @@ class Britney(object):
                 possible_dep_ranges = {}
 
                 # We do not differentiate between depends and pre-depends
-                if pkgdata[DEPENDS]:
-                    depends.extend(apt_pkg.parse_depends(pkgdata[DEPENDS], False))
+                if pkgdata.depends:
+                    depends.extend(apt_pkg.parse_depends(pkgdata.depends, False))
 
-                if pkgdata[CONFLICTS]:
-                    conflicts = apt_pkg.parse_depends(pkgdata[CONFLICTS], False)
+                if pkgdata.conflicts:
+                    conflicts = apt_pkg.parse_depends(pkgdata.conflicts, False)
 
                 with builder.relation_builder(t) as relations:
 
@@ -521,7 +520,7 @@ class Britney(object):
                                     # the package name extracted from the field and it is therefore
                                     # not interned.
                                     pdata = dep_packages_s_a[0][p]
-                                    pt = (sys.intern(p), pdata[VERSION], arch)
+                                    pt = (sys.intern(p), pdata.version, arch)
                                     if dep:
                                         sat.add(pt)
                                     elif t != pt:
@@ -905,8 +904,8 @@ class Britney(object):
                 package = binaries_s_a[name]
                 # check the versioned dependency and architecture qualifier
                 # (if present)
-                if (op == '' and version == '') or apt_pkg.check_dep(package[VERSION], op, version):
-                    if archqual is None or (archqual == 'any' and package[MULTIARCH] == 'allowed'):
+                if (op == '' and version == '') or apt_pkg.check_dep(package.version, op, version):
+                    if archqual is None or (archqual == 'any' and package.multi_arch == 'allowed'):
                         packages.append(name)
 
             # look for the package in the virtual packages list and loop on them
@@ -946,9 +945,9 @@ class Britney(object):
         get_dependency_solvers = self.get_dependency_solvers
 
         # analyze the dependency fields (if present)
-        if not binary_u[DEPENDS]:
+        deps = binary_u.depends
+        if not deps:
             return
-        deps = binary_u[DEPENDS]
 
         # for every dependency block (formed as conjunction of disjunction)
         for block, block_txt in zip(parse_depends(deps, False), deps.split(',')):
@@ -958,12 +957,12 @@ class Britney(object):
                 for p in packages:
                     if p not in package_s_a[0]:
                         continue
-                    excuse.add_sane_dep(package_s_a[0][p][SOURCE])
+                    excuse.add_sane_dep(package_s_a[0][p].source)
                 continue
 
             # check if the block can be satisfied in the source suite, and list the solving packages
             packages = get_dependency_solvers(block, package_s_a)
-            packages = [package_s_a[0][p][SOURCE] for p in packages]
+            packages = [package_s_a[0][p].source for p in packages]
 
             # if the dependency can be satisfied by the same source package, skip the block:
             # obviously both binary packages will enter testing together
@@ -1072,20 +1071,20 @@ class Britney(object):
             binary_u = packages_s_a[pkg_name]
 
             # this is the source version for the new binary package
-            pkgsv = binary_u[SOURCEVER]
+            pkgsv = binary_u.source_version
 
             # if the new binary package is architecture-independent, then skip it
-            if binary_u[ARCHITECTURE] == 'all':
+            if binary_u.architecture == 'all':
                 if pkg_id not in source_t[BINARIES]:
                     # only add a note if the arch:all does not match the expected version
-                    excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u[VERSION], pkgsv))
+                    excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u.version, pkgsv))
                 continue
 
             # if the new binary package is not from the same source as the testing one, then skip it
             # this implies that this binary migration is part of a source migration
             if source_u[VERSION] == pkgsv and source_t[VERSION] != pkgsv:
                 anywrongver = True
-                excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u[VERSION], pkgsv, source_t[VERSION]))
+                excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u.version, pkgsv, source_t[VERSION]))
                 continue
 
             # cruft in unstable
@@ -1101,7 +1100,7 @@ class Britney(object):
             # (the binaries are now out-of-date)
             if source_t[VERSION] == pkgsv and source_t[VERSION] != source_u[VERSION]:
                 anywrongver = True
-                excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u[VERSION], pkgsv, source_u[VERSION]))
+                excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u.version, pkgsv, source_u[VERSION]))
                 continue
 
             # find unsatisfied dependencies for the new binary package
@@ -1110,22 +1109,22 @@ class Britney(object):
             # if the binary is not present in testing, then it is a new binary;
             # in this case, there is something worth doing
             if not binary_t:
-                excuse.addhtml("New binary: %s (%s)" % (pkg_name, binary_u[VERSION]))
+                excuse.addhtml("New binary: %s (%s)" % (pkg_name, binary_u.version))
                 anyworthdoing = True
                 continue
 
             # at this point, the binary package is present in testing, so we can compare
             # the versions of the packages ...
-            vcompare = apt_pkg.version_compare(binary_t[VERSION], binary_u[VERSION])
+            vcompare = apt_pkg.version_compare(binary_t.version, binary_u.version)
 
             # ... if updating would mean downgrading, then stop here: there is something wrong
             if vcompare > 0:
                 anywrongver = True
-                excuse.addhtml("Not downgrading: %s (%s to %s)" % (pkg_name, binary_t[VERSION], binary_u[VERSION]))
+                excuse.addhtml("Not downgrading: %s (%s to %s)" % (pkg_name, binary_t.version, binary_u.version))
                 break
             # ... if updating would mean upgrading, then there is something worth doing
             elif vcompare < 0:
-                excuse.addhtml("Updated binary: %s (%s to %s)" % (pkg_name, binary_t[VERSION], binary_u[VERSION]))
+                excuse.addhtml("Updated binary: %s (%s to %s)" % (pkg_name, binary_t.version, binary_u.version))
                 anyworthdoing = True
 
         # if there is nothing wrong and there is something worth doing or the source
@@ -1146,14 +1145,14 @@ class Britney(object):
                     pkg = pkg_id[0]
                     # if the package is architecture-independent, then ignore it
                     tpkg_data = packages_t_a[pkg]
-                    if tpkg_data[ARCHITECTURE] == 'all':
+                    if tpkg_data.version == 'all':
                         if pkg_id not in source_u[BINARIES]:
                             # only add a note if the arch:all does not match the expected version
                             excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
                         continue
                     # if the package is not produced by the new source package, then remove it from testing
                     if pkg not in packages_s_a:
-                        excuse.addhtml("Removed binary: %s %s" % (pkg, tpkg_data[VERSION]))
+                        excuse.addhtml("Removed binary: %s %s" % (pkg, tpkg_data.version))
                         # the removed binary is only interesting if this is a binary-only migration,
                         # as otherwise the updated source will already cause the binary packages
                         # to be updated
@@ -1349,7 +1348,7 @@ class Britney(object):
                 # if the package in testing has no binaries on this
                 # architecture, it can't be out-of-date
                 if not any(x for x in source_t[BINARIES]
-                           if x[2] == arch and all_binaries[x][ARCHITECTURE] != 'all'):
+                           if x[2] == arch and all_binaries[x].architecture != 'all'):
                     continue
 
                 # if the (t-)p-u package has produced any binaries on
@@ -1357,8 +1356,8 @@ class Britney(object):
                 # uploads to (t-)p-u which intentionally drop binary
                 # packages
                 if any(x for x in self.binaries[suite][arch][0].values() \
-                          if x[SOURCE] == src and x[SOURCEVER] == source_u[VERSION] and \
-                             x[ARCHITECTURE] != 'all'):
+                          if x.source == src and x.source_version == source_u[VERSION] and \
+                             x.architecture != 'all'):
                     continue
 
                 if suite == 'tpu':
@@ -1390,7 +1389,7 @@ class Britney(object):
 
                 # retrieve the binary package and its source version
                 binary_u = all_binaries[pkg_id]
-                pkgsv = binary_u[SOURCEVER]
+                pkgsv = binary_u.source_version
 
                 # if it wasn't built by the same source, it is out-of-date
                 # if there is at least one binary on this arch which is
@@ -1404,12 +1403,12 @@ class Britney(object):
                 else:
                     # if the binary is arch all, it doesn't count as
                     # up-to-date for this arch
-                    if binary_u[ARCHITECTURE] == arch:
+                    if binary_u.architecture == arch:
                         uptodatebins = True
 
                 # if the package is architecture-dependent or the current arch is `nobreakall'
                 # find unsatisfied dependencies for the binary package
-                if binary_u[ARCHITECTURE] != 'all' or arch in self.options.nobreakall_arches:
+                if binary_u.architecture != 'all' or arch in self.options.nobreakall_arches:
                     self.excuse_unsat_deps(pkg, src, arch, suite, excuse)
 
             # if there are out-of-date packages, warn about them in the excuse and set update_candidate
@@ -1692,7 +1691,7 @@ class Britney(object):
             nuninst[arch] = set()
             for pkg_name in binaries[arch][0]:
                 pkgdata = binaries[arch][0][pkg_name]
-                pkg_id = (pkg_name, pkgdata[VERSION], arch)
+                pkg_id = (pkg_name, pkgdata.version, arch)
                 r = inst_tester.is_installable(pkg_id)
                 if not r:
                     nuninst[arch].add(pkg_name)
@@ -1702,7 +1701,7 @@ class Britney(object):
             if not check_archall:
                 for pkg in nuninst[arch + "+all"]:
                     bpkg = binaries[arch][0][pkg]
-                    if bpkg[ARCHITECTURE] == 'all':
+                    if bpkg.architecture == 'all':
                         nuninst[arch].remove(pkg)
 
         # return the dictionary with the results
@@ -1816,7 +1815,7 @@ class Britney(object):
                         continue
 
                     # Do not include hijacked binaries
-                    if binaries_t[parch][0][binary][SOURCE] != source_name:
+                    if binaries_t[parch][0][binary].source != source_name:
                         continue
                     bins.append(pkg_id)
 
@@ -1826,7 +1825,7 @@ class Britney(object):
                     if allow_smooth_updates and suite == 'unstable' and \
                        binary not in self.binaries[suite][parch][0] and \
                        ('ALL' in self.options.smooth_updates or \
-                        binaries_t[parch][0][binary][SECTION] in self.options.smooth_updates):
+                        binaries_t[parch][0][binary].section in self.options.smooth_updates):
 
                         # if the package has reverse-dependencies which are
                         # built from other sources, it's a valid candidate for
@@ -1878,7 +1877,7 @@ class Britney(object):
                     # migration so will end up missing from testing
                     if migration_architecture != 'source' and \
                          suite != 'unstable' and \
-                         binaries_t[parch][0][binary][ARCHITECTURE] == 'all':
+                         binaries_t[parch][0][binary].architecture == 'all':
                         continue
                     else:
                         rms.add(pkg_id)
@@ -1886,7 +1885,7 @@ class Britney(object):
         # single binary removal; used for clearing up after smooth
         # updates but not supported as a manual hint
         elif source_name in binaries_t[migration_architecture][0]:
-            version = binaries_t[migration_architecture][0][source_name][VERSION]
+            version = binaries_t[migration_architecture][0][source_name].version
             rms.add((source_name, version, migration_architecture))
 
         # add the new binary packages (if we are not removing)
@@ -1897,7 +1896,7 @@ class Britney(object):
                 if migration_architecture not in ['source', parch]:
                     continue
 
-                if self.binaries[suite][parch][0][binary][SOURCE] != source_name:
+                if self.binaries[suite][parch][0][binary].source != source_name:
                     # This binary package has been hijacked by some other source.
                     # So don't add it as part of this update.
                     #
@@ -1911,7 +1910,7 @@ class Britney(object):
 
                 # Don't add the binary if it is old cruft that is no longer in testing
                 if (parch not in self.options.fucked_arches and
-                    source_data[VERSION] != self.binaries[suite][parch][0][binary][SOURCEVER] and
+                    source_data[VERSION] != self.binaries[suite][parch][0][binary].source_version and
                     binary not in binaries_t[parch][0]):
                     continue
 
@@ -1992,7 +1991,7 @@ class Britney(object):
                         affected.update(inst_tester.negative_dependencies_of(rm_pkg_id))
 
                     # remove the provided virtual packages
-                    for j, prov_version, _ in pkg_data[PROVIDES]:
+                    for j, prov_version, _ in pkg_data.provides:
                         key = (j, parch)
                         if key not in undo['virtual']:
                             undo['virtual'][key] = provides_t_a[j].copy()
@@ -2014,7 +2013,7 @@ class Britney(object):
         # updates but not supported as a manual hint
         elif item.package in packages_t[item.architecture][0]:
             binaries_t_a = packages_t[item.architecture][0]
-            version = binaries_t_a[item.package][VERSION]
+            version = binaries_t_a[item.package].version
             pkg_id = (item.package, version, item.architecture)
             undo['binaries'][(item.package, item.architecture)] = pkg_id
             affected.update(inst_tester.reverse_dependencies_of(pkg_id))
@@ -2040,7 +2039,7 @@ class Britney(object):
                 # all of its reverse dependencies as affected
                 if binary in binaries_t_a:
                     old_pkg_data = binaries_t_a[binary]
-                    old_version = old_pkg_data[VERSION]
+                    old_version = old_pkg_data.version
                     old_pkg_id = (binary, old_version, parch)
                     # save the old binary package
                     undo['binaries'][key] = old_pkg_id
@@ -2070,7 +2069,7 @@ class Britney(object):
                 binaries_t_a[binary] = new_pkg_data
                 inst_tester.add_testing_binary(updated_pkg_id)
                 # register new provided packages
-                for j, prov_version, _ in new_pkg_data[PROVIDES]:
+                for j, prov_version, _ in new_pkg_data.provides:
                     key = (j, parch)
                     if j not in provides_t_a:
                         undo['nvirtual'].append(key)
@@ -2459,7 +2458,7 @@ class Britney(object):
         # local copies for performance
         sources = self.sources['testing']
         binaries = self.binaries['testing']
-        used = set(binaries[arch][0][binary][SOURCE]
+        used = set(binaries[arch][0][binary].source
                      for arch in binaries
                      for binary in binaries[arch][0]
                   )
@@ -2721,7 +2720,7 @@ class Britney(object):
         all = defaultdict(set)
         for p in nuninst[arch]:
             pkg = self.binaries['testing'][arch][0][p]
-            all[(pkg[SOURCE], pkg[SOURCEVER])].add(p)
+            all[(pkg.source, pkg.source_version)].add(p)
 
         print('* %s' % arch)
 
diff --git a/britney_util.py b/britney_util.py
index c7d0265..d58b360 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -32,7 +32,7 @@ from migrationitem import MigrationItem, UnversionnedMigrationItem
 
 from consts import (VERSION, BINARIES, PROVIDES, DEPENDS, CONFLICTS,
                     ARCHITECTURE, SECTION,
-                    SOURCE, SOURCEVER, MAINTAINER, MULTIARCH,
+                    SOURCE, MAINTAINER, MULTIARCH,
                     ESSENTIAL)
 
 
@@ -132,7 +132,7 @@ def undo_changes(lundo, inst_tester, sources, binaries, all_binary_packages,
                     except KeyError:
                         # If this happens, pkg_id must be a cruft item that
                         # was *not* migrated.
-                        assert source_data[VERSION] != all_binary_packages[pkg_id][VERSION]
+                        assert source_data[VERSION] != all_binary_packages[pkg_id].version
                         assert not inst_tester.any_of_these_are_in_testing((pkg_id,))
                     inst_tester.remove_testing_binary(pkg_id)
 
@@ -143,17 +143,17 @@ def undo_changes(lundo, inst_tester, sources, binaries, all_binary_packages,
         for p in undo['binaries']:
             binary, arch = p
             if binary[0] == "-":
-                version = binaries["testing"][arch][0][binary][VERSION]
+                version = binaries["testing"][arch][0][binary].version
                 del binaries['testing'][arch][0][binary[1:]]
                 inst_tester.remove_testing_binary(binary, version, arch)
             else:
                 binaries_t_a = binaries['testing'][arch][0]
                 if p in binaries_t_a:
                     rmpkgdata = binaries_t_a[p]
-                    inst_tester.remove_testing_binary((binary, rmpkgdata[VERSION], arch))
+                    inst_tester.remove_testing_binary((binary, rmpkgdata.version, arch))
                 pkgdata = all_binary_packages[undo['binaries'][p]]
                 binaries_t_a[binary] = pkgdata
-                inst_tester.add_testing_binary((binary, pkgdata[VERSION], arch))
+                inst_tester.add_testing_binary((binary, pkgdata.version, arch))
 
     # STEP 4
     # undo all changes to virtual packages
@@ -266,7 +266,7 @@ def eval_uninst(architectures, nuninst):
 
 def write_heidi(filename, sources_t, packages_t,
                 VERSION=VERSION, SECTION=SECTION,
-                ARCHITECTURE=ARCHITECTURE, sorted=sorted):
+                sorted=sorted):
     """Write the output HeidiResult
 
     This method write the output for Heidi, which contains all the
@@ -289,11 +289,11 @@ def write_heidi(filename, sources_t, packages_t,
             binaries = packages_t[arch][0]
             for pkg_name in sorted(binaries):
                 pkg = binaries[pkg_name]
-                pkgv = pkg[VERSION]
-                pkgarch = pkg[ARCHITECTURE] or 'all'
-                pkgsec = pkg[SECTION] or 'faux'
-                if pkg[SOURCEVER] and pkgarch == 'all' and \
-                    pkg[SOURCEVER] != sources_t[pkg[SOURCE]][VERSION]:
+                pkgv = pkg.version
+                pkgarch = pkg.architecture or 'all'
+                pkgsec = pkg.section or 'faux'
+                if pkg.source_version and pkgarch == 'all' and \
+                    pkg.source_version != sources_t[pkg.source][VERSION]:
                     # when architectures are marked as "fucked", their binary
                     # versions may be lower than those of the associated
                     # source package in testing. the binary package list for
@@ -439,17 +439,17 @@ def write_controlfiles(sources, packages, suite, basedir):
                     if not bin_data[key]:
                         continue
                     if key == SOURCE:
-                        src = bin_data[SOURCE]
+                        src = bin_data.source
                         if sources_s[src][MAINTAINER]:
                             output += ("Maintainer: " + sources_s[src][MAINTAINER] + "\n")
 
-                        if bin_data[SOURCE] == pkg:
-                            if bin_data[SOURCEVER] != bin_data[VERSION]:
-                                source = src + " (" + bin_data[SOURCEVER] + ")"
+                        if src == pkg:
+                            if bin_data.source_version != bin_data.version:
+                                source = src + " (" + bin_data.source_version + ")"
                             else: continue
                         else:
-                            if bin_data[SOURCEVER] != bin_data[VERSION]:
-                                source = src + " (" + bin_data[SOURCEVER] + ")"
+                            if bin_data.source_version != bin_data.version:
+                                source = src + " (" + bin_data.source_version + ")"
                             else:
                                 source = src
                         output += (k + ": " + source + "\n")
@@ -483,9 +483,9 @@ def old_libraries(sources, packages, fucked_arches=frozenset()):
     for arch in testing:
         for pkg_name in testing[arch][0]:
             pkg = testing[arch][0][pkg_name]
-            if sources_t[pkg[SOURCE]][VERSION] != pkg[SOURCEVER] and \
+            if sources_t[pkg.source][VERSION] != pkg.source_version and \
                 (arch not in fucked_arches or pkg_name not in unstable[arch][0]):
-                migration = "-" + "/".join((pkg_name, arch, pkg[SOURCEVER]))
+                migration = "-" + "/".join((pkg_name, arch, pkg.source_version))
                 removals.append(MigrationItem(migration))
     return removals
 
@@ -562,10 +562,10 @@ def check_installability(inst_tester, binaries, arch, affected, check_archall, n
         if name not in packages_t_a:
             continue
         pkgdata = packages_t_a[name]
-        if version != pkgdata[VERSION]:
+        if version != pkgdata.version:
             # Not the version in testing right now, ignore
             continue
-        actual_arch = pkgdata[ARCHITECTURE]
+        actual_arch = pkgdata.architecture
         nuninst_arch = None
         # only check arch:all packages if requested
         if check_archall or actual_arch != 'all':
-- 
2.8.1

From f8d5a584aeed2d051098cd44075a4ed19cabd089 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Thu, 7 Apr 2016 19:49:13 +0000
Subject: [PATCH 20/28] Reject some excuses with unsatisfiable depends

Ideally we would reject all items with known unsatisfiable
dependencies as they would not be installable in testing.  However,
there are a few known corner cases where we still want to migrate them
(notably when they are already broken in testing).

This commit is an attempt to weed out some of the "obviously" broken
items that will not successfully migrate.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 10 ++++++++--
 1 file changed, 8 insertions(+), 2 deletions(-)

diff --git a/britney.py b/britney.py
index dcee419..44f71b5 100755
--- a/britney.py
+++ b/britney.py
@@ -947,7 +947,8 @@ class Britney(object):
         # analyze the dependency fields (if present)
         deps = binary_u.depends
         if not deps:
-            return
+            return True
+        is_all_ok = True
 
         # for every dependency block (formed as conjunction of disjunction)
         for block, block_txt in zip(parse_depends(deps, False), deps.split(',')):
@@ -972,6 +973,8 @@ class Britney(object):
             if not packages:
                 excuse.addhtml("%s/%s unsatisfiable Depends: %s" % (pkg, arch, block_txt.strip()))
                 excuse.addreason("depends")
+                if arch not in self.options.break_arches:
+                    is_all_ok = False
                 continue
 
             # for the solving packages, update the excuse to add the dependencies
@@ -983,6 +986,7 @@ class Britney(object):
                         excuse.add_dep(p, arch)
                 else:
                     excuse.add_break_dep(p, arch)
+        return is_all_ok
 
     # Package analysis methods
     # ------------------------
@@ -1409,7 +1413,9 @@ class Britney(object):
                 # if the package is architecture-dependent or the current arch is `nobreakall'
                 # find unsatisfied dependencies for the binary package
                 if binary_u.architecture != 'all' or arch in self.options.nobreakall_arches:
-                    self.excuse_unsat_deps(pkg, src, arch, suite, excuse)
+                    is_valid = self.excuse_unsat_deps(pkg, src, arch, suite, excuse)
+                    if not is_valid and not source_t:
+                        update_candidate = False
 
             # if there are out-of-date packages, warn about them in the excuse and set update_candidate
             # to False to block the update; if the architecture where the package is out-of-date is
-- 
2.8.1

From 0416daa70a1c0cdb890f6c335ee37a47042d8588 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 8 Apr 2016 18:02:19 +0000
Subject: [PATCH 21/28] britney.py: Optimise scheduling a bit

Avoid rescheduling the tail of packages if we can avoid it.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 21 +++++++++++----------
 1 file changed, 11 insertions(+), 10 deletions(-)

diff --git a/britney.py b/britney.py
index 44f71b5..8644249 100755
--- a/britney.py
+++ b/britney.py
@@ -2178,8 +2178,8 @@ class Britney(object):
         final result is successful, otherwise (None, []).
         """
         group_info = {}
-        maybe_rescheduled = packages
-        changed = True
+        rescheduled_packages = packages
+        maybe_rescheduled_packages = []
 
         for y in sorted((y for y in packages), key=attrgetter('uvname')):
             updates, rms, _ = self._compute_groups(y.package, y.suite, y.architecture, y.is_removal)
@@ -2196,11 +2196,10 @@ class Britney(object):
         nuninst_last_accepted = nuninst_orig
 
         self.output_write("recur: [] %s %d/0\n" % (",".join(x.uvname for x in selected), len(packages)))
-        while changed:
-            changed = False
-            groups = {group_info[x] for x in maybe_rescheduled}
+        while rescheduled_packages:
+            groups = {group_info[x] for x in rescheduled_packages}
             worklist = self._inst_tester.solve_groups(groups)
-            maybe_rescheduled = []
+            rescheduled_packages = []
 
             worklist.reverse()
 
@@ -2211,7 +2210,6 @@ class Britney(object):
                 accepted, nuninst_after, comp_undo, failed_arch = self.try_migration(comp, nuninst_last_accepted, lundo)
                 if accepted:
                     selected.extend(comp)
-                    changed = True
                     if lundo is not None:
                         lundo.extend(comp_undo)
                     self.output_write("accepted: %s\n" % comp_name)
@@ -2223,6 +2221,8 @@ class Britney(object):
                     else:
                         self.output_write("  most: (%d) .. %s\n" % (len(selected), " ".join(x.uvname for x in selected[-20:])))
                     nuninst_last_accepted = nuninst_after
+                    rescheduled_packages.extend(maybe_rescheduled_packages)
+                    maybe_rescheduled_packages.clear()
                 else:
                     broken = sorted(b for b in nuninst_after[failed_arch]
                                     if b not in nuninst_last_accepted[failed_arch])
@@ -2230,7 +2230,8 @@ class Britney(object):
                     if any(item for item in comp if item.architecture != 'source'):
                         compare_nuninst = nuninst_last_accepted
                     # NB: try_migration already reverted this for us, so just print the results and move on
-                    self.output_write("skipped: %s (%d, %d)\n" % (comp_name, len(maybe_rescheduled), len(worklist)))
+                    self.output_write("skipped: %s (%d, %d, %d)\n" % (comp_name, len(rescheduled_packages),
+                                                                      len(maybe_rescheduled_packages), len(worklist)))
                     self.output_write("    got: %s\n" % (self.eval_nuninst(nuninst_after, compare_nuninst)))
                     self.output_write("    * %s: %s\n" % (failed_arch, ", ".join(broken)))
 
@@ -2238,7 +2239,7 @@ class Britney(object):
                         self.output_write("    - splitting the component into single items and retrying them\n")
                         worklist.extend([item] for item in comp)
                     else:
-                        maybe_rescheduled.append(comp[0])
+                        maybe_rescheduled_packages.append(comp[0])
 
         self.output_write(" finish: [%s]\n" % ",".join( x.uvname for x in selected ))
         self.output_write("endloop: %s\n" % (self.eval_nuninst(self.nuninst_orig)))
@@ -2247,7 +2248,7 @@ class Britney(object):
                                       newly_uninst(self.nuninst_orig, nuninst_last_accepted)))
         self.output_write("\n")
 
-        return (nuninst_last_accepted, maybe_rescheduled)
+        return (nuninst_last_accepted, maybe_rescheduled_packages)
 
 
     def do_all(self, hinttype=None, init=None, actions=None):
-- 
2.8.1

From d07de2b1feaf42d59f8a5133b3c985fee55d2493 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Fri, 8 Apr 2016 20:35:02 +0000
Subject: [PATCH 22/28] Optimise checking of affected packages

For (non-hint) migrations, split the set of affected into two
parts; one for reverse dependencies and one "negative dependencies"
(e.g. Conflicts).

If there are only regressions in the nuninst after checking the first
set, then there is no reason to continue with the second set (as
"negative dependencies" can only make it worse at that point).

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 55 +++++++++++++++++++++++++++++++++++--------------------
 britney_util.py | 43 ++++++++++++++++++++++++++++++++++++++++++-
 2 files changed, 77 insertions(+), 21 deletions(-)

diff --git a/britney.py b/britney.py
index 8644249..ac4ec97 100755
--- a/britney.py
+++ b/britney.py
@@ -1945,7 +1945,8 @@ class Britney(object):
         """
         undo = {'binaries': {}, 'sources': {}, 'virtual': {}, 'nvirtual': []}
 
-        affected = set()
+        affected_pos = set()
+        affected_remain = set()
 
         # local copies for better performance
         sources = self.sources
@@ -1993,8 +1994,8 @@ class Britney(object):
                     if pkey not in eqv_set:
                         # all the reverse dependencies are affected by
                         # the change
-                        affected.update(inst_tester.reverse_dependencies_of(rm_pkg_id))
-                        affected.update(inst_tester.negative_dependencies_of(rm_pkg_id))
+                        affected_pos.update(inst_tester.reverse_dependencies_of(rm_pkg_id))
+                        affected_remain.update(inst_tester.negative_dependencies_of(rm_pkg_id))
 
                     # remove the provided virtual packages
                     for j, prov_version, _ in pkg_data.provides:
@@ -2022,7 +2023,7 @@ class Britney(object):
             version = binaries_t_a[item.package].version
             pkg_id = (item.package, version, item.architecture)
             undo['binaries'][(item.package, item.architecture)] = pkg_id
-            affected.update(inst_tester.reverse_dependencies_of(pkg_id))
+            affected_pos.update(inst_tester.reverse_dependencies_of(pkg_id))
             del binaries_t_a[item.package]
             inst_tester.remove_testing_binary(pkg_id)
 
@@ -2037,8 +2038,8 @@ class Britney(object):
                 equivalent_replacement = key in eqv_set
 
                 # obviously, added/modified packages are affected
-                if not equivalent_replacement and updated_pkg_id not in affected:
-                    affected.add(updated_pkg_id)
+                if not equivalent_replacement:
+                    affected_pos.add(updated_pkg_id)
                 # if the binary already exists in testing, it is currently
                 # built by another source package. we therefore remove the
                 # version built by the other source package, after marking
@@ -2051,8 +2052,8 @@ class Britney(object):
                     undo['binaries'][key] = old_pkg_id
                     if not equivalent_replacement:
                         # all the reverse conflicts
-                        affected.update(inst_tester.reverse_dependencies_of(old_pkg_id))
-                        affected.update(inst_tester.negative_dependencies_of(old_pkg_id))
+                        affected_pos.update(inst_tester.reverse_dependencies_of(old_pkg_id))
+                        affected_remain.update(inst_tester.negative_dependencies_of(old_pkg_id))
                     inst_tester.remove_testing_binary(old_pkg_id)
                 elif hint_undo:
                     # the binary isn't in testing, but it may have been at
@@ -2068,7 +2069,7 @@ class Britney(object):
                     for (tundo, tpkg) in hint_undo:
                         if key in tundo['binaries']:
                             tpkg_id = tundo['binaries'][key]
-                            affected.update(inst_tester.reverse_dependencies_of(tpkg_id))
+                            affected_pos.update(inst_tester.reverse_dependencies_of(tpkg_id))
 
                 # add/update the binary package from the source suite
                 new_pkg_data = packages_s[parch][0][binary]
@@ -2085,17 +2086,18 @@ class Britney(object):
                     provides_t_a[j].add((binary, prov_version))
                 if not equivalent_replacement:
                     # all the reverse dependencies are affected by the change
-                    affected.add(updated_pkg_id)
-                    affected.update(inst_tester.negative_dependencies_of(updated_pkg_id))
+                    affected_pos.add(updated_pkg_id)
+                    affected_remain.update(inst_tester.negative_dependencies_of(updated_pkg_id))
 
             # add/update the source package
             if item.architecture == 'source':
                 sources['testing'][item.package] = sources[item.suite][item.package]
 
         # Also include the transitive rdeps of the packages found so far
-        compute_reverse_tree(inst_tester, affected)
+        compute_reverse_tree(inst_tester, affected_pos)
+        compute_reverse_tree(inst_tester, affected_remain)
         # return the package name, the suite, the list of affected packages and the undo dictionary
-        return (affected, undo)
+        return (affected_pos, affected_remain, undo)
 
     def try_migration(self, actions, nuninst_now, lundo=None, automatic_revert=True):
         is_accepted = True
@@ -2111,7 +2113,7 @@ class Britney(object):
         if len(actions) == 1:
             item = actions[0]
             # apply the changes
-            affected, undo = self.doop_source(item, hint_undo=lundo)
+            affected_pos, affected_remain, undo = self.doop_source(item, hint_undo=lundo)
             undo_list = [(undo, item)]
             if item.architecture == 'source':
                 affected_architectures = set(self.options.architectures)
@@ -2120,7 +2122,8 @@ class Britney(object):
         else:
             undo_list = []
             removals = set()
-            affected = set()
+            affected_pos = set()
+            affected_remain = set()
             for item in actions:
                 _, rms, _ = self._compute_groups(item.package, item.suite,
                                                  item.architecture,
@@ -2133,12 +2136,23 @@ class Britney(object):
                 affected_architectures = set(self.options.architectures)
 
             for item in actions:
-                item_affected, undo = self.doop_source(item,
-                                                       hint_undo=lundo,
-                                                       removals=removals)
-                affected.update(item_affected)
+                item_affected_pos, item_affected_remain, undo = self.doop_source(item,
+                                                                                 hint_undo=lundo,
+                                                                                 removals=removals)
+                affected_pos.update(item_affected_pos)
+                affected_remain.update(item_affected_remain)
                 undo_list.append((undo, item))
 
+        # Optimise the test if we may revert directly.
+        # - The automatic-revert is needed since some callers (notably via hints) may
+        #   accept the outcome of this migration and expect nuninst to be updated.
+        #   (e.g. "force-hint" or "hint")
+        if automatic_revert:
+            affected_remain -= affected_pos
+        else:
+            affected_remain |= affected_pos
+            affected_pos = set()
+
         # Copy nuninst_comp - we have to deep clone affected
         # architectures.
 
@@ -2152,7 +2166,8 @@ class Britney(object):
         for arch in affected_architectures:
             check_archall = arch in nobreakall_arches
 
-            check_installability(self._inst_tester, packages_t, arch, affected, check_archall, nuninst_after)
+            check_installability(self._inst_tester, packages_t, arch, affected_pos, affected_remain,
+                                 check_archall, nuninst_after)
 
             # if the uninstallability counter is worse than before, break the loop
             if automatic_revert and len(nuninst_after[arch]) > len(nuninst_now[arch]):
diff --git a/britney_util.py b/britney_util.py
index d58b360..8876c1c 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -538,25 +538,66 @@ def test_installability(inst_tester, pkg_name, pkg_id, broken, nuninst_arch):
     If nuninst_arch is not None then it also updated in the same
     way as broken is.
     """
+    c = 0
     r = inst_tester.is_installable(pkg_id)
     if not r:
         # not installable
         if pkg_name not in broken:
+            # regression
             broken.add(pkg_name)
+            c = -1
         if nuninst_arch is not None and pkg_name not in nuninst_arch:
             nuninst_arch.add(pkg_name)
     else:
         if pkg_name in broken:
+            # Improvement
             broken.remove(pkg_name)
+            c = 1
         if nuninst_arch is not None and pkg_name in nuninst_arch:
             nuninst_arch.remove(pkg_name)
+    return c
 
 
-def check_installability(inst_tester, binaries, arch, affected, check_archall, nuninst):
+def check_installability(inst_tester, binaries, arch, updates, affected, check_archall, nuninst):
     broken = nuninst[arch + "+all"]
     packages_t_a = binaries[arch][0]
+    improvement = 0
 
     # broken packages (first round)
+    for pkg_id in (x for x in updates if x[2] == arch):
+        name, version, parch = pkg_id
+        if name not in packages_t_a:
+            continue
+        pkgdata = packages_t_a[name]
+        if version != pkgdata.version:
+            # Not the version in testing right now, ignore
+            continue
+        actual_arch = pkgdata.architecture
+        nuninst_arch = None
+        # only check arch:all packages if requested
+        if check_archall or actual_arch != 'all':
+            nuninst_arch = nuninst[parch]
+        elif actual_arch == 'all':
+            nuninst[parch].discard(name)
+        result = test_installability(inst_tester, name, pkg_id, broken, nuninst_arch)
+        if improvement > 0 or not result:
+            # Any improvement could in theory fix all of its rdeps, so
+            # stop updating "improvement" after that.
+            continue
+        if result > 0:
+            # Any improvement (even in arch:all packages) could fix any
+            # number of rdeps
+            improvement = 1
+            continue
+        if check_archall or actual_arch != 'all':
+            # We cannot count arch:all breakage (except on no-break-arch-all arches)
+            # because the nuninst check do not consider them regressions.
+            improvement += result
+
+    if improvement < 0:
+        # The early round is sufficient to disprove the situation
+        return
+
     for pkg_id in (x for x in affected if x[2] == arch):
         name, version, parch = pkg_id
         if name not in packages_t_a:
-- 
2.8.1

From f59b471e4156a11f017d0dc37bb7053c2dbbdb21 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 30 Apr 2016 19:07:33 +0000
Subject: [PATCH 23/28] britney_util: Replace broken if with an assert
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit

Replace a "if" with an unconditionally false condition with a negative
assert of the "correct" condition.  The bug has been present for 2½
years with no known adverse effects.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney_util.py | 4 +---
 1 file changed, 1 insertion(+), 3 deletions(-)

diff --git a/britney_util.py b/britney_util.py
index 8876c1c..544e5ab 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -148,9 +148,7 @@ def undo_changes(lundo, inst_tester, sources, binaries, all_binary_packages,
                 inst_tester.remove_testing_binary(binary, version, arch)
             else:
                 binaries_t_a = binaries['testing'][arch][0]
-                if p in binaries_t_a:
-                    rmpkgdata = binaries_t_a[p]
-                    inst_tester.remove_testing_binary((binary, rmpkgdata.version, arch))
+                assert binary not in binaries_t_a
                 pkgdata = all_binary_packages[undo['binaries'][p]]
                 binaries_t_a[binary] = pkgdata
                 inst_tester.add_testing_binary((binary, pkgdata.version, arch))
-- 
2.8.1

From 0506048e98edd725eacb0cd9f065a5732dd21a45 Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 7 May 2016 08:56:41 +0000
Subject: [PATCH 24/28] Add present-and-installable constraints support

Solves first half of GH#5.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.conf |  4 +++
 britney.py   | 91 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 95 insertions(+)

diff --git a/britney.conf b/britney.conf
index 8137fb0..9b9dd19 100644
--- a/britney.conf
+++ b/britney.conf
@@ -13,6 +13,10 @@ EXCUSES_YAML_OUTPUT = /srv/release.debian.org/britney/var/data-b2/output/excuses
 UPGRADE_OUTPUT    = /srv/release.debian.org/britney/var/data-b2/output/output.txt
 HEIDI_OUTPUT      = /srv/release.debian.org/britney/var/data-b2/output/HeidiResult
 
+# External policy/constraints/faux-packages information that
+# (presumably) rarely changes.  Examples include "constraints".
+STATIC_INPUT_DIR = /srv/release.debian.org/britney/input
+
 # Directory for input files that Britney will update herself
 # (e.g. aging information) or will need regular updates
 # (e.g. urgency information).
diff --git a/britney.py b/britney.py
index ac4ec97..1644f04 100755
--- a/britney.py
+++ b/britney.py
@@ -317,6 +317,17 @@ class Britney(object):
             # unstable
             self.binaries['testing'][arch] = self.read_binaries(self.options.testing, "testing", arch)
 
+        try:
+            constraints_file = os.path.join(self.options.static_input_dir, 'constraints')
+        except AttributeError:
+            self.log("The static_input_dir option is not set", type='I')
+            constraints_file = None
+        if constraints_file is not None and os.path.exists(constraints_file):
+            self.log("Loading constraints from %s" % constraints_file, type='I')
+            self._load_constraints(constraints_file)
+        elif constraints_file is not None:
+            self.log("Constraints file (%s) does not exist" % constraints_file, type='I')
+
         self.log("Compiling Installability tester", type="I")
         self._build_installability_tester(self.options.architectures)
 
@@ -476,6 +487,86 @@ class Britney(object):
         if self.options.verbose or type in ("E", "W"):
             print("%s: [%s] - %s" % (type, time.asctime(), msg))
 
+    def _load_constraints(self, constraints_file):
+        """Loads configurable constraints
+
+        The constraints file can contain extra rules that Britney should attempt
+        to satisfy.  Examples can be "keep package X in testing and ensure it is
+        installable".
+
+        :param constraints_file: Path to the file containing the constraints
+        """
+        tag_file = apt_pkg.TagFile(constraints_file)
+        get_field = tag_file.section.get
+        step = tag_file.step
+        no = 0
+        faux_version = sys.intern('1')
+        faux_section = sys.intern('faux')
+
+        while step():
+            no += 1
+            pkg_name = get_field('Fake-Package-Name', None)
+            if pkg_name is None:
+                raise ValueError("Missing Fake-Package-Name field in paragraph %d (file %s)" % (no, constraints_file))
+            pkg_name = sys.intern(pkg_name)
+
+            def mandatory_field(x):
+                v = get_field(x, None)
+                if v is None:
+                    raise ValueError("Missing %s field for %s (file %s)" % (x, pkg_name, constraints_file))
+                return v
+
+            constraint = mandatory_field('Constraint')
+            if constraint not in {'present-and-installable'}:
+                raise ValueError("Unsupported constraint %s for %s (file %s)" % (constraint, pkg_name, constraints_file))
+
+            self.log(" - constraint %s" % pkg_name, type='I')
+
+            pkg_list = [x.strip() for x in mandatory_field('Package-List').split("\n") if x.strip() != '']
+            src_data = [faux_version,
+                        faux_section,
+                        [],
+                        None,
+                        True,
+                        ]
+            self.sources['testing'][pkg_name] = src_data
+            self.sources['unstable'][pkg_name] = src_data
+            for arch in self.options.architectures:
+                deps = []
+                for pkg_spec in pkg_list:
+                    s = pkg_spec.split(None, 2)
+                    if len(s) == 1:
+                        deps.append(s[0])
+                    else:
+                        pkg, arch_res = s
+                        if not (arch_res.startswith('[') and arch_res.endswith(']')):
+                            raise ValueError("Invalid arch-restriction on %s - should be [arch1 arch2] (for %s file %s)"
+                                             % (pkg, pkg_name, constraints_file))
+                        arch_res = arch_res[1:-1].split()
+                        if not arch_res:
+                            msg = "Empty arch-restriction for %s: Uses comma or negation (for %s file %s)"
+                            raise ValueError(msg % (pkg, pkg_name, constraints_file))
+                        for a in arch_res:
+                            if a == arch:
+                                deps.append(pkg)
+                            elif ',' in a or '!' in a:
+                                msg = "Invalid arch-restriction for %s: Uses comma or negation (for %s file %s)"
+                                raise ValueError(msg % (pkg, pkg_name, constraints_file))
+                bin_data = BinaryPackage(faux_version,
+                                         faux_section,
+                                         pkg_name,
+                                         faux_version,
+                                         arch,
+                                         'no',
+                                         ', '.join(deps),
+                                         None,
+                                         [],
+                                         False,
+                                         )
+                src_data[BINARIES].append((pkg_name, faux_section, arch))
+                self.binaries['testing'][arch][0][pkg_name] = bin_data
+                self.binaries['unstable'][arch][0][pkg_name] = bin_data
+
     def _build_installability_tester(self, archs):
         """Create the installability tester"""
 
-- 
2.8.1

From dc81c16bab221c5bea6fb979c78799050cf2921e Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 7 May 2016 10:44:32 +0000
Subject: [PATCH 25/28] Make "Keep-installable" constraints overrule nuninst
 counters

If there is a regression in "present-and-installable" constraints (on
non-break architectures), then discard the item even if the nuninst
counters have improved.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      | 35 ++++++++++++++++++++++++++++-------
 britney_util.py | 23 ++++++++++++++++++-----
 2 files changed, 46 insertions(+), 12 deletions(-)

diff --git a/britney.py b/britney.py
index 1644f04..0756646 100755
--- a/britney.py
+++ b/britney.py
@@ -324,9 +324,13 @@ class Britney(object):
             constraints_file = None
         if constraints_file is not None and os.path.exists(constraints_file):
             self.log("Loading constraints from %s" % constraints_file, type='I')
-            self._load_constraints(constraints_file)
-        elif constraints_file is not None:
-            self.log("Constraints file (%s) does not exist" % constraints_file, type='I')
+            self.constraints = self._load_constraints(constraints_file)
+        else:
+            if constraints_file is not None:
+                self.log("Constraints file (%s) does not exist" % constraints_file, type='I')
+            self.constraints = {
+                'keep-installable': [],
+            }
 
         self.log("Compiling Installability tester", type="I")
         self._build_installability_tester(self.options.architectures)
@@ -502,6 +506,10 @@ class Britney(object):
         no = 0
         faux_version = sys.intern('1')
         faux_section = sys.intern('faux')
+        keep_installable = []
+        constraints = {
+            'keep-installable': keep_installable
+        }
 
         while step():
             no += 1
@@ -531,6 +539,7 @@ class Britney(object):
                         ]
             self.sources['testing'][pkg_name] = src_data
             self.sources['unstable'][pkg_name] = src_data
+            keep_installable.append(pkg_name)
             for arch in self.options.architectures:
                 deps = []
                 for pkg_spec in pkg_list:
@@ -567,6 +576,8 @@ class Britney(object):
                 self.binaries['testing'][arch][0][pkg_name] = bin_data
                 self.binaries['unstable'][arch][0][pkg_name] = bin_data
 
+        return constraints
+
     def _build_installability_tester(self, archs):
         """Create the installability tester"""
 
@@ -2252,6 +2263,7 @@ class Britney(object):
         # removed by the item would still be counted.
 
         nuninst_after = clone_nuninst(nuninst_now, packages_t, affected_architectures)
+        must_be_installable = self.constraints['keep-installable']
 
         # check the affected packages on all the architectures
         for arch in affected_architectures:
@@ -2261,10 +2273,18 @@ class Britney(object):
                                  check_archall, nuninst_after)
 
             # if the uninstallability counter is worse than before, break the loop
-            if automatic_revert and len(nuninst_after[arch]) > len(nuninst_now[arch]):
+            if automatic_revert:
+                worse = False
+                if len(nuninst_after[arch]) > len(nuninst_now[arch]):
+                    worse = True
+                else:
+                    regression = nuninst_after[arch] - nuninst_now[arch]
+                    if not regression.isdisjoint(must_be_installable):
+                        foo = regression.intersection(must_be_installable)
+                        worse = True
                 # ... except for a few special cases
-                if (item.architecture != 'source' and arch not in new_arches) or \
-                   (arch not in break_arches):
+                if worse and ((item.architecture != 'source' and arch not in new_arches) or
+                   (arch not in break_arches)):
                     is_accepted = False
                     break
 
@@ -2440,7 +2460,8 @@ class Britney(object):
                 # do not allow any regressions on these architectures.
                 # This usually only happens with hints
                 break_arches = set()
-            better = is_nuninst_asgood_generous(self.options.architectures,
+            better = is_nuninst_asgood_generous(self.constraints,
+                                                self.options.architectures,
                                                 self.nuninst_orig,
                                                 nuninst_end,
                                                 break_arches)
diff --git a/britney_util.py b/britney_util.py
index 544e5ab..4732c61 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -488,17 +488,21 @@ def old_libraries(sources, packages, fucked_arches=frozenset()):
     return removals
 
 
-def is_nuninst_asgood_generous(architectures, old, new, break_arches=frozenset()):
-    """Compares the nuninst counters to see if they improved
+def is_nuninst_asgood_generous(constraints, architectures, old, new, break_arches=frozenset()):
+    """Compares the nuninst counters and constraints to see if they improved
 
-    Given a list of architecters, the previous and the current nuninst
+    Given a list of architectures, the previous and the current nuninst
     counters, this function determines if the current nuninst counter
     is better than the previous one.  Optionally it also accepts a set
     of "break_arches", the nuninst counter for any architecture listed
     in this set are completely ignored.
 
+    If the nuninst counters are equal or better, then the constraints
+    are checked for regressions (ignoring break_arches).
+
     Returns True if the new nuninst counter is better than the
-    previous.  Returns False otherwise.
+    previous and there are no constraint regressions (ignoring Break-archs).
+    Returns False otherwise.
 
     """
     diff = 0
@@ -506,7 +510,16 @@ def is_nuninst_asgood_generous(architectures, old, new, break_arches=frozenset()
         if arch in break_arches:
             continue
         diff = diff + (len(new[arch]) - len(old[arch]))
-    return diff <= 0
+    if diff > 0:
+        return False
+    must_be_installable = constraints['keep-installable']
+    for arch in architectures:
+        if arch in break_arches:
+            continue
+        regression = new[arch] - old[arch]
+        if not regression.isdisjoint(must_be_installable):
+            return False
+    return True
 
 
 def clone_nuninst(nuninst, packages_s, architectures):
-- 
2.8.1

From 43f63633ab648e1a7fa93ff7544a87fb4a237bfa Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 7 May 2016 12:30:59 +0000
Subject: [PATCH 26/28] britney.py: Extract a _parse_provides method

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 39 +++++++++++++++++++++------------------
 1 file changed, 21 insertions(+), 18 deletions(-)

diff --git a/britney.py b/britney.py
index 0756646..2b57465 100755
--- a/britney.py
+++ b/britney.py
@@ -722,6 +722,26 @@ class Britney(object):
 
         return sources
 
+    def _parse_provides(self, pkg_id, provides_raw):
+        parts = apt_pkg.parse_depends(provides_raw, False)
+        nprov = []
+        for or_clause in parts:
+            if len(or_clause) != 1:
+                msg = "Ignoring invalid provides in %s: Alternatives [%s]" % (str(pkg_id), str(or_clause))
+                self.log(msg, type='W')
+                continue
+            for part in or_clause:
+                provided, provided_version, op = part
+                if op != '' and op != '=':
+                    msg = "Ignoring invalid provides in %s: %s (%s %s)" % (str(pkg_id), provided, op, provided_version)
+                    self.log(msg, type='W')
+                    continue
+                provided = sys.intern(provided)
+                provided_version = sys.intern(provided_version)
+                part = (provided, provided_version, sys.intern(op))
+                nprov.append(part)
+        return nprov
+
     def _read_packages_file(self, filename, arch, srcdist, packages=None, intern=sys.intern):
         self.log("Loading binary packages from %s" % filename)
 
@@ -795,24 +815,7 @@ class Britney(object):
 
             provides_raw = get_field('Provides')
             if provides_raw:
-                parts = apt_pkg.parse_depends(provides_raw, False)
-                nprov = []
-                for or_clause in parts:
-                    if len(or_clause) != 1:
-                        msg = "Ignoring invalid provides in %s: Alternatives [%s]" % (str(pkg_id), str(or_clause))
-                        self.log(msg, type='W')
-                        continue
-                    for part in or_clause:
-                        provided, provided_version, op = part
-                        if op != '' and op != '=':
-                            msg = "Ignoring invalid provides in %s: %s (%s %s)" % (str(pkg_id), provided, op, version)
-                            self.log(msg, type='W')
-                            continue
-                        provided = intern(provided)
-                        provided_version = intern(provided_version)
-                        part = (provided, provided_version, intern(op))
-                        nprov.append(part)
-                provides = nprov
+                provides = self._parse_provides(pkg_id, provides_raw)
             else:
                 provides = []
 
-- 
2.8.1

From 08fec8eae245cf9d547407623764282be10d9a2a Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sat, 7 May 2016 12:36:49 +0000
Subject: [PATCH 27/28] Add support for loading faux packages

Part of GH#5.

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py | 72 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 72 insertions(+)

diff --git a/britney.py b/britney.py
index 2b57465..fa680e0 100755
--- a/britney.py
+++ b/britney.py
@@ -319,9 +319,17 @@ class Britney(object):
 
         try:
             constraints_file = os.path.join(self.options.static_input_dir, 'constraints')
+            faux_packages = os.path.join(self.options.static_input_dir, 'faux-packages')
         except AttributeError:
             self.log("The static_input_dir option is not set", type='I')
             constraints_file = None
+            faux_packages = None
+        if faux_packages is not None and os.path.exists(faux_packages):
+            self.log("Loading faux packages from %s" % faux_packages, type='I')
+            self._load_faux_packages(faux_packages)
+        else:
+            self.log("Faux packages file (%s) does not exist" % constraints_file, type='I')
+
         if constraints_file is not None and os.path.exists(constraints_file):
             self.log("Loading constraints from %s" % constraints_file, type='I')
             self.constraints = self._load_constraints(constraints_file)
@@ -491,6 +499,70 @@ class Britney(object):
         if self.options.verbose or type in ("E", "W"):
             print("%s: [%s] - %s" % (type, time.asctime(), msg))
 
+    def _load_faux_packages(self, faux_packages_file):
+        """Loads fake packages
+
+        In rare cases, it is useful to create a "fake" package that can be used to satisfy
+        dependencies.  This is usually needed for packages that are not shipped directly
+        on this mirror but is a prerequisite for using this mirror (e.g. some vendors provide
+        non-distributable "setup" packages and contrib/non-free packages depend on these).
+
+        :param faux_packages_file: Path to the file containing the fake package definitions
+        """
+        tag_file = apt_pkg.TagFile(faux_packages_file)
+        get_field = tag_file.section.get
+        step = tag_file.step
+        no = 0
+
+        while step():
+            no += 1
+            pkg_name = get_field('Package', None)
+            if pkg_name is None:
+                raise ValueError("Missing Fake-Package-Name field in paragraph %d (file %s)" % (no, faux_packages_file))
+            pkg_name = sys.intern(pkg_name)
+            version = sys.intern(get_field('Version', '1.0-1'))
+            provides_raw = get_field('Provides')
+            archs_raw = get_field('Architecture', None)
+            component = get_field('Component', 'main')
+            if archs_raw:
+                archs = archs_raw.split()
+            else:
+                archs = self.options.architectures
+            faux_section = 'faux'
+            if component != 'main':
+                faux_section = "%s/faux" % component
+            src_data = [version,
+                        sys.intern(faux_section),
+                        [],
+                        None,
+                        True,
+                        ]
+
+            self.sources['testing'][pkg_name] = src_data
+            self.sources['unstable'][pkg_name] = src_data
+
+            for arch in archs:
+                pkg_id = (pkg_name, version, arch)
+                if provides_raw:
+                    provides = self._parse_provides(pkg_id, provides_raw)
+                else:
+                    provides = []
+                bin_data = BinaryPackage(version,
+                                         faux_section,
+                                         pkg_name,
+                                         version,
+                                         arch,
+                                         get_field('Multi-Arch'),
+                                         None,
+                                         None,
+                                         provides,
+                                         False,
+                                         )
+
+                src_data[BINARIES].append(pkg_id)
+                self.binaries['testing'][arch][0][pkg_name] = bin_data
+                self.binaries['unstable'][arch][0][pkg_name] = bin_data
+
     def _load_constraints(self, constraints_file):
         """Loads configurable constraints
 
-- 
2.8.1

From dba99f0447bc028b1e5404f662bfdfb96115115e Mon Sep 17 00:00:00 2001
From: Niels Thykier <niels@thykier.net>
Date: Sun, 8 May 2016 12:12:39 +0000
Subject: [PATCH 28/28] britney.py: Support compressed Packages/Sources

Signed-off-by: Niels Thykier <niels@thykier.net>
---
 britney.py      |  4 +++-
 britney_util.py | 22 ++++++++++++++++++++++
 2 files changed, 25 insertions(+), 1 deletion(-)

diff --git a/britney.py b/britney.py
index fa680e0..d658894 100755
--- a/britney.py
+++ b/britney.py
@@ -200,7 +200,7 @@ from excuse import Excuse
 from migrationitem import MigrationItem
 from hints import HintParser
 from britney_util import (old_libraries_format, undo_changes,
-                          compute_reverse_tree,
+                          compute_reverse_tree, possibly_compressed,
                           read_nuninst, write_nuninst, write_heidi,
                           eval_uninst, newly_uninst, make_migrationitem,
                           write_excuses, write_heidi_delta, write_controlfiles,
@@ -787,6 +787,7 @@ class Britney(object):
             sources = {}
             for component in self.options.components:
                 filename = os.path.join(basedir, component, "source", "Sources")
+                filename = possibly_compressed(filename)
                 self._read_sources_file(filename, sources)
         else:
             filename = os.path.join(basedir, "Sources")
@@ -967,6 +968,7 @@ class Britney(object):
             for component in self.options.components:
                 filename = os.path.join(basedir,
                              component, "binary-%s" % arch, "Packages")
+                filename = possibly_compressed(filename)
                 self._read_packages_file(filename, arch,
                       self.sources[distribution], packages)
         else:
diff --git a/britney_util.py b/britney_util.py
index 4732c61..1b758f2 100644
--- a/britney_util.py
+++ b/britney_util.py
@@ -27,6 +27,7 @@ from itertools import filterfalse
 import os
 import time
 import yaml
+import errno
 
 from migrationitem import MigrationItem, UnversionnedMigrationItem
 
@@ -626,3 +627,24 @@ def check_installability(inst_tester, binaries, arch, updates, affected, check_a
             nuninst[parch].discard(name)
         test_installability(inst_tester, name, pkg_id, broken, nuninst_arch)
 
+
+def possibly_compressed(path, permitted_compressesion=None):
+    """Find and select a (possibly compressed) variant of a path
+
+    If the given path exists, it will be returned
+
+    :param path The base path.
+    :param permitted_compressesion An optional list of alternative extensions to look for.
+      Defaults to "gz" and "xz".
+    :returns The path given possibly with one of the permitted extensions.  Will raise a
+     FileNotFoundError
+    """
+    if os.path.exists(path):
+        return path
+    if permitted_compressesion is None:
+        permitted_compressesion = ['gz', 'xz']
+    for ext in permitted_compressesion:
+        cpath = "%s.%s" % (path, ext)
+        if os.path.exists(cpath):
+            return cpath
+    raise FileNotFoundError(errno.ENOENT, os.strerror(errno.ENOENT), path)
-- 
2.8.1

Attachment: signature.asc
Description: OpenPGP digital signature


Reply to: