[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Non-security updates between stable releases?



On Tue, Jul 31, 2007 at 05:02:58AM -0400, Tim Hull wrote:
> In my own case, I figure I'll probably either be running Sid or Ubuntu
> Feisty.  I gave etch+rolling my own backports a try,
> but backporting each package was a throwback to the Debian Hamm (i.e.
> pre-apt) days

Backporting is any good only for a single or a couple of packages.  For
anything more, you'll want a mixed system with a pin like:

Package: *
Pin: release a=stable
Pin-Priority: 500

Package: *
Pin: release a=testing
Pin-Priority: 200

> - I was often having to manually resolve dependencies due to
> the fact that I didn't want to pull them *all* from stable or *all* from
> unstable - I wanted to pull the minimum necessary from unstable and the rest
> from stable when building.  Also, it seemed like I'd have to backport 50-odd
> packages to get the functionality I'm looking for on my system - and I'd
> still only have Gnome 2.14...

Too bad, apt won't resolve the dependencies the first time, but at least it
will tell you what you're missing, and you'll be spared any actual
backporting.  You do have to manually ok every package you want to upgrade
from stable to testing, but then the package will track testing until the
version in stable catches up.


And the issue in apt is, it ever considers only one version, the one with
highest pin which matches the non-downgrading rules.  This is wrong if:

stable (pinned at 500):  foo=1.0 [Depends: bar>=1.0], bar=1.0
testing (pinned at 200): foo=1.2 [Depends: bar>=1.2], bar=1.2

If the user says: apt-get install foo=1.2, apt won't be smart enough to find
out it needs to upgrade bar as well.  Same for aptitude or any other
front-end, making pulling packages from testing or experimental a pain.

-- 
1KB		// Microsoft corollary to Hanlon's razor:
		//	Never attribute to stupidity what can be
		//	adequately explained by malice.



Reply to: