[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: alsa-base breaks linux-sound-base



Camaleón <noelamac@gmail.com> writes:

> On Sun, 02 Sep 2012 19:31:02 +0200, lee wrote:
>
>> Camaleón <noelamac@gmail.com> writes:
>> 
>>>> Some packages currently installed may be from unstable or experimental
>>>> because I needed more recent versions of them --- IIRC, the mumble
>>>> ones are. I would want to keep those until testing catches up.
>>>
>>> Then this can be the culprit for all your mess unless you had configure
>>> apt repositories priorities properly.
>> 
>> What's wrong with it? Aptitude installs packages from testing by default
>> and installs packages from unstable or experimental when I tell it to,
>> which is what I want.
>
> Nothing wrong "per se" but when using such mixed repositories you have to 
> carefully watch for what gets installed/updated, from where it comes and 
> if the update routine is finally doing what you wanted to do. In brief, 
> you have to be very cautious or your system can badly break.

I'm paying attention to that in any case. It's pretty easy, I just look
at what aptitude would do, and if I don't like it, I don't let it do
what it wants. I don't override dependencies and only install software
that isn't in either Debian or Debian-multimedia when it really cannot
be avoided.

If package maintainers set up dependencies in such a way that stuff
installs flawlessly and then something breaks, I'll have a
problem. There probably isn't any good way to prevent that, other than
building everything from scratch, which would involve to take care of
all dependencies myself. That would probably be worse.

And what is worse? Having to wait until a bugfix finally makes it into a
Debian package that is in stable (or testing) or use a Debian package
from unstable or, if that cannot be avoided, from experimental --- or
ignore the package management and get the source and compile and install
yourself whatever you need?

>>>> ,----
>>>> | lee@yun:~$ LANG=C apt-cache policy
>>>
>>> (...)
>>>
>>> What the hell is all that bunch of repositories? :-O
>>>
>>> You need an urgent reorganization for your repos and also reducing the
>>> number of them as you have too many defined.
>> 
>> Why? 
>
> To avoid your system from trying to downgrade a bunch of core 
> packages? ;-)

Aptitude doesn't try that. If I removed unstable and experimental, what
other choice would there be but to downgrade packages, leaving me with
some stuff not working since packages from unstable are installed
*because* there have been bugfixes?

Maybe I don't understand the problem, so let's take the mumble server
for an example: There was a regular update to it in testing, and that
version didn't work at all. There was a bug report about it and the bug
was fixed in the next version, and that version was available in
unstable, so I installed the version from unstable and it works
fine. What's the problem, and what's a better alternative?

>> If I was to remove unstable and experimental, how would I install
>> packages from these when needed? 
>
> You can't but then you have to know how proceed under this scenario. I 
> mean, having a complete mix of stable/testing/unstable/experimental and 
> external repositories is not for beginners: you have to understand in 
> deep how this works.

Well, I haven't installed packages from stable. I have installed
mumble-server from unstable, and I tried NVIDIA drivers from unstable
and experimental and went back to the ones in testing. That's all I did,
besides having a current git version of emacs24 (because there have been
fixes to gnus which aren't available in Debian yet) in /usr/local and a
couple libraries (which are needed by some software I'm using), all of
which the package management doesn't know about at all.

So when the package management figures that it needs to install package
X, Y and K when I install package Z from unstable (because I need the
more recent version of Z) and I let it do that, is there something wrong
with that? Of course, any of the packages X, Y, K and Z from unstable
could be worse than the same packages in testing. There isn't anything I
could do about that, though. The alternative is getting the upstream
sources of what's in package Z and make my own version, completely
ignoring any dependencies which might be important. Is that the better
choice? And if it is and when I do that, I need to somehow keep track of
such software and either have to revert to the Debian versions of the
software once a more recent version is available or update from upstream
or just leave it until it quits working for some reason. That doesn't
seem to be a good choice to me.

Having some packages from unstable installed along with the ones from
testing is a situation that will "fix" itself over time when more recent
versions of these packages make it into testing, replacing the ones from
unstable. In the meantime, I have a working system without having to
worry about keeping track of self-installed software and dependency
problems that might arise from it. Just don't circumvent the package
management --- I tried that many years ago and found out it's a very bad
idea, so I don't do that.

>> And if I removed Debian multimedia, I would miss a lot of packages.
>
> Sure, such is life :-)
>
> Or you can also "cherry pick" some packages from d-m and then turn off 
> this repository from your sources.list.

And turn it back on every time I check for updates? Or skip the updates
and only turn it back on when dependency problems come up because the
cherry-picked packages have become too old?

>> Perhaps I don't need the security updates because there aren't any for
>> testing, but they don't seem to hurt anything.
>
> Security updates is one of the repos I would leave. But I'm afraid I have 
> a completely point of view than yours about how to use a system, I mean, 
> I prefer stability over new features and you seem to look for the 
> opposite. Anyway, if that's your case, I would simply install "sid" and 
> problem solved :-)

Nope, reliability is much more important to me than new features. It's
not about new features, it's about having what you need. If I was
running stable, I won't install exim4 from testing or unstable because
it has five or ten new features I don't need or a fix for two bugs that
never occur in my application. When I need the features, I start
thinking about it. When the bugs occur in my application and I need the
fix, I'm likely to install the version from testing/unstable to get
things working. When my X-session randomly freezes and I suspect it's
due to NVIDIA drivers, I'm looking for more recent ones and try them
because the problem might be fixed already.

Besides, look at the NVIDIA drivers in stable: They are ancient. There
have been significant improvements, and I'm lucky that I'm running
testing which has these improved drivers because it matters to me. For a
server, it won't matter and I'd just stay with the nouveau drivers if
they work fine for that.

Reliability is actually the reason why I'm running testing instead of
stable: When you run stable, you have to make relatively big leaps when
you upgrade from one stable release to the next. You don't need that
with testing as the leaps are relatively small and don't happen all at
once and thus are much easier to handle. It's trading one risk for
another.

>>>> While (unsuccessfully) trying to use more recent NVIDIA drivers
>>>> because with the ones from testing the X-session randomly froze, I
>>>> added the i386 architecture because that was recommended. I'm not so
>>>> sure if that was a good idea ... Fortunately, the freezing problem
>>>> seems to have been fixed :)
>>>
>>> At a high cost, I'd say...
>> 
>> Well, what do you do when your X-session randomly freezes? 
>
> Report it? What is for sure is that I'm not going to sacrifice my whole 
> system stability for a VGA problem.

There wasn't anything useful to report because the only thing I could do
was pressing the reset button. The freeze might not occur for two days
and might occur twice within an hour or only once a day. It wasn't
reproducible, and since I had to press reset, there wasn't any
information I could have provided. I have no way of telling what caused
the problem, it might have been something else than the NVIDIA
drivers.

I haven't sacrificed the stability of the whole system by trying out
more recent NVIDIA drivers. With the random freezes, the stability was
gone already. The package management makes sure you have the needed
packages in the needed versions installed when you do that --- if it
doesn't, it needs to be fixed. The only problem is downgrading from the
more recent drivers since the package management didn't handle that
well, which is something to improve upon.

>> You have two choices: 
>
> (...)
>
> Or maybe more, but this thread is not about this. I would suggest that 
> you open a new one if you want to explain these problems in detail or if 
> you're still concerned about your available options.
>
>> When using the packet management means high cost, what else do you
>> suggest to use?
>
> Using the package manager is not the problem here. The problem is mixing 
> "all" Debian flavours and think this is going to magically work with no 
> additional tweaks ;-)

It's working fine, so I'm not worrying about it. If there are packages
in testing that don't work together with packages from unstable, I
expect the packet management to tell me this, and so far it does so
nicely. Do you have an example for a tweak that would be required to
make?


-- 
Debian testing amd64


Reply to: