[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: What are some common problems when using Debian GNU / LINUX?



On Mon, 21 Jan 2013 14:24:10 -0500
Dennis Clarke <dclarke@blastwave.org> wrote:

> > [snip] 
> > 
> > > For a normal usage, testing is better, even if the project claims
> > > it is not for production environment. More recent kernels and
> > > drivers which means more supported hardware, and updated web
> > > browsers are some obvious interesting points here. They are
> > > simply the most obvious.
> 
> Merely a voice from out in the user land here, but felt I could chime
> in. Personally I want a system to work and do not need the latest
> version of everything or anything. The vendor shipped OS should "just
> work" and pulling in an update should never cause the system to
> become unbootable or unstable etc. As a philosophy this works well
> because it then allows me to build my own binaries into /usr/local if
> I choose.  I think, and this is a WAG ( Wild As* Guess ), there is a
> defacto undocumented standard in the linux world which seems to say
> that stuff in /usr/local is just local to that given system and never
> touched by a package update.  This is in violation of the much older
> SVR4 and XPG4/POSIX world ways that state you must place software
> into /opt/vendor/packagename with conf data in /etc/opt/vendor and
> var data in /var/opt/vendorname. The linux world seems to be a wild
> west where you can drop binaries into the /usr file tree.  Sure,
> under /usr/local but still they are referenced BY DEFAULT in the PATH
> env var of users .bashrc and .profile etc.  This is just insane as I
> see it but such is life. Expose users to uncontrolled changes?
> Bizarre.  
> 
I think we're talking about different types of user here, pretty much
congruent with business and private. A business user needs solid
hardware and software and doesn't mind if it's a few years old, which
is just as well as it gets replaced when the company accountant says so.
If the software works, then it works from day one and will never need
replacing or fixing other than for security reasons. That's what Debian
Stable is for, along with servers, which have much the same kind of
requirements and constraints.

Now look at the private computer user, who will often replace hardware
because the latest game won't work on anything more than six months
old. He (usually he) will buy a new PC every year or two, and will find
he needs the latest drivers to use the hardware in it, and the latest
applications to make use of the extra features of the new hardware. He
doesn't need Debian Stable, he needs the very latest, so Unstable, or
Testing if he's a coward. Before the freeze, Testing isn't far behind
Unstable, it's just (normally) free of the most broken of the new bits.
Even then, he may want to compile a kernel, as even Unstable lags a bit
there.

There's a bit of crossover: I want a simple computer, that doesn't need
to support the latest games, that doesn't need a graphics card costing
as much as the rest of the PC, that doesn't in fact cost much at all,
but at the same time I want the latest versions of a few pieces of
software, such as gEDA PCB. This type of serious software is still
under heavy development, and each version really does have new features
which are useful and not just cosmetic. So I run Unstable. I also use a
netbook and a laptop, and it's a pain when a file created by one
version of software doesn't work with a different version, so I run
Unstable on them as well. I have a server (running Stable) where all
the data lives, so the PCs are expendable and can be reinstalled at
minimum effort if required (twice in about eight years so far). I also
do not update all at the same time, so at most an update will break
only one of them. Unless it's a fundamental system problem, such as I
once had with Perl, reversion to the previous version for a while will
sort things out.

> So I generally run RHEL 6 on anything important to me, like my
> personal workstations and some servers. I run Debian stable on some
> edge servers with custom builds of Apache, mod_ssl, libcurl and PHP
> etc etc.  Lastly I do venture into the "testing" space on one
> laptop.  Never further out than "Wheezy" ever and this is only
> because I expect things to work from one boot to another. 

Indeed, horses for courses. If it's booting you're most worried about,
beware of Grub2. That's given me more trouble than all other system
software combined. Tip: don't use a separate /boot partition, the
developers sometimes forget that a few people do that. About three
times so far.
> 
> Just a comment, setting up wireless networking with WPA2 auth seems
> to be a bit of a wild west still and it took a day of fussing to get
> it to work.  Sad but true. 
> 
Want to try a RADIUS server? There's a good example, on Lenny I had to
compile it myself, as licence issues stopped Debian from including SSL
support. That was fixed by the time Squeeze appeared, and FreeRADIUS
with SSL is now a trivial installation. Configuration is another
matter... You might want to try Network Manager, which is a bit
intrusive, but seems to handle WPA2 (Personal *and* RADIUS EAP-TLS) and
OpenVPN, and wireless, and 3G dongles, and nowadays appears to work. My
RADIUS troubles were actually due to the WAP, Cisco-badged, but of
course not really underneath. Most (maybe all) router firmware is utter
rubbish...

-- 
Joe


Reply to: