Re: removing of sddm (debian 9 -kde5) to start in console mode then startx to start kde5
David Wright composed on 2017-10-19 17:00 (UTC-0500):
> On Wed 18 Oct 2017 at 19:22:53 (-0400), Felix Miata wrote:
>> Finding a wanted app to run from a classified tree list of 30 or 40 or 50 or
>> more applications is easier for most people than remembering the name and any
>> required startup options to type, both for those uncommonly used, and even for
>> the commonly used ones if there are more than a scant few such. 2 or 3 or 4
>> clicks to start one up is typically easier than typing 4, 5, 6, 7 or more
>> characters, or searching command history more than a few entries back.
> Do you need a DE to do that? What's the difference between that and
> the Debian menus that I occasionally use?
Don't I? One of the few positive paradigms to come out of Redmond was the button
at the lower left corner of the screen to open a tree-structured list of
applications and utilities available to run. I've yet to see a materially better
one that that derived from it for KDE2 or 3 that added a search box.
>> It's a
>> nice bonus in some DEs that automatically remember and reopen apps, their
>> content states, and their window sizes and positions.
> That's more debatable. Some people like that, some like me prefer
> a particular setup whenever I start X, some use Place with Mouseclick,
> etc. But there appear to be separate packages to handle this, like
> lxsession and devilspie.
I need all the help I can get to pickup where I left off when interrupted and
forced to end the session before the WIP can be completed.
>> For some, the microscopic default text size and fractional default proportion of
>> screen area (80x25, using as little as 1/16 or less of total screen space) of
>> xterm windows impedes their use for anything.
> I've seen reports of fonts getting tinier as resolutions increase,
> and not just on linux. I don't know how hard or easy it is to provide
> sensible defaults for every application on every system, and in any
> case circumstances vary. People with poor sight want large characters,
> others want more characters on the screen. In the specific case of
> xterm, I want and have both, and different fonts too, just by setting
> different commandline options and Xresources.
It's not resolutions per se, but resolution increases have considerable tendency
to carry higher pixel density, notwithstanding the considerable average density
differences between laptop displays and desktop displays. The problem comes from
1-Inconsistency from developers, some who size in pixels, and some who size in
points. Points are resolution dependent, so are in principle unaffected by
density. Pixels are entirely independent, so the higher the density, the smaller
the container into which a fixed number of pixels fit.
2-As you imply, some want a bigger screen to provide for more stuff to fit,
while others want the same stuff to simply be bigger, and yet others want a
mixture of both.
3-Overall, computer developers are a youthful bunch, so their eyesight is better
than average, while their collective wisdom has a lot of growing yet to do. Try
to imagine anyone getting into computer development whose eyesight is materially
poorer than average. That can be lot of pain to suffer day-in, day-out working
on screens designed for use by people with better vision. There simply isn't
much of it happening.
4-Hardware manufacturers are caught in the middle. They don't stay in business
without making a profit. They know how to make higher quality product, but
they're stifled by returns from those who after getting the product home
determine their bigger screen makes things tinier and harder to use. So they
limit the selection of high quality product, producing mostly native resolutions
in physical sizes that produce densities near the arbitrary software standard of
96 DPI. 1366x768 in this day and age is hard to imagine, given that 1024x768
dates back over three decades, and 1080p dates back more than two decades. Yet,
this is the resolution of both a typical 32" TV screen and a laptop of 15".
Large PC screens ought to have an *average* density of at least 200 DPI by now,
and have the software automatically doing the computations necessary to allow
people to enjoy better quality and overall experience.
5-Much of development incorporates overwhelming perfectionism, a need to have
results look "just right", not necessarily with any wisdom of the perspective of
those with different eyes. Over-control is typical, limiting or even eliminating
the ability to accommodate differing capabilities and/or environments. Computers
are good at calculating, but developers routinely limit that utility. One of the
most blatant of such constraints can be seen in Xorg:
Please add option to avoid forcing of 96dpi
an offshoot from:
xserver forces 96 DPI on randr-1.2-capable drivers, overriding correct autodetection
Xorg, as did XFree86, when broken hardware or firmware don't interfere,
automatically calculates a display's size and resolution, resulting in a density
capable of use by applications to display objects sized to accurate physical
dimensions. That calculation is rarely used, as can be seen by examination of
Xorg's logs. Accurate display density is reported, but in most cases it is
"corrected" to a value fixed at 96 DPI, even when actual density is several
times that. Too many different methods have been concocted for those who need
and/or want it to workaround that disparity, instead of letting the computer do
It's already been something like 5 years since web developers monkeywrenched CSS
in similar manner. Standards expunged the physical lengths (cm, in, mm, pt,
etc.) from conforming web browsers, keeping the nominal length names, but
morphing them into logical units rather than physical. The only way to guarantee
a physical size is to have total actual physical control of the hardware and of
the software settings, and a display that does in fact have the same physical
density as that applied by the server, usually 96 DPI.
Relatively recently we got an additional impediment to functional workarounds
courtesy of RedHat management through Gnome:
Recent change breaks HiDPI setup based on calculated or forced DPI
Those who find it desirable or necessary to manually configure the server
instead of accepting random circumstance must provide a corresponding additional
manual parameter otherwise unnecessary if they want their GTK applications
obeying those configurations.
This is nuts, and more seriously and circularly affects those already limited in
ability. If you can't see tiny stuff in trying to accommodate (or personalize),
how do you make the adjustments required to enable the accommodation?
"Wisdom is supreme; therefore get wisdom. Whatever else you
get, get wisdom." Proverbs 4:7 (New Living Translation)
Team OS/2 ** Reg. Linux User #211409 ** a11y rocks!
Felix Miata *** http://fm.no-ip.com/