[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Fwd: Bug#237877: X's falling back to 75 dpi



 Hi everyone,

On 22/4/2005 Cristopher Martin wrote :
> Recently I took it upon myself to adjust KDE's default fonts
> I picked as a default size 10 points,

 That is the problem. You specify fontsize in points.
 And then you try to "solve" problem by setting dpi.

 Dpi represents (or should represent) dots per inch of a display.
 For each display it is a fixed value.
 If you change it and use it, it represents something different.
 Using one global variable to represent two different things is bad.
 If original meaning of dpi is completely useless,
   then dpi only represents one thing, but has a confusing name,
   which is bad.

 A 'point', when used as a measure of fontsize,
    as defined by creators of Postscript,
    is exactly one 72ndth of an inch.
 It is a very usefull measure for output printed on paper.
 It is not usefull for specifying output on a monitor,
   because it is an absolute measure
   (it is also a relative measure, in that it specifies
    relative size of one font compared to another
    if both are specified in points,
    but that is not relevant here),
   and while output printed on paper
     is looked at from a fairly constant distance,
   this is not true for monitors,
     where viewing distance depends on monitor
     and on kind of text viewed.

 So what measure should be used for specifying fontsize on a display ?

 As rules of thumb,
   * viewing distance is proportional to size of display, and
   * different kinds of text are viewed with different applications,
     and default font can be set per application.
 Due to second rule of thumb, we only need consider one type of text ;
   if that can be handled correctly, then so can all others
   (provided someone selects appropriate default fonts for them).
 Due to first rule of thumb, size in meters of a displayed font
   should increase proportionally with increasing size of display,
   and as physical dots per inch of displays is fairly constant,
   a reasonable approximation can be made
   by specifying font size in pixels,
   where 'reasonable' means: less than 10% away from ideal size.
 The BillyBigs pages your email refers to, say that
   Windows uses an 8 pt font at 96 dpi,
   MacOS   uses a 13 pt font at 75 dpi,
    8 [point] * 1/72 [inch/point] * 96 [dot/inch] = 10.66 [dot]
   13 [point] * 1/72 [inch/point] * 72 [dot/inch] = 13    [dot]
   * fractional dots can't be displayed;
       windows's tahoma is probably rendered as 11 pixels.
   * difference partly reflects different uses these fonts have,
     Windows being mainly for use in an office,
     and MacOS more targeted at high-quality visual output,
     which is similar to different per-application fonts.
   * remaining difference reflects user-preferences,
     which may depend on real dpi of their user's monitors ;
     i would expect MacOS users to have somewhat higher real dpi.

 Thus specifying fontsizes in pixels always enables to specify
    a good approximation of ideal font size.
 But maybe we can do better if real dpi is known.
 I think that if a user gets a monitor of same size s/he had previously,
   but with higher real dpi,
   then s/he will want to use this for two things:
   better looking fonts, and seeing more text on display.
 Let's assume that both desires are equally strong,
   and we can model it by assuming that
   they will increase their viewing distance
   by a factor that is square root of ratio of new dpi to old dpi.
 As far as i know, range of real dpi's is between 75 and 100
   (thus 'dotsize' of a monitor is between .25 and .33)
   (but i could imagine tft screens having smaller dots),
   thus square root of ratio of extreme dotsizes is 1.15 ,
   and if an average is used, ideal value deviates less than 8% from it.
 Thus, if default fontsize has been specified in pixels,
   and it has been specified to a value that is best for average user,
   then correcting for real dpi would result in max 1 pixel difference.

 In practice, increasing a font size by 1 pixel can make it look ugly,
   often because middle bar in E is no longer in middle.
 While implementing optimization for real dpi could be good
   (it would require a beauty contest,
    and a 'beauty' field in XFontStruct),
   there are currently more important optimizations to be done,
   as you currently specify fontsize in points,
   so next thing to do would be to
     change that to a specification in pixels.

 Trying to change default dpi for a purpose that is different from
   making it more accurately represent
     most common or average dpi of our user's displays,
   amounts to you pushing a specific (wrong) interpretation of dpi,
   which goes against wishes of others,
   who use a different (wrong) interpretation of dpi.
 This explains why you got complaints.

 If K applications are not capable of specifying default font in pixels,
   please fix that,
   instead of trying to change X in a random unrelated way.

 The above implies that it is not possible to
   compute fontsize in points, to use for printing,
   from fontsize in pixels, to use for displaying.
 Therefore, each application that can print
    should provide for a means that lets user specify that.

 That are my thoughts on this subject.
 Please feel free to point out anything you think is wrong with them.

 have fun !

   Siward
   (home.wanadoo.nl/siward)

 ---------------------------------------------------------------------

 Lotsa people are illliterate ;
   when they read 'guidelines', they think they have read 'law',
   when they read 'law', they think they have read 'justice',
   when they read 'justice', they think they have read 'fairness',
   and when they read 'fair', they think it's a picnic.



Reply to: