On April 8, 2005 13:32, you wrote: > I also wrote him and here's an excerpt (first his opinion, then mine): > --------------------------------snip------------------------------- > -->3. DPI applies well to printing, but not well to the screen. If I > project my laptop display on a screen for a presentation, the theoretical > DPI has clearly changed, but I do not want all of my fonts to suddenly > change with it. DPI values for computer screens are simply convention and > not meaningful. > > That's a totally different technical problem! So you want the second > display (beamer) to copy the first one exactly (pixel-based). What does > this problem and X's inability to achive that has to do with the DPI > problem? If the driver has a proper clone mode with the laptop's display > as primary device (and thus taking its DPI value), that would be your > solution. I think he meant that as an example. In his thread on debian-x from December, he makes the point more generally, and more convincingly, by pointing out the general pointlessness of setting exact dpi, given (for instance) how people sit at different distances to their monitors, so that a 12 point font can be enough for one person, too little for another, etc. Linus has tossed around the idea of "angular" size on the LKML, to more accurately describe how big something on a screen appears to the viewer, in an attempt to deal with the same basic problem, namely, that matching dpi to reality is usually pointless and can even be quite counterproductive. All this just underlines the fact that whether something on the screen matches its size in reality doesn't matter at all to most people (and doesn't matter for all but a few classes of specialized applications). For most of us, it's just "cool". If it really mattered most of the time, Mac OS/Windows would do things differently. Cheers, Christopher Martin
Attachment:
pgpbwg2ZwPugw.pgp
Description: PGP signature