glynn at gclements.plus.com
Sat Jun 28 09:24:39 PDT 2008
Nicolas Mailhot wrote:
> > Sure, there are cases where scalable fonts are desirable. I just wish
> > that developers would not try to force them onto the other 95% of
> > cases. The fonts used in a file manager or a text editor don't need to
> > be some exact physical size on the screen.
> The fonts used in a file manager or a text editor do need to stay the
> same size when you move from one workstation to the next with a screen
> with slightly different pixel size.
No they don't.
In all cases, I want most text to be in the smallest font which I can
comfortably read, in order to maximise the amount of information
visible at any one time.
On every monitor I have owned so far, the size has been dictated by
the pixel size, not the physical size. If I was to switch my desktop
to a lower resolution while mainataining the same physical size, I
wouldn't be able to read anything.
I can read courier-12 (7x13) fine; courier-10 (6x10) is too coarse.
That has been true of every monitor I have owned, regardless of its
physical size or resolution. The next generation may change that. If I
switched to a higher resolution, then I would probably need to use a
larger pixel size (i.e. similar physical size) to retain legibility.
However, I wouldn't need *exactly* the same physical size. In fact,
unless I switched to much higher resolution, a slightly (physically)
smaller bitmap font would still be more legible than either resampling
a bitmap font or switching to a scalable font in order to maintain the
exact same physical size.
> > And even in the cases where they are desirable (i.e. DTP), the thing
> > that really matters is relative consistency, not absolute size.
> You can't achieve consistency by relying on pixel sizes. Harware is not
You can achieve consistency with the rest of the document. You don't
need consistency with the physical world. I worked for a magazine
publisher for nearly three years, and I never once saw anyone hold
physical objects up to the screen for size comparison.
[Not that text rendering mattered all that much; the small screen on
the original Macs meant that everything except for titles was normally
> > > Anyway, a system which supports scalable fonts can trivially support
> > > pixel based fonts,
> > It could, if the developers weren't so obsessed with physical sizes
> > that they go out of their way to prevent you from using pixel sizes
> > anywhere.
> They don't go out of their way.
Well; I suppose that part is technically true. They just take the
easiest route; the end result is the same.
> > > whereas the reverse just isn't true. So far, i haven't
> > > seen where any of your ideas can be used to improve scalable font
> > > rendering.
> > I'm not interested in scalable font rendering. I'm a computer
> > programmer, not a graphic artist, and
> This is pretty evident. Also it's pretty evident the "computer
> programmer class that can't imagine anything but pixels" is a user
> minority. It's over-represented when making decisions on software
> features though (which explains a lot)
It's not just programmers; it's anyone for whom the use of text
outweighs the use of graphics (and DTP is primarily "graphics"; at the
magazine, articles were written and edited on 80x32 text screens, with
the Macs used solely for layout).
> > my main use for a computer is to view and edit text.
> Mine too. That does not change the fact I disagree with you.
> > However, on the "improving" front, I have at least tried to remind
> > people that "resolution independence" involves more than just fonts.
> As you yourself noted text in the main information form on a screen and
> just drives everything else.
You are conflating form and function. Writing is about representing
information "graphically" (as opposed to e.g. verbally). The question
is, which is more important: the information itself, or its
If you're interested in the information itself, the font only matters
insofar as the text remains legible. It shouldn't be too small to see,
nor should it be rasterised too crudely, nor should it be blurred.
But choosing to rasterise a scalable font at exactly 11.7645321 pixels
because that's the result of dividing the user preference by the
physical resolution when there is a 12-pixel hand tuned bitmap
available is absolutely the wrong approach.
> > The second biggest reason for hard-coded DPI assumptions is that
> > people keep coming up with systems where physical sizes are the only
> > things that matter, essentially pretending that you have infinite
> > resolution. At which point, everything has to be substantially
> > over-sized, with even the thinnest lines being several pixels thick,
> > otherwise everything becomes a faint blur.
> The second biggest reason for hard-coded DPI assumptions is that some
> programmers keep insisting it can't work, intentionnaly botch
> implementations and then revert to dpi hacks (with self-satisfied I told
> you so).
Using physical dimensions alone cannot work until every monitor has
pixels so small that they can be ignored (i.e. the situation we have
with laser printers, where physical dimensions work just fine).
> > So you either oversize everything, or you end up having to hack
> > in a fixed nominal DPI so that the resulting pixel sizes end up as
> > integers.
> So you don't have any pixel asumptions in the UI descriptions, and let
> software libs compute dimensions, rounding them so they fall on the
> pixel grid (which is BTW how text rendering already works, and it's more
> complex than UI dimensionning).
Libraries don't have enough context. If you're drawing multiple
entities, it may not matter how you round the coordinates, but it may
matter that you do so consistently (i.e. don't introduce gaps by
rounding in opposite directions). But library which only sees
individual elements rather than the overall picture may be unable to
> > No, I have had plenty of disagreements with people for all kinds of
> > reasons, and I don't normally imply quasi-religious motivations. I
> > describe that position as quasi-religious because I think that it's,
> > well, quasi-religious. And that's the only way that I can see this
> > obsession with physical sizes.
> It's not an obsession but a requirement. A requirement you go out of
> your way to frustrate. Don't complain if that puts you in contact with
> frustrated users.
Sometimes it's one requirement out of many, sometimes it isn't even a
requirement. But balancing conflicting requirements is hard, and
adopting absolutist positions makes everything much simpler.
Just to clarify, for anyone who may have been confused by your
mischaracterisations: my problem isn't with supporting physical sizes,
but with not supporting anything else, and failing to support legacy
applications from the time when 75 dpi was a perfectly safe assumption
(or programs or data designed for Windows, where 96 dpi is a perfectly
> > We managed to survive with a hard-coded 75 dpi for years. Windows was
> > quite successful with a hard-coded choice of either 96 or 120 dpi (in
> > spite of a few programs overlooking the 120 dpi possibility).
> And windows has moved past it. As usual years late because the hardware
> forced its hand.
Yeah, but Microsoft appears able to do a half-decent job of it. The
*didn't* just abandon pixels entirely and operate solely in physical
units. AFAICT, a lot of effort went into balancing the two, and
ensuring that they didn't break a lot of existing applications.
> In other news VGA was good enough for many years. Dump your fancy
> screens and get a VGA-only one
Why? The higher resolution displays let me get more text on screen (so
long as the OS doesn't do anything stupid, like assume that I use such
a huge font at 640x400 because I like big text, when in fact it's
because anything smaller would have too few pixels to be legible).
> > But now, if the monitor says it's 121x124 dpi, apparently it's my
> > responsibility to somehow choose a 20.167 x 20.667 point font to avoid
> > it looking like crap.
> Nope, it's your responsability to bug the authors of the apps you use so
> they honor a desktop-wide prefered font setting, and so this setting can
> optionnaly be expressed in pixels as you like it. (but don't even try to
> argue the pt unit should be removed unless you're ready to have people
> massively angry at you)
I'm not suggesting that the point unit should be removed. I am
suggesting that it shouldn't be the only choice, and also that it
should normally be treated as a rough guideline, rather than assuming
that the user is going to be calibrating measuring devices against the
If the user specifies 12 point, and the monitor's DPI means that
equates to 11.9 pixels, should you use:
a) a 12-pixel hand-tuned bitmap,
b) a blurred blob, arising from rasterising a scalable font with AA
c) a mess of jaggies, arising from rasterising a scalable font without AA
Glynn Clements <glynn at gclements.plus.com>
More information about the xorg