X Logical Font Description and HiDPI

Andrey ``Bass'' Shcheglov andrewbass at gmail.com
Wed Feb 3 11:07:27 UTC 2021


Hello,

*The problem*: X-server serves fonts at a fixed resolution of 100dpi, rather than the current window system resolution (`xdpyinfo | grep -F resolution`).

*A bit of theory*. There are legacy server-side fonts which are sent to X clients over the network (via TCP or UNIX socket) either by the X server itself, or by a separate X Font Server (single or multiple). Unlike the usual client-side fonts (Xft, GTK 2+, Qt 2+), the "server" backend (also called the core X font backend) does not support anti-aliasing, but supports network transparency (that is, bitmaps, without any alpha channel, are sent over the network). At the application level, server-side fonts are specified not as an `XftFontStruct` (which most often translates into the familiar "DejaVu Sans Mono:size=12:antialias=true", see <https://keithp.com/~keithp/talks/xtc2001/paper/>), but as an XLFD <https://wiki.archlinux.org/index.php/X_Logical_Font_Description>. If we are talking about a local machine, then the same font file can be registered in both subsystems at once and be available to both modern GTK and Qt-based applications, and legacy ones (Xt, Athena, Motif, GTK 1.2, Qt 1.x).

Historically, server fonts were bitmap, or raster (*.pcf), and a raster has a resolution of its own (not necessarily the same as the window system resolution). Therefore, XLFD has fields such as RESOLUTION_X and RESOLUTION_Y. For a raster font not to look ugly when rendered onto the screen and still have the requested rasterized glyph size, the raster resolution must be close to the screen resolution, therefore raster fonts were usually shipped with native resolutions of 75 dpi and 100 dpi (that's why we have directories such as /usr/share/fonts/X11/75dpi and /usr/share/fonts/X11/100dpi). So, the below lines represent the same 12 pt font

> -bitstream-charter-bold-r-normal--12-120-75-75-p-75-iso8859-1
> -bitstream-charter-bold-r-normal--17-120-100-100-p-107-iso8859-1

with a rasterized glyph size of

 * 12 px at 75 dpi, and
 * 17 px at 100 dpi, respectively.

But, in addition to bitmap fonts, there are vector, or outline fonts (TrueType, OpenType, Adobe Type 1), which can be scaled by any factor and still look good when rendered onto the screen. Some X-server implementations (notably, XSun) also supported the Adobe Type 3 format, where glyphs were described using the Turing-complete PostScript language.

Of course, the concept of raster resolution does not apply to vector fonts, so I can request zeroes (`0`) or even asterisks (`*`) in the RESOLUTION_X and RESOLUTION_Y fields, and, in theory, my X server should give me exactly the font requested. This is directly stated in the _Arch Wiki_ article at the link above:

> Scalable fonts were designed to be resized. A scalable font name, as shown in the example below, has zeroes in the pixel and point size fields, the two resolution fields, and the average width field.
> 
> ...
> 
> To specify a scalable font at a particular size you only need to provide a value for the POINT_SIZE field, the other size related values ​​can remain at zero. The POINT_SIZE value is in tenths of a point, so the entered value must be the desired point size multiplied by ten.

So, either of the following two queries should return a 12 pt `Courier New` font at the window system resolution:

> -monotype-courier new-medium-r-normal--*-120-*-*-m-*-iso10646-1
> -monotype-courier new-medium-r-normal--0-120-0-0-m-0-iso10646-1

*Or so I thought*. The thing is, having migrated from 96... 15 dpi monitors to a 162 dpi 4k monitor, I noticed that my carefully chosen vector fonts suddenly became too small.

And it turned out that unless you explicitly set RESOLUTION_X and RESOLUTION_Y fields to 162 (and no one in his right mind will do so -- it would require rewriting dozens of Xresources lines every time one changes his monitor), then X server defaults to rendering the font at 100 dpi instead of 162. The difference between 17 and 27 pixels (the factor of 1.62 = 162 / 100) is quite noticeable. Here's an example for a modern Debian 10 box: <https://habrastorage.org/webt/uq/m5/ej/uqm5ej9ys9ynb3vwanrqioayxns.png>.

I thought this regression was a consequence of people gradually cutting out obsolete subsystems from X11, but in Debian Woody, released in 2002 and having a 2.2 kernel, I saw exactly the same thing: <https://habrastorage.org/webt/j9/24/0r/j9240rroo5q9wasvpn8s9zhpjke.png>. The only difference is that Debian Woody renders fonts in a "cleaner" manner, apparently, applying hinting on the server side, before sending bitmaps over the network.

So this is not a regression. The problem has always been there and equally affects all vector font types (TrueType, OpenType, Type 1).

*Now, the question*. Is there a way, without hard-coding window system resolution into user settings for each individual resource, to get by with less effort than recommended by the author of the "Sharing Xresources between systems" article <https://jnrowe.github.io/articles/tips/Sharing_Xresources_between_systems.html>?

Is it possible to solve the problem by changing the global configuration of the X server itself or the libraries it relies on (libfreetype, libxfont)?

Regards,
Andrey.

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 195 bytes
Desc: OpenPGP digital signature
URL: <https://lists.x.org/archives/xorg/attachments/20210203/56b5a719/attachment.sig>


More information about the xorg mailing list