xserver forces 96 DPI on randr-1.2-capable drivers, overriding correct autodetection
hramrach at centrum.cz
Sat Sep 24 02:12:45 PDT 2011
On 23 September 2011 20:59, James Cloos <cloos at jhcloos.com> wrote:
>>>>>> "DS" == Daniel Stone <daniel at fooishbar.org> writes:
> DS> (Real summary: That's your opinion. The development team have ours.
> Well, *some* of the development texm. Some of us agree that forcing the
> dpi to 96 is a b0rked regression. It is just that those with the contrary
> opinion are more vocal and have more traction.
So to have this fixed I need to make more traction than those who want
the X server broken :p
> It is easy enough to avoid, though, by always starting X with the -dpi
> option. One has to add -retro, too, for a reasonable startup experience,
> so ading -dpi is not horribly onerous.
I don't care about -retro. The change should have been more documented
so that people who try to test something and start X server by hand
aren't surprised that it does nothing. Indeed, it is easy for any
display manager scripts to have some -noretro option and make -retro
the default for the benefit of X testing. It could even render text
that tells you to use -noretro in the middle of the root window since
X server insists on having a 'fixed' font to start.
However, breaking the dpi detection is not fixed with any option, and
the patch which adds the option to unbreak it was posted in the bug
about 1.5 year ago and was completely ignored by the X developers
commenting on the bug. They only conjured bogus reasons why the new
behaviour is "correct" which were all refuted.
Clearly if they wanted the X server broken this way they could use the
-dpi option themselves and did not need to break the X server for
everybody. The -dpi option only allows what they want: set the X
server to one fixed dpi. It cannot be used to restore the
autodetection so that X server uses the DPI of the screen connected
and updates it whenever a different screen is connected.
More information about the xorg-devel