How do we want to deal with 4k tiled displays?
Alexander E. Patrakov
patrakov at gmail.com
Wed Jan 22 23:18:32 PST 2014
Keith Packard wrote:
> "Alexander E. Patrakov" <patrakov at gmail.com> writes:
> > What's wrong with my proposal to report "mirrored screens" to clients
> > even though the outputs are not really mirrors? In this case, each
> > mirror can get one EDID, implementing option (1).
>
> We'd have to report both at the full resolution so that applications
> would maximize correctly; I don't see how this really helps here, other
> than potentially confusing applications. It leaves the EDID problem
> unsolved (you don't have an EDID which reflects the unified display),
> and it may well confuse applications into thinking that they can
> configure the two "monitors" separately.
As for the "you don't have an EDID which reflects the unified display", that's
absolutely correct. But my opinion is that it doesn't really exist (thus there
is no problem to solve), so I'd like to see some arguments that show that it
is needed.
Now about the objection about the independent configuration of "mirrors". First
of all, it looks valid and is testable. But, as a user, I would expect (and
tolerate) the existing screen configuration tools to break in such tiled 4K
setup - after all, this setup cannot be fully expressed in the language they
understand. Also I would welcome attempts to limit such breakage to the
necessary minimum.
We could probably deal with that by mirroring the configuration changes done on
one mirror to the other inside the X server, just as if another client did
that matching change. After all, this (another tool reconfiguring outputs) can
happen anyway with today's hardware and software, and configuration tools are
already supposed to be able to deal with it. I don't have two monitors here at
work, but at least KDE's kcm_randr correctly updates itself if I change the
resolution using xrandr behind its back.
Another class of clients that attempt to reconfigure screens is fullscreen
games, and here breakage is indeed not allowed at all. But we can try running
them with two mirrored 1920x1080 monitors right now, attempt to select a lower
resolution in the game (or just run a game that insists on the lower
resolution) and see what breaks and what is not fixable manually by running
xrandr to configure the second monitor identically to the first one. I will test
this later today and report.
Given the above and the following words from Aaron Plattner (unfortunately,
not testable, unlike the above), I think that further discussion is needed.
Aaron Plattner wrote:
> If we present them as a single output to applications, they'll make
> the assumption that they can just assign a single crtc to that output and
> use the remaining crtcs for something else. I suspect that deleting crtcs
> or otherwise marking them as used as a side effect of setting a mode on a
> different crtc is going to explode a lot of existing applications.
OTOH, as you wrote earlier, "RandR is always allowed to say 'no' to any
particular configuration". So do I understand correctly that any client that
breaks is already broken e.g. with ivybridge? If this is so, then indeed, a
fake mirror that I proposed is somewhat pointless.
Further in your mail, you wrote:
> Oh, I can imagine advertising the dual-wire setup as a separate output?
> Would that be helpful at all?
> ...
> We used to just have N outputs to 1 CRTC; it looks like we've got
> N outputs to M CRTCs...
That looks more like a question addressed to Aaron Plattner.
--
Alexander E. Patrakov
More information about the xorg-devel
mailing list