How do we want to deal with 4k tiled displays?
Aaron Plattner
aplattner at nvidia.com
Thu Jan 16 11:11:54 PST 2014
So, monitor manufacturers are starting to make high-resolution displays that
consist of one LCD panel that appears to the PC as two. The one I've got is a
Dell UP2414Q. It shows up to the PC as two DisplayPort 1.2 multistream devices
that have the same GUID but different EDIDs. There's an extension block in the
EDID that's supposed to indicate which side is the left tile and which is the
right, though I haven't tried to decode it yet.
The problem, obviously, is that applications (including some games) treat the
two tiles as if they were completely separate monitors. Windows maximize to
only half of the screen. My question is, how do we want to deal with these
monitors?
As far as I see it, we have four options:
1. Hide the presence of the second tile in the X server.
Somehow combine the two tiles into a single logical output at the RandR
protocol level. The X server would be responsible for setting up the right
configuration to drive the logical output using the correct physical
resources.
2. Hide the presence of the second tile in libXrandr.
This would allow interested applications to query the real state of the
hardware while also making it easier to do modesets on a per-monitor level
rather than per-output.
This could be exposed either as a new "simple" modeset API in libXrandr or
similar, or by modifying the existing interface and having a new interface
to punch through the façade and get at the real configuration, for clients
that care.
3. Update every application that uses RandR 1.2.
Applications can detect the presence of these monitors and deal with them
themselves, but this might have poor adoption because programmers are a lazy
bunch in general.
4. Do nothing and hope the problem goes away.
Hopefully, the situation with current 4k monitors is temporary and we'll
start seeing single-tile 4k displays soon, fixing the problem "forever".
Until we get 8k tiled displays.
If the real output devices are still exposed through the protocol, it might make
sense to add new properties describing their relative positions to make it
easier for clients to lay them out in the right order. This might be useful for
power-walls too.
The problem with the first two options is that driving these monitors consumes
two crtcs. If we present them as a single output to applications, they'll make
the assumption that they can just assign a single crtc to that output and use
the remaining crtcs for something else. I suspect that deleting crtcs or
otherwise marking them as used as a side effect of setting a mode on a different
crtc is going to explode a lot of existing applications.
~~
Regardless of what we do about the current crop of 4k monitors, one feature I
would like to add is a standardized OutputGroup property. Multiple outputs with
the same value of OutputGroup should be considered (both by clients and the
server) as a single logical monitor. This would affect the Xinerama information
presented by rrxinerama.c, and window managers that use RandR 1.2 directly would
be encouraged to consider output groups in their UI behavior.
The X server could configure OutputGroups automatically when setting up the
initial configuration based on the presence of tiled displays, and clients could
reconfigure the groups at runtime to get different behavior if desired.
Does this sound like a reasonable extension to RandR?
--
Aaron
More information about the xorg-devel
mailing list