[RFC PATCH v2] Add xdg-output protocol

Pekka Paalanen ppaalanen at gmail.com
Tue Jul 18 11:55:49 UTC 2017


On Fri, 14 Jul 2017 16:40:32 +0800
Jonas Ådahl <jadahl at gmail.com> wrote:

> This E-mail is quite long, but I tried to reply to some parts.

Hi Jonas, Olivier,

appreciated. :-)

TL;DR, I don't think there is anything in this email that would
actually be an opposition to the xdg-output protocol proposal. This
discussion has taken off in an academic direction about everything
surrounding the protocol proposal.


> On Wed, Jul 12, 2017 at 12:07:29PM +0300, Pekka Paalanen wrote:
> > On Fri, 7 Jul 2017 04:21:57 -0400 (EDT)
> > Olivier Fourdan <ofourdan at redhat.com> wrote:
> >   
> > > Hi Pekka,
> > >   
> > > > it's very hard for me to wrap my head around this, so the below may
> > > > sound a bit harsh, sorry. I don't mean to rant, but I feel there is
> > > > something fundamental amiss. I am diving back into the high-level
> > > > design which is fairly separated from the xdg_output interface.    
> > > 
> > > No worries, but my goal with the protocol proposal (being an Xwayland
> > > specific proposal initially, or merely an additional wl_output event
> > > or now a more generic xdg_output protocol to be extended in the
> > > future with whatever people want for desktop output uses) is to make
> > > Xwayland work with the current existing design/implementations, not
> > > to advocate for or against an existing design or particular
> > > compositor implementation.  
> > 
> > Hi Olivier,
> > 
> > if the intention is not to evaluate whether the current implementations
> > are actually workable in the long run, then should the new interface be
> > named so that we can throw it away if it actually doesn't work? And not
> > design anything else directly on top of it, so that it remains
> > relatively easy to throw away, i.e. stop using?
> >   
> > > 
> > > WRT fractional scaling, my understanding of the design in mutter is
> > > based on this document:
> > > 
> > > https://mail.gnome.org/archives/gnome-shell-list/2017-June/msg00000.html
> > > 
> > > I cannot really comment on the design decisions, nor how those
> > > decisions were made, Jonas would probably be in a better position for
> > > this.  
> > 
> > That email talks about a new feature, separated coordinate spaces:
> > logical and physical. I believe that is the correct design.
> > 
> > But I have understood that you want to make Xwayland work with the
> > design before that one, which I cannot see how it could work. Do you
> > really need to make Xwayland work well with that old design?  
> 
> The benefit of the current/old design (in mutter) is that X11 clients
> don't particularly regress in functionality, compared to how it worked
> before. HiDPI aware clients work just as well, and non-HiDPI clients
> works just as bad.

Very well, it's your choice to support two paths there.

> Naturally, if we can make both HiDPI aware and HiDPI unaware X11 clients
> work good, that's the most optimal, but I find it hard to believe we can
> make that happen without adaptations to HiDPI aware X11 clients (like
> mentioned EWMH extensions or something).

Not being a user, that is another design decision I would have hard
time understanding - why enhance modern toolkits to run better via
Xwayland instead of porting apps to Wayland. I suppose the obvious
answer is apps you don't actually control but still use GTK+ et al. But
how many of those are actually also HiDPI aware? And if you implement
that, do you not get into consistency problems with input and output
coordinate systems?

The two paragraph you wrote above talk about two cases, which in my
mind are mutually exclusive.

> > 
> > I fully agree with section 1, Overall approach.
> > 
> > Section 5 mentions a hwdb-alike database but keyed by EDID data. This is
> > something Keith Packard needs for his VR work to identify HMDs and I
> > discussed with him about it a bit. I think the scaling info would be a
> > good fit there.
> > 
> > Section 7 touches the game and fullscreen topic, which I believe is
> > important for deciding how RandR should advertise output resolutions.
> > 
> > Section 8 is exactly what we are talking about in this thread, right?  
> 
> Right. Lets split up Xwayland support into three distinguishable parts:
> 
> (A) Plain old Xwayland support with no HiDPI awareness or anything
> 
> (B) Fullscreen windows with buffer size matching the monitor framebuffer
> it is fullscreen on
> 
> (C) HiDPI aware X11 clients.
> 
> 
> For (A), this protocol solves the problem more or less completely
> without any further changes. No input transformation changes needed
> anywhere, not in Xwayland, not in any compositor. It is just a way to
> communicate the logical layout of the screen to Xwayland so that it can
> position its buffer-scale-1 windows in a grid that matches the global
> compositor coordinate space.

Right, this is starting to slowly sink in. ;-)


> > What resolution do you want RandR to advertise to X11 apps?
> > 
> > If it is the physical output resolution, the window size in logical
> > units will be wrong unless XWM can reliably detect the X11 application
> > is actually intending to use the physical resolution. How would one do
> > that?  
> 
> This is about part (B).
> 
> How? I'd say, probably only using "best effort" emulation strategies,
> like detecting fullscreen windows matching the physical resolution set
> and assuming anything else should be hidden, then using wp_viewporter or
> something to make it to the logical size, which should result in no
> scaling being done by the compositor.

You imply that the heuristics to detect fullscreen windows would be in
Xwayland. Is that feasible?

So far Xwayland has been very ignorant of any window management
concepts. I'm not sure if that is intentional or just inherited from
Xorg, i.e. is there an actual reason to avoid window management in the
X server.

As much as I'd like to have Xwayland wl_surfaces handled as any
wl_surface, I'm not so sure about that anymore.

> > If it is the logical output size, the window size in logical units will
> > be correct, but the X11 application cannot provide a buffer of the
> > physical output resolution, which means the Wayland compositor must
> > scale the image. If the buffer can hit direct scanout, that should be
> > no problem, but otherwise it could be a performance issue. (I agree to
> > prefer this option.)  
> 
> It'll be a regression quality wise though, as something somewhere will
> have to scale up from the logical to the physical resolution. Long term,
> this is not acceptable as it'll mean games and other applications where
> graphics quality is important that also won't be ported would regress.

It depends on the application. Applications using pixel units would be
unreadable if not scaled, but games could often be using relative units
so would scale automatically and effectively be "HiDPI aware" even if
not intending it. And then there will likely be some where 3D view
scales nicely but some UI elements are in pixel units and perhaps
lacking a scaling setting that would make them readable.

I think you are right, and we should aim for X11 buffer matching the
physical output resolution. The other case where scaling is actually
desired could be solved via, say, RandR video modes induced scaling.
Doing it the other way would not work.

> However, as a first step I think this is what we should do since it's
> trivial. This is what (A) does.

Right.

> > 
> > In any case, there is a single (default) "scaling factor" covering all
> > X11 clients in an Xwayland instance, because neither Xwayland nor XWM
> > have per-window knowledge of the scaling the X11 app used. Also, X11
> > apps are laid out in a global coordinate space spanning all outputs
> > (within a X11 SCREEN), and also input works in that global coordinate
> > space with trivial conversions to per-window coordinates. I assume
> > Xwayland converts per-wl_surface input coordinates to global input
> > coordinates in the X11 space before dispatching to the X11 windows,
> > which means that the Wayland server must send input events taking that
> > into account. How will input work in the proposals?
> > 
> > I do not like the idea of special-casing the Xwayland Wayland client in
> > the Wayland compositor outside of XWM, even less if it requires special
> > mangling for input as well. How much special casing do you do? Or maybe
> > I'm just biased and it would actually be fairly easy to implement in
> > Weston with a little bit of API... Quentin might have some opinions
> > here.  
> 
> Are you talking about (C) here? I don't really have a thought out idea
> or plan how it should work, but EWMH is mentioned, but whether Xwayland
> should play a part or not I'm not sure. I suspect it'll require changes
> to X11 clients to make it work though.

It is about (C) yes, but in the generic case where you have both
low-dpi and hi-dpi X11 windows up. Think about a hi-dpi X11 main
window, with a low-dpi menu window on it, be that a child window or an
override-redirect window. Can you make that work in both window
positioning and input coordinate wise?

I haven't thought it through, but I have a strong suspicion it cannot
work.

Yes, the mangling paragraph is about (C).

> > > Regarding this particular discussion about the need of xdg_output and
> > > logical size/position, I am completely open to suggestions, how would
> > > see Xwayland work with both weston and mutter as they are now and
> > > mutter in the future when it implements fractional scaling, without
> > > adding logical size in xdg_output (or even not adding xdg_output at
> > > all)?  
> > 
> > I wouldn't say "without", I just do not understand how it all is
> > supposed to work in a mixed-dpi hardware setup. If we design an
> > interface that just cannot work in a mixed-dpi setup, we should be
> > able to throw it away later.
> > 
> > So I'm starting to think that the way forward is to design the new
> > interface such that it is easy to deprecate if it turns out to be a
> > dead end, or keep it if turns out future-proof.  
> 
> Well, this is an unstable wayland-protocols protocol, so we are
> completely free to abandon it right? If we need xdg_output to do
> something else after having abandoned the purpose of it now, we're also
> free to reinvent it.

Yeah, that should help a lot. I was thinking of stable xdg_shell
protocols needing xdg_output for something, which is not possible if
xdg_output is still unstable.


> > Maybe the definition of "mixed-dpi support" is what we disagree here.
> > 
> > In my mind, Xwayland cannot provide full mixed-dpi support, because X11
> > is simply incapable of it with current toolkits. It can provide partial
> > mixed-dpi support however, where the Wayland compositor is
> > automatically scaling, assuming all X11 apps are configured to use the
> > same scale factor.
> > 
> > It same as with Wayland apps: HiDPI unaware apps get partial mixed-dpi
> > support, while HiDPI aware apps get the full mixed-dpi support.
> > 
> > This would classify all HiDPI-aware X11 apps as being somewhere between
> > HiDPI unaware and aware Wayland apps: they cannot communicate their
> > scaling factor, but they can be configured as a group to always draw
> > with a given scaling factor.
> >   
> > > However, nothing would stop an x11 client or toolkit from querying
> > > the monitor's mode and physical size (using xrandx), computing the
> > > current DPI depending on the monitor it resides and adapting its
> > > rendering based on that. X11 has all the mechanisms in place for
> > > that, it's just that most clients won't do that (iirc, Firefox has
> > > something like that at some point, not sure it's still used though)  
> > 
> > That's actually a case I didn't even think of. I've only been thinking
> > about apps that want to make their window cover an output, like games.
> > This is a very good topic to bring up.
> >   
> > > That makes a wide range of different x11 clients who behave
> > > differently:
> > > 
> > >  - Plain old x11 clients, who don't know anything about DPI
> > >  - gtk+/GNOME clients, who base their rendering on a specific
> > > xsettings Gdk/WindowScalingFactor and/or a envvar read by clutter,
> > >  - Some other app trying to compute DPI themselves (very few, I think
> > > of Firefox, LibreOffice maybe? not sure at all about those)
> > > 
> > > I'm don't see how Xwayland could change its behavior for each app,
> > > xrandr is not per client. I guess we could come up with a new window
> > > property that Xwayland could monitor and set the buffer scale
> > > accordingly, maybe? Anyhow, that's going off topic wrt the xdg_output
> > > protocol, I'm afraid.  
> > 
> > I agree, I don't see the worth in implementing per-X11-window scales.
> > 
> > Therefore we need to stick with one scale over all X11 clients. If
> > there are plain old clients in that set, then the only workable factor
> > for X11 is scale=1. Would you agree with that?
> >   
> > > The "benefit" I see in "lying" to the clients in xrandr by
> > > advertising a lower mode than actual is that it solves the problem in
> > > a consistent way for all these cases even those clients who try to
> > > compute the DPI themselves).  
> > 
> > How does it solve it for all the cases? Does it not help only with the
> > third case, while the first and second are unaffected? Not to belittle
> > the benefit, just wanting to be accurate.  
> 
> For the first case, this proposal makes it at least work at all, i.e.
> xterm becomes readable by default.

The only thing xterm needs to be readable is that the compositor scales
it up on HiDPI screens, which I believe already works in Weston today.
Hence I am again confused, since the only way it would not work is the
old mutter behaviour, where as to my understanding it can never work:
xterm is not HiDPI aware and does not look at RandR info AFAIK.

Did I misunderstand what proposal you referred to?

> For the second case, in GNOME, we (at least now) simply set the Xsetting
> scaling factor to 1 via gnome-settings-daemon.

You must be talking about the new mutter design where the compositor is
scaling the X11 windows, right?

> For the third case, as Olivier mentiones, makes self calculating clients
> think the resolution isn't that high so they might avoid scaling up.

Right, those are the only ones actually looking at RandR.

You can see how having to consider both the old and the new mutter
designs is confusing me. And I haven't even thought about fractional
scaling yet.

Taking a step back, what was the point of this "chapter"? Was it to
violently agree on how to make all X11 clients use the same scaling
factor: 1?



> > > If randr advertise the output mode 3840×2160 as 2560×1620 with a
> > > fractional scale of 1.5, then the compositor would scale the buffer
> > > up by that fractional scaling factor, i.e. 1.5, which gives the
> > > expected 3840×2160.
> > > 
> > > I don't see how downscaling an already tiny x11 window would work.
> > > Take a basic x11 app, say xterm, with its default fixed font size. It
> > > looks "fine" on a low DPI monitor, but is completely unreadable on a
> > > hidpi screen (at least for my old eyes), so the goal would be to
> > > scale it up, not down.
> > > 
> > > The scale factor of 2 advertised in wl_output would be for Wayland
> > > native clients, so they can set their buffer scale to that scale 2,
> > > and then be downscaled by the compositor to achieve the expected
> > > fractional scale of 1.5 with best rendering on screen.  
> > 
> > Oh yes, I forgot that the X11 apps in this case are always drawing with
> > scale=1, so they are expected to be scaled up indeed. I must have been
> > thinking about all of X11 land being configured with scale=2.
> >   
> > > If the compositor was to scale Xwayland surfaces by 2 as well and  
> > 
> > I think you mean the X11 apps are configured to draw in scale=2, right?
> >   
> > > then downscale to achieve the expected fractional scale of 1.5, it
> > > would downscale by a factor of (2÷1.5) = 1.33, in which case it would
> > > simply advertise a logical size of 5120×2880 (i.e [3840 × 2 ÷
> > > 1.5]×[2160 × 2 ÷ 1.5]) - That's the whole idea behind the logical
> > > size, whatever the compositor does eventually, Xwayland gets it right
> > > as long as the compositor advertise the expected size once the
> > > scaling factor -whatever it is- is applied, if that makes any sense.
> > > 
> > > At least it's how I understood it from the description from:
> > > 
> > > https://mail.gnome.org/archives/gnome-shell-list/2017-June/msg00000.html
> > > 
> > > But I might be wrong in my understanding :)  
> > 
> > Yes, now we are onto something. This feels good, considering outputs in
> > isolation. I didn't actually go through the numbers yet to make sure the
> > example is right, though.  
> 
> Telling Xwayland the logical output is twice as large as the logical
> output actually is, then assuming surface are scale=2 would, I think,
> make HiDPI aware clients draw properly, but it'd also make things like
> xterm and gitk become completely unusable always right?

Yes, I think so.

I have understood that in the old mutter design and in X11 in general,
non-HiDPI-aware clients on a HiDPI output are always unusable. So that
would not be a regression.

It would be a regression in Weston, because Weston is scaling up
low-dpi X11 windows. (It's also scaling up hi-dpi X11 windows, but that
is an existing bug.)


> > No, I believe Xwayland converts input coordinates from per-wl_surface
> > to global X11 coordinates, and then does everything in the global X11
> > coordinate system which is all that X11 apps ever know about.
> > 
> > ISTR there have been Xwayland patches going back and forth on that
> > design, so I'm not sure what the current state actually is. It
> > definitely is tricky code in Xwayland.  
> 
> IIRC it'll just take the x/y in surface local coordinates and append the
> window position to get the "X" coordinate.

Right. What is "window position" if some windows are scale=1 and
some are scale=2?

If an X11 client positions one window relative to another, how do they
map to the global X11 coordinate system if the window scales differ?

Do we need to start scaling coordinates inside Xwayland? But all
coordinate spaces are visible to X11 clients, who do not expect any
scaling to be going on between them.

This is my gripe with any proposal to start supporting X11 windows with
differing scales, and the rationale behind my "all X11 must have a
single scaling factor". Granted, it is all based on the premise that we
apply the same partial mixed-dpi support to X11 as we have for Wayland
clients: the Wayland compositor is scaling as needed. The premise is
based on the fact that otherwise there are cases where the application
is unreadable.


Thanks,
pq
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.x.org/archives/xorg-devel/attachments/20170718/5bfa1a9d/attachment-0001.sig>


More information about the xorg-devel mailing list