internal screen concept
Jesse Barnes
jbarnes at virtuousgeek.org
Tue Mar 22 09:49:45 PDT 2011
On Tue, 22 Mar 2011 10:27:13 +1000
Dave Airlie <airlied at gmail.com> wrote:
> So I've been trying to work out how to add GPU offload support to the
> X server in some sort of useful fashion.
>
> Currently the prototype, just creates two screens, one for each GPU,
> and does some DRI2 magic to make the front buffer shared.
>
> However this leads to a lot of uglies on the protocol end, since you
> have 2 protocol screens, and lots of things start trying to do bad
> things with them.
>
> But we really do need DIX level screens as EXA and DRI2 rely on having
> one, and we need EXA/DRI2 for the offload driver to work.
>
> My first solution involves throwing the XFree86 DDX out and starting
> again, but this didn't seem like it would be acceptable.
>
> So I tried two methods quickly,
>
> a) add concept of internal screen to ScreenRec and ScrnInfoRec, add
> new value screenInfo.numProtocolScreens, add lots
> of if (screenInfo.screens[i].internalScreen) continue to places where
> the protocol meets the screen info struct, and we want to skip
> internal screens.
> Also return numProtocolScreens for number roots etc., add a call into
> the DIX to set a screen to internal state.
>
> b) split screenInfo.screens into screenInfo.screens and
> screenInfo.internalScreens, add a new AddInternalScreen interface, the
> XFree86 DDX
> then picks one or the other. This ran into the problem that the
> XFree86 DDX mostly assume that the index into screenInfo.numScreens is
> equal
> to the index into xf86Screens, and that knowledge is baked into the
> code in lots of places. Also lots of code gets called from ScreenInit,
> before pScrn->pScreen is setup, so we can't always just blindly
> derference that value.
>
> Now I could go and split xf86Screens as well, but that gets into the
> rewriting territory where I'd need to add a new probe ABI, since the
> code
> works by calling DDX->driver Probe->DDX PCI code->DDX add screen,
> later on DIX add screen.
>
> Anyways before I waste another few days I thought I'd throw it out
> there to see if anyone else has an idea or clue.
Yuck... yeah tying this into the DDX looks ugly no matter what. Maybe
you should just punt and support it under Wayland only. :)
What does this look like from the app side? How does a given app end
up running on the offload GPU? Is there a GLX or EGL extension that
adds bits to the config for choosing? Or a separate X display?
--
Jesse Barnes, Intel Open Source Technology Center
More information about the xorg-devel
mailing list