Muxless Dual AMD + Xorg 1.14 = blank screen

Alexander Konotop alexander.konotop at gmail.com
Thu Sep 26 23:45:45 PDT 2013


Thank You very much!

I've enabled compositing and now I'm able to run a game in a window
mode. It gives almost twice better fps in sauerbraten and xonotic.
Fullscreen gives blank screen. Maybe xfce's window manager detects
start of fullscreen applications and disables compositing or smth else
happens.

It's interesting that glxgears now shows ~1100 fps with
internal card and compositing enabled, but shows ~970 fps with external
card (DRI_PRIME=1).

В Thu, 26 Sep 2013 16:56:14 -0400
Alex Deucher <alexdeucher at gmail.com> пишет:

> On Thu, Sep 26, 2013 at 3:13 PM, Alexander Konotop
> <alexander.konotop at gmail.com> wrote:
> > Hello, guys :-)
> >
> > Couldn't someone tell me if it's an issue or it really won't work as
> > still unimplemented, or maybe I'm doing smth wrong:
> >
> > I have an AMD APU + External AMD card (Muxless). here's lspci --vv
> > output for them:
> >
> > 00:01.0 VGA compatible controller: Advanced Micro Devices, Inc.
> > [AMD/ATI] Sumo [Radeon HD 6480G] (prog-if 00 [VGA controller])
> > Subsystem: Hewlett-Packard Company Device 168c Control: I/O+ Mem+
> > BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR-
> > FastB2B- DisINTx+ Status: Cap+ 66MHz- UDF- FastB2B- ParErr-
> > DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx- Latency:
> > 0, Cache Line Size: 64 bytes Interrupt: pin A routed to IRQ 48
> > Region 0: Memory at b0000000 (32-bit, prefetchable) [size=256M]
> > Region 1: I/O ports at 7000 [size=256] Region 2: Memory at d8100000
> > (32-bit, non-prefetchable) [size=256K] Expansion ROM at
> > <unassigned> [disabled] Capabilities: <access denied>
> >         Kernel driver in use: radeon
> > 01:00.0 VGA compatible controller: Advanced Micro Devices, Inc.
> > [AMD/ATI] Seymour [Radeon HD 6400M/7400M Series] (prog-if 00 [VGA
> > controller]) Subsystem: Hewlett-Packard Company Radeon HD 6470M
> > Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr-
> > Stepping- SERR- FastB2B- DisINTx+ Status: Cap+ 66MHz- UDF- FastB2B-
> > ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
> > Latency: 0, Cache Line Size: 64 bytes Interrupt: pin A routed to
> > IRQ 50 Region 0: Memory at c0000000 (64-bit, prefetchable)
> > [size=256M] Region 2: Memory at d8000000 (64-bit, non-prefetchable)
> > [size=128K] Region 4: I/O ports at 6000 [size=256] Expansion ROM at
> > d8020000 [disabled] [size=128K] Capabilities: <access denied>
> >         Kernel driver in use: radeon
> >
> >
> > I've at first read this page:
> > http://xorg.freedesktop.org/wiki/RadeonFeature/
> > especially the phrase "X Server 1.14 is required to support
> > rendering and display from different cards" and "MOSTLY" at the
> > "Hybrid Graphics/PowerXpress/Enduro" row for my cards. and
> > subscribed to this mail list to see any news on Dual Graphics
> > support or bugs.
> >
> > At second I'we waited for xorg 1.14 to appear in my distro.
> >
> > At third I've tried to setup my two cards using this post:
> > http://forums.linuxmint.com/viewtopic.php?f=49&t=139413
> > So I'm using xrandr to switch the PRIME card and it looks like
> > workng at first:
> >
> > $ glxinfo | egrep "(OpenGL vendor|OpenGL renderer|OpenGL version)"
> > OpenGL vendor string: X.Org
> > OpenGL renderer string: Gallium 0.4 on AMD SUMO
> > OpenGL version string: 3.0 Mesa 9.1.6
> >
> > $ DRI_PRIME=1 glxinfo | egrep "(OpenGL vendor|OpenGL
> > renderer|OpenGL version)" OpenGL vendor string: X.Org
> > OpenGL renderer string: Gallium 0.4 on AMD CAICOS
> > OpenGL version string: 3.0 Mesa 9.1.6
> >
> > But if I try to run a fullscreen game specifying DRI_PRIME=1 - it
> > shows blank screen. I can write "/quit" in the game's console
> > (being blind) and it really quits - so it's not a hang. Sometimes
> > xserver restarts while trying to run a game this way. Syslog and
> > Xorg.0.log are empty.
> >
> > After that I tried to run non-fullscreen (I've run glxgears).
> > glxgears shows twice more fps but it's window is just black.
> >
> > So it seems like xrandr asks the xserver to render on a second GPU
> > but it doesn't ask it to use the first GPU to output video.
> >
> > So maybe I need to configure xorg.conf somehow? I've read somewhere
> > that there's no need to do it if I use xrandr to switch and no need
> > to restart Xserver for muxless systems.
> 
> On a muxless system, there are often no displays connected to one of
> the GPUs so it can only be used for rendering.  You can't force X to
> start on that GPU because there are no displays.  You don't need an
> xorg config.
> 
> > Btw if I restart xserver my xrandr settings will fall back to using
> > APU's GPU as primary.
> >
> > To start an Xserver I'm using lightdm.
> 
> You need to be using a compositing window manager to start with.  See
> this page for more info:
> http://nouveau.freedesktop.org/wiki/Optimus/
> 
> You can probably resize the window to make the display appear.  I
> think there may be some damage tracking issues that need to be sorted
> out in the xserver, but no one has had a chance to look into them yet.
> 
> Alex



More information about the xorg-driver-ati mailing list