Muxless Dual AMD + Xorg 1.14 = blank screen

Alexander Konotop alexander.konotop at gmail.com
Thu Sep 26 12:13:59 PDT 2013


Hello, guys :-)

Couldn't someone tell me if it's an issue or it really won't work as
still unimplemented, or maybe I'm doing smth wrong:

I have an AMD APU + External AMD card (Muxless). here's lspci --vv output for
them:

00:01.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Sumo [Radeon HD 6480G] (prog-if 00 [VGA controller])
	Subsystem: Hewlett-Packard Company Device 168c
	Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx+
	Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
	Latency: 0, Cache Line Size: 64 bytes
	Interrupt: pin A routed to IRQ 48
	Region 0: Memory at b0000000 (32-bit, prefetchable) [size=256M]
	Region 1: I/O ports at 7000 [size=256]
	Region 2: Memory at d8100000 (32-bit, non-prefetchable) [size=256K]
	Expansion ROM at <unassigned> [disabled]
	Capabilities: <access denied>
	Kernel driver in use: radeon
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Seymour [Radeon HD 6400M/7400M Series] (prog-if 00 [VGA controller])
	Subsystem: Hewlett-Packard Company Radeon HD 6470M
	Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx+
	Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
	Latency: 0, Cache Line Size: 64 bytes
	Interrupt: pin A routed to IRQ 50
	Region 0: Memory at c0000000 (64-bit, prefetchable) [size=256M]
	Region 2: Memory at d8000000 (64-bit, non-prefetchable) [size=128K]
	Region 4: I/O ports at 6000 [size=256]
	Expansion ROM at d8020000 [disabled] [size=128K]
	Capabilities: <access denied>
	Kernel driver in use: radeon


I've at first read this page:
http://xorg.freedesktop.org/wiki/RadeonFeature/
especially the phrase "X Server 1.14 is required to support rendering and display from different cards"
and "MOSTLY" at the "Hybrid Graphics/PowerXpress/Enduro" row for my cards.
and subscribed to this mail list to see any news on Dual Graphics support or bugs.

At second I'we waited for xorg 1.14 to appear in my distro.

At third I've tried to setup my two cards using this post:
http://forums.linuxmint.com/viewtopic.php?f=49&t=139413
So I'm using xrandr to switch the PRIME card and it looks like workng at first:

$ glxinfo | egrep "(OpenGL vendor|OpenGL renderer|OpenGL version)" 
OpenGL vendor string: X.Org
OpenGL renderer string: Gallium 0.4 on AMD SUMO
OpenGL version string: 3.0 Mesa 9.1.6

$ DRI_PRIME=1 glxinfo | egrep "(OpenGL vendor|OpenGL renderer|OpenGL version)" 
OpenGL vendor string: X.Org
OpenGL renderer string: Gallium 0.4 on AMD CAICOS
OpenGL version string: 3.0 Mesa 9.1.6

But if I try to run a fullscreen game specifying DRI_PRIME=1 - it shows blank screen.
I can write "/quit" in the game's console (being blind) and it really quits - so it's not a hang.
Sometimes xserver restarts while trying to run a game this way. Syslog and Xorg.0.log are empty.

After that I tried to run non-fullscreen (I've run glxgears). glxgears shows twice more fps but it's window is just black.

So it seems like xrandr asks the xserver to render on a second GPU but it doesn't ask it to use the first GPU to output video.

So maybe I need to configure xorg.conf somehow? I've read somewhere that there's no need to do it if I use xrandr to switch
and no need to restart Xserver for muxless systems.
Btw if I restart xserver my xrandr settings will fall back to using APU's GPU as primary.

To start an Xserver I'm using lightdm.

Best regards
Alexander


More information about the xorg-driver-ati mailing list