Not to revive a long since deceased horse, but I just thought I'd mention for posterity that this problem seems to be fixed in XOrg 7.1 RC2. I believe due to Bug #6751.<br><br>I do still get a hard-lock if I enable DRI and try to use both cards at once, but as far as I know this has always been a problem...
<br><br>Thanks,<br>Ricky<br><br><div><span class="gmail_quote">On 3/23/06, <b class="gmail_sendername">Ricky Rivera</b> <<a href="mailto:ricky.rivera@gmail.com">ricky.rivera@gmail.com</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div style="direction: ltr;">For the record, I tried 2.6.16-mm1 because i saw that the radeon memory mapping patch had been added to the DRM git tree, which mm includes. Unfortunately, it didn't make a noticable change. (This is with xorg-server and the ati driver from head.)
<br></div><div style="direction: ltr;"><span class="sg"><br>--Ricky</span></div><div style="direction: ltr;"><span class="e" id="q_10a2a170df52ea81_2"><br><br><div><span class="gmail_quote">On 3/21/06, <b class="gmail_sendername">
Thorsten Becker</b> <<a href="mailto:thorsten.becker@gmx.de" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">thorsten.becker@gmx.de</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Le Jeudi 16 Mars 2006 16:00, Thorsten Becker a écrit:<br>> Xorg 7.0.0, kernel 2.6.15-gentoo-r1<br>><br>> Hello,<br>><br>> To create a dual seat system I bought a Readeon 7000 PCI card for my Dell<br>> Dimension 4550, which has a Radeon 9700 already installed. The Problem is:
<br>> I can only get one card to work at a time. Trying to start an X server on<br>> the "wrong card" results in locking up the system (it doesn't even write an<br>> Xorg.0.log logfile)<br>><br>> What I tried so far:
<br>><br>> BIOS settings: The somewhat crippled BIOS in the dell system only has a few<br>> settings that could be relevant: "Primary Video Controller" is one of them,<br>> it can be set to AGP or Auto.
<br>> If set to AGP, the AGP card works, the vga console is shown on the Monitor<br>> that is connected to that card. I can start an X server for that card, and<br>> it works.<br>> If set to Auto, the PCI card works, and I can start an X server for that
<br>> card.<br>><br>> But If it is set to AGP, and I try to start an X server for the PCI card,<br>> the result is a complete system lockup, same the other way round.<br>><br>> I tried to get some useful debugging output by starting X for the "wrong
<br>> card" via ssh and get some information via the -verbose Option. Such a log<br>> can be found here:<br>> <a href="http://www.tuxdesk.de/Xremote.log" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">
http://www.tuxdesk.de/Xremote.log</a><br>><br>> The
xorg.conf can be found here:<br>> <a href="http://www.tuxdesk.de/xorg.conf.2120" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">http://www.tuxdesk.de/xorg.conf.2120</a><br>><br>> One time a short Xorg log was written. I out it here:
<br>> <a href="http://www.tuxdesk.de/Xorg.0.log" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">
http://www.tuxdesk.de/Xorg.0.log</a><br><br>Now I managed to get a longer log:<br><a href="http://www.tuxdesk.de/Xorg.1.log" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">http://www.tuxdesk.de/Xorg.1.log
</a><br><br>Last things I see:<br>===snip<br><br>II) Loading /usr/lib/xorg/modules/libvgahw.so
<br>(II) Module vgahw: vendor="X.Org Foundation"<br> compiled for 7.0.0, module version = 0.1.0<br> ABI class: X.Org Video Driver, version 0.8<br>(II) RADEON(0): vgaHWGetIOBase: hwp->IOBase is 0x03b0, hwp->PIOOffset is
<br>0x0000<br>(==) RADEON(0): RGB weight 565<br>(II) RADEON(0): Using 6 bits per RGB (8 bit DAC)<br>(II) Loading sub module "int10"<br>(II) LoadModule: "int10"<br>(II) Loading /usr/lib/xorg/modules/libint10.so
<br>(II) Module int10: vendor="X.Org Foundation"<br> compiled for 7.0.0, module version = 1.0.0<br> ABI class: X.Org Video Driver, version 0.8<br>(II) RADEON(0): initializing int10<br>(**) RADEON(0): Option "NoINT10" "true"
<br>(--) RADEON(0): Chipset: "ATI Radeon VE/7000 QY (AGP/PCI)" (ChipID = 0x5159)<br>(--) RADEON(0): Linear framebuffer at 0xe0000000<br>(--) RADEON(0): VideoRAM: 8192 kByte (64 bit SDR SDRAM)<br>(II) RADEON(0): PCI card detected
<br>(**) RADEON(0): Forced into PCI mode<br>(II) RADEON(0): Color tiling enabled by default<br>(II) Loading sub module "ddc"<br>(II) LoadModule: "ddc"<br>(II) Loading /usr/lib/xorg/modules/libddc.so<br>
(II) Module ddc: vendor="X.Org Foundation"<br> compiled for 7.0.0, module version = 1.0.0<br> ABI class: X.Org Video Driver, version 0.8<br>(II) Loading sub module "i2c"<br>(II) LoadModule: "i2c"
<br>(II) Loading /usr/lib/xorg/modules/libi2c.so<br>(II) Module i2c: vendor="X.Org Foundation"<br> compiled for 7.0.0, module version = 1.2.0<br> ABI class: X.Org Video Driver, version 0.8<br>(II) RADEON(0): I2C bus "DDC" initialized.
<br><br>===snip<br><br>So I think there might be something strange happening when the I2C bus is<br>initialized. If I get everything correct I2C is only needed to communicate<br>with the monitor, so to rule everything out I would like to disable it. But
<br>setting the Option "NoDDC" to "yes" did not help.<br><br>Any ideas?<br><br>Thorsten<br>_______________________________________________<br>xorg mailing list<br><a href="mailto:xorg@lists.freedesktop.org" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">
xorg@lists.freedesktop.org</a><br><a href="http://lists.freedesktop.org/mailman/listinfo/xorg" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">http://lists.freedesktop.org/mailman/listinfo/xorg</a>
<br></blockquote></div><br>
</span></div></blockquote></div><br>