Two Radeon Cards, Dell Dimension, Linux and x.org

Dave Airlie airlied at gmail.com
Mon May 1 22:56:21 PDT 2006


> Not to revive a long since deceased horse, but I just thought I'd mention
> for posterity that this problem seems to be fixed in XOrg 7.1 RC2.  I
> believe due to Bug #6751.
>
> I do still get a hard-lock if I enable DRI and try to use both cards at
> once, but as far as I know this has always been a problem...

So it works until you run glxgears on both cards? or it blows up before then?

could you try reboot, modprobe radeon and then starting X to see if it
makes any difference?

Dave.
>
> Thanks,
> Ricky
>
>
> On 3/23/06, Ricky Rivera <ricky.rivera at gmail.com> wrote:
> >
> > For the record, I tried 2.6.16-mm1 because i saw that the radeon memory
> mapping patch had been added to the DRM git tree, which mm includes.
> Unfortunately, it didn't make a noticable change.  (This is with xorg-server
> and the ati driver from head.)
> >
> >
> > --Ricky
> >
> >
> >
> > On 3/21/06, Thorsten Becker <thorsten.becker at gmx.de> wrote:
> > > Le Jeudi 16 Mars 2006 16:00, Thorsten Becker a écrit:
> > > > Xorg 7.0.0, kernel 2.6.15-gentoo-r1
> > > >
> > > > Hello,
> > > >
> > > > To create a dual seat system I bought a Readeon 7000 PCI card for my
> Dell
> > > > Dimension 4550, which has a Radeon 9700 already installed. The Problem
> is:
> > > > I can only get one card to work at a time. Trying to start an X server
> on
> > > > the "wrong card" results in locking up the system (it doesn't even
> write an
> > > > Xorg.0.log logfile)
> > > >
> > > > What I tried so far:
> > > >
> > > > BIOS settings: The somewhat crippled BIOS in the dell system only has
> a few
> > > > settings that could be relevant: "Primary Video Controller" is one of
> them,
> > > > it can be set to AGP or Auto.
> > > > If set to AGP, the AGP card works, the vga console is shown on the
> Monitor
> > > > that is connected to that card. I can start an X server for that card,
> and
> > > > it works.
> > > > If set to Auto, the PCI card works, and I can start an X server for
> that
> > > > card.
> > > >
> > > > But If it is set to AGP, and I try to start an X server for the PCI
> card,
> > > > the result is  a complete system lockup, same the other way round.
> > > >
> > > > I tried to get some useful debugging output by starting X for the
> "wrong
> > > > card" via ssh and get some information via the -verbose Option. Such a
> log
> > > > can be found here:
> > > > http://www.tuxdesk.de/Xremote.log
> > > >
> > > > The xorg.conf can be found here:
> > > > http://www.tuxdesk.de/xorg.conf.2120
> > > >
> > > > One time a short Xorg log was written. I out it here:
> > > > http://www.tuxdesk.de/Xorg.0.log
> > >
> > > Now I managed to get a longer log:
> > > http://www.tuxdesk.de/Xorg.1.log
> > >
> > > Last things I see:
> > > ===snip
> > >
> > > II) Loading /usr/lib/xorg/modules/libvgahw.so
> > > (II) Module vgahw: vendor="X.Org Foundation"
> > >         compiled for 7.0.0, module version = 0.1.0
> > >         ABI class: X.Org Video Driver, version 0.8
> > > (II) RADEON(0): vgaHWGetIOBase: hwp->IOBase is 0x03b0, hwp->PIOOffset is
> > > 0x0000
> > > (==) RADEON(0): RGB weight 565
> > > (II) RADEON(0): Using 6 bits per RGB (8 bit DAC)
> > > (II) Loading sub module "int10"
> > > (II) LoadModule: "int10"
> > > (II) Loading /usr/lib/xorg/modules/libint10.so
> > > (II) Module int10: vendor="X.Org Foundation"
> > >         compiled for 7.0.0, module version = 1.0.0
> > >         ABI class: X.Org Video Driver, version 0.8
> > > (II) RADEON(0): initializing int10
> > > (**) RADEON(0): Option "NoINT10" "true"
> > > (--) RADEON(0): Chipset: "ATI Radeon VE/7000 QY (AGP/PCI)" (ChipID =
> 0x5159)
> > > (--) RADEON(0): Linear framebuffer at 0xe0000000
> > > (--) RADEON(0): VideoRAM: 8192 kByte (64 bit SDR SDRAM)
> > > (II) RADEON(0): PCI card detected
> > > (**) RADEON(0): Forced into PCI mode
> > > (II) RADEON(0): Color tiling enabled by default
> > > (II) Loading sub module "ddc"
> > > (II) LoadModule: "ddc"
> > > (II) Loading /usr/lib/xorg/modules/libddc.so
> > > (II) Module ddc: vendor="X.Org Foundation"
> > >         compiled for 7.0.0, module version = 1.0.0
> > >         ABI class: X.Org Video Driver, version 0.8
> > > (II) Loading sub module "i2c"
> > > (II) LoadModule: "i2c"
> > > (II) Loading /usr/lib/xorg/modules/libi2c.so
> > > (II) Module i2c: vendor="X.Org Foundation"
> > >         compiled for 7.0.0, module version = 1.2.0
> > >         ABI class: X.Org Video Driver, version 0.8
> > > (II) RADEON(0): I2C bus "DDC" initialized.
> > >
> > > ===snip
> > >
> > > So I think there might be something strange happening when the I2C bus
> is
> > > initialized. If I get everything correct I2C is only needed to
> communicate
> > > with the monitor, so to rule everything out I would like to disable it.
> But
> > > setting the Option "NoDDC" to "yes" did not help.
> > >
> > > Any ideas?
> > >
> > > Thorsten
> > > _______________________________________________
> > > xorg mailing list
> > > xorg at lists.freedesktop.org
> > > http://lists.freedesktop.org/mailman/listinfo/xorg
> > >
> >
> >
>
>
> _______________________________________________
> xorg mailing list
> xorg at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/xorg
>
>



More information about the xorg mailing list