DVI output not visible on LCD with DVI, VGA output works -- (MergedFB on Ubuntu Edgy, Powerbook G4 ATI radeon 9600 M10)

Alex Deucher alexdeucher at gmail.com
Wed Jan 3 06:18:48 PST 2007


On 1/3/07, Martin Langhoff <martin.langhoff at gmail.com> wrote:
> [I'm not on the list, please CC ;-) ]
>
> Hi! I have been playing with the external monitor output and I have 2
> LCDs behaving differently. One (at work) has no problem with the DVI
> monitor output directly to the DVI input. The otherone, at home,
> accepts the signal only through the VGA adapter -- the direct DVI
> signal does not work, and the monitor claims it sees no signal at all.
>
> So I am after a hint or two on how to debug this one. I don't have any
> gear to read the DVI signal to understand whether it is correct or
> not.
>
> I have been working for a few days already on my MergedFB
> configuration on this aluminium PowerBook G4, 17' with a Radeon 9600
> M10 (RV350). Software is vanilla Ubuntu Edgy (plus a few things from
> edgy backports). I initially followed the advise laid out at
> http://ozlabs.org/~jk/docs/mergefb/ and had things going _perfectly
> fine_ using MergedFB with the monitor at work (LM950), using the DVI
> signal directly. It took 10 minutes and it was all go. Sweet.
>
> The LCD at home (a cheap 19', 1440x900), on the other hand, does _not_
> like the signal from DVI, and yet runs correctly with the DVI->VGA
> adaptor (it does show some ghosting, though).
>
>  - The DDC info is read correctly regardless of whether I start with
> DVI or VGA cable connected. And swapping the cables around doesn't
> help. Regardless of what connector is on at X.org startup, I always
> get signal via VGA, and never get anything via DVI.
>
>  - There has been some discussion about reduced screen blanking, and
> how that can help LCDs at hi resolution over DVI. Setting
> ReducedBlanking to true didn't help, so fetched & compiled the cvt
> utility and setup a Modeline. It was a bit of challenge to get
> xorg.conf to acknowledge that the monitor definition wasn't there just
> dangling (as mergedfb does its own monitor setup, even if you never
> name it). Eventually, putting 2 monitor entries for the Screen section
> that corresponds to screen 0 got xorg to report in X.log that it is
> using the right Modeline:
>
> # from cvt   1440 900 60 -r
> Modeline "1440x900"  88.75  1440 1488 1520 1600  900 903 909 926  +HSync
>  -Vsync
>
> And still no dice. I don't know how to check WFT is happening on the
> DVI port itself. I have the modified radeontool that dumps registers,
> but I'm wary of messing about with them blind. Maybe there's a magic
> one that says "go" ;-) ?
>
> I did play with the modes proposed in the xmode script above, and with
> the "bioskeys" -- both work correctly when I am using VGA output.
>
> To be frank, I am not sure where to place the blame -- though I
> suspect that the monitor isn't top of the line HW and its DVI input
> reading may be subpar. On the other hand, it _is_ possible to make it
> go with the radeon -- as it runs correctly under OSX (10.3) just
> autodetected on boot or on plugging the cable in.
>
> any hints for further debugging welcome!
>

You need the radeon driver from git to support LVDS and DVI
simultaneously (Option "MonitorLayout" "LVDS, TMDS").  Older versions
of the driver had hard-coded output to crtc mappings.  That's why the
radeontool hack was needed on the page above.  Also, IIRC, some mac
cards reverse the DDC lines.

Alex

>
> martin
> pd: and happy new year!


More information about the Xorg-driver-ati mailing list