TV Out on radeon 7500
nikosapi
nikosapi at gmail.com
Tue Oct 16 19:57:33 PDT 2007
On October 14, 2007 11:14:37 you wrote:
> On 10/13/07, nikosapi <nikosapi at gmail.com> wrote:
> > On October 13, 2007 19:51:14 you wrote:
> > > On 10/13/07, nikosapi <nikosapi at gmail.com> wrote:
> > > > On October 13, 2007 18:53:45 you wrote:
> > > > > On 10/13/07, nikosapi <nikosapi at gmail.com> wrote:
> > > > > > Hello,
> > > > > >
> > > > > > I just got an old pci radeon 7500 card for its TV output
> > > > > > capabilities (I'm using two of the outputs on my nvidia card
> > > > > > already).
> > > > > >
> > > > > > I've upgraded to Ubuntu Gutsy and I've also gotten the latest
> > > > > > radeon drivers and xrandr from git. I got information on how to
> > > > > > do this from:
> > > > > > http://mailman.linux-thinkpad.org/pipermail/linux-thinkpad/2007-A
> > > > > >ugus t/03 9878.html
> > > > > >
> > > > > > Using xrandr I am able to issue the following commands which
> > > > > > makes the blue screen on my tv very jittery (I assume it's
> > > > > > because the card is feeding the TV a bad signal):
> > > > > >
> > > > > > xrandr --addmode S-video 800x600
> > > > > > xrandr --output S-video --mode 800x600
> > > > > >
> > > > > > Then if I try to set the output to ntsc (xrandr --output S-video
> > > > > > --set tv_standard ntsc) I get the following error:
> > > > > >
> > > > > > X Error of failed request: BadValue (integer parameter out of
> > > > > > range for operation)
> > > > > > Major opcode of failed request: 159 (RANDR)
> > > > > > Minor opcode of failed request: 13 ()
> > > > > > Value in failed request: 0x1b9
> > > > > > Serial number of failed request: 20
> > > > > > Current serial number in output stream: 21
> > > > > >
> > > > > > Although if I do: xrandr --output S-video --set tv_standard pal I
> > > > > > get no error message. Does this mean that it doesn't support
> > > > > > ntsc? The only other 'tv_standard' that I found that doesn't give
> > > > > > me an error is 'default' which produces an even more violent
> > > > > > jitter on the screen.
> > > > >
> > > > > According to your bios, you chip only supports PAL:
> > > > > (II) RADEON(1): Default TV standard: PAL
> > > > > (II) RADEON(1): TV standards supported by chip: PAL
> > > > >
> > > > > Right now I only add the standards that the bios lists. It's
> > > > > trivial to add them all though. they should all work in theory,
> > > > > but YMMV.
> > > > >
> > > > > Alex
> > > >
> > > > Hello Alex,
> > > >
> > > > Is this a problem with the card itself? (Should the bios support ntsc
> > > > if the box says it should?)
> > >
> > > It'll probably work fine.
> > >
> > > > If it isn't, is there a way to force the card into ntsc mode?
> > >
> > > You can edit the code to add any standards you want (radeon_output.c).
> > > I should probably just ignore the bios and add all the tv standards
> > > (or at least PAL and NTSC). Maybe this week at some point.
> > >
> > > Alex
> >
> > It worked! I found the code that checks to see if the ntsc mode is
> > supported in the card's bios and I commented some of it out. I attached a
> > patch just in case someone else needs it.
> >
> > Another thing I noticed was that without another monitor connected, I
> > couldn't just activate the tv output with xrandr. The fix for this was to
> > simply enable the CRT-0 output (before or after the tv output).
> > I have a little script which takes care of all of this for me:
>
> Are you saying TV-out doesn't work until you active the VGA output?
> Can you post your xorg log? I'd like take a look at your connector
> table. tv-out should not need to rely on any other output. This
> might be a bug.
>
> If it works reliably for your chip, you can try enabling the tv-out
> load detection either via output attribute at runtime (it's disabled
> by default), or by editing the source and enabling it (radeon_output.c
> ~line 1762). If it works, attaching the tv-out port and running
> --auto should automatically detect the TV and add the mode.
>
> Alex
Hey Alex,
Sorry for the late response, I've been busy :(
I checked out the latest code from git (2007-10-16 @ 21:30) and tried it out,
I now seem to be able to run the following xrandr commands and get an output
on the TV:
DISPLAY=:0.1
xrandr --addmode S-video 800x600
xrandr --output S-video --mode 800x600
xrandr --output S-video --set tv_standard ntsc
xrandr --output S-video --off
xrandr --output S-video --mode 800x600
# y default DVI-0 seems to be set to 1280x768, which makes the S-video output
# the same resolution. I assume this is because the driver mirrors VGA-0 or
# DVI-0 on the TV...
xrandr --output DVI-0 --mode 800x600
# btw, there is nothing but the TV plugged into the card
The output of xrandr --verbose before the last command:
http://pastebin.ca/739340
Here's my Xorg.0.log:
http://pastebin.ca/739346
Finally I decided to try removing all of the ati card's configuration from my
xorg.conf and create a new one just for the radeon card. Then I started a
second X server like so:
Xorg :1 -config /etc/X11/xorg.conf.radeon -novtswitch -sharevts
Then I tried to get the TV output working. When I tried to do:
xrandr --output S-video --mode 800x600
xrandr produced the following error:
xrandr: cannot find mode 800x600
* The reason I tried that is because eventually this is how I would like it to
* work, I don't want to have two connected desktops, It would be better if I
* could have two separate X servers with separate input devices, etc.
I looked back at radeon_output.c but I'm really not sure how to enable load
detection, sorry I'm not that good of a programmer ;) If there's a specific
way you would like me to test my card let me know, I wouldn't be surprised if
I was doing something wrong.
Thank you so much for your help,
nick
More information about the xorg
mailing list