Maximum/native/recommended S-Video TVOut resolutions for IGP9100 radeon?

Alex Deucher alexdeucher at gmail.com
Mon Jul 20 22:21:27 PDT 2009


On Mon, Jul 20, 2009 at 8:15 PM, Mark Knecht<markknecht at gmail.com> wrote:
> On Mon, Jul 20, 2009 at 5:08 PM, Mark Knecht<markknecht at gmail.com> wrote:
>> On Mon, Jul 20, 2009 at 4:11 PM, Alex Deucher<alexdeucher at gmail.com> wrote:
>>> On Mon, Jul 20, 2009 at 7:00 PM, Mark Knecht<markknecht at gmail.com> wrote:
>>>> Hi,
>>>>   I'm wondering how I determine what resolutions are supported using
>>>> xf86-driver-ati with TVOut turned on? My machine is an older Asus
>>>> Pundit-R with an ATI IGP9100 chipset. I have the Open Source driver
>>>> running and it's currently set to 800x600 (recommended on the Xorg
>>>> site) and I'm getting a pretty good picture compared to the older
>>>> closed-source ATI driver. Big improvement on the noise though. The old
>>>> ATI driver used a lot of CPU and made the fans run at high speed. So
>>>> far your Open Source driver doesn't seem to kick the fans up at all so
>>>> thanks for that.
>>>>
>>>>   I've read on the web that S-video was inherently limited to
>>>> 720x480. Is that true? Does the IGP9100 support that sort of
>>>> resolution? This machine is hooked only to a TV and only used for
>>>> MythTV so there are no other considerations other than getting a good
>>>> picture a few hours a day.
>>>
>>> tv-out always outputs native tv timing (NTSC or PAL, etc.).  The
>>> tv-out block has a scaler the downscales the desktop image to the
>>> native tv timing.  In theory you can scale any mode with the
>>> appropriate timing, but at the moment, the driver only supports
>>> 800x600.
>>>
>>> Alex
>>>
>>
>> Alex,
>>   Thanks for the response. Let me see if I understand it correctly.
>>
>> 1) My MythTV backend server is tied to an HD Homerun digitizer and the
>> program is recorded on the backend at whatever resolution it comes in
>> on the cable, but let's assume it's 1080i just to make things clear.
>>
>> 2) My MythTV frontend requests to play the program and it's sent over
>> my network, probably still at 1080i.
>>
>> 3) The frontend program locally converts the program from 1080i to
>> either a) the resolution MythTV is set to play at or b) the screen
>> resolution. Those two may or may not be the same but I assume in my
>> case they are and it's currently 800x600.
>>
>> 4) Whatever the resolution is coming out of MythTV then the radeon
>> driver & TVOut hardware in the radeon chip then convert to whatever
>> X-video can actually handle which is something like 720x480?
>>
>>   If that's fundamentally correct then at least would I be burning
>> less power or requiring less of the driver if I set the X11 screen
>> resolution to the same thing as S-Video?
>>
>> Thanks much,
>> Mark
>>
>
> Or the alternative is that none of that matters because the OS radeon
> driver must be set at 800x600 or it doesn't work with TVOut. I guess
> that's what you are saying.

Xv isn't directly related, it's scaled separately based on the size of
the source and destination surfaces.  In your example the 1920x1080
image is downscaled to 800x600 if you use Xv to scale it to full
screen on a 800x600 pixel desktop. The image you are rendering to and
that's being displayed is currently 800x600 for tv-out; as I said you
could downscale other modes as well, it's just not implemented yet.
The actual timing that hits the tv is standard NTSC, PAL, etc.  If
someone implemented 1024x768 or 1280x1024 support for tv-out the
timing coming out of the tv port would still be standard tv-timing,
otherwise your tv wouldn't sync.  However, you desktop would appear on
the screen  to be the pre-scaled size from the perspective of the size
of your desktop.

Alex


More information about the xorg-driver-ati mailing list