Maximum/native/recommended S-Video TVOut resolutions for IGP9100 radeon?

Mark Knecht markknecht at gmail.com
Mon Jul 20 17:08:32 PDT 2009


On Mon, Jul 20, 2009 at 4:11 PM, Alex Deucher<alexdeucher at gmail.com> wrote:
> On Mon, Jul 20, 2009 at 7:00 PM, Mark Knecht<markknecht at gmail.com> wrote:
>> Hi,
>>   I'm wondering how I determine what resolutions are supported using
>> xf86-driver-ati with TVOut turned on? My machine is an older Asus
>> Pundit-R with an ATI IGP9100 chipset. I have the Open Source driver
>> running and it's currently set to 800x600 (recommended on the Xorg
>> site) and I'm getting a pretty good picture compared to the older
>> closed-source ATI driver. Big improvement on the noise though. The old
>> ATI driver used a lot of CPU and made the fans run at high speed. So
>> far your Open Source driver doesn't seem to kick the fans up at all so
>> thanks for that.
>>
>>   I've read on the web that S-video was inherently limited to
>> 720x480. Is that true? Does the IGP9100 support that sort of
>> resolution? This machine is hooked only to a TV and only used for
>> MythTV so there are no other considerations other than getting a good
>> picture a few hours a day.
>
> tv-out always outputs native tv timing (NTSC or PAL, etc.).  The
> tv-out block has a scaler the downscales the desktop image to the
> native tv timing.  In theory you can scale any mode with the
> appropriate timing, but at the moment, the driver only supports
> 800x600.
>
> Alex
>

Alex,
   Thanks for the response. Let me see if I understand it correctly.

1) My MythTV backend server is tied to an HD Homerun digitizer and the
program is recorded on the backend at whatever resolution it comes in
on the cable, but let's assume it's 1080i just to make things clear.

2) My MythTV frontend requests to play the program and it's sent over
my network, probably still at 1080i.

3) The frontend program locally converts the program from 1080i to
either a) the resolution MythTV is set to play at or b) the screen
resolution. Those two may or may not be the same but I assume in my
case they are and it's currently 800x600.

4) Whatever the resolution is coming out of MythTV then the radeon
driver & TVOut hardware in the radeon chip then convert to whatever
X-video can actually handle which is something like 720x480?

   If that's fundamentally correct then at least would I be burning
less power or requiring less of the driver if I set the X11 screen
resolution to the same thing as S-Video?

Thanks much,
Mark


More information about the xorg-driver-ati mailing list