RFC: Xv field order

Krzysztof Halasa khc at pm.waw.pl
Wed Jun 24 10:33:47 PDT 2009


Thomas Hilber <xorg at toh.cx> writes:

>> I wonder what is the difference between the on-air frame rate and
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>> your card's (fixed) one? 100 ppm would need one second of additional
   ^^^^^^^^^^^^^^^^^^^^^^^
>> (initial) buffering for ca. 3 hours of playback. I think the players
>> initially buffer more than 1 second, don't they?
>
> the problem is not the absolute accuracy of your graphics card.
>
> the problem is: there are always differences between the on-air frame
> rate and the card's fixed one if both are not synchronized.

I already wrote about the difference and not about a card only, right?
The broadcast is the most accurate and stable (at least here), almost
always at 50 Hz +- maybe 10 ppm, maybe less than that.

> Even if
> it were possible to adjust your graphics card exactly to 50Hz it 
> would help nothing. Because the on-air frame rate always floats around
> a litte.

Of course. That's why I wrote about a buffering.

> Besides that the graphics card of course never is running
> at exactly 50Hz. May be somewhere in the range of 49.90 and 50.10 if
> your are very optimistic.

No. In fact, I have just verified with a freq meter, it's orders of
magnitude better. Aren't you using a miscalculated modeline (not 13.5
MHz pixel clock or invalid total number of pixels or lines or something
like that)? 0.2% difference is excessive.

> The only solution to avoid this judder is to synchronize the graphics
> card to the base clock of your software player. This is what the
> vga-sync-fields patch does.
>
> Any buffering won't help. Because the problem arises between the
> software player's base clock and the graphics card. And not between
> the TV station and the software player.

That would be true if you use some player-internal time base. But
players can, and do, synchronize to the graphics card. IOW the card
becomes the time base, and everything (sound) is adjusted to it.

Of course this doesn't make much sense if your card is 0.2% faster than
the broadcast, as for a 3-hrs playback you'd need to buffer over 20
seconds (= waiting 20+ s when changing channels). But if the difference
is reasonable (100 ppm is quite a lot, WRT a pair of DPLLs controlled by
quartz oscillators) buffering probably makes sense.

> Theoretically you could synchronize the software players clock to
> the (fixed) graphics card clock for replay of recordings. Even
> that is not a common practice under linux because the software 
> player has no clue about the graphics card frame rate.

Depends on the output driver. I think all "sync" drivers for mplayer
and xine do it, don't they? (E.g. matrox driver)

> For live-TV this is not possible anyway.

It's perfectly possibly if your card's frame rate is much better than
those +- 0.2%. With 0.2%, well - that's impossible.

> Your assumption of 
> 1 second buffer per 3 hours of playback is way too optimistic. 

Fine, let's assume we have a system with a big frame rate difference
(the DPLL has a discrete set of frequencies it can produce, though it
can do 13.5 MHz). I'm not saying you stuff your "VCO" code. I'm just not
interested in it, at least at this point, and I'm going to implement
(actually, have mostly done it) field sync only, based on the IRQs
generated by the card, instead of Xserver peeking the registers all the
time etc.
This is simple, fast, and reliable. Actually, the best one can do
(though it doesn't mean your "VCO" code can't make it even better).

Now the question is the interface between players and Xserver, not the
internals of the driver, which BTW I'm using for almost two years
without issues.
-- 
Krzysztof Halasa



More information about the xorg mailing list