New Video Decode and Presentation API

Andy Ritger aritger at nvidia.com
Wed Nov 19 15:31:27 PST 2008


Hello, Torgeir.

Sorry for the slow response.  Comments inline below:


On Tue, 18 Nov 2008, Torgeir Veimo wrote:

>
> On 15 Nov 2008, at 04:28, Andy Ritger wrote:
>
>> I'm pleased to announce a new video API for Unix and Unix-like platforms,
>> and a technology preview implementation of this API from NVIDIA.
>
>>     * Defines an API for post-processing of decoded video, including
>>       temporal and spatial deinterlacing, inverse telecine, and noise
>>       reduction.
>
> What about interlaced output and TV output. Is that still possible with this 
> API?
>
> Is field parity observed when outputting interlaced material? I think it's 
> equally important to have good support for baseline mpeg2 in addition to 
> other codecs, and this would imply that interlaced, field parity correct 
> mpeg2 output on standard s-video / rgb should be fully working.

If the application takes advantage of VDPAU's de-interlacing (the current
mplayer patches to use VDPAU do _not_ yet enable VDPAU's deinterlacing), 
then the end result of VDPAU's presentation queue is a progressive frame.

If the application doesn't enable de-interlacing, NVIDIA's VDPAU
implementation will currently copy the weaved frame to the "progressive"
surface, and whether it will come out correctly will depend whether the
window's offset from the start of the screen is odd or even.


>>     * Defines an API for timestamp-based presentation of final video
>>       frames.
>
> This is interesting. Can such timestamps be synchronised with HDMI audio in 
> some ways to guarantee judder free and audio resync free output? Ie, no need 
> to resample audio to compensate for clock drift?

You can query the current time, the time any previously presented frame
was first displayed, and specify the desired presentation time of each
frame as it is enqueued in the PresentationQueue.  This information is
hopefully sufficient to synchronize the audio stream with the presented
video frames.

My understanding of audio over HDMI is that the GPU normally does a
simple pass-through of the audio stream coming from the audio hardware in
the computer.  The VDPAU presentation queue's timestamps use a different
crystal than the audio hardware, so I believe there would always be the
potential for drift.


>>     * Defines an API for compositing sub-picture, on-screen display,
>>       and other UI elements.
>
> I assume this indicates that video can easily be used as textures for opengl 
> surfaces, and that opengl surfaces (with alpha transparency support) can 
> easily be superimposed over video output?

Yes.  The VDPAU presentation queue can be created wrt any X drawable.
So you should be able have VDPAU deliver final frames to an X pixmap,
and then use the GLX_EXT_texture_from_pixmap extension to texture from
that pixmap.  Note: there is not yet an API in place for synchronizing
between VDPAU and OpenGL, so this would be somewhat racy today.


>> These patches include changes against libavcodec, libavutil, ffmpeg,
>
>> and MPlayer itself; they may serve as an example of how to use VDPAU.
>
> Would it be possible to provide a standalone playback test program that 
> illustrates the api usage outside of mplayer?

The test program would need to decode the data to extract the bit stream
to pass into VDPAU, so it is a non-trivial amount of code.

I'm sure if someone wanted to write a standalone VDPAU test app, others
interested in using VDPAU would benefit.


>> If other hardware vendors are interested, they are welcome to also
>> provide implementations of VDPAU.  The VDPAU API was designed to allow
>> a vendor backend to be selected at run time.
>
> It would be helpful to have an open source "no output" backend to allow 
> compile & run test when supported hardware is not available. This would also 
> help accelerate support for any software backend if anyone should choose to 
> implement one.

You mean a pure software implementation of VDPAU?  Yes, that would
be interesting.  Someone could probably do that by wiring up the VDPAU
entry points to an existing software implementation of these codecs.
For now, the engineers at NVIDIA are going to focus on bugfixing our
GPU-based implementation of VDPAU.


>> VC-1 support in NVIDIA's VDPAU implementation currently requires GeForce
>> 9300 GS, GeForce 9200M GS, GeForce 9300M GS, or GeForce 9300M GS.
>
>
> So only mobile chipsets supports VC-1 output currently?
>
> It seems that the marketplace seems to be missing a 9500 GT based gfx card 
> with passive cooling, low form factor and hdmi enabled output...

The GeForce 9300 GS is a desktop GPU.  I expect there are passive
cooled configs.

Thanks,
- Andy


> -- 
> Torgeir Veimo
> torgeir at pobox.com
>
>
>
>
>



More information about the xorg mailing list