Register XvMC video decoding acceleration

Christian König deathsimple at vodafone.de
Thu Jul 14 04:50:40 PDT 2011


Am Donnerstag, den 14.07.2011, 13:23 +0200 schrieb Michel Dänzer:
> > diff --git a/src/radeon_video.c b/src/radeon_video.c
> > index 58e3920..edd6d54 100644
> > --- a/src/radeon_video.c
> > +++ b/src/radeon_video.c
> > @@ -315,6 +315,16 @@ void RADEONInitVideo(ScreenPtr pScreen)
> >      if(num_adaptors)
> >         xf86XVScreenInit(pScreen, adaptors, num_adaptors);
> >  
> > +    if(texturedAdaptor) {
> > +       XF86MCAdaptorPtr xvmcAdaptor = RADEONCreateAdaptorXvMC(pScreen, texturedAdaptor->name);
> > +       if(xvmcAdaptor) {
> > +           if(!xf86XvMCScreenInit(pScreen, 1, &xvmcAdaptor))
> > +               xf86DrvMsg(pScrn->scrnIndex, X_ERROR, "[XvMC] Failed to initialize extension.\n");
> > +           else
> > +               xf86DrvMsg(pScrn->scrnIndex, X_INFO, "[XvMC] Extension initialized.\n");
> > +       }
> > +    }
> > +
> >      if(newAdaptors)
> >         free(newAdaptors);
> >  
> 
> Should this only be done under circumstances where the client side XvMC
> components can actually work? (KMS and >= R300 3D engine?)
Having textured (non overlay) Xv adapter around is the only prerequisite
for the server side I have found so far.
We could add an additional check that's at least an R300+ chipset and
DRI is available, but KMS isn't a really necessary at all.

> What will happen if this gets initialized but the client side can't
> work?
Nothing dangerous, the original idea behind XvMC was to have an
additionally client side library to decode the video in hardware and
then render it with Xv to the display but only a minority of the current
implementation does it this way today.

Instead we just tell the client to use direct rendering (all callback
functions inside the XF86MCAdaptorRec are NULL), with this in place the
client application/wrapper library calls the CreateContext function
directly and we don't have to worry about mostly anything on the server
side.

Then the client side CreateContext function can still tell the
application that some combination of hardware/codec/parameter isn't
supported, resulting in a clean fallback to Xv (at least xine does it
this way).

Regards,
Christian.



More information about the xorg-driver-ati mailing list