Does Xorg'x XRender implementation support "better-than-bilinear" interpolation?
sandmann at cs.au.dk
Tue Sep 11 20:09:19 PDT 2012
"Pierre-Loup A. Griffais" <pgriffais at nvidia.com> writes:
> On 09/11/2012 10:43 AM, Søren Sandmann wrote:
>> "Pierre-Loup A. Griffais"<pgriffais at nvidia.com> writes:
>>> I'm not very familiar with bicubic interpolation, but couldn't it be
>>> achieved using the 'convolution' filter with the adequate kernel?
>>> (possibly in several passes at different scales). AFAIK 'convolution'
>>> is always provided and often accelerated.
>> The convolution filter in Render only allows one phase of the filter to
>> be used, so the quality of bicubic interpolation implemented that way
>> would be terrible.
>> As an aside, a bunch of code could be deleted if we removed the ability
>> for drivers to provide their own filters. As far as I know, no driver
>> has ever done this. I have an old branch here:
> One of the items of my (long) list of things to do is to expose
> anisotropic filtering to RENDER, so it'd be cool if that didn't get
I am unlikely to do anything about this for the forseeable future, and I
doubt anyone else cares.
Though my opinion would be that it would be better to add anisotropic
filtering to Render as a standard filter with a software fallback
instead of as an NVIDIA specific one.
More information about the xorg-devel