[PATCH xserver] glx: Always enable EXT_texture_from_pixmap for DRI glx
Hans de Goede
hdegoede at redhat.com
Sat Jul 2 10:04:34 UTC 2016
Hi,
On 01-07-16 22:32, Adam Jackson wrote:
> On Fri, 2016-07-01 at 19:06 +0200, Hans de Goede wrote:
>> Prior to commit f95645c6f700 ("glx: Don't enable EXT_texture_from_pixmap
>> unconditionally") DRI glx would always advertise EXT_texture_from_pixmap.
>>
>> That commit moved the setting of the extension in the extension bits from
>> __glXInitExtensionEnableBits to its callers so that
>> __glXInitExtensionEnableBits can be used more generally, but at the same
>> time made the setting of EXT_texture_from_pixmap conditionally on
>> __DRI_TEX_BUFFER being present.
>
> You say this like it's a bad thing. Read __glXDisp_BindTexImageEXT. If
> that extension gets called in an indirect context we're going to call
> context->textureFromPixmap->bindTexImage. If the texBuffer extension
> isn't present then the request will silently "succeed" without doing
> anything.
>
>> This has result in an unintended behavior change which breaks e.g.
>> compositors running on llvmpipe. This commit makes the DRI glx code
>> advertise EXT_texture_from_pixmap unconditionally again fixing this.
>
> I think what you mean to say here is "libGL does not enable
> EXT_texture_from_pixmap for direct drisw contexts like it should",
> because I'm pretty sure that...
>
>> @@ -881,8 +885,6 @@ initializeExtensions(__GLXscreen * screen)
>> for (i = 0; extensions[i]; i++) {
>> if (strcmp(extensions[i]->name, __DRI_TEX_BUFFER) == 0) {
>> dri->texBuffer = (const __DRItexBufferExtension *) extensions[i];
>> - __glXEnableExtension(screen->glx_enable_bits,
>> - "GLX_EXT_texture_from_pixmap");
>> LogMessage(X_INFO,
>> "AIGLX: GLX_EXT_texture_from_pixmap backed by buffer objects\n");
>> }
>
> ... while hardware DRI drivers do put the __DRI_TEX_BUFFER extension in
> the list at the right place...
>
>> diff --git a/glx/glxdriswrast.c b/glx/glxdriswrast.c
>> index be32527..8884bff 100644
>> --- a/glx/glxdriswrast.c
>> +++ b/glx/glxdriswrast.c
>> @@ -396,6 +396,7 @@ initializeExtensions(__GLXscreen * screen)
>> __glXEnableExtension(screen->glx_enable_bits, "GLX_EXT_framebuffer_sRGB");
>> __glXEnableExtension(screen->glx_enable_bits, "GLX_ARB_fbconfig_float");
>> __glXEnableExtension(screen->glx_enable_bits, "GLX_EXT_fbconfig_packed_float");
>> + __glXEnableExtension(screen->glx_enable_bits, "GLX_EXT_texture_from_pixmap");
>>
>> extensions = dri->core->getExtensions(dri->driScreen);
>>
>> @@ -407,8 +408,6 @@ initializeExtensions(__GLXscreen * screen)
>>
>> if (strcmp(extensions[i]->name, __DRI_TEX_BUFFER) == 0) {
>> dri->texBuffer = (const __DRItexBufferExtension *) extensions[i];
>> - __glXEnableExtension(screen->glx_enable_bits,
>> - "GLX_EXT_texture_from_pixmap\n");
>> }
>>
>> #ifdef __DRI2_FLUSH_CONTROL
>
> ... drisw drivers do not have it in the list, because drisw is not -
> what's the word - good.
>
> Now maybe we want to remain compatible with libGL and drivers with this
> particular bug. If that's what we want, then the "if (!texBuffer)
> return Success" bit of __glXDRIbindTexImage (both of them) should throw
> GLXUnsupportedPrivateRequest instead. But Mesa really should be fixed.
If you believe that this is better fixed in Mesa, then please do so.
Clear downside of that is that people who stay with an older Mesa will
see things break when we release the next xserver, I'm not sure that
is a good idea, but it is your call.
I noticed that moving from xserver-1.18.x to xserver master caused
a regression where gnome-shell would no longer work on e.g. a cirrus logic
card. So I git bisected this, found the culprit commit causing this
regression and then while I was at it came up with this fix.
Regards,
Hans
More information about the xorg-devel
mailing list