[PATCH xserver 12/13] glx: Compute the GLX extension string from __glXScreenInit
Emil Velikov
emil.l.velikov at gmail.com
Wed Mar 30 11:58:09 UTC 2016
On 23 March 2016 at 22:46, Adam Jackson <ajax at redhat.com> wrote:
> --- a/glx/glxscreens.c
> +++ b/glx/glxscreens.c
> @@ -383,6 +383,14 @@ __glXScreenInit(__GLXscreen * pGlxScreen, ScreenPtr pScreen)
> }
>
> dixSetPrivate(&pScreen->devPrivates, glxScreenPrivateKey, pGlxScreen);
> +
> + i = __glXGetExtensionString(pGlxScreen->glx_enable_bits, NULL);
> + if (i > 0) {
> + pGlxScreen->GLXextensions = xnfalloc(i);
> + (void) __glXGetExtensionString(pGlxScreen->glx_enable_bits,
> + pGlxScreen->GLXextensions);
> + }
> +
Better to keep this hunk just after the NULL initialization of
pGlxScreen->GLXextensions ?
> }
>
> void
> diff --git a/hw/xquartz/GL/indirect.c b/hw/xquartz/GL/indirect.c
> index 9eaeb94..2d88ef2 100644
> --- a/hw/xquartz/GL/indirect.c
> +++ b/hw/xquartz/GL/indirect.c
> @@ -542,20 +542,6 @@ __glXAquaScreenProbe(ScreenPtr pScreen)
> __glXInitExtensionEnableBits(screen->base.glx_enable_bits);
> __glXScreenInit(&screen->base, pScreen);
>
> - //__glXEnableExtension(screen->base.glx_enable_bits, "GLX_ARB_create_context");
> - //__glXEnableExtension(screen->base.glx_enable_bits, "GLX_ARB_create_context_profile");
> -
Not sure what the intent behind these was, so one might as well move
them before the __glXScreenInit() call. Just like the xwin backend.
-Emil
More information about the xorg-devel
mailing list