[PATCH xserver 01/13] glx: Remove default server glx extension string
Emil Velikov
emil.l.velikov at gmail.com
Wed Mar 30 11:38:26 UTC 2016
On 23 March 2016 at 22:46, Adam Jackson <ajax at redhat.com> wrote:
> --- a/hw/xquartz/GL/indirect.c
> +++ b/hw/xquartz/GL/indirect.c
> @@ -566,8 +566,6 @@ __glXAquaScreenProbe(ScreenPtr pScreen)
> unsigned int buffer_size =
> __glXGetExtensionString(screen->glx_enable_bits, NULL);
> if (buffer_size > 0) {
> - free(screen->base.GLXextensions);
> -
> screen->base.GLXextensions = xnfalloc(buffer_size);
> __glXGetExtensionString(screen->glx_enable_bits,
> screen->base.GLXextensions);
> diff --git a/hw/xwin/glx/indirect.c b/hw/xwin/glx/indirect.c
> index e4be642..e515d18 100644
> --- a/hw/xwin/glx/indirect.c
> +++ b/hw/xwin/glx/indirect.c
> @@ -743,8 +743,6 @@ glxWinScreenProbe(ScreenPtr pScreen)
> unsigned int buffer_size =
> __glXGetExtensionString(screen->glx_enable_bits, NULL);
> if (buffer_size > 0) {
> - free(screen->base.GLXextensions);
> -
These two have a comment "(overrides that set by __glXScreenInit())"
just above the hunk that is free to go now.
-Emil
More information about the xorg-devel
mailing list