Failed to start xgl

David Reveman davidr at novell.com
Tue Jan 10 04:41:46 PST 2006


On Tue, 2006-01-10 at 12:33 +0100, Hanno Böck wrote:
> Am Dienstag, 10. Januar 2006 03:47 schrieb David Reveman:
> > You can't use LD_PRELOAD with Xgl as symbols in native libGL must not be
> > loaded when the server is loading internal glx and glcore modules. This
> > will mess up GL/GLX symbols completely. Try using LD_LIBRARY_PATH
> > instead.
> 
> Gives me the same error (and also without any LD-var set), I only set that 
> cause of Rich's note that a wrong libGL might cause the error.

The BadLength error can only happen when you're using indirect rendering
and to get reasonable performance out of Xgl you want to use direct
rendering. So if you're getting that error you might be using the wrong
libGL.

Is this a 64-bit machine? glcore in CVS version of Xgl is currently
broken on 64-bit machines and could be causing memory corruption. You
can try my tarboll if this is the case.

If it's a 32-bit machine and you're not using LD_PRELOAD but still
getting the glibc detected memory corruption error the problem might be
in Xgl. If you could send me a stack trace that would help a lot.

> 
> _______________________________________________
> xorg mailing list
> xorg at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/xorg

-David




More information about the xorg mailing list