How to revert back glx version from 1.4 to 1.2
Ratin
ratin3 at gmail.com
Mon Aug 9 16:43:23 PDT 2010
On Mon, Aug 9, 2010 at 1:54 PM, Adam Jackson <ajax at nwnk.net> wrote:
> On Mon, 2010-08-09 at 12:42 -0700, Ratin wrote:
>> On Mon, Aug 9, 2010 at 7:53 AM, Adam Jackson <ajax at nwnk.net> wrote:
>> > On Mon, 2010-08-09 at 00:44 -0700, Ratin wrote:
>> >> Hi its been known to many that Xorg versions supporting glx version
>> >> 1.4 introduced some memory leak.
>> >
>> > I don't see any bugs about that in bugzilla. Do you have a testcase or
>> > is this just tribal knowledge?
>> >
>> http://www.theregister.co.uk/2010/04/26/ubuntu_xserver_memory_leak_bug/
>> http://ubuntuforums.org/showthread.php?t=1478329
>> http://numberedhumanindustries.wordpress.com/2010/04/28/ubuntu-10-04-memory-leak-issue-not-redhats-nor-fedoras-fault/
>> http://us.generation-nt.com/bug-559408-xserver-xorg-core-x-server-memory-leak-help-168906011.html
>
> So it's a bug that's already been fixed, and that doesn't affect you
> since nvidia's glx support doesn't use that code at all.
>
> - ajax
>
Hi Adam, I do have a testcase, I decode/render video with NVidia's
VDPAU constantly, over time the system becomes really sluggish, a
simple comand like "ls" will take about 30 sec to process. I have been
trying to find out what causes this, I run my application within
valgrind and it doesnt report any memory leak. Upon some searching
online , I found people going thru similar experience and it seemed to
be caused by glx, and still happens after doing an update to xserver
version 1.6.4 (thats how far I could go with ubuntu's repository
xorg-edgers). I did compile xserver 1.8.9 but I am not able to see
anything on the screen - perhaps nvidia kernel mode driver is not
compatible with it ..
Ratin
More information about the xorg
mailing list