RANDR 1.2 vs libxinerama?

Matthias Hopf mhopf at suse.de
Fri Apr 13 10:04:22 PDT 2007


Hey all,

one of our KDE guys, Dirk Mueller, found some interesting inconsistency
between RANDR and libxinerama. Basically, for intel only(!) RANDR
returns the correct information, while for other drivers like NVIDIA's
libxinerama returns the correct one.

Could it be that 1) NVIDIA's RANDR implementation is flaky ATM, and
2) libxinerama isn't adapted correctly to RANDR 1.2? The strange thing
is that the RANDR 1.1 emulation works fine.

I just post Dirk's findings here (test case in
https://bugzilla.novell.com/show_bug.cgi?id=264199):


Hi, 

it is my understanding that libxinerama is being deprecated in favor of xrandr.
right now on intel hardware, libxinerama returns nonsense, namely it lists
deactivated displays as screens: 

screen 0 [ 0/0 - 1024/768 ]
screen 1 [ 0/0 - 1400/1050 ]

while xrandr is correct: 

Screen 0: minimum 320 x 200, current 1280 x 1024, maximum 1400 x 1400


now, on nvidia twinview libxinerama seems to be working allright, and xrandr
returns nonsense: 

Screen 0: minimum 1280 x 480, current 3360 x 1050, maximum 3360 x 1050

this should actually be screen 0/1 with half the horizontal resolution. 


This conflict has to be resolved, as it confuses KDE and possibly other desktop
environments trying to figure out the screen setup.



Thanks

Matthias

-- 
Matthias Hopf <mhopf at suse.de>      __        __   __
Maxfeldstr. 5 / 90409 Nuernberg   (_   | |  (_   |__          mat at mshopf.de
Phone +49-911-74053-715           __)  |_|  __)  |__  R & D   www.mshopf.de



More information about the xorg mailing list