keithp at keithp.com
Sat Mar 17 12:17:17 PDT 2007
On Sat, 2007-03-17 at 12:08 +0100, Pavel Troller wrote:
> I was wrong telling that 1600x1200 works with EDID. It's vice versa. It
> works WITHOUT it. Although all seems to be OK, it tosses the mode because of
> hsync out of range.
Ok, this is mysterious; the values all look good to me. If you can debug
the server during startup, it would be very useful to see why
xf86ValidateModesSync is rejecting this mode. Something as simple as
some debugging printfs (using ErrorF) that dump out the monitor hsync
ranges and then the hsync values computed for each mode should make it
fairly clear whether the probed monitor data is getting placed into the
monitor wrong (I probably haven't tested that much, I use EDID for my
monitor sync ranges all of the time) or whether the hsync value for the
mode is computed incorrectly.
Note that if you *don't* specify sync ranges in the config file, then
all of the modes present in EDID are included, they are not validated
against the sync ranges in the EDID data, so removing those should
preserve the EDID modes for you to use.
> BTW to get 1600x1200 at all, I have to specify PreferredMode option in the
> monitor section. Without it, it offers 1152xsomething as the default, even with
> virtual desktop set to 1600x1200 and equally with and without EDID. Why ?
Because the monitor doesn't say that it wants 1600x1200 mode, it just
offers a range of possible modes. The server picks the mode which is
closest to 96dpi by default, to keep it from using modes with ultra-tiny
pixels. That's what PreferredMode is for; to override the default server
preference for 96dpi-ish modes. If you had a fixed-pixel monitor that
listed a preferred mode, the server would pick that independent of the
> Yes I am, because it works and presents correct DPI values with the old driver.
> I always let the server to compute DPI from the resolution and display size.
> In the log you can see the correct values for both the cases, but kinfocenter
> shows totally different ones.
Can you run just a bare X server and make sure nothing is changing after
the server starts? The values in the log file are from very late in the
startup process and I don't think they should be changed after that
point, but if KDE were to run RandR 1.0 requests at startup (Gnome does,
much to my dismay), it's possible they would be trashed at that point.
> Exactly! I ran xrandr a lot of times now and it printed two kinds of output.
> One contained the fine modes and the other didn't. I didn't know that EDID is
> read everytime the resolution is to be changed, or even only dumped. I thought
> that it's read only once at the server startup time.
No, it wants to detect when you unplug one monitor and plug in another,
so every time an application requests the current mode list, it will
re-fetch the EDID data.
> A code snippet from this file:
> dev->DevName = "ddc2";
> dev->SlaveAddr = 0xA0;
> dev->ByteTimeout = 2200; /* VESA DDC spec 3 p. 43 (+10 %) */
> dev->StartTimeout = 550;
> dev->BitTimeout = 40;
> dev->ByteTimeout = 40;
> dev->AcknTimeout = 40;
> Which of the two dev->ByteTimeout assignents is correct ?
The first one; we obviously missed the second one when increasing the
timeout to match the VESA spec. Thanks for catching the bug.
I'm going to commit something like:
dev->DevName = "ddc2";
dev->SlaveAddr = 0xA0;
dev->ByteTimeout = 2200; /* VESA DDC spec 3 p. 43 (+10 %) */
dev->StartTimeout = 550;
dev->BitTimeout = 40;
dev->AcknTimeout = 40;
dev->RiseFallTime = 40;
dev->HoldTime = 5;
dev->pI2CBus = pBus;
which fills in the remaining fields appropriately; it'd be nice to know
if this had any effect on your setup.
I'll reply to your second message as well.
keith.packard at intel.com
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 189 bytes
Desc: This is a digitally signed message part
More information about the xorg