[Xorg-driver-geode] looking at wide mode support for GeodeLX (continued)
Bart Trojanowski
bart at jukie.net
Fri Feb 29 10:35:43 PST 2008
Hi,
[[ I got busy for a few weeks and I am now returning to wide mode. ]]
Earlier, Jordan presented a patch:
http://lists.x.org/archives/xorg-driver-geode/2008-February/000211.html
I tested it with on Gutsy (xorg-core 1.3.0) and found that it was
skipping most of the modes in LXValidMode() because it was not in the
Cimarron table. I then proposed a simple patch that showed improvement
on my setup:
http://lists.x.org/archives/xorg-driver-geode/2008-February/000218.html
Today I setup the same hardware to run Hardy (xorg-core 1.4.0) and had
the same experience. It seems that only the standard modes are
available w/o my above change to Jordan's patch.
With the above change I can confirm that Gutsy (1.3.0) and Hardy (1.4.0)
work perfectly upto 1920x1200 on the ACER x243w.
Through testing on various monitors I should also mention that Gadi and
myself only had issues with ViewSonic. Recently, we tested on VX2000, a
24" wide screen, and a 22" wide screen. In all three cases we saw modes
being set by the -amd driver were not accepted by the display, even
though they were coming from DDC. For example: the VX2000 announces it
supports 1400x1050, I can select it using xrandr, the driver sets it, X
thinks it's in 1400x1050, but the LCD actually remains in native
1600x1200. If the mode is forced through xorg.conf, then the screen
displayed "no signal". Again, so far only on ViewSonic VX and ViewSonic
Optiquest series.
I have one other observation to make about xorg-core 1.4.0 and GeodeLX.
When I run ddcprobe, which ran fine with 1.3.0, it reports:
# ddcprobe
mmap /dev/zero: Permission denied
VESA BIOS Extensions not detected.
I don't really get it. Is it my setup, or is the second line to be
taken seriously?
Anyways, I'll continue debugging as per Jordan's suggestions from:
* Jordan Crouse <jordan.crouse at amd.com> [080213 15:52]:
> Presumably then, this would be the mode that X would ask for if nothing
> else was specified. Why then 1280x1024, and then the DEFAULT one and
> not the one flagged with driver? There is strangness here that needs to
> be debugged.
I am attaching the patch set as it looks right now... for anyone that
may want to test it.
-Bart
--
WebSig: http://www.jukie.net/~bart/sig/
-------------- next part --------------
From 87e2b6a1304f01e1f4324cdfee9d4f2cb321b6be Mon Sep 17 00:00:00 2001
From: Jordan Crouse <jordan.crouse at amd.com>
Date: Wed, 13 Feb 2008 11:10:21 -0500
Subject: [PATCH] fix-ddc.patch from Jordan
---
src/amd_gx_driver.c | 9 +++++----
src/amd_lx_driver.c | 21 +++++++++++++++------
2 files changed, 20 insertions(+), 10 deletions(-)
diff --git a/src/amd_gx_driver.c b/src/amd_gx_driver.c
index 177b111..ead492f 100644
--- a/src/amd_gx_driver.c
+++ b/src/amd_gx_driver.c
@@ -783,7 +783,9 @@ GXSetVideoMode(ScrnInfoPtr pScrni, DisplayModePtr pMode)
/* Only use the panel mode for built in modes */
- if ((pMode->type && pMode->type != M_T_USERDEF) && pGeode->Panel) {
+ if ((pMode->type & M_T_BUILTIN) || (pMode->type & M_T_DEFAULT)
+ && pGeode->Panel) {
+
GFX(set_fixed_timings(pGeode->PanelX, pGeode->PanelY,
pMode->CrtcHDisplay, pMode->CrtcVDisplay,
pScrni->bitsPerPixel));
@@ -1390,10 +1392,9 @@ GXValidMode(int scrnIndex, DisplayModePtr pMode, Bool Verbose, int flags)
GeodeRec *pGeode = GEODEPTR(pScrni);
int p, ret;
- /* Not sure if this is an X bug or not - but on my current build,
- * user defined modes pass a type of 0 */
+ /* Use the durango lookup for builtin or default modes only */
- if (pMode->type && pMode->type != M_T_USERDEF) {
+ if ((pMode->type & M_T_BUILTIN) || (pMode->type & M_T_DEFAULT)) {
if (pGeode->Panel) {
if (pMode->CrtcHDisplay > pGeode->PanelX ||
diff --git a/src/amd_lx_driver.c b/src/amd_lx_driver.c
index 9abbd5f..c779ac1 100644
--- a/src/amd_lx_driver.c
+++ b/src/amd_lx_driver.c
@@ -845,9 +845,10 @@ LXSetVideoMode(ScrnInfoPtr pScrni, DisplayModePtr pMode)
lx_disable_dac_power(pScrni, DF_CRT_DISABLE);
vg_set_compression_enable(0);
- if (!pMode->type || pMode->type == M_T_USERDEF)
- lx_set_custom_mode(pGeode, pMode, pScrni->bitsPerPixel);
- else {
+ /* If the mode is a default one, then set the mode with the Cimarron
+ * tables */
+
+ if ((pMode->type & M_T_BUILTIN) || (pMode->type & M_T_DEFAULT)) {
if (pMode->Flags & V_NHSYNC)
flags |= VG_MODEFLAG_NEG_HSYNC;
if (pMode->Flags & V_NVSYNC)
@@ -878,8 +879,14 @@ LXSetVideoMode(ScrnInfoPtr pScrni, DisplayModePtr pMode)
pScrni->bitsPerPixel, GeodeGetRefreshRate(pMode),
0);
}
- }
-
+ }
+ else {
+ /* For anything other then a default mode - use the passed in
+ * timings */
+
+ lx_set_custom_mode(pGeode, pMode, pScrni->bitsPerPixel);
+ }
+
if (pGeode->Output & OUTPUT_PANEL)
df_set_output_path((pGeode->Output & OUTPUT_CRT) ? DF_DISPLAY_CRT_FP : DF_DISPLAY_FP);
else
@@ -1386,7 +1393,9 @@ LXValidMode(int scrnIndex, DisplayModePtr pMode, Bool Verbose, int flags)
memset(&vgQueryMode, 0, sizeof(vgQueryMode));
- if (pMode->type && pMode->type != M_T_USERDEF) {
+ /* For builtin and default modes, try to look up the mode in Cimarron */
+
+ if ((pMode->type & M_T_BUILTIN) || (pMode->type && M_T_DEFAULT)) {
if (pGeode->Output & OUTPUT_PANEL) {
--
1.5.3.7.1150.g149d432
From ad44e2e9c10dea13b7bf8a4f5b73066c3f04eaf6 Mon Sep 17 00:00:00 2001
From: Bart Trojanowski <bart at jukie.net>
Date: Fri, 29 Feb 2008 13:30:16 -0500
Subject: [PATCH] don't trust Cimarron checks in LXValidMode() checks for non-panel modes
---
src/amd_lx_driver.c | 2 +-
1 files changed, 1 insertions(+), 1 deletions(-)
diff --git a/src/amd_lx_driver.c b/src/amd_lx_driver.c
index c779ac1..313b45d 100644
--- a/src/amd_lx_driver.c
+++ b/src/amd_lx_driver.c
@@ -1420,7 +1420,7 @@ LXValidMode(int scrnIndex, DisplayModePtr pMode, Bool Verbose, int flags)
ret = vg_get_display_mode_index(&vgQueryMode);
- if (ret < 0)
+ if (pGeode->Output & OUTPUT_PANEL && ret < 0)
return MODE_BAD;
}
--
1.5.3.7.1150.g149d432
More information about the xorg
mailing list