Bug#573849: xserver-xorg-video-radeon: Be verbose about discarding modes
Alex Deucher
alexdeucher at gmail.com
Sun Mar 14 08:21:02 PDT 2010
On Sun, Mar 14, 2010 at 7:50 AM, Bas Wijnen <wijnen at debian.org> wrote:
> Package: xserver-xorg-video-radeon
> Version: 6.12.5-1
> Severity: wishlist
>
> Since fairly recently (a few months, I think), by Radeon refuses to set
> my monitor to 1600x1200. Looking at the server log, I found (in random
> order):
>
> (II) RADEON(0): #6: hsize: 1600 vsize 1200 refresh: 60 vid: 16553
> (II) RADEON(0): Modeline "1600x1200"x0.0 162.00 1600 1664 1856 2160 1200 1201 1204 1250 +hsync +vsync (75.0 kHz)
> (II) RADEON(0): Ranges: V min: 56 V max: 75 Hz, H min: 31 H max: 81 kHz, PixClock max 170 MHz
> (II) RADEON(0): Not using mode "1600x1200" (mode clock too high)
>
> This makes no sense. It should be able to go up to 170 MHz, but it
> refuses 162 MHz. In the source I found:
>
> /* clocks over 135 MHz have heat issues with DVI on RV100 */
> if ((radeon_output->MonType == MT_DFP) &&
> (info->ChipFamily == CHIP_FAMILY_RV100) &&
> (pMode->Clock > 135000))
> return MODE_CLOCK_HIGH;
>
> This explains why it refuses the mode.
>
> Looking at the source should not be required for understanding the log
> file. Please make it more readable by adding a line about the heat
> issues. It may also be a good idea to allow overriding the check. I
> never had any trouble using 1600x1200 with this card and monitor.
You can start the xserver with higher verbosity levels and it will
tell you why modes were rejected. You can also manually specify
modelines in your xorg.conf or at run time using xrandr to override
what the driver/xserver's selections.
Alex
More information about the xorg-driver-ati
mailing list