[Bug 16001] New: XVideo gamma curve is wrong at least for r300 chips
bugzilla-daemon at freedesktop.org
bugzilla-daemon at freedesktop.org
Sun May 18 10:29:34 PDT 2008
http://bugs.freedesktop.org/show_bug.cgi?id=16001
Summary: XVideo gamma curve is wrong at least for r300 chips
Product: xorg
Version: git
Platform: Other
OS/Version: All
Status: NEW
Keywords: patch
Severity: normal
Priority: medium
Component: Driver/Radeon
AssignedTo: xorg-driver-ati at lists.x.org
ReportedBy: ranma+freedesktop at tdiedrich.de
QAContact: xorg-team at lists.x.org
Created an attachment (id=16614)
--> (http://bugs.freedesktop.org/attachment.cgi?id=16614)
Fix for gamma curve on r300 series
See also:
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=481548
I have just checked out the git version of the ati driver from
git://anongit.freedesktop.org/git/xorg/driver/xf86-video-ati
and my patch still applies.
I also do have the problem with the git version of the driver from the
experimental branch of Debian.
Quoting from my Debian bug report:
------ quote start ----------------------------------------------------
Hi,
I noticed that there are very visible steps in the video overlay in dark
scenes. After investigating I found this to be a problem with the video
overlay (XVideo), since software-converted YV12 looks fine compared with
the overlay.
Also, when using other gamma values than 1.0, I would get a pink
overlay.
After playing with the driver and graphics registers a bit,
I found the gamma tables in radeon_video.c to be wrong for my two cards:
This original shows distinct 'steps' in dark areas of the overlay:
|static GAMMA_CURVE_R200 gamma_curve_r200[8] =
| {
| /* Gamma 1.0 */
| {0x00000040, 0x00000000,
| 0x00000040, 0x00000020,
| 0x00000080, 0x00000040,
| 0x00000100, 0x00000080,
| 0x00000100, 0x00000100,
This modified version looks fine:
|static GAMMA_CURVE_R200 gamma_curve_r200[8] =
| {
| /* Gamma 1.0 */
| {0x00000100, 0x00000000,
| 0x00000100, 0x00000020,
| 0x00000100, 0x00000040,
| 0x00000100, 0x00000080,
| 0x00000100, 0x00000100,
So the slope values for
RADEON_OV0_GAMMA_000_00F,
RADEON_OV0_GAMMA_010_01F and
RADEON_OV0_GAMMA_020_03F
need to be shifted by (2, 2, 1) to the left respectively.
I also found, that RADEONSetOverlayGamma() does not allow arbitrary
gamma values, but only 8 differnt ones, where only value 0 (gamma 1.0)
would work properly, since
| /* Set gamma */
| RADEONWaitForIdleMMIO(pScrn);
| ov0_scale_cntl = INREG(RADEON_OV0_SCALE_CNTL) &
~RADEON_SCALER_GAMMA_SEL_MASK;
| OUTREG(RADEON_OV0_SCALE_CNTL, ov0_scale_cntl | (gamma << 0x00000005));
breaks my overlay (pink screen) for 'gamma' values other than '0'.
The following patch fixes both issues for me on my desktop's
"02:00.1 Display controller: ATI Technologies Inc RV370 [Radeon X300SE]"
and on the
"01:00.0 VGA compatible controller: ATI Technologies Inc M22 [Radeon Mobility
M300]"
of my Thinkpad.
diff -Nru xserver-xorg-video-ati-6.8.0/src/radeon_video.c
xserver-xorg-video-ati-6.8.0.fixed_gamma/src/radeon_video.c
- --- xserver-xorg-video-ati-6.8.0/src/radeon_video.c 2008-02-19
02:10:46.000000000 +0100
+++ xserver-xorg-video-ati-6.8.0.fixed_gamma/src/radeon_video.c 2008-05-16
23:42:00.000000000 +0200
@@ -850,19 +850,42 @@
/* Set gamma */
RADEONWaitForIdleMMIO(pScrn);
ov0_scale_cntl = INREG(RADEON_OV0_SCALE_CNTL) &
~RADEON_SCALER_GAMMA_SEL_MASK;
- - OUTREG(RADEON_OV0_SCALE_CNTL, ov0_scale_cntl | (gamma << 0x00000005));
+ if (info->ChipFamily <= CHIP_FAMILY_R200) {
+ /* this breaks my r300 (pink picture)
+ * for gamma != 1.0 (gamma != 0).
+ * I suspect this is really for much older Radeons (r100?) which
+ * didn't have the RADEON_OV0_GAMMA_* registers */
+ OUTREG(RADEON_OV0_SCALE_CNTL, ov0_scale_cntl | (gamma << 0x00000005));
+ } else {
+ OUTREG(RADEON_OV0_SCALE_CNTL, ov0_scale_cntl);
+ }
/* Load gamma curve adjustments */
if (info->ChipFamily >= CHIP_FAMILY_R200) {
- - OUTREG(RADEON_OV0_GAMMA_000_00F,
- - (gamma_curve_r200[gamma].GAMMA_0_F_OFFSET << 0x00000000) |
- - (gamma_curve_r200[gamma].GAMMA_0_F_SLOPE << 0x00000010));
- - OUTREG(RADEON_OV0_GAMMA_010_01F,
- - (gamma_curve_r200[gamma].GAMMA_10_1F_OFFSET << 0x00000000) |
- - (gamma_curve_r200[gamma].GAMMA_10_1F_SLOPE << 0x00000010));
- - OUTREG(RADEON_OV0_GAMMA_020_03F,
- - (gamma_curve_r200[gamma].GAMMA_20_3F_OFFSET << 0x00000000) |
- - (gamma_curve_r200[gamma].GAMMA_20_3F_SLOPE << 0x00000010));
+ if (info->ChipFamily >= CHIP_FAMILY_R300) {
+ /* It looks like the slope values have to be shifted by
+ * additional 2bits/1bit to yield the expected result
+ * on my two r300 cards */
+ OUTREG(RADEON_OV0_GAMMA_000_00F,
+ (gamma_curve_r200[gamma].GAMMA_0_F_OFFSET << 0x00000000) |
+ (gamma_curve_r200[gamma].GAMMA_0_F_SLOPE << 0x00000012));
+ OUTREG(RADEON_OV0_GAMMA_010_01F,
+ (gamma_curve_r200[gamma].GAMMA_10_1F_OFFSET << 0x00000000) |
+ (gamma_curve_r200[gamma].GAMMA_10_1F_SLOPE << 0x00000012));
+ OUTREG(RADEON_OV0_GAMMA_020_03F,
+ (gamma_curve_r200[gamma].GAMMA_20_3F_OFFSET << 0x00000000) |
+ (gamma_curve_r200[gamma].GAMMA_20_3F_SLOPE << 0x00000011));
+ } else {
+ OUTREG(RADEON_OV0_GAMMA_000_00F,
+ (gamma_curve_r200[gamma].GAMMA_0_F_OFFSET << 0x00000000) |
+ (gamma_curve_r200[gamma].GAMMA_0_F_SLOPE << 0x00000010));
+ OUTREG(RADEON_OV0_GAMMA_010_01F,
+ (gamma_curve_r200[gamma].GAMMA_10_1F_OFFSET << 0x00000000) |
+ (gamma_curve_r200[gamma].GAMMA_10_1F_SLOPE << 0x00000010));
+ OUTREG(RADEON_OV0_GAMMA_020_03F,
+ (gamma_curve_r200[gamma].GAMMA_20_3F_OFFSET << 0x00000000) |
+ (gamma_curve_r200[gamma].GAMMA_20_3F_SLOPE << 0x00000010));
+ }
OUTREG(RADEON_OV0_GAMMA_040_07F,
(gamma_curve_r200[gamma].GAMMA_40_7F_OFFSET << 0x00000000) |
(gamma_curve_r200[gamma].GAMMA_40_7F_SLOPE << 0x00000010));
------ quote end ----------------------------------------------------
--
Configure bugmail: http://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
More information about the xorg-driver-ati
mailing list