[Bug 16001] XVideo gamma curve is wrong at least for r300 chips
bugzilla-daemon at freedesktop.org
bugzilla-daemon at freedesktop.org
Wed May 21 06:28:59 PDT 2008
http://bugs.freedesktop.org/show_bug.cgi?id=16001
--- Comment #11 from Alex Deucher <agd5f at yahoo.com> 2008-05-21 06:28:58 PST ---
(In reply to comment #9)
> (In reply to comment #7)
> > I've gone ahead and pushed the first part of this since it makes a noticeable
> > improvement on r2xx/r3xx and the OV0_SCALE_CNTL setup is obviously wrong on
> > r2xx+.
> > The defaults for gamma 1.0 slope should be 0x100 according to the databooks, so
> > it appears that is correct too.
> > b7c80d0c86646105d2bce5d4d59ba6c45aa7cafc
>
> Ok. I still see 3 problems. One is purely cosmetic (no need for reading/writing
> ov0_scale_cntl at all for r200 and up chips in the setgamma function), but the
good point.
> other two look real. First, it seems (for r100 chips) the gamma set in the
> ov0_scale_cntl reg gets overwritten in the RadeonDisplayVideo function (makes
> me wonder actually why those bogus gamma_sel bits on r200 and up have any
> effect at all since they just gets set to 0 anyway?).
Probably not.
> Also, if the first 3 values for the gamma 1.0 curve were wrong, those for the
> other gamma curves look wrong to me too.
>
Definitely possible. I got the tables from ati years ago.
--
Configure bugmail: http://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
More information about the xorg-driver-ati
mailing list