[PATCH xserver] xfree86: fix gamma compute when palette_size > 256

Yu, Qiang Qiang.Yu at amd.com
Wed Nov 1 01:20:44 UTC 2017


Aaron, Michel, thanks for your comments which make me get more clear about
the notation involved in this patch. As my understanding so far:
1. the crtc->gamma_size should be a hardware setting that reflect the hardware
    capability of max gamma CLUT size
2. the palette_size can be bigger than crtc->gamma_size, in situation that the screen
    palette_size > crtc->gamma_size, it still make sense that the hardware will do
    interpolation although not exactly accurate and we can't do any better because
    palette_size exceeds hardware gamma CLUT limit

So according to the above understanding, I think there're two solution to the problem:
1. DDX is responsible for setting crtc->gamma_size to max hardware capable gamma
    CLUT size, and this patch is still needed for situation like Aaron described (11bit color
    but 10bit gamma CLUT) and old amdgpu KMS driver that can only support 256 gamma
    CLUT but need 10bit color.
2. xserver or DDX will always set crtc->gamma_size to max color depth (either in
    xf86CrtcCreate or DDX init) no matter hardware support, and let kernel driver
    do the shrink if needed.

I'd prefer the first solution, what's your opinion?

Regards,
Qiang


________________________________________
From: xorg-devel <xorg-devel-bounces at lists.x.org> on behalf of Aaron Plattner <aplattner at nvidia.com>
Sent: Wednesday, November 1, 2017 12:55:56 AM
To: Michel Dänzer; Yu, Qiang
Cc: xorg-devel at lists.x.org
Subject: Re: [PATCH xserver] xfree86: fix gamma compute when palette_size > 256

On 10/30/2017 03:28 AM, Michel Dänzer wrote:
> On 30/10/17 07:33 AM, Qiang Yu wrote:
>> palette_(red|green|blue)_size > crtc->gamma_size (=256)
>> this may happen when screen has per RGB chanel > 8bit,
>> i.e. 30bit depth screen 10bit per RGB.
>
> Is palette_size > gamma_size really useful though?
>
> Seems to me like gamma_size should always be >= palette_size.
> Specifically, if the gamma ramp only has 256 entries, what's the benefit
> of having 1024 entries in the palette?
>
> What's the exact configuration in which you hit this?

It's the significant bits in the colormap that matter, and not the
number of colormap entries, right?

There are always 2^(depth) colormap entries, so 256 for 8 bits per
component and 1024 for 10. On our hardware, colormap entries have 11
significant bits of precision, so they have a [0,2047] range.

On the other end of the display pipeline, the gamma ramp in the hardware
has 1024 entries. The hardware will interpolate between gamma ramp
entries if your color happens to fall between them. This can happen both
because the colormap entries has twice as much precision as the gamma
ramp, and because the "digital vibrance" and CSC matrix stuff is applied
between the colormap and the gamma ramp and can skew colors around a bit.

This patch doesn't affect our driver since we don't use the xf86 RandR
layer, but I thought I would point out that palette_size is not directly
related to gamma_size, and the colormap precision is only loosely
related to gamma_size.

-- Aaron
_______________________________________________
xorg-devel at lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: https://lists.x.org/mailman/listinfo/xorg-devel


More information about the xorg-devel mailing list