[PATCH xserver] xfree86: fix gamma compute when palette_size > 256

Aaron Plattner aplattner at nvidia.com
Tue Oct 31 16:55:56 UTC 2017


On 10/30/2017 03:28 AM, Michel Dänzer wrote:
> On 30/10/17 07:33 AM, Qiang Yu wrote:
>> palette_(red|green|blue)_size > crtc->gamma_size (=256)
>> this may happen when screen has per RGB chanel > 8bit,
>> i.e. 30bit depth screen 10bit per RGB.
> 
> Is palette_size > gamma_size really useful though?
> 
> Seems to me like gamma_size should always be >= palette_size.
> Specifically, if the gamma ramp only has 256 entries, what's the benefit
> of having 1024 entries in the palette?
> 
> What's the exact configuration in which you hit this?

It's the significant bits in the colormap that matter, and not the 
number of colormap entries, right?

There are always 2^(depth) colormap entries, so 256 for 8 bits per 
component and 1024 for 10. On our hardware, colormap entries have 11 
significant bits of precision, so they have a [0,2047] range.

On the other end of the display pipeline, the gamma ramp in the hardware 
has 1024 entries. The hardware will interpolate between gamma ramp 
entries if your color happens to fall between them. This can happen both 
because the colormap entries has twice as much precision as the gamma 
ramp, and because the "digital vibrance" and CSC matrix stuff is applied 
between the colormap and the gamma ramp and can skew colors around a bit.

This patch doesn't affect our driver since we don't use the xf86 RandR 
layer, but I thought I would point out that palette_size is not directly 
related to gamma_size, and the colormap precision is only loosely 
related to gamma_size.

-- Aaron


More information about the xorg-devel mailing list