[Xorg-driver-geode] The update of the "Xorg-geode-screensaver" issue and Repair the chaos phenomena transition process about desktop to screensaver
Cui, Hunk
Hunk.Cui at amd.com
Mon May 24 00:47:00 PDT 2010
Hi, all,
1). The update of the "Xorg-geode-screensaver" issue
Through communication with TSDB and xorg-devel, I have a certain understanding of this Gamma correction RAM (PAR & PDR registers) principle. All the gamma-related calls, as they are currently implemented, eventually results in writes to the hardware CLUT (Color Lookup Tables) which translates pixel values to actual electrical output values. "Gamma correction" is just one of the things can do by modifying the values in the table. Different hardware has different sized CLUTs.
Some have 8 bits CLUTs (In our Geode-driver, PAD register, 256 bytes space), other may have 10 bits (1024 bytes) and other may have 16 bits. So we store it as a CARD16 and let the driver decide how may bits thy support.
A client App make a "gamma correction" call, server generates a new CLUTs with 16 bits of precision per channel, then the driver takes this and truncates to whatever precision the hardware can actually take. Because the gamma ramp is less precise than 16 bits in our geode-driver. the R,G,B originality values are 16 bits per channel. When the values are transfered to the driver layer, they will be dealed with through "val = (*red << 8) | *green | (*blue>>8);" because the "val" will be writen into Gamma correction RAM register (the type of hardware register: Each of the entries are made up of corrections for R/G/B. Within the DWORD, the red correction is in b[23:16], green in b[15:8] and blue in b[7:0]). Finally, our driver is allowed to truncate.
As a typical use case the CLUTs should be able to pass through each value unaltered.
About different between Graphic data and Video data, in 33234_LX_databook ->P209 Integrated Functions Block Diagram, there are two seperate frame buffers, graphics and video. Assuming both are active, these two streams are mixed before being sent to the monitor. Before the mixing, one or the other but not both can be color adjusted using the Gamma Correction RAM.
Now the screensaver belong to the Graphic data, the data will come into the Gamma Correction RAM, because the luminance generated by a physical device is generally not a linear function of the applied signal, when the Gamma Correction RAM is enabled for graphics use, the data byte of original color is used as an address into the Gamma Correction RAM which produces a new byte of data, a new color intensity.
As the screensaver source code (Client App) are more complex, It is hard to trace the major data transfer from Client to Xserver, I have found a simple Client App about Gamma and modified it (In attachment file). And use DDD-tools, go on to debugging the Gamma client program and Xserver. Finally found the Gamma values be calculated in "gamma_to_ramp" function and produced the 256 bytes data (16-bit). It will be writen to the "Geode driver -> lx_crtc_gamma_set function", The amend method see below:
<<test-gamma_client-App.rar>>
static void
lx_crtc_gamma_set(xf86CrtcPtr crtc, CARD16 * red, CARD16 * green,
CARD16 * blue, int size)
{
unsigned int dcfg;
int i;
DebugP("lx_crtc_gamma_set /by Hunk\n");
assert(size == 256);
for (i = 0; i < 256; i++) {
-- unsigned int val = (*red << 8) | *green | (*blue >> 8);
++ unsigned int val = (*(red++) << 8) | *(green++) | (*(blue++) >> 8);
df_set_video_palette_entry(i, val); //Screensaver bug keypoint by Hunk
}
/* df_set_video_palette_entry automatically turns on
* gamma for video - if this gets called, we assume that
* RandR wants it set for graphics, so reverse cimarron
*/
dcfg = READ_VID32(DF_DISPLAY_CONFIG);
dcfg &= ~DF_DCFG_GV_PAL_BYP;
WRITE_VID32(DF_DISPLAY_CONFIG, dcfg);
}
Now the update methods have been test in Ubuntu desktop and Fedora desktop. (properly display)
2). Repair the chaos phenomena transition process about desktop to screensaver
When the system desktop have been triggered the screensaver, it have a transition process, the display turn chaos and fading to Gamma correction values (1.0 -> 0.01). Debugging the xscreensaver App, I found the client will call “xf86GetGammaRamp -> RRCrtcGammaGet ->xf86RandR12CrtcGetGamma” to get the Xserver default Gamma Correction RAM tables, it will be read into client. In client app, the dates of tables will be multiply a coefficient (the coefficient is from 1.0 to 0.01).
The implement process is “main -> main_loop ->blank_screen -> raise_windows -> fade_screens -> xf86_gamma_fade -> xf86_gamma_fade -> tables values multiply a coefficient -> XF86VidModeSetGammaRamp”.
Repair methods: (fade.c -> xf86_gamma_fade function)
/* Iterate by steps of the animation... */
for (i = (out_p ? steps : 0);
(out_p ? i > 0 : i < steps);
(out_p ? i-- : i++))
{
for (screen = 0; screen < nscreens; screen++)
{
-- xf86_whack_gamma(dpy, screen, &info[screen], (((float)i) / ((float)steps)));
++ xf86_whack_gamma(dpy, screen, &info[screen], 1.0); /* changed by Hunk */
Now, through the two steps, the phenomena exhibit right from desktop to screensaver.
3). Welcome everyone test the code.
Find some more other Gamma test App (client program) to reproduce every possible instance based on Graphic data.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.x.org/archives/xorg-driver-geode/attachments/20100524/7550fff1/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test-gamma_client-App.rar
Type: application/octet-stream
Size: 47759 bytes
Desc: test-gamma_client-App.rar
URL: <http://lists.x.org/archives/xorg-driver-geode/attachments/20100524/7550fff1/attachment-0001.obj>
More information about the Xorg-driver-geode
mailing list