fullscreen support for games

Matthias Hopf mhopf at suse.de
Fri Oct 28 09:03:46 PDT 2005


On Oct 26, 05 22:36:44 +0200, Lionel Ulmer wrote:
> For example (always in the Wine perspective) modern 3D games fare often a
> lot better than 3 years old 2D games because the blitting from main screen
> to GFX card memory is killing us. If you add to this colour conversion from
> 8bpp to 32bpp (which multiplies the bandwith by 4), you have an idea of the
> problem.

I agree that blits for emulating direct framebuffer access can be quite
an issue. Have you thought of downloading the data in the original form
into a texture (i.e. a 16bit or even 8bit texture) and do the conversion
by drawing a textured quad (including table lookup for 8bit by a 1D
texture)?

> Investigations about using GL to optimize this are on my mind for years now
> but never got the motivation (for example, use the GFX card paletted texture
> - if supported - or some shaders for really modern ones to do the depth
> conversion).

Ok, I see you have ;)
Don't use the paletted texture extension! It has been depricated and is
no longer supported on NVIDIA, I don't remember whether ATI ever
implemented it (and if they did, they will depricate it soon as well).
Better use 1D textures and dependend texture lookups. No, you don't
necessarily need pixel shaders for that.
For 16bit->32bit conversion you don't need to do anything at all, just
draw the quad.

CU

Matthias

-- 
Matthias Hopf <mhopf at suse.de>       __        __   __
Maxfeldstr. 5 / 90409 Nuernberg    (_   | |  (_   |__         mat at mshopf.de
Phone +49-911-74053-715            __)  |_|  __)  |__  labs   www.mshopf.de



More information about the xorg mailing list