[PATCH 0/6]

Ian Romanick ian.d.romanick at intel.com
Mon Apr 20 21:14:13 PDT 2009


This patch set hits 4 repositories.  Woo-hoo!

The old DRI2GetBuffers protocol is replaced with DRI2GetBuffersWithFormat.
This protocol adds a per-buffer format value.  This is a magic value that is
opaque outside the driver.  The Intel driver simply sends the bits-per-pixel.
Drivers with other requirements to make the correct allocation can send other
data in the field.

The new function also behaves somewhat differently.  DRI2GetBuffers would
create the requested set of buffers and destroy all previously existing
buffers.  DRI2GetBuffersWithFormat creates the requested set of buffers, but
it only destroys existing buffers on attachments in the requested set.
Futher, it only allocates (or destroys) buffers if the size or format have
changed.  This allows a client to initially request { DRI2BufferBackLeft,
DRI2BufferDepth }, and later request { DRI2BufferBackLeft, DRI2BufferFrontLeft,
DRI2BufferDepth } without ill effect.

Since the buffer allocation is done piece-wise, it is difficult to implement
the combined depth / stencil buffers.  Taking a note from
GL_ARB_framebuffer_object, I added a new DRI2BufferDepthStencil attachment.

I have tested the following combinations with the listed result:

 * New libGL, old 3D driver, old X server, old 2D driver -> works just as
   previously
 * New libGL, new 3D driver, old X server, old 2D driver -> works just as
   previously
 * New libGL, new 3D driver, new X server, old 2D driver -> DRI2 fails to
   initialize, uses the software rasterizer
 * New libGL, new 3D driver, old X server, new 2D driver -> DRI2 fails to
   initialize, uses the software rasterizer
 * New libGL, new 3D driver, new X server, new 2D driver -> Works the way
   we really want it to!  Front-buffer rendering works, but the fake front-
   buffer is only allocated when it's needed.  JUST LIKE MAGIC!

The combination that is not tested is the old libGL / 3D driver with everything
else new.  This should work.  The idea is that if the the 2D driver receives
format=0 in CreateBuffer, it should make the same allocation that CreateBuffers
would have made for the attachment.

dri2proto changes:

Ian Romanick (1):
  Add protocol for DRI2GetBuffersWithFormat

 configure.ac  |    2 +-
 dri2proto.h   |    5 ++-
 dri2proto.txt |   67 +++++++++++++++++++++++++++++++++++++++++++++++++++++++-
 dri2tokens.h  |    1 +
 4 files changed, 70 insertions(+), 5 deletions(-)

xserver changes:

Ian Romanick (1):
  DRI2: Implement protocol for DRI2GetBuffersWithFormat

 configure.ac              |    2 +-
 glx/glxdri2.c             |   59 ++++++++++++--
 hw/xfree86/dri2/dri2.c    |  195 +++++++++++++++++++++++++++++++++------------
 hw/xfree86/dri2/dri2.h    |   22 +++++-
 hw/xfree86/dri2/dri2ext.c |   88 ++++++++++++++------
 5 files changed, 278 insertions(+), 88 deletions(-)

xf86-video-intel changes:

Ian Romanick (1):
  DRI2: If the SDK supports it, use the DRI2GetBuffersWithFormat
    interfaces

 src/i830_dri.c |  126 +++++++++++++++++++++++++++++++++++++++++++++++++++++++-
 1 files changed, 125 insertions(+), 1 deletions(-)

Mesa changes:

Ian Romanick (3):
  DRI2: Implement protocol for DRI2GetBuffersWithFormat
  DRI2: Implement interface for drivers to access
    DRI2GetBuffersWithFormat
  inte / DRI2: When available, use DRI2GetBuffersWithFormat

 include/GL/internal/dri_interface.h        |   28 +++++++-
 src/glx/x11/dri2.c                         |   70 ++++++++++++++++++
 src/glx/x11/dri2.h                         |   10 +++
 src/glx/x11/dri2_glx.c                     |  104 ++++++++++++++++++++++-----
 src/mesa/drivers/dri/intel/intel_buffers.c |   10 +++
 src/mesa/drivers/dri/intel/intel_context.c |  105 +++++++++++++++++++++++----
 6 files changed, 291 insertions(+), 36 deletions(-)



More information about the xorg-devel mailing list