Linux OpenGL ABI discussion

Adam Jackson ajax at nwnk.net
Thu Sep 29 10:54:00 PDT 2005


On Thursday 29 September 2005 04:35, Dave Airlie wrote:
> I have to agree with Christoph, the libGL should be a
> one-size-fits-all and capable of loading drivers from any vendor, I'm
> not sure what is so hard about this apart from the fact that neither
> vendor has seemed willing to help out infrastructure on the basis of
> some belief that they shouldn't have to (maybe because they don't on
> Windows) or maybe because they don't want to be seen to collaborate on
> things.... there is hardly any major secrets in the libGL interface
> that should stop it...

There is exactly one "secret": how to go from GL entrypoint to driver dispatch 
table as fast as possible while still being thread-correct and etc.  However 
this can be read right out of the compiled object with any reasonable 
disassembler, so it's not much of a secret.

> As far as I know idr did a lot of work recently on libGL so we can
> expose GL extensions for vendors like ATI without them having to ship
> their own driver (I'm not sure if ATI contributed anything more than a
> list of things needed).. I think he mentioned this was a bit more
> difficult for glx.. but I'm sure it should be possible...

We already had this thread:

http://lists.freedesktop.org/archives/dri-egl/2005-July/000565.html

In particular, Andy's response about why they're uninterested in a common 
libGL is basically The Last Word on the subject.  It would require that 
nvidia expend time, effort, and money to get to the same level of 
functionality they already have.  This applies equally to any other IHV, and 
to ISVs like XiG and SciTech too for that matter.  You can have whatever 
opinion you like about that stance, but it's simply an economic reality.

It's also irrelevant.  libGL simply needs to provide ABI guarantees.  
Specifying driver compatibility is outside the scope of the LSB.

I would make the case that the sonumber for a libGL that supports OpenGL 2.0 
should start with 1.  DSO version numbers are for ABI changes, and OpenGL 2.0 
is simply not backwards-incompatible with OpenGL 1.5 for the set of 
entrypoints they share.  It's not like 2.0 changes the prototype for glEnd() 
or anything.  So, 1.6.  Or 1.10 or whatever, if we really think that people 
want to do more GL 1.x versions.

I would also make the case that the LSB should in no case require an 
implementation to have features unavailable in open source.  In particular, 
requiring GL 2.0 would be broken.  Remember what the L stands for here.

The deeper issue here is whether it's actually useful to require some minimum 
level of functionality even when large swaths of it will be software.  If I 
don't have cube map support in hardware, do I really want to try it in 
software?  Is that a useful experience for developers or for users?

Perhaps what I would like is a new set of glGetString tokens that describe 
what version and extensions the hardware is actually capable of accelerating, 
rather than what the software supports.  Because in some sense, advertising 
GL 2.0 on a Riva is so inaccurate as to be worse than lying.

> This is as far as I know how MS's OpenGL ICD system works, there is
> one frontend and your driver can expose extra things via it...

It's not.  MS's MCD (mini client driver) system had something like our current 
system, where you have one GL dispatch layer and the vender provides a driver 
that gets loaded by the system.  In the ICD scheme, opengl32.dll (or whatever 
it is) is provided per-vendor.

- ajax
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
URL: <http://lists.x.org/archives/xorg/attachments/20050929/0567394f/attachment.pgp>


More information about the xorg mailing list