[PATCH 3/3] glx/dri3: Request non-vsynced Present for swapinterval zero.
Mario Kleiner
mario.kleiner.de at gmail.com
Mon Dec 15 19:53:47 PST 2014
On 12/15/2014 06:46 AM, Keith Packard wrote:
> Mario Kleiner <mario.kleiner.de at gmail.com> writes:
>
>> Restores proper immediate tearing swap behaviour for
>> OpenGL bufferswap under DRI3/Present.
> Hrm. I'd love for this to be controlled by the GLX_EXT_swap_control_tear
> extension, but that one uses negative interval values to indicate
> tearing, instead of providing a new API, and we can't tell the
> difference between 0 and -0.
>
> Are you sure you don't want GLX_EXT_swap_control_tear and an interval of
> -1 instead of an interval of 0 with tearing?
>
Yes. GLX_EXT_swap_control_tear It's a different use case. Useful for
games which want to avoid tearing if possible, but don't want to get in
a "tremor" of switching frequently between say 60 fps and 30 fps if they
only almost manage to do 60 fps, but not with reliable headroom. Would
be also nice to have support for that, but only as an addition for
swapinterval -1, not a replacement.
The 0 case is good for benchmarking.
In my specific case i always want vsync'ed swap for actual visual
stimulation in neuroscience/medical settings, with no frame skipped
ever. The bonus use for me, except for benchmarking how fast the system
can go, is if one has a multi-display setup, e.g., dual-display for
stereoscopic stimulation - one display per eye, or some CAVE like setup
for VR with more than 2 displays. You want display updates and scanout
on all of them synchronized, so the scene stays coherent. One simple way
for visually testing multi-display sync is to intentionally swap all of
them without vsync, e.g., timed to swap in the middle of the scanout. If
the tear-lines on all displays are roughly at the same vertical position
and stay there then that's a good visual test if stuff works. There are
other ways to do it, but this is the one method that seems to work
cross-platform, without lots of mental context switching depending on
what os/gpu/server/driver combo with what settings one uses, and much
more easy to grasp for scientists with no graphics background. You can
see at a glance if stuff is roughly correct or not.
-mario
More information about the xorg-devel
mailing list