<html>
<head>
<base href="https://bugs.freedesktop.org/">
</head>
<body>
<p>
<div>
<b><a class="bz_bug_link
bz_status_NEW "
title="NEW - Civilization VI - Artifacts in mouse cursor"
href="https://bugs.freedesktop.org/show_bug.cgi?id=108355#c11">Comment # 11</a>
on <a class="bz_bug_link
bz_status_NEW "
title="NEW - Civilization VI - Artifacts in mouse cursor"
href="https://bugs.freedesktop.org/show_bug.cgi?id=108355">bug 108355</a>
from <span class="vcard"><a class="email" href="mailto:freedesktop@psydk.org" title="Hadrien Nilsson <freedesktop@psydk.org>"> <span class="fn">Hadrien Nilsson</span></a>
</span></b>
<pre>Thank you for the patch, Michel.
I spent the last few days trying to understand the situation about color mouse
cursors and I think it is rather confusing.
I extracted the mouse cursor asset from Civ6 and I could reproduce the same
problem with my SDL test program. When I multiplied the R, G and B components
with the Alpha, the blending was correct. So I think you are right about the
symptoms.
I tried your patch and the blending was correct in both my SDL program and the
game.
Now the question is, what should be the correct behavior? How should the RGBA
data be provided, premultiplied or straight? I tried to find specifications,
alas:
- the SDL documentation gives no clue about it;
- XcursorImage* documentation neither;
- wl_pointer_set_cursor neither.
I even looked at Windows. No clue there with the exception of what I think is a
low level function: DrvSetPointerShape, where finally it gives information
about the pixel format: the RGB data should be indeed alpha premultiplied.
As the documentation is not very clear, I had to test implementations. Here are
my results with different combinations:
- Xorg + amdgpu : premultiplied ok, straight wrong
- Xorg + intel : premultiplied ok, straight wrong
- Xorg + proprietary nVidia : premultiplied ok, straight ok
- wayland + amdgpu : premultiplied ok, straight wrong
- wayland + intel : premultiplied ok, straight wrong
- Windows + amdgpu : premultiplied ok, straight ok
- Windows + intel : premultiplied ok, straight ok
So it looks like Windows with any graphics card is able to detect a wrong
format (straight alpha) and fix it, as you did in your patch. And also nVidia
proprietary X11 drivers have the same workaround. I think it could explain why
Aspyr only supports nVidia as nVidia drivers will fix wrong data for them.
I'm very happy your patch fixes Xorg + amdgpu combination, but isn't there a
larger problem, where such a workaround should be also done somewhere else in a
centralized part of the graphics stack (SDL, Xcursor, wayland?) so any
combination could benefit of it?</pre>
</div>
</p>
<hr>
<span>You are receiving this mail because:</span>
<ul>
<li>You are the assignee for the bug.</li>
</ul>
</body>
</html>