[Bug 93885] radeon: allow the user to set a maximum HDMI pixel clock (in MHz) by a kernel parameter

bugzilla-daemon at freedesktop.org bugzilla-daemon at freedesktop.org
Wed Jan 27 08:34:01 PST 2016


https://bugs.freedesktop.org/show_bug.cgi?id=93885

--- Comment #7 from Elmar Stellnberger <estellnb at elstel.org> ---
  Likewise for the G96M [GeForce 9600M GT] nobody would have believe that this
card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna provide the
logs tomorrow when the computer with the XFX radeon card is free for testing.
Just wanna tell that I still hope for a similar radeon tuning parameter like
hdmimhz. The fact that the card was sold with HDMI as 4K ready should be a
strong indication that 3840x2160 at 30/24/23 is possible. If I remember that
correctly 3840x2160 at 30 was initially stated to be supported officially by ATI
for the XFX card (though withdrawn now). I would even take the risk to test it
if the card should not work like this for any reason (old HDMI1.4
incompatibility or so.).

-- 
You are receiving this mail because:
You are the assignee for the bug.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.x.org/archives/xorg-driver-ati/attachments/20160127/3d11a7cb/attachment.html>


More information about the xorg-driver-ati mailing list