[Bug 93885] radeon: allow the user to set a maximum HDMI pixel clock (in MHz) by a kernel parameter

bugzilla-daemon at freedesktop.org bugzilla-daemon at freedesktop.org
Wed Jan 27 08:56:33 PST 2016


https://bugs.freedesktop.org/show_bug.cgi?id=93885

--- Comment #9 from Alex Deucher <alexdeucher at gmail.com> ---
(In reply to Elmar Stellnberger from comment #7)
>   Likewise for the G96M [GeForce 9600M GT] nobody would have believe that
> this card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna
> provide the logs tomorrow when the computer with the XFX radeon card is free
> for testing. Just wanna tell that I still hope for a similar radeon tuning
> parameter like hdmimhz. 

That hw was not designed to support 4k over hdmi.  If you want to hack the
driver, you are welcome to (take a look at radeon_dvi_mode_valid()), but it's
not something I want to enable or support out of the box.  If you break your
card or monitor, you get to keep the pieces.

-- 
You are receiving this mail because:
You are the assignee for the bug.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.x.org/archives/xorg-driver-ati/attachments/20160127/52460c2d/attachment.html>


More information about the xorg-driver-ati mailing list