<html>
<head>
<base href="https://bugs.freedesktop.org/" />
</head>
<body>
<p>
<div>
<b><a class="bz_bug_link
bz_status_RESOLVED bz_closed"
title="RESOLVED NOTABUG - radeon: allow the user to set a maximum HDMI pixel clock (in MHz) by a kernel parameter"
href="https://bugs.freedesktop.org/show_bug.cgi?id=93885#c7">Comment # 7</a>
on <a class="bz_bug_link
bz_status_RESOLVED bz_closed"
title="RESOLVED NOTABUG - radeon: allow the user to set a maximum HDMI pixel clock (in MHz) by a kernel parameter"
href="https://bugs.freedesktop.org/show_bug.cgi?id=93885">bug 93885</a>
from <span class="vcard"><a class="email" href="mailto:estellnb@elstel.org" title="Elmar Stellnberger <estellnb@elstel.org>"> <span class="fn">Elmar Stellnberger</span></a>
</span></b>
<pre> Likewise for the G96M [GeForce 9600M GT] nobody would have believe that this
card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna provide the
logs tomorrow when the computer with the XFX radeon card is free for testing.
Just wanna tell that I still hope for a similar radeon tuning parameter like
hdmimhz. The fact that the card was sold with HDMI as 4K ready should be a
strong indication that 3840x2160@30/24/23 is possible. If I remember that
correctly 3840x2160@30 was initially stated to be supported officially by ATI
for the XFX card (though withdrawn now). I would even take the risk to test it
if the card should not work like this for any reason (old HDMI1.4
incompatibility or so.).</pre>
</div>
</p>
<hr>
<span>You are receiving this mail because:</span>
<ul>
<li>You are the assignee for the bug.</li>
</ul>
</body>
</html>