Making the discrete AMD GPU the default GPU in a Mux-less setup with an integrated Intel GPU in Mint 18 (Xenial)
Ryan Ross
lightknight.rr at gmail.com
Sat Jul 23 06:32:54 UTC 2016
Ok, I think I have the process down (for generating the driver,
anyways). I'm just building all of X in the process, albeit with
glamor,dri/2/3, and static libs enabled.
A few things:
modesetting_drv.so
(https://www.dropbox.com/s/vntvl2gh5k6qmgn/modesetting_drv.so?dl=0) is huge!
Steam is complaining about missing drivers
(https://www.dropbox.com/s/8bgc48u4xg8badw/steam.txt?dl=0)...perhaps the
modesetting driver I generated is tied to the other drivers I built in
the process of making X?
Glxinfo if you are interested:
https://www.dropbox.com/s/t0g5lw59a3dcpvv/glxinfo.txt?dl=0
Latest Xorg.0.log: https://www.dropbox.com/s/n9of9nmzn9lxd1s/Xorg.0.log?dl=0
And both the latest modified source and configured build directory, in
one package, if that might help:
https://www.dropbox.com/s/8d27fipcvglc140/xorg-server.tar.bz2?dl=0
I'm currently thinking that installing the generated X server + what not
might fix everything; the only thing holding me up is figuring out the
prefixes (it wants to install to /usr/local, but the files it replaces
start at /usr/, etc.), and knowing what exactly is in this package
(should I choose to use checkinstall...it seems to have drivers, the
xserver, lots of things...) so I can put that in the package manifest.
And one other...unique thing: pressing the 'Print Screen' button, while
using this driver in this fashion, does not take a screenshot, but
brings you back to the window manager.
And it's 2:30 AM...again...sleep beckons.
Good night,
Ryan
On 07/22/2016 04:54 AM, Yu, Qiang wrote:
> I just copy out the hw/xfree86/drivers/modesetting, and make it build
> with the system xserver-xorg-dev like a normal DDX (ie. xserver-xorg-video-ati).
>
> But I don't think this makes any difference if you build the patched whole
> Xserver to generate modesetting_drv.so. I think the RenderXXX option
> is the point to find why your modesetting_drv.so can't work.
>
> Regards,
> Qiang
>
> ________________________________________
> From: Ryan Ross <lightknight.rr at gmail.com>
> Sent: Friday, July 22, 2016 4:31:05 PM
> To: Yu, Qiang
> Cc: xorg-driver-ati at lists.x.org
> Subject: Re: Making the discrete AMD GPU the default GPU in a Mux-less setup with an integrated Intel GPU in Mint 18 (Xenial)
>
> Ok, now you need to teach me how you did that magic trick. Mind you, the
> fan is running non-stop, but I don't care.
>
> Output:
>
> ryan at ryan-Satellite-P55t-B ~ $ glxinfo
> name of display: :0
> display: :0 screen: 0
> direct rendering: Yes
> server glx vendor string: SGI
> server glx version string: 1.4
> server glx extensions:
> GLX_ARB_create_context, GLX_ARB_create_context_profile,
> GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
> GLX_ARB_framebuffer_sRGB, GLX_ARB_multisample,
> GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
> GLX_EXT_fbconfig_packed_float, GLX_EXT_framebuffer_sRGB,
> GLX_EXT_import_context, GLX_EXT_texture_from_pixmap,
> GLX_EXT_visual_info,
> GLX_EXT_visual_rating, GLX_INTEL_swap_event, GLX_MESA_copy_sub_buffer,
> GLX_OML_swap_method, GLX_SGIS_multisample, GLX_SGIX_fbconfig,
> GLX_SGIX_pbuffer, GLX_SGIX_visual_select_group, GLX_SGI_swap_control
> client glx vendor string: Mesa Project and SGI
> client glx version string: 1.4
> client glx extensions:
> GLX_ARB_create_context, GLX_ARB_create_context_profile,
> GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
> GLX_ARB_framebuffer_sRGB, GLX_ARB_get_proc_address,
> GLX_ARB_multisample,
> GLX_EXT_buffer_age, GLX_EXT_create_context_es2_profile,
> GLX_EXT_create_context_es_profile, GLX_EXT_fbconfig_packed_float,
> GLX_EXT_framebuffer_sRGB, GLX_EXT_import_context,
> GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info,
> GLX_EXT_visual_rating,
> GLX_INTEL_swap_event, GLX_MESA_copy_sub_buffer,
> GLX_MESA_multithread_makecurrent, GLX_MESA_query_renderer,
> GLX_MESA_swap_control, GLX_OML_swap_method, GLX_OML_sync_control,
> GLX_SGIS_multisample, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
> GLX_SGIX_visual_select_group, GLX_SGI_make_current_read,
> GLX_SGI_swap_control, GLX_SGI_video_sync
> GLX version: 1.4
> GLX extensions:
> GLX_ARB_create_context, GLX_ARB_create_context_profile,
> GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
> GLX_ARB_framebuffer_sRGB, GLX_ARB_get_proc_address,
> GLX_ARB_multisample,
> GLX_EXT_buffer_age, GLX_EXT_create_context_es2_profile,
> GLX_EXT_create_context_es_profile, GLX_EXT_fbconfig_packed_float,
> GLX_EXT_framebuffer_sRGB, GLX_EXT_import_context,
> GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info,
> GLX_EXT_visual_rating,
> GLX_INTEL_swap_event, GLX_MESA_copy_sub_buffer,
> GLX_MESA_multithread_makecurrent, GLX_MESA_query_renderer,
> GLX_MESA_swap_control, GLX_OML_swap_method, GLX_OML_sync_control,
> GLX_SGIS_multisample, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
> GLX_SGIX_visual_select_group, GLX_SGI_make_current_read,
> GLX_SGI_swap_control, GLX_SGI_video_sync
> Extended renderer info (GLX_MESA_query_renderer):
> Vendor: X.Org (0x1002)
> Device: AMD CAPE VERDE (DRM 2.43.0 / 4.4.0-31-generic, LLVM 3.8.1)
> (0x6823)
> Version: 12.1.0
> Accelerated: yes
> Video memory: 2048MB
> Unified memory: no
> Preferred profile: core (0x1)
> Max core profile version: 4.1
> Max compat profile version: 3.0
> Max GLES1 profile version: 1.1
> Max GLES[23] profile version: 3.0
> OpenGL vendor string: X.Org
> OpenGL renderer string: Gallium 0.4 on AMD CAPE VERDE (DRM 2.43.0 /
> 4.4.0-31-generic, LLVM 3.8.1)
> OpenGL core profile version string: 4.1 (Core Profile) Mesa 12.1.0-devel
> OpenGL core profile shading language version string: 4.10
> OpenGL core profile context flags: (none)
> OpenGL core profile profile mask: core profile
> OpenGL core profile extensions:
>
> That does indeed work.
>
> Regards,
> Ryan
>
>
> On 07/22/2016 02:52 AM, Yu, Qiang wrote:
>> Yes, I tested on Ubuntu 16.04 and it's OK. But I haven't tried to build
>> the whole Xserver to get the modesetting_drv.so, instead I build it separately.
>>
>> Attach my modesetting DDX and you can copy it to
>> /usr/lib/xorg/extra-modules/drivers/
>> for a try.
>>
>> Regards,
>> Qiang
>> ________________________________________
>> From: Ryan Ross <lightknight.rr at gmail.com>
>> Sent: Friday, July 22, 2016 2:25:46 PM
>> To: Yu, Qiang
>> Cc: xorg-driver-ati at lists.x.org
>> Subject: Re: Making the discrete AMD GPU the default GPU in a Mux-less setup with an integrated Intel GPU in Mint 18 (Xenial)
>>
>> Ah, and your configuration works? I'll have to look closer into the
>> difference between Ubuntu and Mint.
>>
>> As promised, here are the files:
>> The log file (again):
>> https://www.dropbox.com/s/cqklcqjn18aer3m/Xorg.0.log?dl=0
>> And the build-source files:
>> https://www.dropbox.com/s/itt8nbh7rs9lmz6/xorg-build.tar.bz2?dl=0
>>
>> Thanks,
>> Ryan
>>
>> On 07/22/2016 02:15 AM, Yu, Qiang wrote:
>>> 1.18.3 is OK. And I use Ubuntu 16.04.
>>>
>>> The RenderPath and RenderDriver are necessary for modesetting
>>> to select a different render GPU than the display GPU.
>>>
>>> Regards,
>>> Qiang
>>> ________________________________________
>>> From: Ryan Ross <lightknight.rr at gmail.com>
>>> Sent: Friday, July 22, 2016 1:43:01 PM
>>> To: Yu, Qiang; xorg-driver-ati at lists.x.org
>>> Subject: Re: Making the discrete AMD GPU the default GPU in a Mux-less setup with an integrated Intel GPU in Mint 18 (Xenial)
>>>
>>> Greetings Qiang,
>>>
>>> What version of X are you using? I've built my own with what Mint
>>> currently is offering, which is apparently 1.18.3, but it seems to be
>>> behind whatever you are using. I'll have the full source (with the
>>> modifications) available for you as soon as DropBox finishes synching
>>> everything with my laptop (went with a clean install, it may be a few
>>> hours...). Until then, I've attached the Xorg.log, if you're interested.
>>>
>>> For whatever reason it did not care for the "RenderPath" and
>>> "RenderDriver" sections.
>>>
>>> Ah, and one line needed to be added 'driver.c' to get it to compile:
>>> "#include <gbm.h>". Aside from that, once all dependencies were
>>> satisfied, it compiled well.
>>>
>>> Many Thanks,
>>> Ryan
>>>
>>>
>>>
>>> On 07/21/2016 05:02 AM, Yu, Qiang wrote:
>>>> Hi Ryan,
>>>>
>>>> That's interesting. I also want to do this but in another way:
>>>> one modesetting DDX display on iGPU and render on dGPU
>>>>
>>>> Attach the prototype patch and the xorg.conf should be
>>>>
>>>> Section "ServerFlags"
>>>> Option "AutoAddGPU" "off"
>>>> EndSection
>>>>
>>>> Section "Device"
>>>> Identifier "Intel"
>>>> Driver "modesetting"
>>>> Option "RenderPath" "radeon"
>>>> Option "RenderDriver" "radeonsi"
>>>> EndSection
>>>>
>>>> Regards,
>>>> Qiang
>>>>
>>>> ________________________________________
>>>> From: xorg-driver-ati <xorg-driver-ati-bounces at lists.x.org> on behalf of Ryan Ross <lightknight.rr at gmail.com>
>>>> Sent: Thursday, July 21, 2016 11:49:32 AM
>>>> To: xorg-driver-ati at lists.x.org
>>>> Subject: Making the discrete AMD GPU the default GPU in a Mux-less setup with an integrated Intel GPU in Mint 18 (Xenial)
>>>>
>>>> Greetings,
>>>>
>>>> I have a Toshiba Satellite P55T-B5340 that I am attempting to persuade
>>>> to boot X using the discrete GPU, and only the discrete GPU. It's been a
>>>> lot of fun, because it's muxless (no hardware switch, vgaswitcharoo
>>>> seems only to crash it), and the GPU I want to use has no dedicated
>>>> outputs (and so, somehow, has to steal them from the integrated GPU,
>>>> using magic from what I can tell...). I've attempted to adapt the Nvidia
>>>> approach, which sadly only appears to work for glxinfo; attempts to use
>>>> "modesetting", screens, and AllowEmptyInitialConfiguration appear to
>>>> have no (positive) effect on X, as it really wants this GPU to be hooked
>>>> up to some form of output (or it will unload the module). On the upside,
>>>> the discrete GPU is quite happy with the 'radeon' driver.
>>>>
>>>> Additional Information:
>>>>
>>>> The iGPU is an Intel HD 4600 Graphics (CPU is a i7-4710HQ).
>>>> The dGPU is an AMD/ATI Venus Pro (Radeon HD 8850M / R9 M265X).
>>>>
>>>> Output from xrandr --listproviders is as follows:
>>>> Providers: number : 3
>>>> Provider 0: id: 0x6d cap: 0x9, Source Output, Sink Offload crtcs: 4
>>>> outputs: 4 associated providers: 2 name:Intel
>>>> Provider 1: id: 0x45 cap: 0x6, Sink Output, Source Offload crtcs: 6
>>>> outputs: 0 associated providers: 2 name:VERDE @ pci:0000:01:00.0
>>>> Provider 2: id: 0x45 cap: 0x6, Sink Output, Source Offload crtcs: 6
>>>> outputs: 0 associated providers: 2 name:VERDE @ pci:0000:01:00.0
>>>>
>>>> Pages I've looked at, solutions I've tried:
>>>>
>>>> https://wiki.archlinux.org/index.php/PRIME
>>>> https://www.mankier.com/5/xorg.conf#Outputclass_Section
>>>> http://us.download.nvidia.com/XFree86/Linux-x86/364.15/README/randr14.html
>>>>
>>>> and a search through both Google and this greater mailing list for
>>>> 'discrete gpu', finding some interesting stuff, but nothing applicable.
>>>>
>>>> Has anyone some magic glitter to spare, or perhaps some working
>>>> knowledge of how I might get this beast into the proper configuration?
>>>>
>>>> Many Thanks,
>>>> Ryan
>>>>
>>>>
>>>> _______________________________________________
>>>> xorg-driver-ati mailing list
>>>> xorg-driver-ati at lists.x.org
>>>> https://lists.x.org/mailman/listinfo/xorg-driver-ati
More information about the xorg-driver-ati
mailing list