Don't know how to switch primary GPU

Hello together,

I'm total new to Garudo linux.
My pc has an uncommon setup of two gpus, one old gtx780 and a newer rx5700xt. The amd GPU is primary used for gaming and stuff and the gtx780 only for the OBS Studio to render the stream (just for fun, no big deal), but the rx5700xt is to weak to game @4k and also render the stream.
My problem is now: how can I switch the primary gpu under garudo?
They are both installed, but default the nvidia gpu is used for everything.
I clicked a bit through the optimus manager, but can't really find a way to switch it on the gui. I also tried running Valheim, it works, but the performance is very poor what might be the use of the wrong gpu.
Have you any idea how I can get the amd gpu working as primary?

inxi -zaF

System:    Kernel: 5.14.12-zen1-1-zen x86_64 bits: 64 compiler: gcc v: 11.1.0  
parameters: BOOT_IMAGE=/@/boot/vmlinuz-linux-zen  
root=UUID=3a22bd02-3058-4c8c-98aa-4ed76cc588a4 rw [email protected] quiet splash
rd.udev.log_priority=3 vt.global_cursor_default=0 systemd.unified_cgroup_hierarchy=1
resume=UUID=12518e0a-eec7-4823-95e2-d7c2124d91bc loglevel=3
Desktop: KDE Plasma 5.22.5 tk: Qt 5.15.2 info: latte-dock wm: kwin_x11 vt: 1 dm: SDDM  
Distro: Garuda Linux base: Arch Linux  
Machine:   Type: Desktop Mobo: ASRock model: Z590 Extreme serial: <filter>  
UEFI: American Megatrends LLC. v: P1.90 date: 07/22/2021  
CPU:       Info: 10-Core model: Intel Core i9-10900F bits: 64 type: MT MCP arch: Comet Lake family: 6  
model-id: A5 (165) stepping: 5 microcode: EC cache: L2: 20 MiB  
flags: avx avx2 lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx bogomips: 111997  
Speed: 4788 MHz min/max: 800/5200 MHz Core speeds (MHz): 1: 4788 2: 4800 3: 4800 4: 4800  
5: 4679 6: 4800 7: 4794 8: 4804 9: 4801 10: 4801 11: 4798 12: 4805 13: 4800 14: 4800 15: 4708  
16: 4797 17: 4736 18: 4747 19: 4808 20: 4799  
Vulnerabilities: Type: itlb_multihit status: KVM: VMX disabled  
Type: l1tf status: Not affected  
Type: mds status: Not affected  
Type: meltdown status: Not affected  
Type: spec_store_bypass mitigation: Speculative Store Bypass disabled via prctl and seccomp  
Type: spectre_v1 mitigation: usercopy/swapgs barriers and __user pointer sanitization  
Type: spectre_v2 mitigation: Enhanced IBRS, IBPB: conditional, RSB filling  
Type: srbds status: Not affected  
Type: tsx_async_abort status: Not affected  
Graphics:  Device-1: AMD Navi 10 [Radeon RX 5600 OEM/5600 XT / 5700/5700 XT] vendor: Sapphire Limited  
driver: amdgpu v: kernel bus-ID: 03:00.0 chip-ID: 1002:731f class-ID: 0300  
Device-2: NVIDIA GK110 [GeForce GTX 780] driver: nvidia v: 470.74 alternate: nouveau  
bus-ID: 04:00.0 chip-ID: 10de:1004 class-ID: 0300  
Device-3: Elgato Systems Cam Link 4K type: USB  
driver: hid-generic,snd-usb-audio,usbhid,uvcvideo bus-ID: 2-4:3 chip-ID: 0fd9:0066  
class-ID: 0102 serial: <filter>  
Display: x11 server: X.Org 1.20.13 compositor: kwin_x11 driver: loaded: modesetting,nvidia  
display-ID: :0 screens: 1  
Screen-1: 0 s-res: 7680x2160 s-dpi: 96 s-size: 2030x571mm (79.9x22.5") s-diag: 2109mm (83")  
Monitor-1: DP-1 res: 3840x2160 hz: 60 dpi: 161 size: 607x345mm (23.9x13.6")  
diag: 698mm (27.5")  
Monitor-2: DVI-I-1-1 res: 1920x1080 hz: 60 dpi: 102 size: 477x268mm (18.8x10.6")  
diag: 547mm (21.5")  
Monitor-3: DP-1-0 res: 1920x1080 hz: 60 dpi: 102 size: 478x269mm (18.8x10.6")  
diag: 548mm (21.6")  
OpenGL: renderer: AMD Radeon RX 5700 XT (NAVI10 DRM 3.42.0 5.14.12-zen1-1-zen LLVM 12.0.1)  
v: 4.6 Mesa 21.2.3 direct render: Yes

mhwd -li

Installed PCI configs:
--------------------------------------------------------------------------------
NAME               VERSION          FREEDRIVER           TYPE
--------------------------------------------------------------------------------
video-linux            2021.08.29                true            PCI
video-optimus-manager            2021.08.29               false            PCI

Warning: No installed USB configs!

inxi -Fxxxza | grep loaded

Display: x11 server: X.Org 1.20.13 compositor: kwin_x11 driver: loaded: modesetting,nvidia

Which one is your monitor connected to? That's your primary GPU.

1 Like

Normally the two fhd monitors are connected to the gtx780 and the 4k monitor is connected to the 5700xt.
This was my starting setup, and the 4k monitor was connected to the Radeon while the other were connected to the 780. With this setup the Geforce was the primary video rendering GPU.
I've reinstalled all AMD drivers yesterday, now the rx5700xt is the primary GPU, and now I can't use the 780 for monitor output anymore, all three monitors are connected to the AMD GPU yet to get recognized by the OS.

I also run Multi-GPU and have since maybe mid 1990's? Think 3 ISA cards, then 2 ISA and 1 PCI, then 2 PCI and 1 AGP...you get the picture ;).

Multi GPU has become super broken on Linux as a whole. I've search high and low on why "seemingly" the kernel started enumerating things backwards (or defaulting to last found) and why virtually every DE has broken XScreen enumeration.

I looked over your dump and can't really tell how you have things set up. There are a few lame suggestions/questions until you can toss in some more info.

#1 Have you tried powering off and on again? JOKING!
#1.realizes Have you/can you swap the PCIe slots. As I said things seem to be backwards in the last two years. Once upon a time my primary PCIe slot was "default" now every it's second class citizen i.e. my older GPU is always the default to. As long as your board divides your lane speeds up (meaning your 16x slot becomes 8x as soon as you add the second GPU) you might as well change their slots and see if the system behaves as you expect.

#2 How are you configuring things? Are you writing an xorg.conf or are you setting things up with a/x/randr? I've figured out how to do multi GPU with randr (wayland like) but the overhead is INSANE and the syntax for the newly added "providers" is vague. Tad off topic but if you're using an randr set up you might also be gimping your performance in general. Dump says you're on X rather than Wayland so best you go the xorg.conf route.

#3 How are you launching things? (semi contingent on #2) The randr style doesn't allow you to really define who is primary or who does what which adds tons of overhead (kinda makes an SLI set up but with mismatched hardware and just makes a mess). If you have an xorg.conf primary can be defined by simply ensuring all the GPU > screens you want to be primary are screen 0. Then you launch everything you want secondary with DISPLAY=:0.1/2/3/4 etc <name -o- program>

A lot of tools/applications will also have a --display option. (Though the vast majority still have it but it no longer works.) When I login I have a script that sets everything up for me, runs all my set applications on the correct gpu/screen/workspace etc. You need to get a tad brutalist with defining who does what and where with multi GPU.

I don't know for sure (as no one ever answers the question) but I'm guessing this is all poor transitional decisions in hopes Wayland isn't a steaming pile...However speaking to one of the wayland devs made me really think it's going to be a small nightmare for wasted GPU resources. All GPU other than the main one will be useless other than physical outs from what was explained to me meaning you can't offload processing just pipe outputs to other places...waste. I hope I misunderstood this because it seems terrible compared to a nice split GPU xorg.conf.

In a perfect world I'd buy a GPU that did what I needed but not even the $5000+ Radeon W6800 can run all my screens. Any why the hell would anyone want to pay $5k+ when what used to be 2 or 3 $200/300 GPU's can run more and technically better due to resource segregation! I really hope whatever this trend is of breaking something that's worked for over 20 years ends!
rm -rf --nuclear Wayland Pipewire :wink:

As an addendum you mentioned using the optimus tool but this isn't an optimus set up. The nVidia driver panel will make an xorg.conf for the nvidia GPU but you will have to write in the AMD bit yourself. Well unless there is some AMD equivalent I've never seen. My last Radeon GPU was an R9 270X which sucked and never really had decent support as AMD basically killed driver support right as I bought the damned thing.