Intel integrated GPU used instead of Nvidia GTX 1660 for any game

I've recently installed Garuda KDE Dr460nized Gamer on my laptop, dual booting with Windows on the side.

When I'm playing any Steam Game, the game runs on the intel iGPU instead of my dedicated GPU, which basically makes it run much worse than they should. I've tried Team Fortress 2 (which runs on OpenGL) and Borderlands 2 (unsure)

so far, i've tried to:

  • auto install proprietary drivers using the Garuda Settings Manager
  • manually uninstall video-nvidia-prime-renderer-offload and install video-nvidia-dkms instead, using the Garuda Settings Manager
  • check the UEFI to see if there was a setting that disabled the NVIDIA GPU (there was none, but the UEFI did detect and show the GPU alongside the intel iGPU)
  • edit my /etc/mkinitcpio.conf file to replace "nouveau" with nvidia (contents of the file below)
  • I've considered following this guide but I figured it'd be wiser not to blindly paste text in what are probably system files


Kernel: 5.19.6-zen1-1-zen arch: x86_64 bits: 64 compiler: gcc v: 12.2.0
parameters: BOOT_IMAGE=/@/boot/vmlinuz-linux-zen
root=UUID=c5ea7d35-48e9-4936-abf3-cc1ba8dcc8cf rw [email protected]
quiet quiet splash rd.udev.log_priority=3 vt.global_cursor_default=0
Desktop: KDE Plasma v: 5.25.4 tk: Qt v: 5.15.5 info: latte-dock
wm: kwin_x11 vt: 1 dm: SDDM Distro: Garuda Linux base: Arch Linux
Type: Laptop System: ASUSTeK product: ASUS TUF Gaming F15 FX506LH_TUF566LH
v: 1.0 serial: <superuser required>
Mobo: ASUSTeK model: FX506LH v: 1.0 serial: <superuser required>
UEFI: American Megatrends v: FX506LH.310 date: 11/26/2021
ID-1: BAT1 charge: 36.0 Wh (85.3%) condition: 42.2/48.1 Wh (87.9%)
volts: 12.8 min: 11.7 model: ASUS A32-K55 type: Li-ion serial: N/A
status: charging
Info: model: Intel Core i5-10300H bits: 64 type: MT MCP arch: Comet Lake
gen: core 10 built: 2020 process: Intel 14nm family: 6 model-id: 0xA5 (165)
stepping: 2 microcode: 0xF0
Topology: cpus: 1x cores: 4 tpc: 2 threads: 8 smt: enabled cache:
L1: 256 KiB desc: d-4x32 KiB; i-4x32 KiB L2: 1024 KiB desc: 4x256 KiB
L3: 8 MiB desc: 1x8 MiB
Speed (MHz): avg: 2737 high: 4400 min/max: 800/4500 scaling:
driver: intel_pstate governor: performance cores: 1: 2500 2: 4400 3: 2500
4: 2500 5: 2500 6: 2500 7: 2500 8: 2500 bogomips: 39999
Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx
Type: itlb_multihit status: KVM: VMX disabled
Type: l1tf status: Not affected
Type: mds status: Not affected
Type: meltdown status: Not affected
Type: mmio_stale_data mitigation: Clear CPU buffers; SMT vulnerable
Type: retbleed mitigation: Enhanced IBRS
Type: spec_store_bypass mitigation: Speculative Store Bypass disabled via
Type: spectre_v1 mitigation: usercopy/swapgs barriers and __user pointer
Type: spectre_v2 mitigation: Enhanced IBRS, IBPB: conditional, RSB
filling, PBRSB-eIBRS: SW sequence
Type: srbds mitigation: Microcode
Type: tsx_async_abort status: Not affected
Device-1: Intel CometLake-H GT2 [UHD Graphics] vendor: ASUSTeK driver: i915
v: kernel arch: Gen-9.5 process: Intel 14nm built: 2016-20 ports:
active: eDP-1 empty: none bus-ID: 00:02.0 chip-ID: 8086:9bc4
class-ID: 0300
Device-2: NVIDIA TU117M vendor: ASUSTeK driver: nvidia v: 515.65.01
alternate: nouveau,nvidia_drm non-free: 515.xx+ status: current (as of
2022-07) arch: Turing code: TUxxx process: TSMC 12nm built: 2018-22 pcie:
gen: 1 speed: 2.5 GT/s lanes: 16 link-max: gen: 3 speed: 8 GT/s
bus-ID: 01:00.0 chip-ID: 10de:1f99 class-ID: 0300
Device-3: Sonix USB2.0 HD UVC WebCam type: USB driver: uvcvideo
bus-ID: 1-7:2 chip-ID: 322e:202c class-ID: 0e02
Display: x11 server: X.Org v: 21.1.4 with: Xwayland v: 22.1.3
compositor: kwin_x11 driver: X: loaded: modesetting,nvidia
unloaded: nouveau alternate: fbdev,intel,nv,vesa gpu: i915 display-ID: :0
screens: 1
Screen-1: 0 s-res: 1920x1080 s-dpi: 96 s-size: 508x285mm (20.00x11.22")
s-diag: 582mm (22.93")
Monitor-1: eDP-1 model: Najing CEC Panda 0x004d built: 2019
res: 1920x1080 hz: 144 dpi: 142 gamma: 1.2 size: 344x194mm (13.54x7.64")
diag: 395mm (15.5") ratio: 16:9 modes: 1920x1080
OpenGL: renderer: Mesa Intel UHD Graphics (CML GT2) v: 4.6 Mesa 22.1.7
direct render: Yes
Device-1: Intel Comet Lake PCH cAVS vendor: ASUSTeK driver: snd_hda_intel
v: kernel alternate: snd_soc_skl,snd_sof_pci_intel_cnl bus-ID: 00:1f.3
chip-ID: 8086:06c8 class-ID: 0403
Device-2: NVIDIA vendor: ASUSTeK driver: snd_hda_intel v: kernel pcie:
gen: 1 speed: 2.5 GT/s lanes: 16 link-max: gen: 3 speed: 8 GT/s
bus-ID: 01:00.1 chip-ID: 10de:10fa class-ID: 0403
Sound Server-1: ALSA v: k5.19.6-zen1-1-zen running: yes
Sound Server-2: PulseAudio v: 16.1 running: no
Sound Server-3: PipeWire v: 0.3.56 running: yes
Device-1: MEDIATEK MT7921 802.11ax PCI Express Wireless Network Adapter
vendor: AzureWave driver: mt7921e v: kernel pcie: gen: 2 speed: 5 GT/s
lanes: 1 bus-ID: 03:00.0 chip-ID: 14c3:7961 class-ID: 0280
IF: wlp3s0 state: up mac: <filter>
Device-2: Realtek RTL8111/8168/8411 PCI Express Gigabit Ethernet
vendor: ASUSTeK driver: r8169 v: kernel pcie: gen: 1 speed: 2.5 GT/s
lanes: 1 port: 3000 bus-ID: 04:00.0 chip-ID: 10ec:8168 class-ID: 0200
IF: enp4s0 state: down mac: <filter>
Device-1: IMC Networks Wireless_Device type: USB driver: btusb v: 0.8
bus-ID: 1-14:3 chip-ID: 13d3:3563 class-ID: e001 serial: <filter>
Report: bt-adapter ID: hci0 rfk-id: 0 state: up address: <filter>
Local Storage: total: 476.94 GiB used: 124.53 GiB (26.1%)
SMART Message: Unable to run smartctl. Root privileges required.
ID-1: /dev/nvme0n1 maj-min: 259:0 vendor: Micron
model: 2210 MTFDHBA512QFD size: 476.94 GiB block-size: physical: 512 B
logical: 512 B speed: 31.6 Gb/s lanes: 4 type: SSD serial: <filter>
rev: P6MA001 temp: 37.9 C scheme: GPT
ID-1: / raw-size: 394.3 GiB size: 394.3 GiB (100.00%) used: 124.5 GiB
(31.6%) fs: btrfs dev: /dev/nvme0n1p6 maj-min: 259:6
ID-2: /boot/efi raw-size: 260 MiB size: 256 MiB (98.46%) used: 25.9 MiB
(10.1%) fs: vfat dev: /dev/nvme0n1p1 maj-min: 259:1
ID-3: /home raw-size: 394.3 GiB size: 394.3 GiB (100.00%) used: 124.5 GiB
(31.6%) fs: btrfs dev: /dev/nvme0n1p6 maj-min: 259:6
ID-4: /var/log raw-size: 394.3 GiB size: 394.3 GiB (100.00%) used: 124.5
GiB (31.6%) fs: btrfs dev: /dev/nvme0n1p6 maj-min: 259:6
ID-5: /var/tmp raw-size: 394.3 GiB size: 394.3 GiB (100.00%) used: 124.5
GiB (31.6%) fs: btrfs dev: /dev/nvme0n1p6 maj-min: 259:6
Kernel: swappiness: 133 (default 60) cache-pressure: 100 (default)
ID-1: swap-1 type: zram size: 7.6 GiB used: 1.74 GiB (22.9%)
priority: 100 dev: /dev/zram0
System Temperatures: cpu: 62.0 C pch: 56.0 C mobo: N/A
Fan Speeds (RPM): cpu: 0
Processes: 291 Uptime: 48m wakeups: 1 Memory: 7.6 GiB used: 4.2 GiB (55.3%)
Init: systemd v: 251 default: graphical tool: systemctl Compilers:
gcc: 12.2.0 Packages: pacman: 1957 lib: 554 Shell: fish v: 3.5.1
default: Bash v: 5.1.16 running-in: konsole inxi: 3.3.20
Garuda (2.6.6-1):
System install date:     2022-08-30
Last full system update: 2022-09-03
Is partially upgraded:   No
Relevant software:       NetworkManager
Windows dual boot:       Probably (Run as root to verify)
Snapshots:               Snapper
Failed units:

I'm pretty sure that the GTX 1650 is called "NVIDIA TU117M". I'm not sure why it doesn't call it by its name.

first paragraph (not full file) of /etc/mkinitcpio.conf

# vim:set ft=sh
# The following modules are loaded before any boot hooks are
# run.  Advanced users may wish to specify all system modules
# in this array.  For instance:
#     MODULES=(crc32c-intel nvidia nvidia_modeset nvidia_uvm nvidia_drm)
MODULES=(crc32c-intel nvidia nvidia_modeset nvidia_uvm nvidia_drm)


Sat Sep  3 10:44:32 2022
| NVIDIA-SMI 515.65.01    Driver Version: 515.65.01    CUDA Version: 11.7     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0 Off |                  N/A |
| N/A   54C    P8     1W /  N/A |      6MiB /  4096MiB |      0%      Default |
|                               |                      |                  N/A |

| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|    0   N/A  N/A       672      G   /usr/lib/Xorg                       4MiB |

the above output just doesn't change when I (run the command while I) run a game


glxheads: exercise multiple GLX connections (any key = exit)
glxheads xdisplayname ...
glxheads :0 mars:0 venus:1
Name: :0
Display:     0x56539ddf81b0
Window:      0x6e00002
Context:     0x56539de1b490
GL_VERSION:  4.6 (Compatibility Profile) Mesa 22.1.7
GL_VENDOR:   Intel
GL_RENDERER: Mesa Intel(R) UHD Graphics (CML GT2)

Being new to Garuda and Linux in general, I'm unsure if there is any more useful info that I can show you. I've also tried running games from the Heroic Games launcher but they crash instantly, which is probably a completely different issue altogether, which shouldn't be related to Garuda.

Hi there, welcome to the forum!
Check if this helps.


Hey thanks, that fixed it in an instant. I really just needed to add prime-run %command% to the steam game's launch settings. It just works.


Glad it helped.
And thanks for the complete and well formatted first post.
Unfortunately this is quite unusual :blush:


This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.