Unable to enable Wayland on Gnome

Hi guys, I'm on Garuda GNOME and I have a laptop with AMD GPU and NVIDIA, currently I would like to use wayland because I would like to use android apps on my pc with waydroid, as well as I want to try wayland after a long time not using it.

Before writing here I already searched your forums and followed all the guides. in detail I followed:
I added the parameter nvidia-drm.modeset=1, enabled the flag "Enable GDM Wayland", added the 3 seconds delay by editing the gdm service

No solution worked, I keep reading "Mutter (X11)" in the neofetch. Even trying to start waydroid, it too reports that I am not on wayland.

How can I do this? (If it is helpful to know, I currently use optimus manager to change GPU to integrated only, since with Nvidia prime my GPU still stays on even if not in use, and consumes several watts. also I use a second external monitor connected to HDMI).
(I also tried weston, but the apps become difficult to use when they go into portrait mode)

This is my garuda-inix:

 ╭─daniele@daniele in ~ took 645ms
 ╰─λ garuda-inxi
System:
  Kernel: 6.4.1-zen2-1-zen arch: x86_64 bits: 64 compiler: gcc v: 13.1.1
    parameters: BOOT_IMAGE=/@/boot/vmlinuz-linux-zen
    root=UUID=1af7acc8-474a-4dde-82cf-0219dd70ce0e rw rootflags=subvol=@
    quiet splash rd.udev.log_priority=3 vt.global_cursor_default=0 loglevel=3
    nvidia-drm.modeset=1 amd_iommu=off ibt=off
  Desktop: GNOME v: 44.2 tk: GTK v: 3.24.38 wm: gnome-shell dm: GDM v: 44.1
    Distro: Garuda Linux base: Arch Linux
Machine:
  Type: Laptop System: LENOVO product: 82B1 v: Lenovo Legion 5 15ARH05H
    serial: <superuser required> Chassis: type: 10 v: Lenovo Legion 5 15ARH05H
    serial: <superuser required>
  Mobo: LENOVO model: LNVNB161216 v: NO DPK serial: <superuser required>
    UEFI: LENOVO v: FSCN24WW date: 04/14/2022
Battery:
  ID-1: BAT0 charge: 71.8 Wh (100.0%) condition: 71.8/80.0 Wh (89.8%)
    volts: 17.3 min: 15.4 model: Celxpert L19C4PC1 type: Li-poly
    serial: <filter> status: full cycles: 150
CPU:
  Info: model: AMD Ryzen 7 4800H with Radeon Graphics bits: 64 type: MT MCP
    arch: Zen 2 gen: 3 level: v3 note: check built: 2020-22
    process: TSMC n7 (7nm) family: 0x17 (23) model-id: 0x60 (96) stepping: 1
    microcode: 0x8600104
  Topology: cpus: 1x cores: 8 tpc: 2 threads: 16 smt: enabled cache:
    L1: 512 KiB desc: d-8x32 KiB; i-8x32 KiB L2: 4 MiB desc: 8x512 KiB L3: 8 MiB
    desc: 2x4 MiB
  Speed (MHz): avg: 1434 high: 1617 min/max: 1400/2900 boost: enabled
    scaling: driver: acpi-cpufreq governor: schedutil cores: 1: 1397 2: 1400
    3: 1397 4: 1411 5: 1617 6: 1400 7: 1397 8: 1400 9: 1567 10: 1397 11: 1433
    12: 1531 13: 1400 14: 1400 15: 1400 16: 1400 bogomips: 92635
  Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 sse4a ssse3 svm
  Vulnerabilities: <filter>
Graphics:
  Device-1: NVIDIA TU106M [GeForce RTX 2060 Mobile] vendor: Lenovo
    driver: nvidia v: 535.54.03 alternate: nouveau,nvidia_drm non-free: 530.xx+
    status: current (as of 2023-05) arch: Turing code: TUxxx
    process: TSMC 12nm FF built: 2018-22 pcie: gen: 1 speed: 2.5 GT/s lanes: 8
    link-max: gen: 3 speed: 8 GT/s lanes: 16 ports: active: none off: HDMI-A-1
    empty: DP-1,eDP-2 bus-ID: 01:00.0 chip-ID: 10de:1f15 class-ID: 0300
  Device-2: AMD Renoir vendor: Lenovo driver: amdgpu v: kernel arch: GCN-5
    code: Vega process: GF 14nm built: 2017-20 pcie: gen: 4 speed: 16 GT/s
    lanes: 16 ports: active: eDP-1 empty: none bus-ID: 06:00.0
    chip-ID: 1002:1636 class-ID: 0300 temp: 52.0 C
  Device-3: IMC Networks Integrated Camera driver: uvcvideo type: USB
    rev: 2.0 speed: 480 Mb/s lanes: 1 mode: 2.0 bus-ID: 3-3:2 chip-ID: 13d3:56ff
    class-ID: 0e02
  Display: x11 server: X.Org v: 21.1.8 with: Xwayland v: 23.1.2
    compositor: gnome-shell driver: X: loaded: modesetting,nvidia dri: radeonsi
    gpu: amdgpu,nvidia,nvidia-nvswitch display-ID: :1 screens: 1
  Screen-1: 0 s-res: 3840x1080 s-dpi: 96 s-size: 1016x286mm (40.00x11.26")
    s-diag: 1055mm (41.55")
  Monitor-1: HDMI-A-1 mapped: HDMI-1-0 note: disabled pos: right
    model: Acer R240Y serial: <filter> built: 2019 res: 1920x1080 hz: 75 dpi: 93
    gamma: 1.2 size: 527x296mm (20.75x11.65") diag: 604mm (23.8") ratio: 16:9
    modes: max: 1920x1080 min: 640x480
  Monitor-2: eDP-1 pos: primary,left model: AU Optronics 0xd1ed built: 2019
    res: 1920x1080 hz: 120 dpi: 142 gamma: 1.2 size: 344x193mm (13.54x7.6")
    diag: 394mm (15.5") ratio: 16:9 modes: 1920x1080
  API: OpenGL v: 4.6 Mesa 23.1.3 renderer: AMD Radeon Graphics (renoir LLVM
    15.0.7 DRM 3.52 6.4.1-zen2-1-zen) direct-render: Yes
Audio:
  Device-1: NVIDIA TU106 High Definition Audio vendor: Lenovo
    driver: snd_hda_intel v: kernel pcie: gen: 1 speed: 2.5 GT/s lanes: 8
    link-max: gen: 3 speed: 8 GT/s lanes: 16 bus-ID: 01:00.1
    chip-ID: 10de:10f9 class-ID: 0403
  Device-2: AMD ACP/ACP3X/ACP6x Audio Coprocessor vendor: Lenovo driver: N/A
    alternate: snd_pci_acp3x, snd_rn_pci_acp3x, snd_pci_acp5x, snd_pci_acp6x,
    snd_acp_pci, snd_rpl_pci_acp6x, snd_pci_ps, snd_sof_amd_renoir,
    snd_sof_amd_rembrandt pcie: gen: 4 speed: 16 GT/s lanes: 16
    bus-ID: 06:00.5 chip-ID: 1022:15e2 class-ID: 0480
  Device-3: AMD Family 17h/19h HD Audio vendor: Lenovo driver: snd_hda_intel
    v: kernel pcie: gen: 4 speed: 16 GT/s lanes: 16 bus-ID: 06:00.6
    chip-ID: 1022:15e3 class-ID: 0403
  API: ALSA v: k6.4.1-zen2-1-zen status: kernel-api tools: N/A
  Server-1: PipeWire v: 0.3.72 status: active with: 1: pipewire-pulse
    status: active 2: wireplumber status: active 3: pipewire-alsa type: plugin
    4: pw-jack type: plugin tools: pactl,pw-cat,pw-cli,wpctl
Network:
  Device-1: Realtek RTL8111/8168/8411 PCI Express Gigabit Ethernet
    vendor: Lenovo driver: r8169 v: kernel pcie: gen: 1 speed: 2.5 GT/s lanes: 1
    port: 2000 bus-ID: 03:00.0 chip-ID: 10ec:8168 class-ID: 0200
  IF: eno1 state: up speed: 1000 Mbps duplex: full mac: <filter>
  Device-2: Intel Wi-Fi 6 AX200 driver: iwlwifi v: kernel pcie: gen: 2
    speed: 5 GT/s lanes: 1 bus-ID: 04:00.0 chip-ID: 8086:2723 class-ID: 0280
  IF: wlp4s0 state: up mac: <filter>
Bluetooth:
  Device-1: Intel AX200 Bluetooth driver: btusb v: 0.8 type: USB rev: 2.0
    speed: 12 Mb/s lanes: 1 mode: 1.1 bus-ID: 5-3:7 chip-ID: 8087:0029
    class-ID: e001
  Report: bt-adapter ID: hci0 rfk-id: 4 state: up address: <filter>
Drives:
  Local Storage: total: 2.29 TiB used: 1.3 TiB (56.7%)
  SMART Message: Required tool smartctl not installed. Check --recommends
  ID-1: /dev/nvme0n1 maj-min: 259:1 vendor: Kingston model: SFYRD2000G
    size: 1.82 TiB block-size: physical: 512 B logical: 512 B speed: 63.2 Gb/s
    lanes: 4 tech: SSD serial: <filter> fw-rev: EIFK31.6 temp: 36.9 C
    scheme: GPT
  ID-2: /dev/nvme1n1 maj-min: 259:0 vendor: Lenovo
    model: UMIS RPITJ512VME2OWD size: 476.94 GiB block-size: physical: 512 B
    logical: 512 B speed: 31.6 Gb/s lanes: 4 tech: SSD serial: <filter>
    fw-rev: 1.4C1908 temp: 47.9 C scheme: GPT
Partition:
  ID-1: / raw-size: 966.8 GiB size: 966.8 GiB (100.00%)
    used: 112.53 GiB (11.6%) fs: btrfs dev: /dev/nvme0n1p5 maj-min: 259:6
  ID-2: /boot/efi raw-size: 100 MiB size: 96 MiB (96.00%)
    used: 29.3 MiB (30.6%) fs: vfat dev: /dev/nvme0n1p1 maj-min: 259:2
  ID-3: /home raw-size: 966.8 GiB size: 966.8 GiB (100.00%)
    used: 112.53 GiB (11.6%) fs: btrfs dev: /dev/nvme0n1p5 maj-min: 259:6
  ID-4: /var/log raw-size: 966.8 GiB size: 966.8 GiB (100.00%)
    used: 112.53 GiB (11.6%) fs: btrfs dev: /dev/nvme0n1p5 maj-min: 259:6
  ID-5: /var/tmp raw-size: 966.8 GiB size: 966.8 GiB (100.00%)
    used: 112.53 GiB (11.6%) fs: btrfs dev: /dev/nvme0n1p5 maj-min: 259:6
Swap:
  Kernel: swappiness: 133 (default 60) cache-pressure: 100 (default)
  ID-1: swap-1 type: zram size: 30.72 GiB used: 0 KiB (0.0%) priority: 100
    dev: /dev/zram0
Sensors:
  System Temperatures: cpu: 72.5 C mobo: N/A gpu: amdgpu temp: 52.0 C
  Fan Speeds (RPM): N/A
Info:
  Processes: 416 Uptime: 21m wakeups: 1 Memory: available: 30.73 GiB
  used: 4.66 GiB (15.2%) Init: systemd v: 253 default: graphical
  tool: systemctl Compilers: gcc: 13.1.1 alt: 11/12 clang: 15.0.7 Packages:
  pm: pacman pkgs: 1520 libs: 430 tools: pamac,paru Shell: fish v: 3.6.1
  default: Bash v: 5.1.16 running-in: gnome-terminal inxi: 3.3.27
Garuda (2.6.16-1):
  System install date:     2023-06-18
  Last full system update: 2023-07-06 ↻
  Is partially upgraded:   No
  Relevant software:       snapper NetworkManager dracut nvidia-dkms
  Windows dual boot:       Probably (Run as root to verify)
  Failed units:            


You also have to actually switch the session type on the GDM login screen. After you enter your username there should be a gear icon that appears to choose this setting.

This is obviously just a random picture from the internet, but it illustrates the idea well enough:

2 Likes

yes, I already knew that and I tried, but only "Gnome" and "Gnome classic" appears in the gear

Check if you have these Nvidia-related lines in /usr/lib/udev/rules.d/61-gdm.rules. If you do, comment them out:

disable Wayland on Hi1710 chipsets
ATTR{vendor}=="0x19e5", ATTR{device}=="0x1711", RUN+="/usr/lib/gdm-runtime-config set daemon WaylandEnable false"
disable Wayland when using the proprietary nvidia driver
DRIVER=="nvidia", RUN+="/usr/lib/gdm-runtime-config set daemon WaylandEnable false"
disable Wayland if modesetting is disabled
IMPORT{cmdline}="nomodeset", RUN+="/usr/lib/gdm-runtime-config set daemon WaylandEnable false"
2 Likes

yes, there are some rules about it but they are a little different

# disable Wayland on Hi1710 chipsets
ATTR{vendor}=="0x19e5", ATTR{device}=="0x1711", GOTO="gdm_disable_wayland"

The rule "disable Wayland when using the proprietary nvidia driver" is not present

# disable wayland if modesetting is disabled
KERNEL!="card[0-9]*", GOTO="gdm_nomodeset_end"
KERNEL=="card[0-9]-*", GOTO="gdm_nomodeset_end"
SUBSYSTEM!="drm", GOTO="gdm_nomodeset_end"
# but keep it enabled for simple framebuffer drivers
DRIVERS=="simple-framebuffer", GOTO="gdm_nomodeset_end"
IMPORT{parent}="GDM_MACHINE_HAS_VIRTUAL_GPU"
ENV{GDM_MACHINE_HAS_VIRTUAL_GPU}!="1", RUN+="/usr/bin/touch /run/udev/gdm-machine-has-hardware-gpu"
IMPORT{cmdline}="nomodeset", GOTO="gdm_disable_wayland"
LABEL="gdm_nomodeset_end"

At the end of the file there is:

LABEL="gdm_disable_wayland"
RUN+="/usr/lib/gdm-runtime-config set daemon WaylandEnable false"
GOTO="gdm_end"

is it possible that some of these rules prevent my video card from going into sleep and not consuming more power? so i would get rid of optimus manager

If you want to use Wayland, I would comment out any line that references disabling Wayland. A lot of those rules are set up as a safeguard for Nvidia users, because Nvidia is less compatible with Wayland in general--but if you want to try it then you'll have to take those rules down.

Getting rid of Optimus manager is our community's general recommendation anyway; here is the guidance in the Garuda wiki for laptops with two GPUs:

2 Likes

I know the discussion may be off topic for this question, but I have already followed that guide and my GPU continues to stay active, using a battery meter I detect a discharge of 20-27W. Instead with optimus disabling the GPU I get 7-8W in idle. I would also like to get rid of optimus, but currently any solution I tried did not disable the GPU. And if I disable it, the second monitor is not detected :cry:

Should I open a new issue for this?

(regarding wayland as soon as it's back on pc I will try to disable the parameters and update this thread

I did and now I am on wayland thank you!
But to do that I had to uninstall optimus and install prime-render-offload and currently currently running nvidia-smi I get:

 ╭─daniele@daniele in ~ took 65ms
 ╰─λ nvidia-smi
Thu Jul  6 14:46:02 2023       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.54.03              Driver Version: 535.54.03    CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA GeForce RTX 2060        Off | 00000000:01:00.0 Off |                  N/A |
| N/A   51C    P8               4W /  80W |      3MiB /  6144MiB |      0%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
                                                                                         
+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|    0   N/A  N/A      2119      G   /usr/bin/gnome-shell                          1MiB |
+---------------------------------------------------------------------------------------+

Of course, I disconnected the monitor before the command because I know that HDMI forces the GPU to stay on to process the second screen.

the fact that gnome-shell uses 1mb of video memory, is enough to use 18W in idle on battery :cry:

Any suggestions?

Hopefully a knowledgeable Nvidia user will chime in because I actually have no idea.

There is this suggestion in the ArchWiki from the PRIME article to install switcheroo-control, but to be honest I don't know if that is good advice or not:

Gnome integration

For GNOME integration, install switcheroo-control and enable switcheroo-control.service.

GNOME will respect the PrefersNonDefaultGPU property in the desktop entry. Alternatively, you can launch applications with GPU by right clicking on the icon and choosing Launch using Discrete Graphics Card.

1 Like

@DanielusG, you might want to check this out (just added to the guide):

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.