Something on the System is taking TOO MUCH SPACE

Inxi:

perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
        LANGUAGE = "",
        LC_ALL = (unset),
        LC_ADDRESS = "pt_BR.UTF-8",
        LC_NAME = "pt_BR.UTF-8",
        LC_MONETARY = "pt_BR.UTF-8",                                                                                                   
        LC_PAPER = "pt_BR.UTF-8",                                                                                                      
        LC_IDENTIFICATION = "pt_BR.UTF-8",                                                                                             
        LC_TELEPHONE = "pt_BR.UTF-8",                                                                                                  
        LC_MEASUREMENT = "pt_BR.UTF-8",                                                                                                
        LC_TIME = "pt_BR.UTF-8",                                                                                                       
        LC_NUMERIC = "pt_BR.UTF-8",                                                                                                    
        LANG = "en_US.UTF-8"                                                                                                           
    are supported and installed on your system.                                                                                        
perl: warning: Falling back to the standard locale ("C").                                                                              
System:                                                                                                                                
  Kernel: 5.15.94-1-lts arch: x86_64 bits: 64 compiler: gcc v: 12.2.1                                                                  
    parameters: BOOT_IMAGE=/@/boot/vmlinuz-linux-lts                                                                                   
    root=UUID=33da784c-6400-4145-99e3-93bc3c7463b2 rw [email protected]                                                               
    quiet                                                                                                                              
    cryptdevice=UUID=5b16040e-5d39-4cd0-80b7-423ffe50eaff:luks-5b16040e-5d39-4cd0-80b7-423ffe50eaff                                    
    root=/dev/mapper/luks-5b16040e-5d39-4cd0-80b7-423ffe50eaff splash                                                                  
    rd.udev.log_priority=3 vt.global_cursor_default=0 loglevel=3 ibt=off                                                               
  Desktop: LXQt v: 1.2.1 tk: Qt v: 5.15.8 info: cairo-dock, lxqt-panel                                                                 
    wm: kwin_x11 vt: 1 dm: SDDM Distro: Garuda Linux base: Arch Linux                                                                  
Machine:                                                                                                                               
  Type: Desktop Mobo: INTEL model: HM65DESK serial: <superuser required>                                                               
    UEFI: American Megatrends v: 4.6.5 date: 02/23/2019                                                                                
CPU:                                                                                                                                   
  Info: model: Intel Core i7-2620M bits: 64 type: MT MCP arch: Sandy Bridge                                                            
    gen: core 2 level: v2 built: 2010-12 process: Intel 32nm family: 6                                                                 
    model-id: 0x2A (42) stepping: 7 microcode: 0x2F                                                                                    
  Topology: cpus: 1x cores: 2 tpc: 2 threads: 4 smt: enabled cache:
    L1: 128 KiB desc: d-2x32 KiB; i-2x32 KiB L2: 512 KiB desc: 2x256 KiB
    L3: 4 MiB desc: 1x4 MiB
  Speed (MHz): avg: 1219 high: 1642 min/max: 800/3400 scaling:
    driver: intel_cpufreq governor: performance cores: 1: 1020 2: 1642 3: 812
    4: 1403 bogomips: 21551
  Flags: avx ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3
  Vulnerabilities: <filter>
Graphics:
  Device-1: NVIDIA GM107 [GeForce GTX 750] driver: nvidia v: 525.89.02
    alternate: nouveau,nvidia_drm non-free: 525.xx+
    status: current (as of 2023-02) arch: Maxwell code: GMxxx
    process: TSMC 28nm built: 2014-19 pcie: gen: 2 speed: 5 GT/s lanes: 8
    link-max: lanes: 16 bus-ID: 01:00.0 chip-ID: 10de:1381 class-ID: 0300
  Display: x11 server: X.Org v: 21.1.7 with: Xwayland v: 22.1.8
    compositor: kwin_x11 driver: X: loaded: nvidia unloaded: modesetting
    alternate: fbdev,nouveau,nv,vesa gpu: nvidia display-ID: :0 screens: 1
  Screen-1: 0 s-res: 1360x768 s-dpi: 90 s-size: 384x300mm (15.12x11.81")
    s-diag: 487mm (19.18")
  Monitor-1: HDMI-0 res: 1360x768 hz: 60 dpi: 49
    size: 708x398mm (27.87x15.67") diag: 812mm (31.98") modes: N/A
  API: OpenGL v: 4.6.0 NVIDIA 525.89.02 renderer: NVIDIA GeForce GTX
    750/PCIe/SSE2 direct-render: Yes
Audio:
  Device-1: Intel 6 Series/C200 Series Family High Definition Audio
    driver: snd_hda_intel v: kernel bus-ID: 00:1b.0 chip-ID: 8086:1c20
    class-ID: 0403
  Device-2: NVIDIA GM107 High Definition Audio [GeForce 940MX]
    driver: snd_hda_intel v: kernel pcie: gen: 2 speed: 5 GT/s lanes: 8
    link-max: lanes: 16 bus-ID: 01:00.1 chip-ID: 10de:0fbc class-ID: 0403
  Sound API: ALSA v: k5.15.94-1-lts running: yes
  Sound Server-1: PulseAudio v: 16.1 running: no
  Sound Server-2: PipeWire v: 0.3.65 running: yes
Network:
  Device-1: Realtek RTL8111/8168/8411 PCI Express Gigabit Ethernet
    driver: r8169 v: kernel pcie: gen: 1 speed: 2.5 GT/s lanes: 1 port: d000
    bus-ID: 03:00.0 chip-ID: 10ec:8168 class-ID: 0200
  IF: enp3s0 state: down mac: <filter>
  Device-2: Ralink MT7601U Wireless Adapter type: USB driver: mt7601u
    bus-ID: 2-1.5:3 chip-ID: 148f:7601 class-ID: 0000 serial: <filter>
  IF: wlp0s29u1u5 state: up mac: <filter>
Drives:
  Local Storage: total: 2.8 TiB used: 875.95 GiB (30.6%)
  SMART Message: Unable to run smartctl. Root privileges required.
  ID-1: /dev/sda maj-min: 8:0 model: SATA SSD size: 55.9 GiB block-size:
    physical: 512 B logical: 512 B speed: 3.0 Gb/s type: SSD serial: <filter>
    rev: Sb10 scheme: GPT
  ID-2: /dev/sdb maj-min: 8:16 vendor: Seagate model: ST3000NM0053
    size: 2.73 TiB block-size: physical: 512 B logical: 512 B speed: 3.0 Gb/s
    type: HDD rpm: 7200 serial: <filter> rev: G00A scheme: GPT
  ID-3: /dev/sdc maj-min: 8:32 type: USB vendor: Generic model: Flash Disk
    size: 14.51 GiB block-size: physical: 512 B logical: 512 B type: SSD
    serial: <filter> rev: 8.07 scheme: MBR
  SMART Message: Unknown USB bridge. Flash drive/Unsupported enclosure?
Partition:
  ID-1: / raw-size: 558.79 GiB size: 558.79 GiB (100.00%)
    used: 262.12 GiB (46.9%) fs: btrfs dev: /dev/dm-0 maj-min: 254:0
    mapped: luks-5b16040e-5d39-4cd0-80b7-423ffe50eaff
  ID-2: /boot/efi raw-size: 5.59 GiB size: 5.58 GiB (99.80%)
    used: 242.2 MiB (4.2%) fs: vfat dev: /dev/sda2 maj-min: 8:2
  ID-3: /home raw-size: 558.79 GiB size: 558.79 GiB (100.00%)
    used: 262.12 GiB (46.9%) fs: btrfs dev: /dev/dm-0 maj-min: 254:0
    mapped: luks-5b16040e-5d39-4cd0-80b7-423ffe50eaff
  ID-4: /var/log raw-size: 558.79 GiB size: 558.79 GiB (100.00%)
    used: 262.12 GiB (46.9%) fs: btrfs dev: /dev/dm-0 maj-min: 254:0
    mapped: luks-5b16040e-5d39-4cd0-80b7-423ffe50eaff
  ID-5: /var/tmp raw-size: 558.79 GiB size: 558.79 GiB (100.00%)
    used: 262.12 GiB (46.9%) fs: btrfs dev: /dev/dm-0 maj-min: 254:0
    mapped: luks-5b16040e-5d39-4cd0-80b7-423ffe50eaff
Swap:
  Kernel: swappiness: 133 (default 60) cache-pressure: 100 (default)
  ID-1: swap-1 type: zram size: 3.79 GiB used: 0 KiB (0.0%) priority: 100
    dev: /dev/zram0
Sensors:
  System Temperatures: cpu: 42.0 C mobo: N/A gpu: nvidia temp: 36 C
  Fan Speeds (RPM): N/A gpu: nvidia fan: 33%
Info:
  Processes: 240 Uptime: 9m wakeups: 0 Memory: 3.79 GiB used: 1.44 GiB (38.0%)
  Init: systemd v: 253 default: graphical tool: systemctl Compilers:
  gcc: 12.2.1 alt: 11 clang: 15.0.7 Packages: 1812 pm: pacman pkgs: 1768
  libs: 540 tools: octopi,pamac,paru pm: flatpak pkgs: 44 Shell: Bash
  v: 5.1.16 running-in: qterminal inxi: 3.3.25
Garuda (2.6.14-1):
head: cannot open '/var/log/pacman.log' for reading: No such file or directory
  System install date:     
  Last full system update: 2023-02-19
  Is partially upgraded:   No
  Relevant software:       snapper NetworkManager mkinitcpio nvidia-dkms
  Windows dual boot:       No/Undetected
  Failed units:            shadow.service 

Alright, I'm doing something super wrong, my installation is taking up to 319gb of space. HELP.

I have simply no idea what could be making my system so fat. I've used baobab to see what it was, but it can only go so far as detecting 53gb of the 319. According to it, the heaviest stuff are a 5gb lib under cuda, 12gb of flatpak and 8gb under /usr/share, but this still explains nothing.

In case you're wondering, Garuda has a separate partition on it's own and my home is about 60gb - and somehow I still feel this is way too big, must have a lot of cache or something o.o

TLDR: I'm not sure how to properly analyze my disk and I really don't want an installation of 320gb of heaviness. Any help is greatly appreciated.

Please post sudo btrfs subvolume list / and df -h

sudo btrfs subvolume list /

ID 256 gen 331754 top level 5 path restore_backup_@_110925332
ID 257 gen 335113 top level 5 path @home
ID 258 gen 332046 top level 5 path @root
ID 259 gen 277931 top level 5 path @srv
ID 260 gen 335080 top level 5 path @cache
ID 261 gen 335110 top level 5 path @log
ID 262 gen 335075 top level 5 path @tmp
ID 263 gen 335071 top level 3620 path .snapshots
ID 500 gen 278011 top level 5 path @_backup_20221412121412479
ID 1334 gen 130215 top level 263 path .snapshots/696/snapshot
ID 1628 gen 130215 top level 263 path .snapshots/887/snapshot
ID 1976 gen 130300 top level 263 path .snapshots/1102/snapshot
ID 2279 gen 155475 top level 263 path .snapshots/1255/snapshot
ID 2406 gen 166094 top level 263 path .snapshots/1322/snapshot
ID 2550 gen 181686 top level 263 path .snapshots/1390/snapshot
ID 2836 gen 207322 top level 263 path .snapshots/1598/snapshot
ID 2947 gen 217446 top level 263 path .snapshots/1662/snapshot
ID 2999 gen 221525 top level 263 path .snapshots/1706/snapshot
ID 3050 gen 232266 top level 263 path .snapshots/1749/snapshot
ID 3093 gen 232266 top level 263 path .snapshots/1772/snapshot
ID 3110 gen 232319 top level 263 path .snapshots/1775/snapshot
ID 3176 gen 238447 top level 263 path .snapshots/1829/snapshot
ID 3250 gen 243928 top level 263 path .snapshots/1877/snapshot
ID 3314 gen 249626 top level 263 path .snapshots/1912/snapshot
ID 3399 gen 257693 top level 263 path .snapshots/1952/snapshot
ID 3422 gen 260046 top level 263 path .snapshots/1974/snapshot
ID 3460 gen 263622 top level 263 path .snapshots/2006/snapshot
ID 3520 gen 270040 top level 263 path .snapshots/2049/snapshot
ID 3532 gen 278011 top level 5 path @_backup_20222112142513663
ID 3583 gen 274744 top level 263 path .snapshots/2102/snapshot
ID 3620 gen 335112 top level 5 path @
ID 3678 gen 283558 top level 263 path .snapshots/2189/snapshot
ID 3749 gen 290429 top level 263 path .snapshots/2257/snapshot
ID 3761 gen 291686 top level 263 path .snapshots/2265/snapshot
ID 3800 gen 295170 top level 263 path .snapshots/2277/snapshot
ID 3855 gen 301361 top level 263 path .snapshots/2292/snapshot
ID 3905 gen 306134 top level 263 path .snapshots/2312/snapshot
ID 3906 gen 306159 top level 263 path .snapshots/2313/snapshot
ID 3907 gen 306178 top level 263 path .snapshots/2314/snapshot
ID 3908 gen 306269 top level 263 path .snapshots/2315/snapshot
ID 3909 gen 306270 top level 263 path .snapshots/2316/snapshot
ID 3932 gen 308437 top level 263 path .snapshots/2329/snapshot
ID 3984 gen 313314 top level 263 path .snapshots/2349/snapshot
ID 3985 gen 313333 top level 263 path .snapshots/2350/snapshot
ID 3987 gen 313436 top level 263 path .snapshots/2352/snapshot
ID 3988 gen 313437 top level 263 path .snapshots/2353/snapshot
ID 3997 gen 314180 top level 263 path .snapshots/2362/snapshot
ID 4021 gen 316395 top level 263 path .snapshots/2376/snapshot
ID 4049 gen 319081 top level 263 path .snapshots/2395/snapshot
ID 4134 gen 326386 top level 263 path .snapshots/2435/snapshot
ID 4135 gen 326413 top level 263 path .snapshots/2436/snapshot
ID 4144 gen 327252 top level 263 path .snapshots/2443/snapshot
ID 4154 gen 328073 top level 263 path .snapshots/2444/snapshot
ID 4168 gen 329217 top level 263 path .snapshots/2458/snapshot
ID 4171 gen 329550 top level 263 path .snapshots/2461/snapshot
ID 4186 gen 330791 top level 263 path .snapshots/2476/snapshot
ID 4195 gen 331552 top level 263 path .snapshots/2482/snapshot
ID 4197 gen 331732 top level 263 path .snapshots/2484/snapshot
ID 4198 gen 331755 top level 263 path .snapshots/2485/snapshot
ID 4203 gen 332049 top level 263 path .snapshots/2490/snapshot
ID 4204 gen 332050 top level 263 path .snapshots/2491/snapshot
ID 4205 gen 332084 top level 263 path .snapshots/2492/snapshot
ID 4220 gen 333552 top level 263 path .snapshots/2506/snapshot

df -h

Filesystem      Size  Used Avail Use% Mounted on
dev             1.9G     0  1.9G   0% /dev
run             1.9G  9.5M  1.9G   1% /run
/dev/dm-0       559G  263G  295G  48% /
tmpfs           1.9G   19M  1.9G   1% /dev/shm
tmpfs           1.9G   16M  1.9G   1% /tmp
/dev/sda2       5.6G  243M  5.4G   5% /boot/efi
/dev/dm-0       559G  263G  295G  48% /srv
/dev/dm-0       559G  263G  295G  48% /var/log
/dev/dm-0       559G  263G  295G  48% /var/tmp
/dev/dm-0       559G  263G  295G  48% /var/cache
/dev/dm-0       559G  263G  295G  48% /home
/dev/dm-0       559G  263G  295G  48% /root
/dev/dm-1       916G  450G  420G  52% /mnt/Archieves
tmpfs           389M   80M  310M  21% /run/user/1000

  1. You do not have a separate home partition. Your disk space between home and root is shared.
  2. Did you disable Garuda System Maintenance? It should be in your tray.
  1. I know, never said that. Maybe it was the way I phrased? All I meant is that my home is only responsible for 60gb of those 300+, so it can't be totally blamed.

  2. Nop. Still there, still working, afaik. Something wrong with it?
    Edit: I did turn off old snapshots notifications, as it always notified me and I never could see any unwanted snapshots on the Snapper tool. Is this what I did wrong? xD

sudo snapper-tools find-old
sudo snapper get-config
systemctl status snapper-cleanup.timer

sudo snapper-tools find-old

Old snapshots:
2329    ter. jan. 24 21:33:14 2023      Increasing tmpfs
Old restore subvolumes:
[email protected]_110925332
@_backup_20221412121412479
@_backup_20222112142513663

sudo snapper get-config

Failed to set locale.
Key                    | Value
-----------------------+------
ALLOW_GROUPS           |      
ALLOW_USERS            |      
BACKGROUND_COMPARISON  | yes  
EMPTY_PRE_POST_CLEANUP | yes  
EMPTY_PRE_POST_MIN_AGE | 1800 
FREE_LIMIT             | 0.2  
FSTYPE                 | btrfs
NUMBER_CLEANUP         | yes  
NUMBER_LIMIT           | 13   
NUMBER_LIMIT_IMPORTANT | 5    
NUMBER_MIN_AGE         | 1800 
QGROUP                 |      
SPACE_LIMIT            | 0.5  
SUBVOLUME              | /    
SYNC_ACL               | no   
TIMELINE_CLEANUP       | yes  
TIMELINE_CREATE        | yes  
TIMELINE_LIMIT_DAILY   | 7    
TIMELINE_LIMIT_HOURLY  | 0    
TIMELINE_LIMIT_MONTHLY | 10   
TIMELINE_LIMIT_WEEKLY  | 20   
TIMELINE_LIMIT_YEARLY  | 0    
TIMELINE_MIN_AGE       | 1800 

systemctl status snapper-cleanup.timer

 snapper-cleanup.timer - Daily Cleanup of Snapper Snapshots
     Loaded: loaded (/usr/lib/systemd/system/snapper-cleanup.timer; enabled; preset: disabled)
     Active: active (waiting) since Wed 2023-02-22 09:13:22 -03; 53min ago
      Until: Wed 2023-02-22 09:13:22 -03; 53min ago
    Trigger: Thu 2023-02-23 09:22:33 -03; 23h left
   Triggers:  snapper-cleanup.service
       Docs: man:snapper(8)
             man:snapper-configs(5)

Feb 22 09:13:22 Aorigaruda systemd[1]: Started Daily Cleanup of Snapper Snapshots.

Yeah, well then no surprises there.

systemctl status snapper-cleanup
sudo snapper-tools delete-old
3 Likes

systemctl status snapper-cleanup

 snapper-cleanup.service - Daily Cleanup of Snapper Snapshots
     Loaded: loaded (/usr/lib/systemd/system/snapper-cleanup.service; static)
     Active: inactive (dead) since Wed 2023-02-22 09:22:38 -03; 1h 13min ago
   Duration: 5.260s
TriggeredBy:  snapper-cleanup.timer
       Docs: man:snapper(8)
             man:snapper-configs(5)
    Process: 4997 ExecStart=/usr/lib/snapper/systemd-helper --cleanup (code=exited, status=0/SUCCESS)
   Main PID: 4997 (code=exited, status=0/SUCCESS)
        CPU: 1.513s

Feb 22 09:22:33 Aorigaruda systemd[1]: Started Daily Cleanup of Snapper Snapshots.
Feb 22 09:22:33 Aorigaruda systemd-helper[4997]: running cleanup for 'root'.
Feb 22 09:22:35 Aorigaruda systemd-helper[4997]: running number cleanup for 'root'.
Feb 22 09:22:35 Aorigaruda systemd-helper[4997]: running timeline cleanup for 'root'.
Feb 22 09:22:35 Aorigaruda systemd-helper[4997]: running empty-pre-post cleanup for 'root'.
Feb 22 09:22:38 Aorigaruda systemd[1]: snapper-cleanup.service: Deactivated successfully.
Feb 22 09:22:38 Aorigaruda systemd[1]: snapper-cleanup.service: Consumed 1.513s CPU time.

Alright, the "delete-old" thing didn't run too long and it didn't seem to do much? My df -h didn't really change much. What's 263G up there is now 237G, but everything else is the same o.o
Or am I looking at this the wrong way?

sudo snapper set-config NUMBER_LIMIT=10
sudo snapper cleanup number
sudo btrfs subvolume list /

Ahm... this doesn't look right, does it?

sudo snapper set-config NUMBER_LIMIT=10

Failed to set locale.

sudo snapper cleanup number

Failed to set locale.

sudo btrfs subvolume list /

ID 257 gen 335250 top level 5 path @home
ID 258 gen 332046 top level 5 path @root
ID 259 gen 277931 top level 5 path @srv
ID 260 gen 335080 top level 5 path @cache
ID 261 gen 335250 top level 5 path @log
ID 262 gen 335075 top level 5 path @tmp
ID 263 gen 335250 top level 3620 path .snapshots
ID 1334 gen 130215 top level 263 path .snapshots/696/snapshot
ID 1628 gen 130215 top level 263 path .snapshots/887/snapshot
ID 1976 gen 130300 top level 263 path .snapshots/1102/snapshot
ID 2279 gen 155475 top level 263 path .snapshots/1255/snapshot
ID 2406 gen 166094 top level 263 path .snapshots/1322/snapshot
ID 2550 gen 181686 top level 263 path .snapshots/1390/snapshot
ID 2836 gen 207322 top level 263 path .snapshots/1598/snapshot
ID 2947 gen 217446 top level 263 path .snapshots/1662/snapshot
ID 2999 gen 221525 top level 263 path .snapshots/1706/snapshot
ID 3050 gen 232266 top level 263 path .snapshots/1749/snapshot
ID 3093 gen 232266 top level 263 path .snapshots/1772/snapshot
ID 3110 gen 232319 top level 263 path .snapshots/1775/snapshot
ID 3176 gen 238447 top level 263 path .snapshots/1829/snapshot
ID 3250 gen 243928 top level 263 path .snapshots/1877/snapshot
ID 3314 gen 249626 top level 263 path .snapshots/1912/snapshot
ID 3399 gen 257693 top level 263 path .snapshots/1952/snapshot
ID 3422 gen 260046 top level 263 path .snapshots/1974/snapshot
ID 3460 gen 263622 top level 263 path .snapshots/2006/snapshot
ID 3520 gen 270040 top level 263 path .snapshots/2049/snapshot
ID 3583 gen 274744 top level 263 path .snapshots/2102/snapshot
ID 3620 gen 335250 top level 5 path @
ID 3678 gen 283558 top level 263 path .snapshots/2189/snapshot
ID 3749 gen 290429 top level 263 path .snapshots/2257/snapshot
ID 3761 gen 291686 top level 263 path .snapshots/2265/snapshot
ID 3800 gen 295170 top level 263 path .snapshots/2277/snapshot
ID 3855 gen 301361 top level 263 path .snapshots/2292/snapshot
ID 3907 gen 306178 top level 263 path .snapshots/2314/snapshot
ID 3984 gen 313314 top level 263 path .snapshots/2349/snapshot
ID 3985 gen 313333 top level 263 path .snapshots/2350/snapshot
ID 3987 gen 313436 top level 263 path .snapshots/2352/snapshot
ID 3988 gen 313437 top level 263 path .snapshots/2353/snapshot
ID 3997 gen 314180 top level 263 path .snapshots/2362/snapshot
ID 4021 gen 316395 top level 263 path .snapshots/2376/snapshot
ID 4049 gen 319081 top level 263 path .snapshots/2395/snapshot
ID 4134 gen 326386 top level 263 path .snapshots/2435/snapshot
ID 4135 gen 326413 top level 263 path .snapshots/2436/snapshot
ID 4144 gen 327252 top level 263 path .snapshots/2443/snapshot
ID 4154 gen 328073 top level 263 path .snapshots/2444/snapshot
ID 4168 gen 329217 top level 263 path .snapshots/2458/snapshot
ID 4171 gen 329550 top level 263 path .snapshots/2461/snapshot
ID 4186 gen 330791 top level 263 path .snapshots/2476/snapshot
ID 4195 gen 331552 top level 263 path .snapshots/2482/snapshot
ID 4197 gen 331732 top level 263 path .snapshots/2484/snapshot
ID 4198 gen 331755 top level 263 path .snapshots/2485/snapshot
ID 4203 gen 332049 top level 263 path .snapshots/2490/snapshot
ID 4204 gen 332050 top level 263 path .snapshots/2491/snapshot
ID 4205 gen 332084 top level 263 path .snapshots/2492/snapshot
ID 4220 gen 333552 top level 263 path .snapshots/2506/snapshot
ID 4235 gen 335135 top level 263 path .snapshots/2507/snapshot
ID 4236 gen 335228 top level 263 path .snapshots/2508/snapshot

Add or uncomment pt_BR.UTF-8 to /etc/locale.gen and run sudo locale-gen and then run all the commands again for me, please.

1 Like

I uncommented it and the result is the same :confused:

Did you run sudo locale-gen? What's the output?
Well, anyway, regardless, the commands still did their job.

Please run sudo snapper list, let's see why those snapshots you have are not getting removed.

5 Likes

Amigo, I suggest identifying the 'heaviest' folders and files, in an old-school manner. You could use find to locate fat files with something like:

find / -size +100M -print

but I can't recommend that wild-goose chase. Instead, narrow it down to the heavy hitters like this:

du -sm /* | sort -r -n -k1 | head -10

Then keep drilling down into the heaviest folders, until you locate an identifiable candidate for deletion. Just remember to keep the * at the end of the path on the 'du', and tighten the 'head' number to suit your tastes...

For example, I didn't realize that my /home/USERNAME/Downloads folder was such a fat hog...

Hope this helps.

F.

4 Likes

sudo locale-gen

Generating locales...
Generation complete.

sudo snapper list

Failed to set locale.
    # | Type   | Pre # | Date                     | User | Cleanup  | Description                                                              | Userdata
------+--------+-------+--------------------------+------+----------+--------------------------------------------------------------------------+---------
   0  | single |       |                          | root |          | current                                                                  |         
 696  | single |       | Sun May  1 00:00:07 2022 | root | timeline | timeline                                                                 |         
 887  | single |       | Wed Jun  1 12:00:28 2022 | root | timeline | timeline                                                                 |         
1102  | single |       | Fri Jul  1 11:00:02 2022 | root | timeline | timeline                                                                 |         
1255  | single |       | Mon Aug  1 00:00:20 2022 | root | timeline | timeline                                                                 |         
1390  | single |       | Fri Sep  2 21:00:04 2022 | root | timeline | timeline                                                                 |         
1598  | single |       | Sat Oct  1 14:00:31 2022 | root | timeline | timeline                                                                 |         
1662  | single |       | Mon Oct 10 00:00:12 2022 | root | timeline | timeline                                                                 |         
1706  | single |       | Mon Oct 17 00:00:31 2022 | root | timeline | timeline                                                                 |         
1749  | single |       | Mon Oct 24 00:00:06 2022 | root | timeline | timeline                                                                 |         
1772  | single |       | Mon Oct 31 00:00:21 2022 | root | timeline | timeline                                                                 |         
1775  | single |       | Tue Nov  1 10:00:13 2022 | root | timeline | timeline                                                                 |         
1829  | single |       | Mon Nov  7 14:00:32 2022 | root | timeline | timeline                                                                 |         
1877  | single |       | Mon Nov 14 10:00:07 2022 | root | timeline | timeline                                                                 |         
1912  | single |       | Mon Nov 21 00:00:12 2022 | root | timeline | timeline                                                                 |         
1952  | single |       | Mon Nov 28 18:00:06 2022 | root | timeline | timeline                                                                 |         
1974  | single |       | Thu Dec  1 00:00:17 2022 | root | timeline | timeline                                                                 |         
2006  | single |       | Mon Dec  5 01:00:01 2022 | root | timeline | timeline                                                                 |         
2049  | single |       | Tue Dec 13 16:00:06 2022 | root | timeline | timeline                                                                 |         
2102  | single |       | Mon Dec 19 11:00:14 2022 | root | timeline | timeline                                                                 |         
2189  | single |       | Mon Dec 26 00:00:07 2022 | root | timeline | timeline                                                                 |         
2257  | single |       | Sun Jan  1 00:00:13 2023 | root | timeline | timeline                                                                 |         
2265  | single |       | Mon Jan  2 00:00:19 2023 | root | timeline | timeline                                                                 |         
2277  | single |       | Mon Jan  9 11:00:18 2023 | root | timeline | timeline                                                                 |         
2292  | single |       | Mon Jan 16 00:00:08 2023 | root | timeline | timeline                                                                 |         
2314  | single |       | Mon Jan 23 12:00:06 2023 | root | timeline | timeline                                                                 |         
2349  | pre    |       | Sun Jan 29 12:32:55 2023 | root | number   | pacman -Su                                                               |         
2350  | post   |  2349 | Sun Jan 29 12:43:18 2023 | root | number   | ant-dracula-kde-theme-git ant-dracula-kvantum-theme-git ant-dracula-them |         
2352  | pre    |       | Sun Jan 29 13:14:17 2023 | root | number   | pacman -Rns electron19                                                   |         
2353  | post   |  2352 | Sun Jan 29 13:14:29 2023 | root | number   | electron19                                                               |         
2362  | single |       | Mon Jan 30 00:00:01 2023 | root | timeline | timeline                                                                 |         
2376  | single |       | Wed Feb  1 00:00:04 2023 | root | timeline | timeline                                                                 |         
2395  | single |       | Mon Feb  6 00:00:01 2023 | root | timeline | timeline                                                                 |         
2435  | pre    |       | Sun Feb 12 14:05:17 2023 | root | number   | pacman -Su                                                               |         
2436  | post   |  2435 | Sun Feb 12 14:21:10 2023 | root | number   | alsa-card-profiles ananicy-rules ant-dracula-kde-theme-git ant-dracula-k |         
2443  | single |       | Mon Feb 13 00:00:00 2023 | root | timeline | timeline                                                                 |         
2444  | single |       | Tue Feb 14 10:00:39 2023 | root | timeline | timeline                                                                 |         
2458  | single |       | Thu Feb 16 21:00:07 2023 | root | timeline | timeline                                                                 |         
2461  | single |       | Fri Feb 17 00:00:02 2023 | root | timeline | timeline                                                                 |         
2476  | single |       | Sat Feb 18 00:00:04 2023 | root | timeline | timeline                                                                 |         
2482  | single |       | Sun Feb 19 00:00:03 2023 | root | timeline | timeline                                                                 |         
2484  | pre    |       | Sun Feb 19 19:40:35 2023 | root | number   | pacman -Su                                                               |         
2485  | post   |  2484 | Sun Feb 19 19:53:21 2023 | root | number   | acl ananicy-rules ant-dracula-kde-theme-git ant-dracula-kvantum-theme-gi |         
2490  | pre    |       | Sun Feb 19 23:39:10 2023 | root | number   | pacman -Rns python-appdirs                                               |         
2491  | post   |  2490 | Sun Feb 19 23:39:19 2023 | root | number   | python-appdirs                                                           |         
2492  | single |       | Mon Feb 20 00:00:03 2023 | root | timeline | timeline                                                                 |         
2506  | single |       | Tue Feb 21 00:00:13 2023 | root | timeline | timeline                                                                 |         
2507  | single |       | Wed Feb 22 10:00:08 2023 | root | timeline | timeline                                                                 |         
2508  | single |       | Wed Feb 22 11:00:09 2023 | root | timeline | timeline                                                                 |         
2509  | single |       | Wed Feb 22 12:00:08 2023 | root | timeline | timeline                                                                 |         
2510  | single |       | Wed Feb 22 13:00:08 2023 | root | timeline | timeline                                                                 |         
2511  | single |       | Wed Feb 22 14:00:06 2023 | root | timeline | timeline                                                                 |         
2512  | single |       | Wed Feb 22 15:00:08 2023 | root | timeline | timeline                                                                 |         

Maybe the fault is in my Snapper Tools config? I mean, I DID set some 50 snapshots tolerance at a single time, as you can see above on my sudo snapper get-config. But in any case I feel like your commands should've overruled it, rigth?

Thanks for the suggestion. I think TNE is on to something here, but this is certainly a good trick to have set aside in any case : 3

1 Like

Yeah okay. You're keeping multiple months worth of snapshots that include every single package in your system, including the huge ones like CUDA and co.

If I were you, I'd use the snapper-tools gui to disable timeline snapshots entirely and then delete all the snapshots in your snapshot list manually.

Snapshots don't actually take that much space, they only take a lot of space if not many files are shared between them, which is very very likely over your months of usage.

In general I'd say you could switch your timeline snapshot cleanup policy to exclusively daily too if you wanted to keep them around for like a week or so, assuming you don't trust the normal pacman interaction snapshots.

The other issue of course was that you had GSM's snapshot stuff disabled. GSM protects you from forgetting about outdated backup volumes (after restoring snapshots) and forgotten manual snapshots that can take multiple gigabytes of space, only more the more time passed since they happened. You should probably turn that back on, but I made sure to delete them for you nevertheless.

6 Likes

df -h

Filesystem      Size  Used Avail Use% Mounted on
dev             1.9G     0  1.9G   0% /dev
run             1.9G  9.5M  1.9G   1% /run
/dev/dm-0       559G  161G  391G  30% /
tmpfs           1.9G   32M  1.9G   2% /dev/shm
tmpfs           1.9G  224M  1.7G  12% /tmp
/dev/sda2       5.6G  243M  5.4G   5% /boot/efi
/dev/dm-0       559G  161G  391G  30% /var/cache
/dev/dm-0       559G  161G  391G  30% /var/log
/dev/dm-0       559G  161G  391G  30% /srv
/dev/dm-0       559G  161G  391G  30% /var/tmp
/dev/dm-0       559G  161G  391G  30% /home
/dev/dm-0       559G  161G  391G  30% /root
/dev/dm-1       916G  450G  420G  52% /mnt/Archieves
tmpfs           389M   80M  310M  21% /run/user/1000

Patience is everything :relieved:
Thanks a lot, TNE, sorry for making your life so difficult with this xDD

And yeah, it's on now, at least on my sudo user, so once a week at least this will get up to my face so I don't forget those oldies : 3

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.