System Hang Prevention is killing my encoding!

My script and encoding process is using the memory. All of it. I put the laptop on Turbo mode and let it run all night. Memory usage is pretty stable as it processes the video; so if it starts fine, it should continue fine. Unless "system hang prevention" has a word in it.

and if one of the plugins has a bug and it really freezes, then I'd want to know.

In "Garuda Assistant", "Settings" tab, I see a "Nohang enabled" checkbox.
I don't really know but I'd put my bet there.
GitHub - hakavlad/nohang: A sophisticated low memory handler for Linux

1 Like

Sounds like you have a memory leak in your script if it slowly uses up for memory

1 Like

Disabling nohang allows it to go about twice further before being killed again.

Memory usage is pretty high, but that's on script start. It remains stable.

I'm wondering whether zram can cause issues by letting VapourSynth think that it can use a lot more space for cache? I'm playing around with cache settings. But anyway; memory doesn't go up, it just decides to kill it at some point. Perhaps at some spikes. It goes up and down in spikes.

Perhaps if you disable zram temporarily it will give you enough cushion to finish your operation. However, if as suggested you have a memory leak it may still exhaust your ram.

Posting your script might help.

BTW, please do not post pictures of textual outputs.


He needs the zram to increase the processing resolution, How does ZRAM impact heavy load?

@Hanuman I would bet on those spikes too.
Perhaps rather than disabling it altogether try see if you can configure it for your purpose,

Customizing corrective actions: if the name or control group of the victim matches a certain regex pattern, you can run any command instead of sending the SIGTERM signal


Unless I am mistaken, Zram is only half of your system ram.

Now I'm playing with thread count of both VPY script and FFMPEG, and max_cache_size.

What's strange is that I was running the same script on Windows on 5K footage with x265 encoding. That was hungry. Now it's only 2.7K with x264 encoding.

It starts at about 2.6fps and then steadily go down to ~.8fps when it crashes. Now I see that "swp" in htop was in the red. If I reduce the thread count, swp has some room and it should avoid the crashes, but at lower performance. On Windows I ran 5K footage with x265 at about 0.5fps

This is my VapourSynth script

vspipe -c y4m Test.vpy - | ffmpeg -i pipe: -crf 14 -preset slower -color_range pc -threads 2 Video1.mp4
source = "/home/hanuman/VideoCreation/FringeMinorityBoostMeditation/Source/GH010373.MP4"
import sys
import vapoursynth as vs
import xClean as xc
core = vs.core
#core.max_cache_size = 3000
core.num_threads = 10

clip = core.lsmas.LibavSMASHSource(source=source)
clip = xc.xClean(clip, sharp=7.7, m1=.60, m2=2.85, h=2.8, outbits=10, gpuid=1, gpucuda=0, downchroma=False)

Do you think disabling zram would help?


It would seem you will need to lower the requirements of the operation you are performing. Perhaps using less threads would allow the operation to complete, although it would take much longer.

1 Like

OK got it running stable at 1.1fps... System Monitor reports 20-50% CPU usage. I'll let this script run (4.5h), then see if I can do better. Disable zram perhaps.

Investigating how to disable zram before the next video (this one will actually take 9h). Just a yes/no... do I just need to run this to disable, and then reboot to re-enable?

swapoff /dev/zram0
rmmod zram

I would think that would do it, but I'm not positive as I've never had cause to terminate it before.

You also might want to stop the zram.service.

What are you encoding to take 9Hrs :rofl:

1 Like

Denoising raw cilp... it's too noisy due to lack of lighting, and it does a very good job. Normally it would take 4.5h per 9m clip segments in 5K ... but instead, now it's taking 9h per 9m segment in 2.7K at 60fps. Shouldn't be that slow.

you do no there are easier ways of doing this what output file format are you trying to get?

1 Like

Lossless output would be way too huge. x265 requires too much processing for intermediary file. Doing x264 now.

What is the size of the source file and format ?

It seems that no matter the format, it creates 4GB files in 9m segments; for 30fps 5K or for 60fps 2.7K. GoPro h265 has bugs causing decoding errors, so now I switched back to h264.

ok so they are not raw files they are gopro files?
what output size quality are you after ?
you say h264, h265 to be able to view on say youtube?
edit why do you think it creates 4gb file of only 9min,s?

it will be for YouTube. Still tweaking the way I configure it. I want above 1K... but 2.7K is only 60fps not 30fps, and that's not suitable indoors with low light. Otherwise it's 1K or 4K in 30fps. It just encodes at standard bitrate no matter the settings it seems; with an option to increase bitrate even further. Fine.. just need to re-encode them afterwards. Then as I'm doing the montage, there seems to be 1 or 2 frames off when switching video compared to the separate audio track...

Then it has wrong VUI tag saying it's TV-range when it's PC-range, video time doesn't account for time zone, h265 encodes with errors...

Kind of getting off-topic.