Decoder failure: Out of memory
Forum rules
Help us help you:
Help us help you:
- Are you using the latest stable version of SABnzbd? Downloads page.
- Tell us what system you run SABnzbd on.
- Adhere to the forum rules.
- Do you experience problems during downloading?
Check your connection in Status and Interface settings window.
Use Test Server in Config > Servers.
We will probably ask you to do a test using only basic settings. - Do you experience problems during repair or unpacking?
Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Re: Decoder failure: Out of memory
Of course, sometimes people repost stuff months later
If this is causing it I'll have to write a rotate script so anything older than, say, a year, gets moved to a subdir or something. I'll wait and see first though.
If this is causing it I'll have to write a rotate script so anything older than, say, a year, gets moved to a subdir or something. I'll wait and see first though.
Re: Decoder failure: Out of memory
You can disable this behavior by going to Config Specials and unchekcing backup_for_duplicates.
But then it will only use History, so don't empty it if you want it to keep working half a year later
But then it will only use History, so don't empty it if you want it to keep working half a year later
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
Re: Decoder failure: Out of memory
As mentioned I'd prefer it to check for duplicates, I'll take a shorter list to check against over none at all (so if this is the problem, I'll write a script for it)
For troubleshooting purposes, does unchecking that setting prevent the backup dir from being parsed at all, or does it only stop comparing the queue items to this dir (but still loads everything)?
For troubleshooting purposes, does unchecking that setting prevent the backup dir from being parsed at all, or does it only stop comparing the queue items to this dir (but still loads everything)?
Re: Decoder failure: Out of memory
The duplicate file-based check only looks for the filename to exist, which should not be very heavy.. I first thought it maybe would compute some MD5, but it doesn't.
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
Re: Decoder failure: Out of memory
The nzb backup dir was probably coincidental, sometimes it works for a few jobs, sometimes it locks up straight away again... I noticed a different pattern though:
They're all stuck at 88-90%, despite different sizes. Is there a specific process that kicks in at that moment?
These are the queue options that are enabled right now:
Hardware wise I've ruled out memory issues with memtest86, and did a HDD surface scan as well.
They're all stuck at 88-90%, despite different sizes. Is there a specific process that kicks in at that moment?
These are the queue options that are enabled right now:
- Only Get Articles for Top of Queue (just enabled this one, same result)
- Check before download
- Abort jobs that cannot be completed
- Detect Duplicate Downloads: Pause
- Allow proper releases
- Download all par2 files
- Enable SFV-based checks
- Post-Process Only Verified Jobs
- Enable recursive unpacking
- Ignore any folders inside archives
- Ignore Samples
- Use tags from indexer
- Cleanup List: srr, nzb
- History Retention: Keep all jobs
Hardware wise I've ruled out memory issues with memtest86, and did a HDD surface scan as well.
Re: Decoder failure: Out of memory
It gets even more puzzling... At least two of the 89% downloads on that screenshot are in the download dir, one fully downloaded and not in need of repair, the other just missing an sfv file that took a split second to restore. When I resume these downloads in SAB, the restored one jumps to 95%, then everything I start (including the restored one) eventually produces the 'out of memory' error again.
I did notice a lot of page faults (1.9M!) in process explorer, even though I restarted SAB a few times this morning:
I've sampled a few other running processes, and they only display a small fraction of these. As mentioned, memtest86 ran a full pass without errors, and so far anything that gets stuck in SAB I can download without any issues with NZBget on the very same machine.
Anything else I can test / try?
I did notice a lot of page faults (1.9M!) in process explorer, even though I restarted SAB a few times this morning:
I've sampled a few other running processes, and they only display a small fraction of these. As mentioned, memtest86 ran a full pass without errors, and so far anything that gets stuck in SAB I can download without any issues with NZBget on the very same machine.
Anything else I can test / try?
Re: Decoder failure: Out of memory
No special process at 90%, it could be something specific in the files that triggers the segfaults. Although I'd expect a crash, not for it to keep running.
It's a bit hard to diagnose when there's many jobs. If we have it with just 1 job in the queue it could be easier to dig though the logs.
Is there any improvement if you disable Check before download?
It's a bit hard to diagnose when there's many jobs. If we have it with just 1 job in the queue it could be easier to dig though the logs.
Is there any improvement if you disable Check before download?
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
Re: Decoder failure: Out of memory
I have been experiencing the same issue that MiG first reported. This began recently - within the past month for me. I'm running macOS High Sierra on a Mac Mini with 8 GB RAM. Software version 2.3.9. A memory test said everything was ok.
The decoder failure issue does not happen with every download. Some downloads complete successfully, and others get the decoder failure. I haven't noticed a pattern yet. On 2019-04-17 MiG posted a log capture. The error on mine is almost identical - the only obvious difference I see on mine is that the extension "decoder.pyo" is "decoder.pyc" on mine. The issue will happen with only one download in the queue. Usually it will get to 90-95% complete and then present the "Decoder failure: out of memory" error.
The decoder failure issue does not happen with every download. Some downloads complete successfully, and others get the decoder failure. I haven't noticed a pattern yet. On 2019-04-17 MiG posted a log capture. The error on mine is almost identical - the only obvious difference I see on mine is that the extension "decoder.pyo" is "decoder.pyc" on mine. The issue will happen with only one download in the queue. Usually it will get to 90-95% complete and then present the "Decoder failure: out of memory" error.
Re: Decoder failure: Out of memory
Interesting
Re: Decoder failure: Out of memory
@arhyarion What do you have set for Article cache?
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
-
- Jr. Member
- Posts: 54
- Joined: April 26th, 2008, 2:22 pm
Re: Decoder failure: Out of memory
I could say with 100% that this is a bug, I even changed the server, I order the new server from Hetzner and also i switched from Debian to Ubuntu 18 in order to change everything. I changed it because some admin told someone its your memory its bad , and even I tested alll of my memories with 7 days test modes not even single error with memory. Anyway I changed hardware and ordered the new server with fresh and new hw. I cant believe that sabnzbd is needing more than 32gb or 64gb or ram? I even changed from i7 to pure xeon.
And with 100% I can tell you that Decoder failure: Out of memory is 100% bug, its started not a long ago but somewhere in 2019 or at the end of 2018. Maybe its with some nzb files. Which mean its incompatibility with some nzb creator or these are bad nzb files which are normal in other nzb downloaders or its just straight bug in sabnzbd in par2 or repairing process or fetching extra blocks, but it look something with par2 part of sabnzbd. In my case I never user Check before download option.
But I would like to ask you to fix this bug please. Its very nasty because its stopping the whole download process.
And with 100% I can tell you that Decoder failure: Out of memory is 100% bug, its started not a long ago but somewhere in 2019 or at the end of 2018. Maybe its with some nzb files. Which mean its incompatibility with some nzb creator or these are bad nzb files which are normal in other nzb downloaders or its just straight bug in sabnzbd in par2 or repairing process or fetching extra blocks, but it look something with par2 part of sabnzbd. In my case I never user Check before download option.
But I would like to ask you to fix this bug please. Its very nasty because its stopping the whole download process.
Last edited by loopdemack on June 20th, 2019, 7:39 am, edited 1 time in total.
Re: Decoder failure: Out of memory
The only possible thing that I can imagine is that somehow somebody created NZB files that have huge article-sizes listed inside the NZB.
Do you have an NZB for which this happend?
Otherwise there is just nothing I can do. I use the lowest C-call possible (malloc) to ask the system for X bytes of memory and if it doesn't give it to us we have a problem. That code hasn't changed in years.
Do you have an NZB for which this happend?
Otherwise there is just nothing I can do. I use the lowest C-call possible (malloc) to ask the system for X bytes of memory and if it doesn't give it to us we have a problem. That code hasn't changed in years.
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
-
- Jr. Member
- Posts: 54
- Joined: April 26th, 2008, 2:22 pm
Re: Decoder failure: Out of memory
Whats crazy when I tried that same nzb on old server its downloaded successfully, also I tried the same nzb on alt.bin on windows and its downloaded ok. On the new server four time it made error and 2 times it downloaded it ok. Seems when the queue is empty its downloading ok, and when there is few things in queue it could get the error.
-
- Jr. Member
- Posts: 54
- Joined: April 26th, 2008, 2:22 pm
Re: Decoder failure: Out of memory
Asked to remove the nzb I can send it on pm if needed.
Stupid mediafire banned the file here is on mega
Stupid mediafire banned the file here is on mega
Last edited by loopdemack on June 21st, 2019, 11:04 am, edited 1 time in total.
Re: Decoder failure: Out of memory
I downloaded that .NZB with SABnzbd ... and no problem at all: a nice *mp4 in the resulting directory, no problems in GUI nor LOG