Page 1 of 2

Speed Problem when downloading a big NZB

Posted: January 13th, 2011, 9:07 am
by marc_al
Hello,

I have a strange problem when downloading big nzb files (35-40GB for the content of the nzb by rar files of 100 MB).
In the beginning, the speed is normal (like a nzb file whose content is 4-5 GB). As soon as I reach the 10-12GB downloaded, sabnzbd becomes really slow.
Every 20-25 MB sabnzbd stops downloading (the speed is still normal), but no article is downloaded at all and after 2-3 minutes it starts again downloading 20-25 MB then stops ... (sometimes it stops at 5 MB remaining then after a few minutes it downloads again the few mega to make the file and then continues).
The more I advance in the nzbfile, the slower it becomes.
After 20-25 GB, nothing happens during a few hours. The only way to do something is by a kill -9 (but I lose the queue).

If I split the nzb file to download parts of 9-10 GB and I add the 4-5 nzb files , there is no problem  (in fact it is still a little slower than splitting in parts of 5 GB, but not a lot so I can live with it). The problem is really for the content of one item in the queue, nor the total.
I have no idea about what is wrong. I have seen something  like that in a program in the two cases below : The program  was writing in a file when something is done and saves the file to HDD (from the begining every time) and after a moment it takes so much time that it is stuck. The other case was when working on a database to read one record at a time  in a table that had no index and a lot of records.
Restarting Sabnzbd doesn't solve the problem : it is immediately slow. Only removing the zb from queue helps. Pausing Sabnzbd takes a few minutes before Sabn,zbd is responsive again and as soon as I resume the problem is there again.
Unfortunately I know nothing about Python.

Things I have tried so far :
I have made a complete reinstallation in november of my NAS (I had to do that to use the ext4). Nothing has changed.
I have cleaned the queue like it is explained for another slow down problem (even when I use no backup server).
I have also disabled the backup server (before importing the big nzb).


The server that uses sabnzbd is a Synology DS 210J. It is a very small server (128 MB RAM and a 800Mhz processor) but there is only a problem with a big nzb. 15 nzb of 5-6 GB each have no problem.

Can you please tell me what I can do?
Is there a tool for splitting a nzb in sabnzbd or an auto split feature that could help me?
I have put no cache size .
When not using the UI, nothng changes. For example it says that in 2 hours it will be done and after 5-6 hours it is not finished

Thank you
Marc

Re: Speed Problem when downloading a big NZB

Posted: January 13th, 2011, 10:47 am
by shypike
SABnzbd keeps all meta-data about an active NZB in memory.
As the download progresses it needs to keep more in memory.
With 128M this will become a problem after some time.
What I find surprising is that the problem becomes gradually worse.
I had expected a sudden decrease of speed as soon as the big NZB starts to download.
We do very little testing with memory constraints.
Generally we consider 128M to be too little for normal usage.

I'll look into this, but it may take a while.

Re: Speed Problem when downloading a big NZB

Posted: January 14th, 2011, 12:47 am
by marc_al
Hello,

I don't know if it is a memory problem or a processor problem because after a moment the NAS becomes really unresponsive. Accessing a file from the NAS can take a few minutes too when Sabnzbd runs from more than 13-14 GB.
I will try so see what is realy the problem with the tool from Synology (RAM or processor speed).
Is there a reason why you need to copy all the metadata in memory?
I was thinking that you were doing that : You read completely the nzb file and you create one temp file with everything needed for every file in the nzb (so if the nzb has 500 files you have 500 temp files).
After that you do the temp files 1 by 1 : Read the content of the temp file, and download the articles to compose the file (in multi server depending on the servers). When a file is completely read, you uncompress it and you remove the temp file.
I don't understand  why you need more memory when the download progresses as you have less things to download? Shouldn't it be enough to remembere that File1 is downloaded, File2 is downloaded and File 3 is downloading and File 4 is next...?


Thank you
Marc

Re: Speed Problem when downloading a big NZB

Posted: January 14th, 2011, 3:17 am
by shypike
Swapping will slow your system down.
It slows down SABnzbd and the file server software.
Swapping is probably the explanation.

The source code is available here: http://bazaar.launchpad.net/~sabnzbd/sa ... .6.x/files
So, go ahead.

Re: Speed Problem when downloading a big NZB

Posted: January 14th, 2011, 9:24 am
by marc_al
Hello,

You are right, it is the swapping that is a problem.
The processor is used at 45-50% and the memory is used at 70% max.
The swap file is filled
The top command gives

PID  PPID USER     STAT   VSZ %MEM %CPU COMMAND
3090     1 root        S     479m   389.9  0.0 /var/packages/sab2/target/utils/bin/python /var/packages/sab2/target/...

EDIT Removed the question about the editor : I didn't see the homepage I accessesd directly the adress to the forum

Thank you
Marc

Re: Speed Problem when downloading a big NZB

Posted: January 14th, 2011, 2:31 pm
by shypike
The current design is not optimized for memory.
It can certainly be improved.
But it's one of the hundred other things that can be done better and done extra.

Re: Speed Problem when downloading a big NZB

Posted: January 25th, 2011, 1:26 pm
by qdejv
i hit same bug, can't download anything bigger than ~28gb in single nzb, cache limit is set to 768mb on otherwise idle machine with 1gb of ram
if bug cannot be easily fixed, can some workaround be made to postprocess multiple nzbs as one?

Re: Speed Problem when downloading a big NZB

Posted: January 25th, 2011, 4:24 pm
by shypike
qdejv wrote: i hit same bug, can't download anything bigger than ~28gb in single nzb, cache limit is set to 768mb on otherwise idle machine with 1gb of ram
if bug cannot be easily fixed, can some workaround be made to postprocess multiple nzbs as one?
On which OS are you running SABnzbd?

Re: Speed Problem when downloading a big NZB

Posted: January 26th, 2011, 10:22 am
by qdejv
xp sp3

Re: Speed Problem when downloading a big NZB

Posted: January 26th, 2011, 11:24 am
by shypike
Weird, never seen this, not even with 45G downloads.
I'll try it on a more constrained system, but it may take a while.

Re: Speed Problem when downloading a big NZB

Posted: March 15th, 2011, 10:49 am
by simoncm
Hi All

I have the same problem on a Synology 410j (128MB RAM) :\

What appears to happen is that a all (or at least most) of the articles which make up one part of a RAR are downloaded, and then everything 'stops' while a cache filed labelled "SABnzbd_nzo__6wQdt" is updated (not sure what goes into this last file). During this time, Sabnzbd does not download any data and it is unresponsive on the web interface etc... but there does not appear to be excessive CPU or memory usage.

After a few minutes (the delay period gets longer the further through the NZB file you get) Sabnzbd resumes, and promptly creates the new rar part in the "download" folder, deletes the cached articles, and resumes downloading new articles.

Each time its the same file "SABnzbd_nzo__6wQdt" which is being updated.

It is solved by breaking up the NZB file into 5-8GB chunks.

In getting to this point, I've tried all the usual restarts, clearing of cache etc... and it changes nothing.

Any help appreciated :)

Re: Speed Problem when downloading a big NZB

Posted: March 15th, 2011, 10:55 am
by shypike
128M is very little and SABnzbd consumes relatively much memory,
especially for very large downloads.
We need to work on this, but haven't started yet.

Re: Speed Problem when downloading a big NZB

Posted: March 15th, 2011, 11:16 am
by simoncm
Hi shypike

thanks for your quick response.

I understand that 128MB is quite a small amount of memory. I just wanted to let you guys know what I'd found when trying to deal with this problem to see if you could provide any further insights.

Thanks

Re: Speed Problem when downloading a big NZB

Posted: September 11th, 2011, 7:36 pm
by deiniol39
Just wondering if there was any update with the memory issue. This is one of the main reasons I bought a Synology box because you could run SAB and sickbeard and Couchpotato on it. But this download speed is a pain..

Re: Speed Problem when downloading a big NZB

Posted: September 12th, 2011, 5:31 am
by shypike
And you have the same limited amount of memory?
We're indeed still looking for memory leaks and for ways to reduce memory usage.
But 128M will never be a realistic target.
Especially not with three Python programs running.