High Memory Usage / Potential Memory Leaks Investigation Thread

Report & discuss bugs found in SABnzbd
Forum rules
Help us help you:
  • Are you using the latest stable version of SABnzbd? Downloads page.
  • Tell us what system you run SABnzbd on.
  • Adhere to the forum rules.
  • Do you experience problems during downloading?
    Check your connection in Status and Interface settings window.
    Use Test Server in Config > Servers.
    We will probably ask you to do a test using only basic settings.
  • Do you experience problems during repair or unpacking?
    Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
User avatar
sander
Release Testers
Release Testers
Posts: 9062
Joined: January 22nd, 2008, 2:22 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by sander »

In SABnzbd, I've created a thread that each minute will print out the memory usage as seen by heapy. The strange thing: during a download, the memory usage reported by heapy only goes from 8.5 MB tot 12.8 MB. FYI: top -bn1 reports "248m  42m" as VIRT and RES memory usage.

I also tried a "print dir()" in the separate thread to see all variables, but in the thread only the local variables are seen, and not the variables used in other parts of SAB.

So not very useful results so far.

Are the SAB devs still interested in this case, or am I the only one ... ?

Code: Select all

dumpthread active!
Partition of a set of 122643 objects. Total size = 8592560 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0  58991  48  3375316  39   3375316  39 str
     1  29165  24  1146728  13   4522044  53 tuple
     2    320   0   645248   8   5167292  60 dict of module
     3   8362   7   568616   7   5735908  67 types.CodeType
     4    838   1   528112   6   6264020  73 dict (no owner)
     5   8151   7   456456   5   6720476  78 function
     6    593   0   326408   4   7046884  82 dict of class
     7    604   0   270064   3   7316948  85 type
     8    604   0   236320   3   7553268  88 dict of type
     9   1897   2   202272   2   7755540  90 unicode
<352 more rows. Type e.g. '_.more' to view.>
Gelukt, na x pogingen: 0
... and dumpthread is sleeping again

and

Code: Select all

dumpthread active!
Partition of a set of 149395 objects. Total size = 12281196 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0  71747  48  5273024  43   5273024  43 str
     1  29199  20  1150964   9   6423988  52 tuple
     2   2345   2  1113800   9   7537788  61 dict of sabnzbd.nzbstuff.Article
     3    945   1   658440   5   8196228  67 dict (no owner)
     4    326   0   651824   5   8848052  72 dict of module
     5   8385   6   570180   5   9418232  77 types.CodeType
     6   8173   5   457688   4   9875920  80 function
     7    602   0   327632   3  10203552  83 dict of class
     8    611   0   273200   2  10476752  85 type
     9    611   0   238424   2  10715176  87 dict of type
<363 more rows. Type e.g. '_.more' to view.>
Gelukt, na x pogingen: 0
... and dumpthread is sleeping again

If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by shypike »

The "guardian" runs every 30 seconds (although not exactly reproducible).
See sabnzbd/__init__.py, function check_all_tasks (near the bottom of the file).

Or you just use the scheduler.
Open sabnzbd/scheduler and add this call near the end of function init (look for string cfg.RSS_RATE.callback(schedule_guard).
       __SCHED.add_daytime_task(action, 'heapy', range(1, 8), None, (h, m),  kronos.method.sequential, None, None)
Where action = name of the parameterless function to be called, h = hour, m = minute

The first method will likely give the best results, because it's the main thread.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by shypike »

sander wrote: Are the SAB devs still interested in this case, or am I the only one ... ?
I'm interested and it's great that you are giving it so much attention.
However, in the coming three weeks I have 0 time for this.
User avatar
sander
Release Testers
Release Testers
Posts: 9062
Joined: January 22nd, 2008, 2:22 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by sander »

shypike wrote: The "guardian" runs every 30 seconds (although not exactly reproducible).
See sabnzbd/__init__.py, function check_all_tasks (near the bottom of the file).

I did this:

Code: Select all

def check_all_tasks():
    """ Check every task and restart safe ones, else restart program
        Return True when everything is under control
    """
    if __SHUTTING_DOWN__ or not __INITIALIZED__:
        return True

    f = open('/tmp/sabnzbd.dumpheap.output2.txt', 'a')
    print >>f,"check_all_tasks is active!"
    print >>f,dir()
    print >>f,"... and check_all_tasks is done again"
    f.close()


    # Non-restartable threads, require program restart
However, the result of dir() is local again (as also stated in http://docs.python.org/library/functions.html):

Code: Select all

check_all_tasks is active!
['f']
... and check_all_tasks is done again
check_all_tasks is active!
['f']
... and check_all_tasks is done again



So only the file descriptor f is known and reported. :-(
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by shypike »

I thought you were using heapy to get info.
dir() is indeed not going to tell you anything.
Alas, memory profiling with Python sucks, big time.

I have looked at this before and I cannot find any true memory leaks.
It looks as though a) the garbage collector behaves different between platforms and Python versions
and b) on some platforms/versions Python refuses to return memory to the OS.

After my holidays I'll pick this up again.
User avatar
sander
Release Testers
Release Testers
Posts: 9062
Joined: January 22nd, 2008, 2:22 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by sander »

shypike wrote: I thought you were using heapy to get info.

As reported an hour ago or so, heapy will only show 8 - 12 MB memory usage. Strange.

Maybe it's python itself (not SAB) using a lot of memory ...
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
User avatar
sander
Release Testers
Release Testers
Posts: 9062
Joined: January 22nd, 2008, 2:22 pm

Re: High Memory Usage / Potential Memory Leaks Investigation

Post by sander »

OK, I have found a way to limit SAB's/python's memory usage: open a terminal, type

Code: Select all

ulimit -d 500111 -m 500111 -v 500111
and then in that terminal, start SABnzbd.py

FYI: setting the limits too low (like 100111 thus 100 MB) will cause problems with SABnzbd.


The memory usage will stay limited to "422m 138m". See included screendump: on the right ulimit is at work. ;-)

Code: Select all

sander@athlon64:~$ ulimit -d 500111 -m 500111 -v 5001111
sander@athlon64:~$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) 500111
scheduling priority             (-e) 20
file size               (blocks, -f) unlimited
pending signals                 (-i) 16382
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) 500111
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) unlimited
virtual memory          (kbytes, -v) 5001111
file locks                      (-x) unlimited
sander@athlon64:~$
Last edited by sander on July 25th, 2010, 4:10 pm, edited 1 time in total.
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
loopdemack
Jr. Member
Jr. Member
Posts: 54
Joined: April 26th, 2008, 2:22 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by loopdemack »

@Sander tell me one thing, is the 500111 the lowest value which gave you the stability or did you tried to test the lower values like 400111 or 300111 or even 200111?
User avatar
sander
Release Testers
Release Testers
Posts: 9062
Joined: January 22nd, 2008, 2:22 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by sander »

loopdemack wrote: @Sander tell me one thing, is the 500111 the lowest value which gave you the stability or did you tried to test the lower values like 400111 or 300111 or even 200111?
I indeed tried lower values (I believe 100111 and 200111), which resulted in strange (startup) errors. You should try yourself. And try which parameter (-d, -m and/or -v) is really needed...

Remember: it seems that after setting a value, you can't reset it. So close that terminal, and open another terminal and set another value.
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
lennardw
Newbie
Newbie
Posts: 23
Joined: July 10th, 2010, 4:32 am

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by lennardw »

I have the same issue.
I did find that this only occurs for if my rar-files, that I'm downloading (not unpacking or par-checking) are larger than 128MB (arround this at least). Smaller rar-files give no issues whatsoever.

I have 2GB available RAM in a virtual machine.

My findings:
Downloading files smaller than roughly 128MB:
Downloading 20 threads
no cache-limit
postprocessing going on...

Top memory usage seen: around 450-500MB (and dropping some times)

RAR files larger than roughly 128MB:
Downloading 6 threads
no other post-processing going on,

memory usage will suddenly rise very quickly, and par-checking and/or unrarring will become unpossible due to lack of available memory.
(it will rise even faster if it starts unrarring after download)
Memory is hardly ever returned to the OS


This is reproducable on my system. I hope this helps.
I'm using python 2.6 on a Virtual machine running on Centos 5.5
Last edited by lennardw on October 31st, 2010, 7:42 am, edited 1 time in total.
elmer91
Newbie
Newbie
Posts: 12
Joined: December 6th, 2009, 8:28 am

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by elmer91 »

May be this link could explain the SAB/Python memory behavior:

http://pushingtheweb.com/2010/06/python-and-tcmalloc/
elmer91
Newbie
Newbie
Posts: 12
Joined: December 6th, 2009, 8:28 am

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by elmer91 »

I have made following things on my system (Synology NAS DS710+)

1) Compiled/installed tcmalloc buy my own
2) modified my SAB startup script to force python using tcmalloc:
  export TCMALLOC_SKIP_SBRK=1
  export LDPRELOAD="/opt/lib/libtcmalloc.so"
  python ... SABnzbd.py

I will report later results of testing
loopdemack
Jr. Member
Jr. Member
Posts: 54
Joined: April 26th, 2008, 2:22 pm

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by loopdemack »

Great I am eager to hear the results.
elmer91
Newbie
Newbie
Posts: 12
Joined: December 6th, 2009, 8:28 am

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by elmer91 »

Still under testing, but it seems the reported behavior remains the same !

Test conditions: SAB 0.5.4 running on Linux 2.6.32 (python 2.5.5 + yenc)
Cache: 128M, Quickcheck on

One 4 GB nzb (91 rar files 50M each) downloaded several times (2 bad articles on this NZB)
Only one job in queue at a time.

First download (ram used VIRT/RES reported by htop, measurements taken at different download stages)
rar50: 138M/94M
rar65: 141M/98M
rar76: 142M/99M
rar86: 144M/99M
verify/repair/unrar: 144M/101M
completed: 144M/101M

Second download without restarting SAB (same nzb moved from backup folder to watched folder):
rar00: 153M/108M
rar30: 157M/114M
rar35: 157M/106M (first resident memory decrease seen)
rar40: 157M/109M
rar50: 157M/108M
rar57: 157M/114M
rar74: 157M/114M

Still downloading, but it seems the behavior is the same:
Few MB lost for every download.
When downloading huge amount of data without restarting SAB, used memory is climbing high.

As my new system has 1 GB memory, I don't have to restart SAB too often.
But on my previous system with only 128 MB memory (cache set to 0), I had to restart SAB every day.

I will continue downloading same NZB several times again...
(this 4 GB NZB take 1 hour to download on my DSL line)
Last edited by elmer91 on November 2nd, 2010, 12:47 pm, edited 1 time in total.
elmer91
Newbie
Newbie
Posts: 12
Joined: December 6th, 2009, 8:28 am

Re: High Memory Usage / Potential Memory Leaks Investigation Thread

Post by elmer91 »

After few runs, memory used by SAB once downloading is completed:

run #1: 144M/101M
run #2: 160M/117M
run #3: 162M/119M
run #4: 181M/124M

There is definitely memory not released back to the system !
Post Reply