Multiple Instances SABNZBD - Unpack Separately
Posted: October 4th, 2021, 4:13 pm
This is kind of a odd request, but here's what I'm trying to accomplish:
1) I have a QNAP running Docker with 3 instances of Sab that Sonarr/Radarr send with a round robin (priority being the same accomplished this and it works very well)
2) My QNAP is still just a baby processor ARM Ryzen V1500B @ 2.20GHz
3) It bottlenecks whenever it has to unpack anything and I have an M1 Mac Mini sitting right here ready to help
Is there a way I can set up my M1 Mac Mini to simply unpack the files once it notices a new folder needs to be worked on? Right now, the process is setup for the individual instance to kick off a post-processing job, but that won't work as I don't want it to utilize the unpacking on that instance. Ideally I'd like to make my 4th instance of Sab running on my M1 Mac Mini go to work whenever a new job is completed and it would go behind and unpack everything.
I believe Sonarr doesn't care how long it takes, it simply checks every minute to see if new files can be grabbed.
I can write in PowerShell decently, and less so in Bash, Batch, and Python.
Odd thing is I can't get the CPUs to be used appropriately on my QNAP as it stays around 30% usage and doesn't go any higher , up from 18% before. I believe it's because Container Station running on QNAP is at 34% CPU and 84% memory usage, but the overall memory of the QNAP is 6/32GB so I can use more RAM if needed. My docker compose files are telling it to use 6GB of RAM but Sab won't actually use the full 6.
1) I have a QNAP running Docker with 3 instances of Sab that Sonarr/Radarr send with a round robin (priority being the same accomplished this and it works very well)
2) My QNAP is still just a baby processor ARM Ryzen V1500B @ 2.20GHz
3) It bottlenecks whenever it has to unpack anything and I have an M1 Mac Mini sitting right here ready to help
Is there a way I can set up my M1 Mac Mini to simply unpack the files once it notices a new folder needs to be worked on? Right now, the process is setup for the individual instance to kick off a post-processing job, but that won't work as I don't want it to utilize the unpacking on that instance. Ideally I'd like to make my 4th instance of Sab running on my M1 Mac Mini go to work whenever a new job is completed and it would go behind and unpack everything.
I believe Sonarr doesn't care how long it takes, it simply checks every minute to see if new files can be grabbed.
I can write in PowerShell decently, and less so in Bash, Batch, and Python.
Odd thing is I can't get the CPUs to be used appropriately on my QNAP as it stays around 30% usage and doesn't go any higher , up from 18% before. I believe it's because Container Station running on QNAP is at 34% CPU and 84% memory usage, but the overall memory of the QNAP is 6/32GB so I can use more RAM if needed. My docker compose files are telling it to use 6GB of RAM but Sab won't actually use the full 6.