Version 2.3.9 [03c10dc]
Limux Mint 19.3 64bit Cinnamon
I got the idea from here, use an api call to clear the queue of a specific type of download.
I am running
curl "http://localhost:8085/sabnzbd/api?mode= ... pikey=<api key>
this appears to delete SOME of the time. It is run after every download from yyy, via a post processing shell script.
Sometimes it leaves the queue untouched, but if I run it from the shell it works.
The script is running according to the log,
2020-04-04 09:01:00,937::INFO::[newsunpack:169] Running external script /home/<USERNAME>/nzb/post/post-tv.sh <...>
2020-04-04 09:01:01,378::DEBUG::[interface:481] API-call from 127.0.0.1 [curl/7.58.0] {'category': u'yyy', 'apikey': u'<HASH>a12f5e3', 'mode': u'history', 'value': u'all', 'name': u'delete'}
api call not deleting all
Forum rules
Help us help you:
Help us help you:
- Are you using the latest stable version of SABnzbd? Downloads page.
- Tell us what system you run SABnzbd on.
- Adhere to the forum rules.
- Do you experience problems during downloading?
Check your connection in Status and Interface settings window.
Use Test Server in Config > Servers.
We will probably ask you to do a test using only basic settings. - Do you experience problems during repair or unpacking?
Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
-
- Jr. Member
- Posts: 52
- Joined: October 24th, 2011, 4:18 am
Re: api call not deleting all
It seems to happen when a SINGLE nzb is in the finished queue. It delete's it from the cli but not from the script.
Re: api call not deleting all
Sorry for slow response!
The history-delete only deletes finished jobs. When a job is still running a post-processing script, it's not really finished yet so it's not deleted.
You can also run this script as a pre-queue script so it deletes them as soon as a new one comes in.
But: if your goal is just to have an empty history, you can just set History Retention in Config > Switches to not keep any jobs.
The history-delete only deletes finished jobs. When a job is still running a post-processing script, it's not really finished yet so it's not deleted.
You can also run this script as a pre-queue script so it deletes them as soon as a new one comes in.
But: if your goal is just to have an empty history, you can just set History Retention in Config > Switches to not keep any jobs.
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
-
- Jr. Member
- Posts: 52
- Joined: October 24th, 2011, 4:18 am
Re: api call not deleting all
I didn't get a message about it being answered so I am late as well.
Hmm so could I run it after a delay of say 10 sec?
something like
sleep 10s&&api call&
rest of script
or does it need more
Hmm so could I run it after a delay of say 10 sec?
something like
sleep 10s&&api call&
rest of script
or does it need more
Re: api call not deleting all
No, because while the script is running the item still won't be considered Finished and won't be removed.
There's no way really to do this with the post-proc script.
Maybe you can give a bit more idea about your use case? You only want to remove a specific set of downloads from your history, but keep the rest? Are these jobs added automatically by an external program or manually?
There's no way really to do this with the post-proc script.
Maybe you can give a bit more idea about your use case? You only want to remove a specific set of downloads from your history, but keep the rest? Are these jobs added automatically by an external program or manually?
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
-
- Jr. Member
- Posts: 52
- Joined: October 24th, 2011, 4:18 am
Re: api call not deleting all
They are added as rss entries. I only want to remove competed ones from this rss call (e.g. abc) and in the post script I was using the curl with the abc set selected. Most of the downloads are automatically done via radarr/sonarr and they "clean up" after. This rss was generating a lot of small downloads and it was getting cluttered in the results.
Think I have figured it out.
Created a small script call sl.sh running the curl command
added a crontab entry run every 10 min that runs the script.
Overkill maybe but at least it does what I want and doesn't use any processing power.
Think I have figured it out.
Created a small script call sl.sh running the curl command
added a crontab entry run every 10 min that runs the script.
Overkill maybe but at least it does what I want and doesn't use any processing power.
Re: api call not deleting all
That's indeed also a way to do it!
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate