Page 1 of 1
Help, I am downloading too fast!
Posted: April 7th, 2013, 5:44 am
by knatsch
Sounds weird I know...
I am on a better connection than in the past, I can now easily grab at 20MB/s from Usenet.
I have several RSS feeds set up in sabnzbd that get pulled every 15 minutes.
The problem now is that apparently sometimes files have not spread to my news server yet resulting in sabnzbd to fail the NZB.
If I manually try 5-10 minutes later, the NZB gets downloaded without any errors.
I could increase the polling interval, but this would not resolve the issue as the upload could be just a few seconds old at the time when the RSS gets queried.
Is there any way of forcing sabnzbd to retry these a few minutes later automatically?
Anyone else with that problem in here?
Re: Help, I am downloading too fast!
Posted: April 7th, 2013, 6:24 am
by shypike
There's no way to handle this right now.
Re: Help, I am downloading too fast!
Posted: April 7th, 2013, 7:49 am
by knatsch
Thanks for your quick reply! Too bad there is no way to handle this properly at the moment.
Is anyone else having this kind of problem and is living with a workaround? I am stuck at the moment so an idea would be fine as well
Re: Help, I am downloading too fast!
Posted: April 20th, 2013, 3:24 am
by knatsch
Just had an idea:
Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
Re: Help, I am downloading too fast!
Posted: April 20th, 2013, 3:49 am
by AutomaticCoding
knatsch wrote:Just had an idea:
Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
Your probable best bet is to throw together software that:-
A. On launch, connects to RSS feed & downloads it
B. Issues RSS feed from A
C. {MARK/LOOP/FUNCTION/WHATEVERYOUWANTTOCALLIT}
D. Sleep X minutes (The polling time you currently have in sabnzbd)
E. Download the RSS feed, keep in memory but continue to serve up the old RSS feed
F. Sleep X minutes, however long it takes for your usenet server to download it.
G. Update the feed you're serving with the feed you have in memory
H. Go to the mark/continue;/call function();/whatever else
I'm pretty sure if you're running sabnzbd on some distro of linux you could probably throw this together in a bash script, hell, you could probably do it all in one line using netcat, xparms and sleep (maybe curl too).
Re: Help, I am downloading too fast!
Posted: April 20th, 2013, 5:03 am
by AutomaticCoding
Threw this together for you:-
Code: Select all
#Config
url="http://tycho.usno.navy.mil/cgi-bin/timer.pl"
rssFeed=$(curl -si $url)
/tmp/two.sh "$rssFeed" &
httpServerPID=$!
echo "Server up with pid $httpServerPID"
while true; do
sleep 10m
rssFeed=$(curl -si $url)
kill $httpServerPID
#/tmp/two.sh $rssFeed &
httpServerPID=$!
echo "Server up with pid $httpServerPID"
done
Code: Select all
while true; do
echo -e "$1" | nc -l 7601
done
Tuh-duh. Oh, and, if you close the application it won't stop listening on the port, so, have fun.
EDIT:- I just want to note, it's not perfect. Oh, and it may or may not require an extra \r\n at the end of the rssFeed variable, now that I think about it.
Re: Help, I am downloading too fast!
Posted: April 20th, 2013, 7:02 am
by jcfp
knatsch wrote:Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
The first part is trivial, through rss or category setup. The second part is alot more work, probably involves running a script from cron that checks and unpauses queue jobs via sab's api. There have been scripts posted to the forums in the past that may make api use from the command line a bit easier. Possible, but not very easy.
Another approach might be the (ab)use of a pre-queue script. Assign stuff from the rss feed to a certain category but don't bother with pausing, instead have the pre-queue script delay itself for that category. Not sure if this misbehaves when the delay is very long or in case you were to shutdown the program while a new job is getting delayed, but it is pretty simple to setup.
Code: Select all
#!/bin/sh
target="category"
delay="300" # seconds
[ "$3" = "$target" ] && sleep $delay
echo "1" # accept all jobs