Sounds weird I know...
I am on a better connection than in the past, I can now easily grab at 20MB/s from Usenet.
I have several RSS feeds set up in sabnzbd that get pulled every 15 minutes.
The problem now is that apparently sometimes files have not spread to my news server yet resulting in sabnzbd to fail the NZB.
If I manually try 5-10 minutes later, the NZB gets downloaded without any errors.
I could increase the polling interval, but this would not resolve the issue as the upload could be just a few seconds old at the time when the RSS gets queried.
Is there any way of forcing sabnzbd to retry these a few minutes later automatically?
Anyone else with that problem in here?
Help, I am downloading too fast!
Forum rules
Help us help you:
Help us help you:
- Are you using the latest stable version of SABnzbd? Downloads page.
- Tell us what system you run SABnzbd on.
- Adhere to the forum rules.
- Do you experience problems during downloading?
Check your connection in Status and Interface settings window.
Use Test Server in Config > Servers.
We will probably ask you to do a test using only basic settings. - Do you experience problems during repair or unpacking?
Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Re: Help, I am downloading too fast!
There's no way to handle this right now.
Re: Help, I am downloading too fast!
Thanks for your quick reply! Too bad there is no way to handle this properly at the moment.
Is anyone else having this kind of problem and is living with a workaround? I am stuck at the moment so an idea would be fine as well
Is anyone else having this kind of problem and is living with a workaround? I am stuck at the moment so an idea would be fine as well
Re: Help, I am downloading too fast!
Just had an idea:
Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
-
- Newbie
- Posts: 27
- Joined: January 2nd, 2013, 5:20 am
Re: Help, I am downloading too fast!
Your probable best bet is to throw together software that:-knatsch wrote:Just had an idea:
Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
A. On launch, connects to RSS feed & downloads it
B. Issues RSS feed from A
C. {MARK/LOOP/FUNCTION/WHATEVERYOUWANTTOCALLIT}
D. Sleep X minutes (The polling time you currently have in sabnzbd)
E. Download the RSS feed, keep in memory but continue to serve up the old RSS feed
F. Sleep X minutes, however long it takes for your usenet server to download it.
G. Update the feed you're serving with the feed you have in memory
H. Go to the mark/continue;/call function();/whatever else
I'm pretty sure if you're running sabnzbd on some distro of linux you could probably throw this together in a bash script, hell, you could probably do it all in one line using netcat, xparms and sleep (maybe curl too).
-
- Newbie
- Posts: 27
- Joined: January 2nd, 2013, 5:20 am
Re: Help, I am downloading too fast!
Threw this together for you:-
Tuh-duh. Oh, and, if you close the application it won't stop listening on the port, so, have fun.
EDIT:- I just want to note, it's not perfect. Oh, and it may or may not require an extra \r\n at the end of the rssFeed variable, now that I think about it.
Code: Select all
#Config
url="http://tycho.usno.navy.mil/cgi-bin/timer.pl"
rssFeed=$(curl -si $url)
/tmp/two.sh "$rssFeed" &
httpServerPID=$!
echo "Server up with pid $httpServerPID"
while true; do
sleep 10m
rssFeed=$(curl -si $url)
kill $httpServerPID
#/tmp/two.sh $rssFeed &
httpServerPID=$!
echo "Server up with pid $httpServerPID"
done
Code: Select all
while true; do
echo -e "$1" | nc -l 7601
done
EDIT:- I just want to note, it's not perfect. Oh, and it may or may not require an extra \r\n at the end of the rssFeed variable, now that I think about it.
Re: Help, I am downloading too fast!
The first part is trivial, through rss or category setup. The second part is alot more work, probably involves running a script from cron that checks and unpauses queue jobs via sab's api. There have been scripts posted to the forums in the past that may make api use from the command line a bit easier. Possible, but not very easy.knatsch wrote:Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
Another approach might be the (ab)use of a pre-queue script. Assign stuff from the rss feed to a certain category but don't bother with pausing, instead have the pre-queue script delay itself for that category. Not sure if this misbehaves when the delay is very long or in case you were to shutdown the program while a new job is getting delayed, but it is pretty simple to setup.
Code: Select all
#!/bin/sh
target="category"
delay="300" # seconds
[ "$3" = "$target" ] && sleep $delay
echo "1" # accept all jobs