Help, I am downloading too fast!

Get help with all aspects of SABnzbd
Forum rules
Help us help you:
  • Are you using the latest stable version of SABnzbd? Downloads page.
  • Tell us what system you run SABnzbd on.
  • Adhere to the forum rules.
  • Do you experience problems during downloading?
    Check your connection in Status and Interface settings window.
    Use Test Server in Config > Servers.
    We will probably ask you to do a test using only basic settings.
  • Do you experience problems during repair or unpacking?
    Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Post Reply
knatsch
Newbie
Newbie
Posts: 17
Joined: April 15th, 2009, 7:23 am

Help, I am downloading too fast!

Post by knatsch »

Sounds weird I know...

I am on a better connection than in the past, I can now easily grab at 20MB/s from Usenet.
I have several RSS feeds set up in sabnzbd that get pulled every 15 minutes.
The problem now is that apparently sometimes files have not spread to my news server yet resulting in sabnzbd to fail the NZB.
If I manually try 5-10 minutes later, the NZB gets downloaded without any errors.
I could increase the polling interval, but this would not resolve the issue as the upload could be just a few seconds old at the time when the RSS gets queried.
Is there any way of forcing sabnzbd to retry these a few minutes later automatically?
Anyone else with that problem in here?
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: Help, I am downloading too fast!

Post by shypike »

There's no way to handle this right now.
knatsch
Newbie
Newbie
Posts: 17
Joined: April 15th, 2009, 7:23 am

Re: Help, I am downloading too fast!

Post by knatsch »

Thanks for your quick reply! Too bad there is no way to handle this properly at the moment.

Is anyone else having this kind of problem and is living with a workaround? I am stuck at the moment so an idea would be fine as well :)
knatsch
Newbie
Newbie
Posts: 17
Joined: April 15th, 2009, 7:23 am

Re: Help, I am downloading too fast!

Post by knatsch »

Just had an idea:

Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
AutomaticCoding
Newbie
Newbie
Posts: 27
Joined: January 2nd, 2013, 5:20 am

Re: Help, I am downloading too fast!

Post by AutomaticCoding »

knatsch wrote:Just had an idea:

Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
Your probable best bet is to throw together software that:-
A. On launch, connects to RSS feed & downloads it
B. Issues RSS feed from A
C. {MARK/LOOP/FUNCTION/WHATEVERYOUWANTTOCALLIT}
D. Sleep X minutes (The polling time you currently have in sabnzbd)
E. Download the RSS feed, keep in memory but continue to serve up the old RSS feed
F. Sleep X minutes, however long it takes for your usenet server to download it.
G. Update the feed you're serving with the feed you have in memory
H. Go to the mark/continue;/call function();/whatever else

I'm pretty sure if you're running sabnzbd on some distro of linux you could probably throw this together in a bash script, hell, you could probably do it all in one line using netcat, xparms and sleep (maybe curl too).
AutomaticCoding
Newbie
Newbie
Posts: 27
Joined: January 2nd, 2013, 5:20 am

Re: Help, I am downloading too fast!

Post by AutomaticCoding »

Threw this together for you:-

Code: Select all

#Config
url="http://tycho.usno.navy.mil/cgi-bin/timer.pl"

rssFeed=$(curl -si $url)
/tmp/two.sh "$rssFeed" &
httpServerPID=$!
echo "Server up with pid $httpServerPID"
while true; do
	sleep 10m
	rssFeed=$(curl -si $url)
	kill $httpServerPID
	#/tmp/two.sh $rssFeed &
	httpServerPID=$!
	echo "Server up with pid $httpServerPID"
done	

Code: Select all

while true; do
	echo -e "$1" | nc -l 7601
done
Tuh-duh. Oh, and, if you close the application it won't stop listening on the port, so, have fun.

EDIT:- I just want to note, it's not perfect. Oh, and it may or may not require an extra \r\n at the end of the rssFeed variable, now that I think about it.
User avatar
jcfp
Release Testers
Release Testers
Posts: 1004
Joined: February 7th, 2008, 12:45 pm

Re: Help, I am downloading too fast!

Post by jcfp »

knatsch wrote:Is it possible to import from RSS as paused and unpause each NZB after a period automatically, e.g. 10 minutes?
The first part is trivial, through rss or category setup. The second part is alot more work, probably involves running a script from cron that checks and unpauses queue jobs via sab's api. There have been scripts posted to the forums in the past that may make api use from the command line a bit easier. Possible, but not very easy.

Another approach might be the (ab)use of a pre-queue script. Assign stuff from the rss feed to a certain category but don't bother with pausing, instead have the pre-queue script delay itself for that category. Not sure if this misbehaves when the delay is very long or in case you were to shutdown the program while a new job is getting delayed, but it is pretty simple to setup.

Code: Select all

#!/bin/sh

target="category"
delay="300" # seconds

[ "$3" = "$target" ] && sleep $delay

echo "1" # accept all jobs
Post Reply