Search found 42 matches
- March 29th, 2011, 10:50 am
- Forum: Feature Requests
- Topic: On Config/API change event, update URLs of jobs to be tried again
- Replies: 2
- Views: 1820
Re: On Config/API change event, update URLs of jobs to be tried again
>>The next 0.6.0 Beta will not send NZB to history because of indexer limits That would make everything easier, thank you. >> thin. Maybe. But imagine that, eventually, the need for API change happens. Errors in the providers are actually making it more frequent. There's a great chance the users wil...
- March 29th, 2011, 6:48 am
- Forum: Feature Requests
- Topic: On Config/API change event, update URLs of jobs to be tried again
- Replies: 2
- Views: 1820
On Config/API change event, update URLs of jobs to be tried again
Feature: When changing provider API key on config, parse/update URLs of jobs in history that are failed. (Probably at queue too?). Why? - You have failed jobs in your history that you waited a couple hours / day to try again (daily limit, etc). You know they are valid anyway. - Meanwhile you were as...
- March 28th, 2011, 9:04 pm
- Forum: General Help
- Topic: Need help understanding origin of *partXY.rar.1" files.
- Replies: 5
- Views: 3541
Re: Need help understanding origin of *partXY.rar.1" files.
No, they look ok so far (could not check them all, looked at like 50 of them and didn't find anything unusual. Is this situation possible: - sabnzbd had the "Detect Duplicate" option set as ignore. - Jobs got orphaned in the download folder after failed / poorly done upgrades, cleared queu...
- March 28th, 2011, 2:19 pm
- Forum: General Help
- Topic: Need help understanding origin of *partXY.rar.1" files.
- Replies: 5
- Views: 3541
Re: Need help understanding origin of *partXY.rar.1" files.
Hi Shypike, thank you so much for helping. I'm lost here. I found out dozens of directories with this situation: Like 50 files .rar and the files with the same exact name but named .rar.1...I hadn't considered par2 as the source of the .1 (as to avoid overwrite of original file). I will try to test ...
- March 28th, 2011, 9:56 am
- Forum: General Help
- Topic: Need help understanding origin of *partXY.rar.1" files.
- Replies: 5
- Views: 3541
Need help understanding origin of *partXY.rar.1" files.
I have a lot of directories in my system which I don't know if were processed or not. They accumulate to more than 200 every week or so. Checking each of them manually takes a ridiculous amount of time. I wrote a script to go in each directory, find par, rar, sfv, srr, srs, etc and try to do somethi...
- March 27th, 2011, 12:05 pm
- Forum: General Help
- Topic: Confused about the definition of Orphaned Jobs
- Replies: 2
- Views: 6767
Re: Confused about the definition of Orphaned Jobs
Got it, thank you!
- March 27th, 2011, 8:36 am
- Forum: General Help
- Topic: Confused about the definition of Orphaned Jobs
- Replies: 2
- Views: 6767
Confused about the definition of Orphaned Jobs
What characteristics define a job as "orphaned" so that it goes to the job list at "Status/Queue repair"? A have more of these than I count and check one-by-one. I mean, did they fail to download, were duplicates of other stuff already upper in the queue so not downloaded, downlo...
- March 24th, 2011, 3:27 pm
- Forum: Beta Releases
- Topic: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
- Replies: 11
- Views: 13106
Re: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
Ah ok, I was thinking large (server, power desktop) and not the other way around (NAS)... Now I see, you are absolutely right. Running sabnzbd on a Nintendo DS, a coffee machine or something must create such results. I was about to ask you if you own a Usenet Host LOL :) You know what, I think unrar...
- March 24th, 2011, 2:59 pm
- Forum: Beta Releases
- Topic: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
- Replies: 11
- Views: 13106
Re: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
I've never seen anything like this with an average of 5TB/day downloaded and processed. I Can't imagine what kind of operation your running there :) Out of curiosity what size of download can take 12 hours on unrar + par2 r processes? Honestly, I am pretty sure that if I establish something like Max...
- March 24th, 2011, 2:46 pm
- Forum: Beta Releases
- Topic: Unuzable nzb/Failed URL is actually misleading for a "Wait x" common API msg
- Replies: 3
- Views: 3914
Re: Unuzable nzb/Failed URL is actually misleading for a "Wait x" common API msg
Yeah, nzbmatrix and nzb.su too. We know its not a 404, 500, etc error because there's content with the API and Error in the string. But that's it. Both providers are absolutely defaults for Sick Beard users. All we can know is date/time when the download attempt failed. I think it could be archived ...
- March 24th, 2011, 10:31 am
- Forum: Beta Releases
- Topic: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
- Replies: 11
- Views: 13106
Re: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
It'ss a good idea. I've been also thinking about an option to set "max par2 processing time" / "max unrar processing time" settings in the config. We know our systems. We know what is a reasonable time and what is not. And, obviously, the average lumberjack joe would have the opt...
- March 24th, 2011, 10:25 am
- Forum: Beta Releases
- Topic: Unuzable nzb/Failed URL is actually misleading for a "Wait x" common API msg
- Replies: 3
- Views: 3914
Unuzable nzb/Failed URL is actually misleading for a "Wait x" common API msg
sabnzbd+ fills history with one hundred "URL Fetching failed: Unuzable nzb file". Visiting the URL you see it actually is one of your search providers telling you your "API daily/hourly limit is reached, try again later". - Deleting this items from history would confuse the nzb s...
- March 14th, 2011, 2:05 am
- Forum: Beta Releases
- Topic: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
- Replies: 11
- Views: 13106
Re: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
There is a setting in sabnzbd+ to stop downloading while files are being post-processed .. maybe that is an option ? Yes, it has always been unset, so sabnzbd+ could download during post-process if it wanted to. Checking it seems unproductive (idle bandwidth, cpu, etc). All the server is supposed t...
- March 14th, 2011, 1:40 am
- Forum: Beta Releases
- Topic: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
- Replies: 11
- Views: 13106
Re: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
Ok, so I Googled a little and found out Sabnzbd actually does try to fetch more blocks when repair fails. This is news to me. I don't know why it never happened/worked here. I have other post at this same forums asking if there wasn't a way to automatically download more par2 files when needed... I'...
- March 14th, 2011, 1:12 am
- Forum: Beta Releases
- Topic: Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
- Replies: 11
- Views: 13106
Weird issue: Post-processing accumulates many simultaneous jobs. Kills cpu/ram.
Hi, I'm not sure if what I am about to describe here is actually a problem, a new sabnzbd normal behavior for the latest Beta or if it has always been like this, so excuse me. I have never seen this happen. 1) I only noticed because I receive e-mail from system to root saying CPU/RAM was dying. Wen...