I like the improvement made to detect duplicate episodes in series, but I have some ideas on how to advance on the duplicate detection.
1. If "Pause on duplicates" is selected, resume the duplicate if the primary fails. With more and more failures due to missing articles these days, being a little smarter here would be huge.
2. If all duplicates fail for same NZB name (this is more complex), would it be possible to use articles from multiple dupe nzbs to reconstruct the destination file?
The first one seems pretty straight forward, but the second would would be complex and powerful. There are more and more reposts these days due to articles being taken down by usenet providers. It would be awesome to be able to reconstruct the file from multiple reposts of the same nzb.
Thank you!
Advanced Duplicate Processing
Re: Advanced Duplicate Processing
I do the second one manually sometimes. If it's broken across multiple uploads/cross-posts then I'll copy them into a temp directory keeping the largest RAR from each set. Often times by doing that you can get enough to repair.
Re: Advanced Duplicate Processing
#2 is not worth the effort.zymurgist wrote: 2. If all duplicates fail for same NZB name (this is more complex), would it be possible to use articles from multiple dupe nzbs to reconstruct the destination file?
If the NZBs refer to different uploads then it's unlikely that the combined parts verify.
Maybe sometimes it works, but there no certainty at all.
Potentially wasting even more bandwidth.
#1 is interesting, but I think you'll be better of with a front-end like SickBeard, which does this already.
Re: Advanced Duplicate Processing
I was hoping it was doable in some way. Manual is fine since it's not something that happens often. Care to provide a step by step how-to? Thanks!ALbino wrote:I do the second one manually sometimes. If it's broken across multiple uploads/cross-posts then I'll copy them into a temp directory keeping the largest RAR from each set. Often times by doing that you can get enough to repair.
Re: Advanced Duplicate Processing
Thanks. I'll have to take a look at SickBeard. I'm not familiar with it.shypike wrote:#2 is not worth the effort.zymurgist wrote: 2. If all duplicates fail for same NZB name (this is more complex), would it be possible to use articles from multiple dupe nzbs to reconstruct the destination file?
If the NZBs refer to different uploads then it's unlikely that the combined parts verify.
Maybe sometimes it works, but there no certainty at all.
Potentially wasting even more bandwidth.
#1 is interesting, but I think you'll be better of with a front-end like SickBeard, which does this already.
Re: Advanced Duplicate Processing
Well, there's not much of a step-by-step, but essentially you just 1) download them both individually, 2) put one of them in a temp folder, then 3) copy the second one into the temp folder as well and either it will copy successfully because the file was missing all together or it will be prompt you to replace the existing file. If the one you're copying is larger then say yes, and if it's smaller say no. Eventually you end up with the best possible RARs and PARs from each separate posting. This "trick" has really only been useful a handful of times, but it does work on occasion if the uploader is using the same RAR and PAR sets.zymurgist wrote:I was hoping it was doable in some way. Manual is fine since it's not something that happens often. Care to provide a step by step how-to? Thanks!ALbino wrote:I do the second one manually sometimes. If it's broken across multiple uploads/cross-posts then I'll copy them into a temp directory keeping the largest RAR from each set. Often times by doing that you can get enough to repair.
A similarly good way to go about it is to assume a bad NZB from the indexer and just download the specific broken RARs/PARs from Binsearch/NZBClub or whatever.