RAR > PAR > RAR first

Want something added? Ask for it here.
Phasma
Newbie
Newbie
Posts: 5
Joined: April 26th, 2008, 4:39 pm

Re: RAR > PAR > RAR first

Post by Phasma »

I'd like to add an idea that perhaps would be an easier in-between step to implement before on-the-fly par2 verification. Im not sure if it will work, but here we go. NZB's contain the exact size a file is supposed to be, right? What if SAB were to check the size of the downloaded files against the NZB? If the sizes of the RAR's match it could start extracting without PAR verification. It's not a guarantee that the files contain no errors but checking the sizes provides *some* indication and takes just a few seconds at most.

If the sizes would not match SAB would first try PAR verification. And the same if extraction fails.

An option to turn this behaviour on and off would be nice of course. :)
auskento
Moderator
Moderator
Posts: 77
Joined: January 21st, 2008, 8:45 pm
Location: Melbourne, AUS

Re: RAR > PAR > RAR first

Post by auskento »

Just build a faster server, thats what I did :)

Mind you, I only have an 8mbit connection, which i limit to about 550k.  I get a 350MB TV episode in about 12 mins.

On my old server (AMD XP 2000) This would on average take 70 seconds to verify and 70 seconds to extract

On my new server (Xeon 1.8 Dual)  This has gone to about 7 seconds to verify and 6 seconds to extract.
The contoller on this motherboard is also significantly faster, so I dont have any throughput problems now while streaming content to a PC or multiple TVIX style units, if there is still disc activity going on in the back ground :)
Phasma
Newbie
Newbie
Posts: 5
Joined: April 26th, 2008, 4:39 pm

Re: RAR > PAR > RAR first

Post by Phasma »

We're not talking about 350Mb SD TV Eps. Think more like 14Gb Full HD Movies. Those take quite some more time. :)
User avatar
neilt0
Full Member
Full Member
Posts: 120
Joined: January 22nd, 2008, 4:16 am

Re: RAR > PAR > RAR first

Post by neilt0 »

Phasma wrote: We're not talking about 350Mb SD TV Eps. Think more like 14Gb Full HD Movies. Those take quite some more time. :)
And you've never had to repair any of these? Are you nuts?

I have a GigaNews account and regularly download 50GB posts. I'd be crazy not to par check before unpacking.
methanoid
Jr. Member
Jr. Member
Posts: 66
Joined: March 7th, 2008, 6:33 am

Re: RAR > PAR > RAR first

Post by methanoid »

shypike wrote: It could,  but we also need some attention for "nicing" the par2 process, because you will still have problems when a repair is needed.
CPU usage can be regulated, just set SABnzbd's CPU priority lower than that of MediaCenter.

Actually, the most worrying part is that it is (on Windows at least) not possible to prioritize disk access.
On Windows it's perfectly possible for a low-pio CPU process to saturate the disk channel completely.
(Run a large xcopy in the background and then try to start another application and you'll know what I mean.)
Do you mean setting SABnzbd CPU priority using Task Manager or is there some way of doing it automatically... Like using SAB instead of Folding@Home and SETI as something to use any spare CPU cycles/Disk activity!???  ;D  ;D
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: RAR > PAR > RAR first

Post by shypike »

I'm sure there are utilities to set a processes prio.
Maybe some day we build it into SABnzbd itself.
Still no fix for the disk usage priotiry...
SABUser
Newbie
Newbie
Posts: 24
Joined: April 11th, 2008, 11:04 am

Re: RAR > PAR > RAR first

Post by SABUser »

Phasma wrote: I'd like to add an idea that perhaps would be an easier in-between step to implement before on-the-fly par2 verification. Im not sure if it will work, ...
This is a very interesting idea. How viable is it?
auskento wrote: Just build a faster server, thats what I did :)

Mind you, I only have an 8mbit connection, which i limit to about 550k. ...
You hit the nail on the head. If you have a rented server you have slower hardware (but not slow) and much faster internet access. Upgrading to the type of hardware you are talking about would cost several thousand Euro per year extra. Just for Sabnzbd is not viable.
tdian
Newbie
Newbie
Posts: 1
Joined: May 12th, 2008, 4:45 pm

Re: RAR > PAR > RAR first

Post by tdian »

PAR2-on-the-fly verifiction is actually pretty easy to implement (just finished implementing it on the old SAB client).
Basically the only thing you have to do is extract the md5 hashes from a par2 file and then verify the files once you assemble them.

Per block verification (i.e figuring out exactly how many blocks are missing) is a bit harder to do, and probably not worth implementing in SAB (if blocks are missing, repairing is going to take some time anyway).
User avatar
neilt0
Full Member
Full Member
Posts: 120
Joined: January 22nd, 2008, 4:16 am

Re: RAR > PAR > RAR first

Post by neilt0 »

TDIAN RETURNS!  ;D  ;D  ;D

Great to see you here, and thank you so much for creating my favourite application!
SABUser
Newbie
Newbie
Posts: 24
Joined: April 11th, 2008, 11:04 am

Re: RAR > PAR > RAR first

Post by SABUser »

I would like to restart the debate on the original idea.

Looking at my history my last 98 par2 checks passed. Most of these checks are for 350MB or greater data sets. So whilst I understand that many users always want to par2 check for me the majority of the time it simply is not required, a direct unrar would succeed.

On the fly parity checks would be great but that's still alot of CPU time that for me is not needed.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: RAR > PAR > RAR first

Post by shypike »

on-the-fly check has been implemented for 0.5.0.
Just wait and see how that behaves first.
SABUser
Newbie
Newbie
Posts: 24
Joined: April 11th, 2008, 11:04 am

Re: RAR > PAR > RAR first

Post by SABUser »

I will indeed wait and see but speaking theoretically it will still do the calculations that i don't need it to do regardless how well on the fly works.

FYI i have downloaded 9 more 350MB NZB since i posted and all 9 did not need repair. From what i can see 1 set of files out of the last 114 sets has needed repair. Thats approximately 70GB of parity checks that didn't need checked and 0.4GB that did.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: RAR > PAR > RAR first

Post by shypike »

The way SABnzbd is designed now, it's not possible to retry a defective download afterwards.
So when you do have an incomplete download, SABnzbd will not be able to help you repair the job.
You will definitely encounter jobs with missing articles.
But, we'll discuss the idea of optional par2 in the team.
ziddey
Newbie
Newbie
Posts: 24
Joined: March 14th, 2008, 2:20 pm

Re: RAR > PAR > RAR first

Post by ziddey »

shypike wrote: I do have plans for on-the-fly par2 verification.
Ideally one should do the par2 verification when articles are assembled
into a file. In combination with an enabled (large) memory cache, this would be
the most efficient way.
But it's a lot of work and not high on the list right now.
Woo! This was the exact suggestion I was going to say. If the user is willing to devote enough ram, assemble articles to rar files in memory, and then perform parity checking immediately, then write to disk. Or if bad, I suppose write to disk / know to get par files as necessary.

As for the rar before par: if there's a way to force a file as "good", it could save a lot of time. Say unraring gets through the first 13 rars, and fails on the 14. If we could tell par the first 13 are good, and to work from there, it'd be less taxing? And now, we would just further check everything as needed, and if one rar is damaged, it's very plausible for others to be. How about even an unrar as you download, as the files become available?



As for the par not taking much time, in my setup, I have a single dedicated 500gb sata300 drive for sabnzbd downloading. My download speeds are such that if a file took me 20 minutes to download, I'd then have to spend 13 minutes par checking. And after that, everything checked out, and it begins the unrar process, which then takes another 4 minutes. With some techniques, I'm sure in this scenario, I'd be looking at maybe 25 minutes tops instead of 37 minutes. I know if I had the temporary files be on a different drive, the unrar would be faster. And I know I'm approaching the limits of hard drive grunt, but it's still room enough to cut lots of time. It could be 33% faster, eh.


Happy holidays!! Amazing
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: RAR > PAR > RAR first

Post by shypike »

The on-the-fly par2 check has been implemented for release 0.5.0.
It works if you use the sources from Subversion.
An official Beta is still 1-2 months away.
Post Reply