RAR > PAR > RAR first
RAR > PAR > RAR first
Parity files are great. I understand what they are used for and I understand what sab uses them for.
But there are a percentage of users with very reliable news servers that dont need to use par most of the time.
I am suggesting that a feature be added to allow sab to be set to attempt an unrar first. If this unrar fails then continue in the manner that sab usually does ... basically forgetting that this unrar attempt happening. If unrar works then bypass par altogether.
This will likely not be of use to everyone but by ways of an example...
Of the last 30 downloads I have made (all approximately 400MB) only 1 has needed a par2 repair.
This means that my server completed 29 complete par2 checks on approximately 11.2GB of data it completely didn't need to do. That is a lot of wasted CPU cycles and a tonne of disk thrashing. More importantly it makes download faster.
Thoughts?
But there are a percentage of users with very reliable news servers that dont need to use par most of the time.
I am suggesting that a feature be added to allow sab to be set to attempt an unrar first. If this unrar fails then continue in the manner that sab usually does ... basically forgetting that this unrar attempt happening. If unrar works then bypass par altogether.
This will likely not be of use to everyone but by ways of an example...
Of the last 30 downloads I have made (all approximately 400MB) only 1 has needed a par2 repair.
This means that my server completed 29 complete par2 checks on approximately 11.2GB of data it completely didn't need to do. That is a lot of wasted CPU cycles and a tonne of disk thrashing. More importantly it makes download faster.
Thoughts?
Last edited by SABUser on April 11th, 2008, 11:15 am, edited 1 time in total.
-
- Full Member
- Posts: 211
- Joined: January 22nd, 2008, 1:38 pm
Re: RAR > PAR > RAR first
This could also be solved by letting par check every file after dload instead of waiting until all rars are dloaded.
Maybe in a future version of SAB
Maybe in a future version of SAB
Re: RAR > PAR > RAR first
Possibly, however I can download at almost 100mbit so it would definitely be a race.. especially if you consider by the time download 1 is done and is in the unrar phase download 2 is almost done as well.... all these processes fighting for CPU and disk time.
The way i see it par checking should be used to check validity of data. If the vast majority of the time you are safe to assume that the data is valid then why bother?
The way i see it par checking should be used to check validity of data. If the vast majority of the time you are safe to assume that the data is valid then why bother?
Re: RAR > PAR > RAR first
Having a "reliable server" is not enough.
Many articles get lost because there is a problem between the uploader and his/her server.
There lots of posts that miss articles on ALL servers in the world.
Many articles get lost because there is a problem between the uploader and his/her server.
There lots of posts that miss articles on ALL servers in the world.
Re: RAR > PAR > RAR first
Whilst I agree that in theory you are correct the "theory" and the real world don't match. I can say without absolute certainty that that VAST majority of my downloads (which are many) pass par2 without the need to repair. This is based on real world actual and reproducible data.
If you would like me to keep some notes and post them again on specifics if will if it will help prove my case but please don't disregard this based on theory alone... there must be many people like me that haven't really paid to much attention to their stats.
If you would like me to keep some notes and post them again on specifics if will if it will help prove my case but please don't disregard this based on theory alone... there must be many people like me that haven't really paid to much attention to their stats.
Re: RAR > PAR > RAR first
Polite bump for opinions
-
- Release Testers
- Posts: 114
- Joined: January 25th, 2008, 1:10 pm
Re: RAR > PAR > RAR first
PAR verification doesnt that much time, does it?
Re: RAR > PAR > RAR first
No its quite efficient but it does need to load all data from disk which itself adds to the bottleneck. When you combine this with all the other things fighting for disk and CPU time such as rar and normal article downloading and decoding it all adds up.
-
- Release Testers
- Posts: 114
- Joined: January 25th, 2008, 1:10 pm
Re: RAR > PAR > RAR first
Well yeah its a nice feature - but definitely not high priority for me, and probably a bunch of other people
Not everyone has giganews
Not everyone has giganews
Re: RAR > PAR > RAR first
Yes i totally agree it should be a low priority feature all I am looking for is an indication that it will be added to a "to do" list i.e. tracked as wanted and perhaps added some day. Nothing more
Re: RAR > PAR > RAR first
I do have plans for on-the-fly par2 verification.
Ideally one should do the par2 verification when articles are assembled
into a file. In combination with an enabled (large) memory cache, this would be
the most efficient way.
But it's a lot of work and not high on the list right now.
Ideally one should do the par2 verification when articles are assembled
into a file. In combination with an enabled (large) memory cache, this would be
the most efficient way.
But it's a lot of work and not high on the list right now.
Re: RAR > PAR > RAR first
Shypike, would this help with spikes in CPU usage? I'm considering a HTPC/Download box and would not want SABnzbd+ from grabbing processor time when I was watching a video or more importantly, recording something.
Re: RAR > PAR > RAR first
It could, but we also need some attention for "nicing" the par2 process, because you will still have problems when a repair is needed.
CPU usage can be regulated, just set SABnzbd's CPU priority lower than that of MediaCenter.
Actually, the most worrying part is that it is (on Windows at least) not possible to prioritize disk access.
On Windows it's perfectly possible for a low-pio CPU process to saturate the disk channel completely.
(Run a large xcopy in the background and then try to start another application and you'll know what I mean.)
CPU usage can be regulated, just set SABnzbd's CPU priority lower than that of MediaCenter.
Actually, the most worrying part is that it is (on Windows at least) not possible to prioritize disk access.
On Windows it's perfectly possible for a low-pio CPU process to saturate the disk channel completely.
(Run a large xcopy in the background and then try to start another application and you'll know what I mean.)
Re: RAR > PAR > RAR first
Could the proposed per-RAR PAR checking be a feature that could be disabled in favor of after download completes PAR checking? Rather than a replacement for the status quo.
I favor lower CPU lower RAM consumption PAR checking either way
I favor lower CPU lower RAM consumption PAR checking either way
Re: RAR > PAR > RAR first
Don't worry.
a) the feature is still far away and b) of course we'll examine the impact first.
a) the feature is still far away and b) of course we'll examine the impact first.