Page 1 of 1

Too many files fails to unpack (missing blocks)

Posted: December 9th, 2009, 6:00 pm
by akuzura
I went from Altbinz to sabnzbd because of its awesome newzbin features.

Recently, though, a lot of files that I try to download are missing huge chunks of blocks when running quickpar, and the program (sabnzbd) can't seem to fix it. When downloading, I get errors all the time.

Code: Select all

2009-12-09 11:31:43,174 WARNING [assembler] <Article: [email protected], bytes=398673, partnum=19, art_id=None> missing
2009-12-09 11:33:41,980 WARNING [decoder] <Article: [email protected], bytes=398591, partnum=1, art_id=None> => missing from all servers, discarding
2009-12-09 11:38:41,352 WARNING [decoder] <Article: [email protected], bytes=398587, partnum=104, art_id=None> => missing from all servers, discarding
2009-12-09 11:40:02,296 WARNING [assembler] <Article: [email protected], bytes=398587, partnum=104, art_id=None> missing
2009-12-09 11:46:14,137 WARNING [assembler] <Article: [email protected], bytes=398591, partnum=1, art_id=None> missing
2009-12-09 11:51:24,138 WARNING [decoder] <Article: [email protected], bytes=398564, partnum=119, art_id=None> => missing from all servers, discarding
2009-12-09 11:51:24,479 WARNING [decoder] <Article: [email protected], bytes=398592, partnum=129, art_id=None> => missing from all servers, discarding
2009-12-09 11:53:02,105 WARNING [assembler] <Article: [email protected], bytes=398564, partnum=119, art_id=None> missing
2009-12-09 11:53:02,217 WARNING [assembler] <Article: [email protected], bytes=398592, partnum=129, art_id=None> missing
I don't know what this means, I initially thought it meant that the post was too old for my usenet server to handle (I use bintube, which I think is an astraweb reseller, meaning it has over 300+ binary retention, and this post is 70 days old)

The download speed should be around 1300 kbps, but sometimes when it climbs up to this speed it will stop and go slowly towards 30kbps and stay there for some time, until it rises again.

This problem is not limited to big files. I'm downloading right now a 200 mb file, and I get the same errors (however I'm not sure these errors are related to my broken files problem)

So I found a file that had more than half of its blocks missing, and tried to run the same nzb through altbinz (downloaded the NZB from a different location. One was from a newzbin report, and the second (functional) one was from searching the same terms in bintube's usenet search. The titles match as well as the age, so I know it's the same thing. I don't think this matters at all, but it might for all I know) , and it downloaded the file without needing to repair.

Can anyone help me? I understand I'm being very vague here, but I don't really understand the problem, it is a mystery.

So basically:
The files I try to download are missing chunks of blocks and are irreparable
I get weird "missing from all servers, discarding" errors
The program "altbinz" could download the same nzb without errors


Version: 0.4.12
OS: Windows 7
Install-type: Windows Installer
Skin (if applicable): Default
Firewall Software: None
Are you using IPV6? no
Is the issue reproducible? Yes and no. It happens alot, but I don't know when it does in fact happen.

Re: Too many files fails to unpack (missing blocks)

Posted: December 9th, 2009, 7:59 pm
by Stanknuggets
i feel you brother, that is another one of my problems, but i think only with older nzb's

Re: Too many files fails to unpack (missing blocks)

Posted: December 9th, 2009, 8:42 pm
by akuzura
^^^
But it's only with sabnzbd, it's fine using altbinz, that's the weird part.

Re: Too many files fails to unpack (missing blocks)

Posted: December 10th, 2009, 2:55 am
by shypike
There's only one way to find out, please email the suspect NZB to bugs at sabnzbd.org

One thing I have seen in the past is that different programs handle incorrect NZBs differently.
Example: sometimes you get an NZB with duplicate articles claiming to
describe the same file part. SABnzbd always uses the first one of the pair.
If another program would pick the second one, that might be the better choice (or not).

Re: Too many files fails to unpack (missing blocks)

Posted: December 10th, 2009, 6:32 am
by akuzura
It may be the usenet server being poor.
I'm kinda thinking of maybe changing server to see

Anyone else have the same problem and also using astraweb/bintube?

Re: Too many files fails to unpack (missing blocks)

Posted: December 10th, 2009, 10:43 am
by akuzura
UPDATE
Changed to newshosting.com and the problems went away.
I'm blaming astraweb

Re: Too many files fails to unpack (missing blocks)

Posted: December 11th, 2009, 7:45 pm
by pobox
If you use Astraweb you can avoid problems by configuring server redundancy, such as adding the SSL or EU server as a backup to the non-SSL or US server.

Re: Too many files fails to unpack (missing blocks)

Posted: December 12th, 2009, 12:59 pm
by Stanknuggets
pobox wrote: If you use Astraweb you can avoid problems by configuring server redundancy, such as adding the SSL or EU server as a backup to the non-SSL or US server.
That's how i have it setup.

does this "Fail on yEnc CRC Errors:" need to be checked or unchecked? i have it checked.

my main and backup are the ssl servers.

Re: Too many files fails to unpack (missing blocks)

Posted: December 12th, 2009, 4:09 pm
by pobox
Unchecked for two reasons:

1. Almost all of the damaged yEnc articles being passed around Usenet were damaged at the time of posting by known bugs, so downloading them again from the backup server won't accomplish anything.

2. The use of SSL ports is reported to successfully bypass connection problems caused by proxy servers somewhere between the news server and the client.

When people report "CRC errors" sometimes it's difficult to know which type of CRC errors they're talking about.  Yenc CRC errors happen when the calculated CRC32 value after decoding an article doesn't match the one provided.  The other kind of CRC error is when WinRar is unpacking RAR files.

Having a 2nd Astraweb server as a backup is said to fix "430 No such article" errors.

Re: Too many files fails to unpack (missing blocks)

Posted: January 3rd, 2010, 8:43 am
by indigo6ix
i used to get errors on pretty much everything i downloaded lately with Astra..

i changed to giganews a few days ago and have not had a problem since..

i know absolutely LOADS of people that are complaining about Astra with all kinds of errors  :/

sad times... Astra used to be so good, now they have really let the side down.

Astraweb detailed testing

Posted: January 3rd, 2010, 6:48 pm
by pobox
Since posting to this thread I bought an Astraweb block for testing, and discovered that if any mid-retention article is missing from one Astraweb server, it is missing from all of them, and it won't magically re-appear later.

Re: Too many files fails to unpack (missing blocks)

Posted: January 6th, 2010, 12:25 pm
by Stanknuggets
this may be my case only but, it looks like i had two sticks of bad ram. memtest reported errors so i removed the culprit sticks and the last four itmes i snatched have completed successfully.