Too many files fails to unpack (missing blocks)

Get help with all aspects of SABnzbd
Forum rules
Help us help you:
  • Are you using the latest stable version of SABnzbd? Downloads page.
  • Tell us what system you run SABnzbd on.
  • Adhere to the forum rules.
  • Do you experience problems during downloading?
    Check your connection in Status and Interface settings window.
    Use Test Server in Config > Servers.
    We will probably ask you to do a test using only basic settings.
  • Do you experience problems during repair or unpacking?
    Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Post Reply
akuzura
Newbie
Newbie
Posts: 6
Joined: December 9th, 2009, 9:44 am

Too many files fails to unpack (missing blocks)

Post by akuzura »

I went from Altbinz to sabnzbd because of its awesome newzbin features.

Recently, though, a lot of files that I try to download are missing huge chunks of blocks when running quickpar, and the program (sabnzbd) can't seem to fix it. When downloading, I get errors all the time.

Code: Select all

2009-12-09 11:31:43,174 WARNING [assembler] <Article: [email protected], bytes=398673, partnum=19, art_id=None> missing
2009-12-09 11:33:41,980 WARNING [decoder] <Article: [email protected], bytes=398591, partnum=1, art_id=None> => missing from all servers, discarding
2009-12-09 11:38:41,352 WARNING [decoder] <Article: [email protected], bytes=398587, partnum=104, art_id=None> => missing from all servers, discarding
2009-12-09 11:40:02,296 WARNING [assembler] <Article: [email protected], bytes=398587, partnum=104, art_id=None> missing
2009-12-09 11:46:14,137 WARNING [assembler] <Article: [email protected], bytes=398591, partnum=1, art_id=None> missing
2009-12-09 11:51:24,138 WARNING [decoder] <Article: [email protected], bytes=398564, partnum=119, art_id=None> => missing from all servers, discarding
2009-12-09 11:51:24,479 WARNING [decoder] <Article: [email protected], bytes=398592, partnum=129, art_id=None> => missing from all servers, discarding
2009-12-09 11:53:02,105 WARNING [assembler] <Article: [email protected], bytes=398564, partnum=119, art_id=None> missing
2009-12-09 11:53:02,217 WARNING [assembler] <Article: [email protected], bytes=398592, partnum=129, art_id=None> missing
I don't know what this means, I initially thought it meant that the post was too old for my usenet server to handle (I use bintube, which I think is an astraweb reseller, meaning it has over 300+ binary retention, and this post is 70 days old)

The download speed should be around 1300 kbps, but sometimes when it climbs up to this speed it will stop and go slowly towards 30kbps and stay there for some time, until it rises again.

This problem is not limited to big files. I'm downloading right now a 200 mb file, and I get the same errors (however I'm not sure these errors are related to my broken files problem)

So I found a file that had more than half of its blocks missing, and tried to run the same nzb through altbinz (downloaded the NZB from a different location. One was from a newzbin report, and the second (functional) one was from searching the same terms in bintube's usenet search. The titles match as well as the age, so I know it's the same thing. I don't think this matters at all, but it might for all I know) , and it downloaded the file without needing to repair.

Can anyone help me? I understand I'm being very vague here, but I don't really understand the problem, it is a mystery.

So basically:
The files I try to download are missing chunks of blocks and are irreparable
I get weird "missing from all servers, discarding" errors
The program "altbinz" could download the same nzb without errors


Version: 0.4.12
OS: Windows 7
Install-type: Windows Installer
Skin (if applicable): Default
Firewall Software: None
Are you using IPV6? no
Is the issue reproducible? Yes and no. It happens alot, but I don't know when it does in fact happen.
Last edited by akuzura on December 9th, 2009, 6:03 pm, edited 1 time in total.
Stanknuggets
Newbie
Newbie
Posts: 18
Joined: December 5th, 2009, 11:03 am

Re: Too many files fails to unpack (missing blocks)

Post by Stanknuggets »

i feel you brother, that is another one of my problems, but i think only with older nzb's
akuzura
Newbie
Newbie
Posts: 6
Joined: December 9th, 2009, 9:44 am

Re: Too many files fails to unpack (missing blocks)

Post by akuzura »

^^^
But it's only with sabnzbd, it's fine using altbinz, that's the weird part.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: Too many files fails to unpack (missing blocks)

Post by shypike »

There's only one way to find out, please email the suspect NZB to bugs at sabnzbd.org

One thing I have seen in the past is that different programs handle incorrect NZBs differently.
Example: sometimes you get an NZB with duplicate articles claiming to
describe the same file part. SABnzbd always uses the first one of the pair.
If another program would pick the second one, that might be the better choice (or not).
akuzura
Newbie
Newbie
Posts: 6
Joined: December 9th, 2009, 9:44 am

Re: Too many files fails to unpack (missing blocks)

Post by akuzura »

It may be the usenet server being poor.
I'm kinda thinking of maybe changing server to see

Anyone else have the same problem and also using astraweb/bintube?
akuzura
Newbie
Newbie
Posts: 6
Joined: December 9th, 2009, 9:44 am

Re: Too many files fails to unpack (missing blocks)

Post by akuzura »

UPDATE
Changed to newshosting.com and the problems went away.
I'm blaming astraweb
pobox
Full Member
Full Member
Posts: 104
Joined: May 3rd, 2008, 6:11 pm

Re: Too many files fails to unpack (missing blocks)

Post by pobox »

If you use Astraweb you can avoid problems by configuring server redundancy, such as adding the SSL or EU server as a backup to the non-SSL or US server.
Stanknuggets
Newbie
Newbie
Posts: 18
Joined: December 5th, 2009, 11:03 am

Re: Too many files fails to unpack (missing blocks)

Post by Stanknuggets »

pobox wrote: If you use Astraweb you can avoid problems by configuring server redundancy, such as adding the SSL or EU server as a backup to the non-SSL or US server.
That's how i have it setup.

does this "Fail on yEnc CRC Errors:" need to be checked or unchecked? i have it checked.

my main and backup are the ssl servers.
pobox
Full Member
Full Member
Posts: 104
Joined: May 3rd, 2008, 6:11 pm

Re: Too many files fails to unpack (missing blocks)

Post by pobox »

Unchecked for two reasons:

1. Almost all of the damaged yEnc articles being passed around Usenet were damaged at the time of posting by known bugs, so downloading them again from the backup server won't accomplish anything.

2. The use of SSL ports is reported to successfully bypass connection problems caused by proxy servers somewhere between the news server and the client.

When people report "CRC errors" sometimes it's difficult to know which type of CRC errors they're talking about.  Yenc CRC errors happen when the calculated CRC32 value after decoding an article doesn't match the one provided.  The other kind of CRC error is when WinRar is unpacking RAR files.

Having a 2nd Astraweb server as a backup is said to fix "430 No such article" errors.
indigo6ix
Newbie
Newbie
Posts: 1
Joined: January 3rd, 2010, 8:40 am

Re: Too many files fails to unpack (missing blocks)

Post by indigo6ix »

i used to get errors on pretty much everything i downloaded lately with Astra..

i changed to giganews a few days ago and have not had a problem since..

i know absolutely LOADS of people that are complaining about Astra with all kinds of errors  :/

sad times... Astra used to be so good, now they have really let the side down.
pobox
Full Member
Full Member
Posts: 104
Joined: May 3rd, 2008, 6:11 pm

Astraweb detailed testing

Post by pobox »

Since posting to this thread I bought an Astraweb block for testing, and discovered that if any mid-retention article is missing from one Astraweb server, it is missing from all of them, and it won't magically re-appear later.
Stanknuggets
Newbie
Newbie
Posts: 18
Joined: December 5th, 2009, 11:03 am

Re: Too many files fails to unpack (missing blocks)

Post by Stanknuggets »

this may be my case only but, it looks like i had two sticks of bad ram. memtest reported errors so i removed the culprit sticks and the last four itmes i snatched have completed successfully.
Post Reply