Script to automate fixing failed downloads

Come up with a useful post-processing script? Share it here!
Post Reply
pancus
Newbie
Newbie
Posts: 15
Joined: March 20th, 2008, 11:46 pm

Script to automate fixing failed downloads

Post by pancus »

I hate downloading something only to have it finish with 10 or so missing blocks. I've come to learn that if you download the same thing that was posted earlier or, strangely enough, the exact same thing from a different indexer you can combine the two (or more) into one working download.

I wrote a script to automate this. In short, when you download the same thing over and over it will rename the folders to thing.1 thing.2 and you can use these to repair your download.

Here is an example run in "normal" mode. "Normal" mode rescans all the folders and tries to repair them. Once in a blue moon sabnzbd says something failed to repair when in fact it's repairable.

Code: Select all

attempting to repair usenet incomplete downloads... normal mode.

repairing folder './Human.Toilet.2021.iNTERNAL.720p.BluRay.x264-PooP'
You need 5 more recovery blocks to be able to repair.

done.
Here's an example of something I fixed today (name changed). It failed with 5 blocks above, downloaded the exact same thing from a different indexer (same date etc), it failed with ~100 blocks missing but them together had enough to repair.

Code: Select all

attempting to repair usenet incomplete downloads... 
append 'hc' for hardcore mode, 'normal' for normal mode...

repairing folder './Human.Toilet.2021.iNTERNAL.720p.BluRay.x264-PooP'
- found './Human.Toilet.2021.iNTERNAL.720p.BluRay.x264-PooP.1'
Repair complete.
Here is the script if anyone wants to play with it. It's written in bash so run it with "bash whateveryounamedit.sh" or chmod it. It's possible there are some bugs but it works for metm.

Code: Select all

#!/bin/bash

# to use:
#  download same release exactly as the failed one.
#  make sure the nzb is named _exactly_ the same.
#  if it fails too you can keep running this and downloading other nzbs until it repairs.
#  do not stop this once it starts running, it may leave folders in an unwanted state.

# how it works:
#  looks for directories similiar to /name /name.1 /name.2
#  gets into /name, symlinks the others inside and uses all pars/rars for repair.
#  does _not_ unrar on success, do that inside sab so it names/cleans up properly.

# hardcore mode tries the par2 inside the other folders in case they are different.
# normal mode only scans folders without a .1 since sab sometimes fails for no reason.

# sabnzbd incomplete folder to operate on.
INCOMPLETE="/250g/usenet/incomplete"
# nonroot user to drop permissions to when ran as root, preferably same as sabnzbd runs under.
NONROOT="poop"

[ ! -d "${INCOMPLETE}" ] && echo "'${INCOMPLETE}' not found." && exit
[ "$(whoami)" == 'root' ] && echo 'dropping root...' && su "${NONROOT}" -c "$0" "$1" && exit

[ "$1" == 'hc' ] && HARDCORE="1"
[ "$1" == 'normal' ] && NORMAL="1"

echo -n 'attempting to repair usenet incomplete downloads... '
[ -z "${HARDCORE}${NORMAL}" ] && echo '' && echo "append 'hc' for hardcore mode, 'normal' for normal mode..."
[ -n "${HARDCORE}" ] && echo 'HARDCORE MODE!'
[ -n "${NORMAL}" ] && echo 'normal mode.'
echo 'starting in 3 sec...'
sleep 3

cd "${INCOMPLETE}" || exit
find . -maxdepth 1 -type d -print0 | sort -z | while read -d '' -r DIR; do 
  [ "${DIR}" == '.' ] && continue
  if [ -d "${INCOMPLETE}/${DIR}.1" ] && [ -z "${NORMAL}" ]; then
    echo ''
    echo "repairing folder '${DIR}'"
    cd "${INCOMPLETE}/${DIR}" || exit
    MAINPAR="$(find . -maxdepth 1 -type f -name "*.par2" | grep -vE '\.vol[0-9]{1,3}\+[0-9]{1,3}\.par2')"
    [ ! -f "${MAINPAR}" ] && MAINPAR="$(find . -maxdepth 1 -type f -name "*.par2" | sort | head -n1)"
    [ ! -f "${MAINPAR}" ] && echo 'setting MAINPAR failed...' && continue
    MD5SUM="$(md5sum "${MAINPAR}" | cut -c1-32)"
    for j in $(seq 1 9); do
      if [ -d "../${DIR}.${j}" ]; then
        echo "- found '${DIR}.${j}'"
        ln -s "../${DIR}.${j}" "x${j}"
        if [ $(find "x${j}/" -maxdepth 1 -type f -name "*.par2" | wc -l) -ge 1 ]; then
          # rename potentally dupe par2 files in /x# to please par2repair
          rename ".par2" ".x${j}.par2" "x${j}/"*.par2
          [ -n "${HARDCORE}" ] && mv "x${j}/"*".x${j}.par2" .
        fi
      fi
    done
    if [ -n "${HARDCORE}" ]; then
      # try each diff par2
      for j in $(seq 1 9); do
        if [ -L "x${j}" ]; then
          echo "-- trying .${j} par2"
          ALTPAR="$(find . -maxdepth 1 -type f  -name "*.x${j}.par2" | grep -vE '\.vol[0-9]{1,3}\+[0-9]{1,3}\.x[1-9]\.par2')"
          [ ! -f "${ALTPAR}" ] && ALTPAR="$(find . -maxdepth 1 -type f -name "*.x${j}.par2" | sort | head -n1)"
          if [ -f "${ALTPAR}" ]; then
            MD5ALT="$(md5sum "${ALTPAR}" | cut -c1-32)"
            if echo "${MD5SUM}" | grep -q "${MD5ALT}" ; then
              echo '   par2 is a dupe, skipping.'
            else
              MD5SUM="${MD5SUM} ${MD5ALT}"
              par2repair "${ALTPAR}" ./*.vol*.par2 {,x[1-9]/}*.rar 2>&1 | grep -E 'more recovery blocks|Repair complete'
            fi
          else
            echo '   setting ALTPAR failed.'
          fi
        fi
      done
    else
      par2repair "${MAINPAR}" {,x[1-9]/}*.{par2,rar} 2>&1 \
        | grep -E 'more recovery blocks|Repair complete'
    fi
    # done. move things back and remove symlinks.
    for j in $(seq 1 9); do
      if [ -L "x${j}" ]; then
        if [ $(find -maxdepth 2 -type f -name "*.x${j}.par2" | wc -l) -ge 1 ]; then
          [ -n "${HARDCORE}" ] && mv ./*".x${j}.par2" "x${j}/"
          rename ".x${j}.par2" ".par2" "x${j}/"*.par2
        fi
        rm "x${j}"
      fi
    done
  elif [ ! -d "${INCOMPLETE}/${DIR}.1" ] && [ $(find "${INCOMPLETE}/${DIR}" -maxdepth 1 -type f -name "*.par2" | wc -l) -ge 1 ] && [ -n "${NORMAL}" ]; then
    echo ''
    echo "repairing folder '${DIR}'"
    cd "${INCOMPLETE}/${DIR}" || exit
    par2repair ./*.par2 ./*.rar 2>&1 | grep -E 'more recovery blocks|Repair complete'
  fi
done

echo ''
echo 'done.'
Also after it successfully repairs just "retry" the first download again, it will detect as done, unrar, cleanup etc.
User avatar
sander
Release Testers
Release Testers
Posts: 9061
Joined: January 22nd, 2008, 2:22 pm

Re: Script to automate fixing failed downloads

Post by sander »

I've come to learn that if you download the same thing that was posted earlier
So, that is the same as clicking "Retry", right?
Did you also try config/switches/#propagation_delay ?
or, strangely enough, the exact same thing from a different indexer you can combine the two (or more) into one working download.
Can you check: do those two different NZBs have the same article references?
If not, do they have the same rar and par2 files?
pancus
Newbie
Newbie
Posts: 15
Joined: March 20th, 2008, 11:46 pm

Re: Script to automate fixing failed downloads

Post by pancus »

sander wrote: July 24th, 2021, 2:19 am So, that is the same as clicking "Retry", right?
Did you also try config/switches/#propagation_delay ?
No. You know how you see the same thing posted multiple times but months apart? Like that.
sander wrote: July 24th, 2021, 2:19 am Can you check: do those two different NZBs have the same article references?
If not, do they have the same rar and par2 files?
I'm pretty sure they are different. I've noticed that some indexers give the exact same nzbs for the same thing while others are different. I have no idea why. And iirc the file names are the same.
User avatar
sander
Release Testers
Release Testers
Posts: 9061
Joined: January 22nd, 2008, 2:22 pm

Re: Script to automate fixing failed downloads

Post by sander »

Can you share two suchs NZBs with me? I would like to do some analysis on them.
Puzzled
Full Member
Full Member
Posts: 160
Joined: September 2nd, 2017, 3:02 am

Re: Script to automate fixing failed downloads

Post by Puzzled »

sander wrote: July 24th, 2021, 2:19 am So, that is the same as clicking "Retry", right?
No, he's using duplicate postings of the same content. If one has a bad part1.rar and the other a bad part2.rar and they're uncompressed then par2 may be able to use good data from the other posting to recover the content. I've tried it myself sometimes but it's usually didn't work because the content in the nzbs didn't match each others par2 blocks. Automating it is pretty clever, though.
Post Reply