17 May 2008

This will my homage to Usenet.
Background:
NZB files are like Torrent files in that they outline where you can get the files you are looking for usually alt.binaries.something.
A Usenet client connects to a server which a user pays for e.g. (giganews.com), it then requests from the server the files which were outlined in the .nzb file. It downloads it pretty fast (800KB/s in my case) which means a typical 800mb documentary can be had in around 20 minutes! What downloads is lots of .rar .r01 .rNN, as well as .par2 files.
Par2 files are parity files which will allow a .rNN file to be repaired or regenerated depending on how complete your download is.
Now I download a lot... So I like it all to be extracted and dumped into a folder when extracted.
My simple scripts to automate this:
Repair all:
for i in *.par2; do par2repair $i; done

Remove all:

rm *.PAR2
rm *.par2
rm *.nzb
rm *.r*
rm *sample*

Move all:

for i in *.avi; do mkdir ${i/.avi*}; mv ${i/.avi*}* ${i/.avi*}/; done