Best type of HDD for 20gb+ newsgroup file parity checking

Soldato
Joined
30 Jul 2006
Posts
3,084
Location
4090 on 850w = BOOM
I often obtain 1080p video mkvs to load onto my dedicated decoder on my flatscreen in the other room but my maxtor diamondmax10 500gb drive just takes an absolute age to parity check a 19gb film with quickpar then ages to unpack onto the same drive with winrar. i have a seperate 74gb raptor for win7 which is ok at the moment just the storage drive is crap at this task can't decode a massive queue of downloaded files with my 50mb internet whilst parity checking another file.

I'm considering 3xSamsung SpinPoint F3 500GB SATA-II 16MB Cache and using my maxtor to unpack onto.

Am I better off spending money on 3 or 4 newer 500gb drives and striping them for speed for this process or would changing to ssd be worth it?
 
Last edited:
Hi,

i do the same as you, Quickpar can't really be sped up unfortunately, it only uses a single core, so the higher the clockspeed the better

It amazes me that no one has made a multi core ready PAR repair utility given the amount of newsgroup subscribers

On my PC i have 3 x 1TB Samsung Spinpoint F3s and they are unbelieveably quick in RAID0, plus giving you an absurd amount of space. I can rip 2 x DVD/Blu Rays onto that striped drive at the same time, whilst repairing a PAR and unzipping. All without even remotely choking the drive.

I would far sooner have 2.72TB of space for £150 compared to 64GB for around the same price

Go RAID0 if you have the space!
 
Last edited:
It amazes me that no one has made a multi core ready PAR repair utility given the amount of newsgroup subscribers

How much would you pay for it? :)

I have a Usenet client in development that parity checks as it goes, downloads the required repair volumes, and repairs - all across multiple threads.
 
There's a few that do. Newsleecher has it's 'repair and extract' feature too.

It would be nice if they made a 64 bit version of it - I don't think there's a x64 Delphi compiler yet, although maybe that's changed in the last year.
 
I currently use Newsbin Pro (x64) and it's absolutely rock solid, a far cry from the constant crashes using Grabit.

I'm pretty sure it still uses the same basic PAR utility as Quickpar though, i'm almost certain it's still single threaded as some archives take AGES to repair using my fast quad and fast drives!
 
On my file server I've got 8* Samsung F1 in hardware RAID 5, extracting on the same disk goes at about 21MB/s and parchecks at about 70MB/s on a 2Ghz Athlon64 X2.
My RAID0 SSD's both extract and quickpar at around 160MB/s, on a 3Ghz C2Q.

I don't know how my file server performance compares to your setup, but I don't feel it's too slow (SABNZBD does all the unrarring and par checking for me anyway). 20minutes to parcheck and unrar a 20GB file is nothing compared to the download time.
 
MultiPar is effictively a clone of QuickPar GUI but with support for multiple cores/threads and also command line usage.

PAR2+TBB is another multicore/thread command line version. I believe SABnzbd+ uses this as AbsenceJam pointed out. If you just import NZBs then I'd highly recommend it.

As an alternative you could verify by sfv. Should be quicker, but yeah, if you need to repair you'd still be faced with your dilemma.
 
Back
Top Bottom