Raid Controllers

Soldato
Joined
21 Jan 2007
Posts
8,704
Most mobos just come with 2 raid ports. You are after RAID 0+1 (10). Choose wisely as you will need a controller capable of both. (With 4 connectors obvisously)
 
I have the Areca ARC1110 on an Asus P5WDG2 WS Pro motherboard which works very well, also full driver support in Vista (32 & 64bit), very pleased with it, it is not cheap either, just over £200, but I was looking to the long term.

It is a true hardware RAID controller.

I bought this because Asus would not release a signed Vista 64bit driver for the Marvell controller on my mobo. :)
 
PhillyDee said:
Most mobos just come with 2 raid ports. You are after RAID 0+1 (10). Choose wisely as you will need a controller capable of both. (With 4 connectors obvisously)

Raid 0+1 is not raid 10..

RAID 0+1 is NOT to be confused with RAID 10. A single drive failure will cause the whole array to become, in essence, a RAID Level 0 array
 
Do you have the correct slot to add it to your mobo as PCI normally is 33mhz and limits you to 133MB/Sec shared with other aspects of mobo so wont even see that, if you have 66mhz PCI slot like a Asus server mobo your fine, the it mentions newer PCI-X also
 
pitchfork said:
I'll be having a 1gig 2900 xt in the pci express x16 slot, i've heard that disables pci x? (using the x16 slot)
I am using the Nvidia 7900GTO (PCI-E) in a x16 pci-e slot with the Areca ARC1110 (PCI-X) in a PCI-X slot and neither are disabled or reduced in throughput.

If I have 2 x PCI-E slots filled then transfer rate goes from x16 (for one lane) to x8 for the 2 occupied lanes (this is on the 975X), even then PCI-X is not affected. :)
 
If true there aint much diff using the GPU in 8x slot, if any, AGP8x was not even full saturated and actually had more bandwidth 1 way that PCI-E but not both ways.
 
Back
Top Bottom