stinka said:
did mine at 16k, which was reccomended for games
The person who recommended this was incorrect.
Basically stripe size can have a fairly dramatic effect on the sequential read rate.
Any file which is smaller than the stripe size is not split between the hard drives, merely placed on one of the two in the array.
This means smaller stripes are best suited to Windows and applications as these will use thousands of small files e.g. .dll's etc.
Games often use very large cache files e.g. .gcf files in Half Life. The large stripe is beneficial here because imagine the following scenario with a 1GB file:
1,000,000,000 KB ~= 1GB
Think very simplistically, of course this isn't exactly what happens.
If your stripe is 16K, this file will be split into 62,500,000 parts - 31,250,000 on each drive.
If your stripe is 128K, your file will only be split into 7,812,500 - 3,906,250 on each drive.
The more parts the file is in, the more CPU power is required to recompile and read these. You will notice higher CPU usage on the smaller stripe than the larger one.
Imagine now that the smaller striped scenario has some fragmentation. There are so many more files to scout for and recompile.
It is for this reason it is well advised to defrag often on small stripe arrays.
They will often give the highest scores in programs like HDTach.
I recall my 74GB Raptor RAID0 on a 16K stripe scored about 125MB/s sustained whilst only 107MB/s on a 128K stripe.
The reason for this is because the smaller files are not arrayed to both drives.
In summary, small stripe = nippy windows, large stripe = better gaming performance.
64K is a good go between but I always go for 32K as this is about right between windows and games (gaming files are only loaded once e.g. map and caches, so there is no point having them too optimised).