I have some perl scripts that work with about 10,000 individual files on my hard drive. During processing, they extensively read these files, create and delete a large number of temporary files, and end up creating and writing about 4,000 output files. Then the whole process is repeated next time I run it, the 4,000 output files deleted and the whole process starts again
An SSD could speed these scripts up tremendously compared to a hard drive, but is this exactly the type of application where an SSD is *not* suitable ? A large churn rate of thousands and thousands of files ? I'm running Windows 7. Does the automatic TRIM function help ?
An SSD could speed these scripts up tremendously compared to a hard drive, but is this exactly the type of application where an SSD is *not* suitable ? A large churn rate of thousands and thousands of files ? I'm running Windows 7. Does the automatic TRIM function help ?