Other then WinRar (that accessing a lot of memory), the real world difference between 1333 and 2400 DDR 3 is about 1%.
http://www.anandtech.com/show/6372/...-to-ddr32400-on-ivy-bridge-igp-with-gskill/11
Why is 1333 and 2400 only around 1% difference in most situations? Simple it's because of the cache, and the larger the cache the more chance of cache hits.
CPU registers (fastest)
Level 1 cache
Level 2 cache
Level 3 cache
DDR
HDD / SSD (slowest)
When you first switch on your computer all storage is present in HDD / SSD only. However as the computer runs it moves frequent accessed memory closer up the hierarchy of storage. The levels of memory speed above, allow many GB / TB of programs/data to be stored on the HDD/SSD, yet due to the cache levels, there is minimum latency when the CPU executes. The CPU cache controller, and memory management of modern operating system make this possible.
The early PC's used an 8086 CPU that had no cache. Back then memory speed was really important, computers were even advertised with '0 wait state memory' meaning that the CPU did not wait additional clock cycles while memory was accessed. Back then fit the wrong memory speed and you cripple a computer, today you struggle to even notice faster RAM and it's all down to the caching system.