Why do we write RAM speeds in Mhz?

RAM has only recently been in the multiple GHz range compared to CPUs so it's a throwback to when RAM was much slower.
Understandable. but. CPU's were known as 1Ghz the second they hit that speed. RAM hit "1000MHZ" quite some time ago. Why didn't it follow the CPU's and just go straight to 1Ghz? Strange one.
 
Not that starnge IMO :)

RAM speed are often not nice round values. IE 2133MHz, 2667MHz, 2933MHz etc. To represent these in GHz units would either mean less accuracy (2.1GHz vs 2133MHz) or less readability by using more characters (2.133GHz vs 2133MHz). Generally you use the high unit you can that still shows the correct value easily, and with RAM that means MHz is still king due to the values in common usage.

CPU's are far more commonly found in round 100MHz values so GHz is more easily used.
 
2666mhz for example is its effective clock and not true clock speed . as its DDR its actually 1333mhz . so advertising in the public mind wouldnt really work in GHz until very recently ... But a lot of enthusiasts will refer to ram speed when asked in GHz now .
 
RAM has always been like that - you used to get PC2-6400, etc. using the effective speed for 800MHz RAM and so on so it seems to have kind of stuck as a convention to use the long version.

Gets a bit silly when you have like PC4-36800 though LOL.

I think as capacities moved to GBs about the same time RAM speeds started to make sense to quote in GHz they probably didn't to avoid confusion though.
 
Isn't RAM transfer rates calculated in bytes per second? So you wouldn't say ghz when it hits 1000 like you would with a CPU that measures in calculations per second.

200 MHz clock rate × 2 × 8 Bytes = 3,200 MB/s bandwidth - taken from another site, but you get the idea.
 
Back
Top Bottom