Why don't we measure ram in GHz?

True. But as enthusiasts, we dont normally entertain the marketing mans nonsense, yet if I said my ram was running at 2.4GHz you lot would think I'm strange.
 
Never really looked into the way it works, but isn't it redundant to have CPU and RAM at different clock speeds? Based on some made up assumption that memory can only be read by the CPU and that can only happen x times per second, there'd never be a reason to clock RAM higher than CPU, so all we have to do is match CPU and RAM clocks then we can stop discussing it? Of course I may be totally wrong, I am largely making this up on the spot.
 
Because exact ram speeds matter, or at least they did a few years ago.

For example on my first gen i5 750 the rated ram speed was 1333Mhz.
If you said 1.3Ghz it would look like it was incompatible.

As of 2009 the new CPUs are VERY easy to overclock as before (unless you had the £800+ unlocked version) the multiplier was locked, meaning you had to up the FSB to increase core speed. This literally cocked up ALL ram (and others like PCI-E) speeds.

I would say that now exact ram speeds are far less relevant, because whatever you set it to will not change when you overclock, as you simply up the multiplier.
 
Last edited:
True. But as enthusiasts, we dont normally entertain the marketing mans nonsense,

We do if we quote storage in bytes using 1000 instead of 1024. ;)

At least Hz is in true decimal, so 3400MHz is 3.4GHz. I am ambivalent, often quoting ram in GHz or MHz.

But as the man said 'it is not what it is rated at, its what it is running at'.

Anyway, the measure of ram should be in size bytes, not frequency Hz.
 
Because exact ram speeds matter, or at least they did a few years ago.

For example on my first gen i5 750 the rated ram speed was 1333Mhz.
If you said 1.3Ghz it would look like it was incompatible.

As of 2009 the new CPUs are VERY easy to overclock as before (unless you had the £800+ unlocked version) the multiplier was locked, meaning you had to up the FSB to increase core speed. This literally cocked up ALL ram (and others like PCI-E) speeds.

I would say that now exact ram speeds are far less relevant, because whatever you set it to will not change when you overclock, as you simply up the multiplier.

1333Mhz isn't 1.3Ghz though...
 
We do if we quote storage in bytes using 1000 instead of 1024. ;)

At least Hz is in true decimal, so 3400MHz is 3.4GHz. I am ambivalent, often quoting ram in GHz or MHz.

But as the man said 'it is not what it is rated at, its what it is running at'.

Anyway, the measure of ram should be in size bytes, not frequency Hz.

That's not marketing nonsense ";)". It's the difference between gigabytes and gibibytes.

Storage is measured in gigabytes, so a 256GB SSD is actually 256GB. Operating systems measure storage in gibibytes (GiB), but list it as gigabytes (GB).
 
Don't we...?
How many people are talking about "4K monitors", when it's more like 2160p? :D

It's not more like 2160P, it IS 2160P. This 4K nonsense really does need to stop. People seem to get upset when it's pointed out as well. 3840 isn't 4000, so it's not 4K.

4K is a cinema standard that uses a slightly wider aspect ratio than 16:9. 2K was a thing in the cinemas too, but we never saw anyone calling Full HD, or 1080P "2K" at the time.

The K thing needs to stop because of how vague and inaccurate it is. The amount of resolutions now that people are calling #K is just foolish.

Now, people refer to 1920x1080, 2048x1080, 2560x1440, 2560x1600 as "2K".

2880x1800, 3200x1800, 3440x1440 "3K".

Also, somehow dual 2560x1440 displays side by side is being called "5K" despite only being half the actual pixel count of what 5K would even be.

So for the most part, when people are talking about resolutions using "K" you really don't know what particular one they're talking about.
 
Back
Top Bottom