See now
these ram speed results are confusing.
There are benchmarks showing that the difference between 3000MHz and 3600MHz aren't that huge.
Here tho the 1% lows show huge difference.
Here for example doesn't show a big difference.
3rd link doesnt show lows thats why, just crappy avg's.
2nd link you didnt put in post so cant see.
When textures are been loaded from gpu memory ram wont help, but when they first loaded from disc ram performance can help, some games dont use the gpu at all for certain textures or data and those games are more likely to see larger differences such as lightning returns. (games with these characteristics usually dont get reviewed by reviewers).
Some games are coded in a mess like FF15 where all gpu textures are duplicated into ram, although I never tested FF15 performance to see if ram latency has any impact.
Also i dont think memory bandwidth helps with games, its more so latency, so what would help is people stopped doing 3000 vs 3600 tests but rather did CL12 vs CL14 vs CL16, CR timings etc. I know 3600CL14 latency is lower than 3200CL14 latency but it adds more confusion and unneeded variables. Especially when the latency is increased on the higher speeds.