Associate
- Joined
- 24 May 2015
- Posts
- 500
Whatever memory you choose on day one won't make an iota of difference given how everyone with ADL CPUs are going to be GPU bound out the wazzoo.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It's technically not a performance thing. It's to help games that don't work due to DRM tripping. I can see people using it for performance tweaking though and I imagine game benchmarks will test both options.
Alder lake is very complicated when it comes to memory speed. On one half of the chip you the Skylake clusters with a low latency ring bus and the other the Atom cores in a high latency mesh. I think Alder lake will be performance balancing act within a narrow margin when it comes to memory tuning.
I don’t think the design will suddenly see double digit performance gains from DDR5 7500mhz or dropping down to Cas20 1T.
I literally just showed you scaling and you still came up with this?
Unless you play WD Legion which is 22% faster on DDR5 making me wonder if Cyberpunk would also show such gains in cpu limited scenarios as they both have RT. Not sure if that is an anomaly as the same game when tested by eteknix didn’t show any gains.Whatever memory you choose on day one won't make an iota of difference given how everyone with ADL CPUs are going to be GPU bound out the wazzoo.
Why do you and a few others I can name keep spouting this stuff about ram performance in PC gaming.I literally just showed you scaling and you still came up with this?
If you are using reconstruction/upscaling techniques like DLSS/FSR at 4K then the 1080p data is actually very relevant even at 4K.Why do you and a few others I can name keep spouting this stuff about ram performance in PC gaming.
Running 12900K with a 3090 with the lowest settings at 1080p is just trying to trick people when in reality anyone who actually plays PC games buy these faster Ram and then are on the forums asking what happened.
Tell it like it is,your benching for epeen and that is fine but at least tell people the truth because I know you know the difference.IF I hurt your feeling too bad suck it up buttercup
Can DDR4 4000mhz kits be run on Alder Lake DDR4 motherboards, at gear 1 /1:1 then? Or, only some Z690 boards?
Thanks. Is DDR4 running at 4400mhz considered to be bdie? Or, is it necessary to buy more expensive kits?
OK. Here is a DDR 4400mhz kit:
https://www.overclockers.co.uk/patr...qwdBQJZziTmv6neEwEm8-1636222630-0-gaNycGzNCKU
voltage = 1.45v, if that is important.
Something new for you AMD and Intel Fans to fight about :-
https://www.pugetsystems.com/labs/a...0-Series-2242/#IntelQuickSyncPerformanceBoost
Have fun ..
Now you know why there was so many work applications in reviewers benchmarks, that you normally would not see that reviewer do such benchmarks, guessing part of the Intel benchmark testing required for the reviews . Now explains why gaming is seeing normal increases in benchmarks when not GPU bound and well some are just broken under the new ADL chip design being slightly slower or faster or giving totally fake results like 8GHz-9GHz overclocks, or worse using the IGPU Intel Quick Sync as a cpu benchmark, where it should have been disabled to show the true cpu result not a CPU+IGPU result..
Anyways be nice when you fight about this one and as always stated buy what fits your needs not what someone else is telling you, check the reviews for the workloads you do, if gaming only well we all know the reality on that one now, it's a little uplift if you have a decent cpu already or could be slightly lower performance depending on the game, so buy what fits your use case.
Let the battle begin ..
Haha, don't need hedt if your 12900k already competed with TR