• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 2 (Ryzen 3000) - *** NO COMPETITOR HINTING ***

I know what it's for, but it's an unrealistic scenario which will affect <1% of people so basing opinion on it is just not worthwhile. It's like testing performance on Windows XP or on SteamOS. It's not really going to make much difference now is it?

It depends how often you upgrade. If you replace your CPU every 1 or 2 years, then 1440p and 4K benchmarks are important for showing real world performance in TODAYS games.

However, I'm planning for approx a 5 year upgrade cycle for my CPU. In 5 years time, we'll be using graphics cards that are way faster than anything thats available today. So to give me an idea of how a CPU may perform in 4 or 5 years, I want to remove any GPU bottlenecks because I wont have a GPU bottleneck in 4 or 5 years. To do that, I'll be looking at low resolution benchmarks such as 1080p.
 
But the point is, the games you will be playing in 5 years time won’t be the same games that the 720p benchmarks are done on today.

The game engines will be totally different and tuned for the hardware of the time.
 
But the point is, the games you will be playing in 5 years time won’t be the same games that the 720p benchmarks are done on today.

The game engines will be totally different and tuned for the hardware of the time.

There's also the point that trying to mimic scenario A by using scenario B, will yield reliable and comparable results.
 
well World of Warcraft runs on moded Warcraft 3 engine thats from 2000 Your point ?? I still play it and many milions of people also. And with Classic going out in August even more will come back.
This is actually an urban myth, it started development in the WC3 engine (there’s screen shots knocking around of it), but they quickly found it was completely unsuitable for the purpose. WoW uses its own specifically developed engine.
 
This is actually an urban myth, it started development in the WC3 engine (there’s screen shots knocking around of it), but they quickly found it was completely unsuitable for the purpose. WoW uses its own specifically developed engine.

It's also recently gotten a pretty decent DX12 upgrade that improved CPU performance also.
 
I need reviews to hurry up as im currently considering a i7-8700k with an itx board which is gonna be around £100 cheaper than a 3700x and a x570
 
But the point is, the games you will be playing in 5 years time won’t be the same games that the 720p benchmarks are done on today.

The game engines will be totally different and tuned for the hardware of the time.

It will give you a better indication of which CPU is faster.

4k in a very GPU bottlenecked game could show two CPU's to within margin of error. Yet at 1080p there could be 20% difference. I'd get the CPU that's 20% faster at 1080p because it's 20% faster. The other CPU would never be faster in future games.
 
Well I'm sold!

gIE83pt.jpg



Edit: Wendell approved as well!

P1zLi9w.jpg

lol :D
 
It will give you a better indication of which CPU is faster.

4k in a very GPU bottlenecked game could show two CPU's to within margin of error. Yet at 1080p there could be 20% difference. I'd get the CPU that's 20% faster at 1080p because it's 20% faster.

Faster at playing todays engines yes, faster in 5 years it means nothing at all.
 
Might be worth considering a X470 motherboard to be honest.

i have a local computer shop who has a MSI B450I Gaming Plus AC who will flash it for me to the newest bios and save me £100 but i dunno how much better the x570 will be for performance until i see reviews and then how well it will stack up against the I7 8700k, i mainly use my pc to game and browse the web and do the odd extract from a zip file lol
 
It will give you a better indication of which CPU is faster.

1440p in a very GPU bottlenecked game could show two CPU's to within margin of error. Yet at 1080p there could be 20% difference. I'd get the CPU that's 20% faster at 1080p because it's 20% faster.
Yes, but in that specific game engine. Otherwise it would be 20% quicker across the board. It’s no wonder that Intel are quicker on some game engines as that’s what developers were targeting as their user base, but that’s rapidly changing.

We already see Zen 1 doing better in modern games compared to how they fared when they first came out against the equivalent Intel processors.
 
Yes, but in that specific game engine. Otherwise it would be 20% quicker across the board. It’s no wonder that Intel are quicker on some game engines as that’s what developers were targeting as their user base, but that’s rapidly changing.

We already see Zen 1 doing better in modern games compared to how they fared when they first came out against the equivalent Intel processors.

Spot on!

7600K smacked around the 1600 in gaming when it launched. Faster clocked, and better IPC.
Two years later it's significantly better; despite Ryzen 1 still having lesser clocks, and IPC.

You can see the same engines from back then, being used today still show Ryzen doing poorly. Namely the FaR Cry games. Just does not work well on Ryzen.

 
Anyone know of the GB X370 Gaming K5 will run the 12 core part at stock ?

Just updated to the BIOS for Zen2 support and got a reminder of how annoying zen1 (1700) is with Hynix ram, can't for the life of me get anything better than 2667mhz now :/ (was at 2933 before)
 
Help me. I am getting excited by these early numbers turning up online. If you had an 8700k OC'd at 5ghz, would you upgrade to AMD, or would you wait and see what happens now the 2 main loggerheads are at large?

I literally only use this PC for gaming, 1080p @240hz
 
Back
Top Bottom