• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 2 (Refresh) 3900XT/3800XT/3600XT

I presume if the to end goes up the sustained clocks will come up to? I've never seen anyone complain their chip clocked too high ;)

I'm interested to know because if the sustained in game clocks are only 200Mhz higher its going to make little difference, <4%.

To be honest, Steve Burk is one of those people who likes to test games at lower settings which pushed up lower thread count CPU's.... the guy likes to pretend that's he's really clever, but he clearly does not understand that using lower settings in games also puts less load on the CPU especially in Multithreading, the game is designed like that to put less stress on lower end CPU's when you use lower IQ settings. so the 4 core CPU's look better than they actually are.

Having said all of that despite Steve's completely over bloated charts padded out with high overclocked Intel CPU's if you actually look closely at them Zen 2 gaming performance is actually very good.
This is the weakest Ryzen 3000 chart he has, the 3900X there at 4.3Ghz is just under the 5.1Ghz 9600K, or about 17% slower than the 5.2Ghz 10900K.

But look at the difference between the stock 3900X and the 4.3Ghz 3900X, its 2%.

https://www.gamersnexus.net/hwreviews/3587-intel-core-i9-10900k-cpu-review-benchmarks

3587-intel-core-i9-10900k-cpu-review-benchmarks


qk042ki.png
 
I think these TX versions are especially great for first gen Ryzen owners on B350/X370 boards since now that the Ryzen 4000 series won’t support them, they have an option to upgrade into faster Ryzen 3000 models.

I am in this boat, currently having a b350 /w ryzen 5 1600, but how would a 3600XT compare to 4600? I assume the 4600 would be lower price (£180) and slight(much?) better performance but would need a new motherboard. If i sold the old stuff the difference wouldn't be that much more in the end. £50 difference perhaps.
 
I am in this boat, currently having a b350 /w ryzen 5 1600, but how would a 3600XT compare to 4600? I assume the 4600 would be lower price (£180) and slight(much?) better performance but would need a new motherboard. If i sold the old stuff the difference wouldn't be that much more in the end. £50 difference perhaps.

The 3600 launched at £180, the 3600X at £220.

My guess is AMD will put the prices up a little, £200 for the 4600 and £240 for the 4600X.
 
1usmus didn't confirm anything. He just saw the Chinese tweet said FCLK 2,000 and he retweeted it.

This seems too good by AMD and every "improvement" that Zen 3 might have promised... we are getting it now.

1: Clock speed boosts
2: Vastly improved FCLK of... well we can now say 3,800 Mhz atleast

The only thing not known yet is the chiplet design if the 6 and 8 core will be a 3300x approach. If they do this, I don't really see what Zen 3 has to offer besides maybe guaranteeing an FCLK of 2,000 and that IPC uplift.
 
I am in this boat, currently having a b350 /w ryzen 5 1600, but how would a 3600XT compare to 4600? I assume the 4600 would be lower price (£180) and slight(much?) better performance but would need a new motherboard. If i sold the old stuff the difference wouldn't be that much more in the end. £50 difference perhaps.
We still don’t know the actual specs of the 3600XT but if it’s just a 200-300Mhz boost, then we are probably talking about a 2-3% speed increase.
If however there are more improvements under the hood, then we could easily see a +5% speed increase but we need more details before we can make good predictions.
The Ryzen 4000 series is a Next Gen thing on an enhanced 7nm process node and leaks suggests it's around 15% faster vs the Ryzen 3000 series.
 
I don't expect to see any architectural improvements. They will just be taking the current design and improving it slightly.

So maybe higher clocks and a faster FCLK.

There will be no IPC improvements or redesign of the chiplets etc...

All these major changes will come with 4000 series.
 
I'm interested to know because if the sustained in game clocks are only 200Mhz higher its going to make little difference, <4%.

To be honest, Steve Burk is one of those people who likes to test games at lower settings which pushed up lower thread count CPU's.... the guy likes to pretend that's he's really clever, but he clearly does not understand that using lower settings in games also puts less load on the CPU especially in Multithreading, the game is designed like that to put less stress on lower end CPU's when you use lower IQ settings. so the 4 core CPU's look better than they actually are.

Having said all of that despite Steve's completely over bloated charts padded out with high overclocked Intel CPU's if you actually look closely at them Zen 2 gaming performance is actually very good.
This is the weakest Ryzen 3000 chart he has, the 3900X there at 4.3Ghz is just under the 5.1Ghz 9600K, or about 17% slower than the 5.2Ghz 10900K.

But look at the difference between the stock 3900X and the 4.3Ghz 3900X, its 2%.

https://www.gamersnexus.net/hwreviews/3587-intel-core-i9-10900k-cpu-review-benchmarks

3587-intel-core-i9-10900k-cpu-review-benchmarks


qk042ki.png

That's true they should always use ultra settings when testing cpu's. That chart is misleading. In that game 3900x with smt disabled and tuned memory was almost on top. It is a different game f1 2018 vs f1 2019 but it should work the same way.


 
Last edited:
That's true they should always use ultra settings when testing cpu's. That chart is misleading. In that game 3900x with smt disabled and tuned memory was almost on top. It is a different game f1 2018 vs f1 2019 but it should work the same way.


No, using higher settings will not make a difference to the higher core count CPU's like the 9900K and the 3900X you highlighted but the lower core count CPU's like the 7600K and 7700K would fall further back.
The effect of SMT is something different entirely, its a problem in its own right in some games. its why you sometimes see the 9700K ahead of the 9900K.
 
I'm interested to know because if the sustained in game clocks are only 200Mhz higher its going to make little difference, <4%.

To be honest, Steve Burk is one of those people who likes to test games at lower settings which pushed up lower thread count CPU's.... the guy likes to pretend that's he's really clever, but he clearly does not understand that using lower settings in games also puts less load on the CPU especially in Multithreading, the game is designed like that to put less stress on lower end CPU's when you use lower IQ settings. so the 4 core CPU's look better than they actually are.

Having said all of that despite Steve's completely over bloated charts padded out with high overclocked Intel CPU's if you actually look closely at them Zen 2 gaming performance is actually very good.
This is the weakest Ryzen 3000 chart he has, the 3900X there at 4.3Ghz is just under the 5.1Ghz 9600K, or about 17% slower than the 5.2Ghz 10900K.

But look at the difference between the stock 3900X and the 4.3Ghz 3900X, its 2%.

https://www.gamersnexus.net/hwreviews/3587-intel-core-i9-10900k-cpu-review-benchmarks

3587-intel-core-i9-10900k-cpu-review-benchmarks

It would be interesting to go back to some of the older CPU's that were tested at low resolutions to see how they scaled and test them again and see if the prediction was accurate. I'm betting they didn't scale as assumed. The software evolves to take advantage the newer hardware and I can see the pre new console generation aging quickly as the newer games arrive.
 
It would be interesting to go back to some of the older CPU's that were tested at low resolutions to see how they scaled and test them again and see if the prediction was accurate. I'm betting they didn't scale as assumed. The software evolves to take advantage the newer hardware and I can see the pre new console generation aging quickly as the newer games arrive.

Back when Zen 2 first came out made plenty of mention that the 7700K and to some extent the 9600K too has stutter in games, even some owners of these CPU's posting videos about it on Youtube.

If you're running very high end GPU's like the 2080TI you're applying so much GPU muscle very high Image quality settings don't phase it, what you are doing is applying more assets in the scene, longer draw distances, more NPC's, more complex physics, all this adds to the draw calls and modern games are very multithreaded in that. lower core count CPU's end up tapped out on all threads and that causes the stutter.

By lowering those settings you're taking that stuff away and with that those draw calls are reduced leaving the lower core count CPU's with more headroom.

Reviewers like Steve Burk never made mention of any of this stuttering, he doesn't get any, if you look at his gaming slides there is no consistency, unlike someone like Hardware Unboxed who have everything on the highest settings at 1080P Gamers Nexus slides are a complete mishmash, like he's tailoring each game for the CPU's he put's on his slides.

I think they are accurate for the way he reviews those games but like he does with his conflated power consumption testing he likes those slides to look a certain way, nothing about his CPU's reviews is Apples to Apples, there are a lot of Cherries....

Don't look to Steve Burk to find answers to that question :)
 
Back when Zen 2 first came out made plenty of mention that the 7700K and to some extent the 9600K too has stutter in games, even some owners of these CPU's posting videos about it on Youtube.

If you're running very high end GPU's like the 2080TI you're applying so much GPU muscle very high Image quality settings don't phase it, what you are doing is applying more assets in the scene, longer draw distances, more NPC's, more complex physics, all this adds to the draw calls and modern games are very multithreaded in that. lower core count CPU's end up tapped out on all threads and that causes the stutter.

By lowering those settings you're taking that stuff away and with that those draw calls are reduced leaving the lower core count CPU's with more headroom.

Reviewers like Steve Burk never made mention of any of this stuttering, he doesn't get any, if you look at his gaming slides there is no consistency, unlike someone like Hardware Unboxed who have everything on the highest settings at 1080P Gamers Nexus slides are a complete mishmash, like he's tailoring each game for the CPU's he put's on his slides.

I think they are accurate for the way he reviews those games but like he does with his conflated power consumption testing he likes those slides to look a certain way, nothing about his CPU's reviews is Apples to Apples, there are a lot of Cherries....

Don't look to Steve Burk to find answers to that question :)

Yes if like to see more real world testing, background stuff, multi monitor etc. It's not as controlled or scientific but run it for long enough and discard the outliers will give you a pretty good idea of what an average user could expect.
 
Back when Zen 2 first came out made plenty of mention that the 7700K and to some extent the 9600K too has stutter in games, even some owners of these CPU's posting videos about it on Youtube.

If you're running very high end GPU's like the 2080TI you're applying so much GPU muscle very high Image quality settings don't phase it, what you are doing is applying more assets in the scene, longer draw distances, more NPC's, more complex physics, all this adds to the draw calls and modern games are very multithreaded in that. lower core count CPU's end up tapped out on all threads and that causes the stutter.

By lowering those settings you're taking that stuff away and with that those draw calls are reduced leaving the lower core count CPU's with more headroom.

Reviewers like Steve Burk never made mention of any of this stuttering, he doesn't get any, if you look at his gaming slides there is no consistency, unlike someone like Hardware Unboxed who have everything on the highest settings at 1080P Gamers Nexus slides are a complete mishmash, like he's tailoring each game for the CPU's he put's on his slides.

I think they are accurate for the way he reviews those games but like he does with his conflated power consumption testing he likes those slides to look a certain way, nothing about his CPU's reviews is Apples to Apples, there are a lot of Cherries....

Don't look to Steve Burk to find answers to that question :)
He also likes to draw out his answers. Why take 2 minutes to sum up your review when you can take 10?
 
Here is a comparison of the new XT models vs the regular Ryzen 3000 CPU's.
343fdzah73.jpg

Wow. If they can do those base clocks at the same TDP, then they must be highly binned.

I'll believe it when I see it.

It would also be a mistake not to set the 3600XT base clock to the same as the 3800XT.
 
Back
Top Bottom