• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 2 (Ryzen 3000) - *** NO COMPETITOR HINTING ***

That Robert guy from AMD repeatedly said in an interview after Computex that AMD are now the market leaders in single thread performance, multi thread performance, power consumption and best price. Again bold claims coming from AMD so make of that what you will.

In this interview

 
Didn't think they stated what the GPU was in the demo.

You are right, they did state that the 3800X has remove the CPU bottleneck moving the bottleneck to the GPU, which is a bold claim.

And claimed during the presentation they are either at or better than the competition. There was no clear cut benchmark, but its looking like competition is starting to heat up.

This is the point isn't it? "the 3800X removed the CPU bottleneck" the performance on screen was higher than the 9900K but only just by a couple of frames, if they "removed the CPU bottleneck" then its not indicative of the CPU's performance compared with the 9900K given the marginal difference, if the GPU is the bottleneck.
 
Did it? The FPS was margin of error stuff, the resolution was unknown and it was running on a RX 5700 which could possibly bottleneck the performance.
So by that logic then, if the RX 5700 was holding the system back drop in a Radeon VII, RTX 2080 or 2080 Ti and the 3800X will move past the 9900K.

Definite, hard numbers would be nice of course, but matching performance within margin for error is a good demonstration, surely?
 
So by that logic then, if the RX 5700 was holding the system back drop in a Radeon VII, RTX 2080 or 2080 Ti and the 3800X will move past the 9900K.

Definite, hard numbers would be nice of course, but matching performance within margin for error is a good demonstration, surely?

Will? Not necessarily, if the GPU is the bottleneck then either CPU could actually be faster where the GPU is not the bootleneck.


Wow, humbug is such an AMD fanboi....
 
I repeat what I said previously...
I'd like to see 3800X + Navi versus 9900K + 2080Ti.
I think the value proposition is in showing that an all AMD system is close enough to as good as you can get. The 3800X accentuates the performance of Navi, whilst the 9900K is holding back the 2080Ti. The performance differential would be at its lowest.
 
I repeat what I said previously...
I'd like to see 3800X + Navi versus 9900K + 2080Ti.
I think the value proposition is in showing that an all AMD system is close enough to as good as you can get. The 3800X accentuates the performance of Navi, whilst the 9900K is holding back the 2080Ti. The performance differential would be at its lowest.

3800X and 9900K @ 1080P or even, yes 720P on 2080TI for me, that's when we get a look at true performance.

I do think the 3800X will be similar in performance to the 9900K at that.
 
D8tQoMzWsAEAtdM.jpg


https://adoredtv.com/radeon-rx-5700-xt-slides-leak-performance-and-specs-revealed/
 
Last edited:
The reason reviewers use low resolutions is to get the true performance of the CPU in games, if the GPU is the bottleneck then its a GPU review, not a CPU review.

In that the only thing i don't agree with is when they also turn all the settings as low as they go, doing that turns off a lot of the works handled by the CPU which is in modern titles very heavily threaded, that is also not a representation of the CPU's performance because what happens then is lower threaded CPU's don't get stressed as they would with all the IQ settings on, so a 4 core becomes as good as an 8 core when in fact with the game fully utilizing the CPU all 4 threads become loaded up and its much slower.

1080P or 720P ultra settings is the way to go, a 2080TI has more than enough grunt to make the CPU the bottleneck at those resolution with Ultra settings.
 
You say that but AMD had the last generation of consoles too, with a Jaguar part as each cpu and a custom graphics card for both.
Old consoles are basically with tablet CPUs competing Intel's Atom.
Optimizing work game developers have done has been just to get games running on those flimsy cores.
With really nothing in common with PC games.
 
Back
Top Bottom