• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
Soldato
Joined
11 Sep 2009
Posts
6,172
Location
Limbo
ugh

Yeah I mean 13900K > 13900KF due to IGPU being present. Thought that was obvious.

So, I'd had a quick look at the HUB review and the main issue I have with it is the fact that they are running everything stock. No one should be doing this on the 13900xx. It basically made the whole video irrelevant, zero tuning involved. If you want a cpu you can slap in and call it the best then don't even bother building, go with a prebuilt PC where all the work has been done for you.

I do believe 13900K cpu do have a range at which they are binned at going by the IGORS lab cpu tray review he did. But its more of a lottery, the KS is essentially buying a higher probability, but we don't have enough reviewed chips to see what its average bin is.

There isn't enough information at present on the KS, no real overclocking reviews so I can't say for sure what performance is like.

I didn't post any benchmarks... I find it hard to rely on one review as I don't know who to trust. If you read lots of reviews and also forum posts you get a better picture but its time consuming, but like I said there don't seem to be any decent reviews of the KS out yet.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,137
Location
ARC-L1, Stanton System
depends what kind of workloads he does and doesnt mind the power draw, I do agree the 7950x3d will probably be slightly slower over the 7950x in multi work loads due to the cached ccd boosting lower

Depends on the workload, if you're doing a lot of tile rendering the 13900K or the 7950X will be better, if you're doing a lot of video editing you should really be using the GPU, espesially if it has AV1 encode / decode, which they all do now.... if you packing and unpacking archives the X3D CPU's blow everything clean out of the water.
 
Caporegime
Joined
17 Mar 2012
Posts
48,137
Location
ARC-L1, Stanton System
ugh

Yeah I mean 13900K > 13900KF due to IGPU being present. Thought that was obvious.

So, I'd had a quick look at the HUB review and the main issue I have with it is the fact that they are running everything stock. No one should be doing this on the 13900xx. It basically made the whole video irrelevant, zero tuning involved. If you want a cpu you can slap in and call it the best then don't even bother building, go with a prebuilt PC where all the work has been done for you.

I do believe 13900K cpu do have a range at which they are binned at going by the IGORS lab cpu tray review he did. But its more of a lottery, the KS is essentially buying a higher probability, but we don't have enough reviewed chips to see what its average bin is.

There isn't enough information at present on the KS, no real overclocking reviews so I can't say for sure what performance is like.

I didn't post any benchmarks... I find it hard to rely on one review as I don't know who to trust. If you read lots of reviews and also forum posts you get a better picture but its time consuming, but like I said there don't seem to be any decent reviews of the KS out yet.

That's true for any and every CPU, i'm running mine with 20% less power at 5 to 10% higher performance vs stock. that only works if you ignore the fact that every CPU can be tuned.
 
Last edited:
Soldato
Joined
11 Sep 2009
Posts
6,172
Location
Limbo
Depends on the workload, if you're doing a lot of tile rendering the 13900K or the 7950X will be better, if you're doing a lot of video editing you should really be using the GPU, espesially if it has AV1 encode / decode, which they all do now.... if you packing and unpacking archives the X3D CPU's blow everything clean out of the water.
I agree but like uscool said the x3d will be slower than the x part, it seems very obvious right now. The more threads an app needs will obviously make the x part even better.

Anyway, I'm just waiting for the damn chips to be out so I can actually compare them, tired of speculation. I trust nothing that comes directly from AMD or Intel marketing themselves. It's pure marketing and morals seem to be out of the window now, lying or manipulation seems to be the norm, you can see it in the gpu sector too. I also suspect a huge amount of astroturfing going on in computer forums.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,137
Location
ARC-L1, Stanton System
I agree but like uscool said the x3d will be slower than the x part, it seems very obvious right now. The more threads an app needs will obviously make the x part even better.

Anyway, I'm just waiting for the damn chips to be out so I can actually compare them, tired of speculation. I trust nothing that comes directly from AMD or Intel marketing themselves. It's pure marketing and morals seem to be out of the window now, lying or manipulation seems to be the norm, you can see it in the gpu sector too. I also suspect a huge amount of astroturfing going on in computer forums.

Edit: NVM my mistake...
 
Last edited:
Soldato
Joined
11 Sep 2009
Posts
6,172
Location
Limbo
You posted a slide that disagrees with the rest of the internet by quite some margin, is it the whole of the internet that is lies or manipulation with Computer Base De standing alone as the truth sayers?
Please link me, maybe I've been smoking too much weed. I'm pretty sure I haven't posted a slide today :D
 
Soldato
Joined
18 Feb 2015
Posts
6,489
Computer Base De are the ultimate cherry picking, always have been.

Pick any other reviewer and they show vastly different results, Computer Base De are the same as user benchmark.
There is no cherry picking involved, only actual competent testing. That you see it that way simply betrays your AMD bias, but I'm not interested in cheerleading for corpos, only in what the products they sell actually give me (and at what cost).

thats like 9 games wouldnt call that overall is that both at stock
I seen that site before is that at 720p ? also any 7700x because I aint interested in the 7950x and I know on some games the dual ccd gives issues
720p is the only way to properly test CPU performance because it's the easiest way to minimise a GPU bottleneck. I'm not going to pull up 7700X results but rest assured they're only marginally better in the best case scenario. Mind you in most games the differences between Intel & AMD are minor, but in outlier cases that's where the difference is most stark, and imo that's the more important scenarios to consider, otherwise you could flip a coin and be equally well off with either.

wouldnt call that a spanking 54 game average , cant imagine the 7700x being much worse
I don't use Hardware Unboxed/Tech Spot numbers, because they're incapable of testing CPUs properly. Bad settings, bad scenes, canned benchmarks etc. If you look at real testing you can see how quickly Zen 4 bottlenecks the GPU and falls apart (see below, and keep in mind that while driving it's actually less punishing on the CPU and in this area; if it had a city center test it would be an even bigger slaughter). It's simply not good enough even for current GPUs, let alone future ones (assuming you want to keep your CPU for more than 2 years). I mean just look at the 1% lows, we're talking about a >60% difference! If that's not a spanking then I don't know what is.

d3L7Msv.jpg

 
Soldato
Joined
18 Feb 2015
Posts
6,489
The great thing about CPU discussions is that it quickly separates those who are able to think and understand what a testing methodology is from those can't. Saves a lot of headaches. :)

Great news @uscool you can save SO MUCH MONEY! Look, all you need is an R3 3100, it has 90% of the performance of the 13900K. Plus it makes so much sense, because people play at 4K and not 720p.

relative-performance-games-38410-2160.png
 
Soldato
Joined
28 Oct 2009
Posts
5,383
Location
Earth
5800x3d is enough for my use case and drop in upgrade, I don't need 13th gen or 7000x3d and spend much more

Bet I won't notice the difference unless I have my eyes glued on the FPS counter
 
Last edited:
Soldato
Joined
28 Oct 2009
Posts
5,383
Location
Earth
There is no cherry picking involved, only actual competent testing. That you see it that way simply betrays your AMD bias, but I'm not interested in cheerleading for corpos, only in what the products they sell actually give me (and at what cost).


720p is the only way to properly test CPU performance because it's the easiest way to minimise a GPU bottleneck. I'm not going to pull up 7700X results but rest assured they're only marginally better in the best case scenario. Mind you in most games the differences between Intel & AMD are minor, but in outlier cases that's where the difference is most stark, and imo that's the more important scenarios to consider, otherwise you could flip a coin and be equally well off with either.


I don't use Hardware Unboxed/Tech Spot numbers, because they're incapable of testing CPUs properly. Bad settings, bad scenes, canned benchmarks etc. If you look at real testing you can see how quickly Zen 4 bottlenecks the GPU and falls apart (see below, and keep in mind that while driving it's actually less punishing on the CPU and in this area; if it had a city center test it would be an even bigger slaughter). It's simply not good enough even for current GPUs, let alone future ones (assuming you want to keep your CPU for more than 2 years). I mean just look at the 1% lows, we're talking about a >60% difference! If that's not a spanking then I don't know what is.

d3L7Msv.jpg


AMD is also stock with hardware unboxed can use pbo/ curve optimiser, but I guess intel out of the box is just not good intel should really work on improving out of the box to show them in better light ?
 
Caporegime
Joined
17 Mar 2012
Posts
48,137
Location
ARC-L1, Stanton System
There is no cherry picking involved, only actual competent testing. That you see it that way simply betrays your AMD bias, but I'm not interested in cheerleading for corpos, only in what the products they sell actually give me (and at what cost).


720p is the only way to properly test CPU performance because it's the easiest way to minimise a GPU bottleneck. I'm not going to pull up 7700X results but rest assured they're only marginally better in the best case scenario. Mind you in most games the differences between Intel & AMD are minor, but in outlier cases that's where the difference is most stark, and imo that's the more important scenarios to consider, otherwise you could flip a coin and be equally well off with either.


I don't use Hardware Unboxed/Tech Spot numbers, because they're incapable of testing CPUs properly. Bad settings, bad scenes, canned benchmarks etc. If you look at real testing you can see how quickly Zen 4 bottlenecks the GPU and falls apart (see below, and keep in mind that while driving it's actually less punishing on the CPU and in this area; if it had a city center test it would be an even bigger slaughter). It's simply not good enough even for current GPUs, let alone future ones (assuming you want to keep your CPU for more than 2 years). I mean just look at the 1% lows, we're talking about a >60% difference! If that's not a spanking then I don't know what is.

d3L7Msv.jpg


I have a few questions.

Why is the CPU utilisation on the 7700X much lower than the 13700K despite it having twice as many cores? its like its using 2/3 less of the AMD CPU vs the Intel one.

Why does one of the cores on the 7700X apear to be locked to 4.4Ghz? That's over 1Ghz lower than it should be

Why is this the only review that shows this much of a performance disparity between these CPU's, its about 3X higher than the rest of the Internet.
 
Last edited:
Associate
Joined
28 Sep 2018
Posts
2,291
AMD is also stock with hardware unboxed can use pbo/ curve optimiser, but I guess intel out of the box is just not good intel should really work on improving out of the box to show them in better light ?

Intel has more scaling and tuning options. There's more to tune but more to gain as well. As long as you have the knowledge and a repeatable process, you can squeeze the platform.

Example, my buddy today setup a octvb where the games scale based on temps on his 13900k. So he's gained 400mhz of frequency over stock while running cooler. That's a 360mm aio btw not some exotic cooling. And that's before the ram tuning benefits which vary game to game.

image.png
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,137
Location
ARC-L1, Stanton System
Intel has more scaling and tuning options. There's more to tune but more to gain as well. As long as you have the knowledge and a repeatable process, you can squeeze the platform.

Example, my buddy today setup a octvb where the games scale based on temps on his 13900k. So he's gained 400mhz of frequency over stock while running cooler. That's a 360mm aio btw not some exotic cooling. And that's before the ram tuning benefits which vary game to game.

image.png

No.... i did that with my 5800X.
 
Soldato
Joined
18 Feb 2015
Posts
6,489
I have a few questions.

Why is the CPU utilisation on the 7700X much lower than the 13700K despite it having twice as many cores? its like its using 2/3 less of the AMD CPU vs the Intel one.
Because the architecture of Zen is less well suited for gaming than Intel's (and if we're to be totally honest, it's clear we're getting server-first designs rather than something specifically made for the desktop market; even the v-cache stuff is only accidentally a gaming CPU because they made it for database workloads and only later brought it to desktop), and that's besides the minor advantage Intel CPUs enjoy in CP2077 (though tbf when AMD was asked about it they said everything's running as intended so..). To a certain extent this can be mitigated by v-cache because if you look at 5800X3D vs ADL it actually made up a lot of ground compared to the 5800X and they were more competitive.

3k2GhMb.jpg
Why does one of the cores on the 7700X apear to be locked to 4.4Ghz? That's over 1Ghz lower than it should be
It isn't, that's normal load balancing. Watch the video, you can see the usual fluctuations in clocks.
Why is this the only review that shows this much of a performance disparity between these CPU's, its about 3X higher than the rest of the Internet.
It's not the only review and as I've explained before, the results are dependent on the testing methodology. Most reviews test either the canned benchmark or not a CPU demanding scene. Any time you see the CPUs get actually stressed you can see a very clear separation between them.
Here f.ex. (https://youtu.be/U1CuB-YGa9c?t=343) you can see how quickly the FPS goes from 70 to 100+ depending on NPCs near-by but at all times the GPU remains underutilised. And it works the same with AMD GPUs as well, even though they lack the RT grunt they also quickly run into a CPU wall (https://youtu.be/c_QUlUZNAH4?t=489) because it can really get that demanding.
 
Caporegime
Joined
17 Mar 2012
Posts
48,137
Location
ARC-L1, Stanton System
Because the architecture of Zen is less well suited for gaming than Intel's (and if we're to be totally honest, it's clear we're getting server-first designs rather than something specifically made for the desktop market; even the v-cache stuff is only accidentally a gaming CPU because they made it for database workloads and only later brought it to desktop), and that's besides the minor advantage Intel CPUs enjoy in CP2077 (though tbf when AMD was asked about it they said everything's running as intended so..). To a certain extent this can be mitigated by v-cache because if you look at 5800X3D vs ADL it actually made up a lot of ground compared to the 5800X and they were more competitive.

3k2GhMb.jpg

It isn't, that's normal load balancing. Watch the video, you can see the usual fluctuations in clocks.

It's not the only review and as I've explained before, the results are dependent on the testing methodology. Most reviews test either the canned benchmark or not a CPU demanding scene. Any time you see the CPUs get actually stressed you can see a very clear separation between them.
Here f.ex. (https://youtu.be/U1CuB-YGa9c?t=343) you can see how quickly the FPS goes from 70 to 100+ depending on NPCs near-by but at all times the GPU remains underutilised. And it works the same with AMD GPUs as well, even though they lack the RT grunt they also quickly run into a CPU wall (https://youtu.be/c_QUlUZNAH4?t=489) because it can really get that demanding.

Yeah thought so... You REALLY do believe in these charts from Computer Base De. :)

Other than stating the obvious that is true for all CPU's, more load = less performance, yeah no ####. You also make a lot of blanket statements, ones that anyone can make, Yours here being "because AMD not as good as Intel" well that sort of an answer is frankly expected, but i was hoping for more.

You haven't really answered the fundamental question, if i employ 100 people to run tests for me and 5 out 100 of the all agree with each other with the remaining few way off the rest my thinking would be that they got something wrong and i wouldn't use those persons results, it seems to me your superstition is that everyone else is wrong and only they are doing it right, it so much looks like you take this position because it agrees with an idea that you already have in your head, to you it just proves it, its the first thing you said to me here which doesn't help the way this looks, its an apparent confirmation bias, Intel better than AMD. It looks a bit nuts. BTW i just do not care which if any you prefer, look at me, i'm apparently an enigma, for the last decade all of the GPU's that have bought and kept are Nvidia, some people just can't get their head around that, weirdly :D
is it why you band pretty much exclusively Computer Base De around? THAT is what Computer Base De understand there is a market for. its a form of "look at me, i'm different, click me" Its not anything so romantic as they are the underrated and a wholly unmatched genius that sets them apart from everyone one else, Its simply contrived, its fake.

I looked at multiple RDNA3 results and from them concluded it is what it is, that is fundamentally notably less than AMD said it would be, as a result of that i don't trust AMD's slides. That is rational.

BTW, they are all first and foremost Datacentre CPU's, or maybe your thinking is this is why AMD's CPU's are so much better in Datacentre, because Intel design CPU's for your games and AMD don't, no..... you really don't matter to Intel, in the same way you don't matter to AMD, no one is doing anything special here just for you, they are just CPU's, they haven't actually changed that much in decades, and that's fine, because its the games that are designed for the CPU's, not the CPU's designed for the games.

Edited: its nice to have more time to reply, formulated to be less unintentionally provocative.
 
Last edited:
Back
Top Bottom