• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
4k + DLSS Q > 4k native at plenty of games.
Right, but you need to be able to push well over 100 FPS, the game needs to have RT that is very heavy on the CPU. Currently only Spiderman and Cyberpunk fall into this category. This makes it very niche.
 
Last edited:
Right, but you need to be able to push well over 100 FPS, the game needs to have RT that is very heavy on the CPU. Currently only Spiderman and Cyberpunk fall into this category. This makes it very niche.


Yeah true. Also I think very heavy multi player games like Warzone 2 and some Battlefield can possibly benefit from more than 8 cores only if the amount of players online in the game session is excessive and very large.

They are very niche cases and even with only 8 cores these cases are still playable easily with good experience. Its just that it can be better with more than 8 cores, but by no means bad and actually still even plenty good with 8 cores for like any game, even the niche cases unless you are beyond sensitive.
 
Last edited:
Yeah true. Also I think very heavy multi player games like Warzone 2 and some Battlefield can possibly benefit from more than 8 cores only if the amount of players online in the game session is excessive and very large.

They are very niche cases and even with only 8 cores these cases are still playable easily with good experience. Its just that it can be better with more than 8 cores, but by no means bad or even not good with only 8 cores.
I've tested Warzone extensively. It's a myth that it can benefit from more than 8 cores. Can it use more than 8? Yes, but does it really benefit? No, it works best with a render count of 6 or 7 (cores). That's using the fastest GPU you can get in Warzone, the 7900 XTX. The 4090 is too slow in Warzone vs the XTX so it's not a good comparison, however I tested the 4090 anyway.

x2 CCD SMT Off Render 6 434 281


x2 CCD SMT Off Render 7 424 271


x2 CCD SMT On Render 6 443 273


x2 CCD SMT On Render 7 431 280


x1 CCD SMT Off Render 6 416 267


x1 CCD SMT Off Render 7 392 251


x1 CCD SMT On Render 6 400 265


x1 CCD SMT On Render 7 409 262


I tested MW2 using 15/16 CPU threads, but the scores were lower so didn't include them. In hindsight I should have kept them but at the time I was purely testing for myself to find out the two fastest configs for CPU render worker threads.

8 cores is still the sweet spot for games in general and will be for some time yet as the consoles are 8 cores. Consoles are king in game development and most games are primarily built with them in mind.
 
Last edited:
No surprise its from ASROCK. I'm actually surprised it doesn't have DDR4 and DDR5 banks. They have a history of doing crazy stuff like that.

I think the memory controller is part of the CPU, so even in that situation it wasn’t designed to work with DDR4 from what I recall.
 
I've tested Warzone extensively. It's a myth that it can benefit from more than 8 cores. Can it use more than 8? Yes, but does it really benefit? No, it works best with a render count of 6 or 7 (cores). That's using the fastest GPU you can get in Warzone, the 7900 XTX. The 4090 is too slow in Warzone vs the XTX so it's not a good comparison, however I tested the 4090 anyway.

Here's some 7950X3D data I gathered running MW2. This applies to Warzone 2 as well.
MW2 Benchmark 4K Basic Preset
x2 CCD = 16 cores enabled
x1 CCD = 8 cores enabled
Render = The RenderWorkerThread Count used for MW2 6 = 6 CPU worker Threads etc
7950X3D + PBO + Curve Optimiser
4090 + Rebar + 180 Core +1699 Mem
View attachment 2602991

x2 CCD SMT Off Render 6 434 281


x2 CCD SMT Off Render 7 424 271


x2 CCD SMT On Render 6 443 273


x2 CCD SMT On Render 7 431 280


x1 CCD SMT Off Render 6 416 267


x1 CCD SMT Off Render 7 392 251


x1 CCD SMT On Render 6 400 265


x1 CCD SMT On Render 7 409 262


I tested MW2 using 15/16 CPU threads, but the scores were lower so didn't include them. In hindsight I should have kept them but at the time I was purely testing for myself to find out the two fastest configs for CPU render worker threads.

8 cores is still the sweet spot for games in general and will be for some time yet as the consoles are 8 cores. Consoles are king in game development and most games are primarily built with them in mind.
The testing is flawed cause you are using cores from a different CCD. You are not specifically testing the effect of cores here but of ccd latency.
 
Totally legitimate as you say. The amount of PCMR types who were banging on about the PS5 checkerboarding not being true 4K and yet are running fake 4K themselves when they use DLSS is amusing.
Yes, but when console peasants were doing this it was bad.
When PCMR "I spent more therefore I am worth it!" types do this it is the best thing ever.

Keep thinking that someone is driving this narrative. Suddenly everyone wants fake generated frames too.

I think the memory controller is part of the CPU, so even in that situation it wasn’t designed to work with DDR4 from what I recall.
Since AMD are using chiplets and the memory controller is on a separate chip, it would have been relatively easy for AMD to offer Zen4 CCDs with a DDR4 I/O controller.

How that would have performed vs Zen4 DDR5 is unknown. And since they would have had to segment AM5 sockets into AM5-DDR5 and AM5-DDR4, longer term it would have it could have been a bad thing. And by segmenting the boards they probably wouldn't gained the "halo" effect like Intel have - where people see 13900K series with DDR5 at the top of charts but buy a 13600K with DDR4.
 
The testing is flawed cause you are using cores from a different CCD. You are not specifically testing the effect of cores here but of ccd latency.
Wrong. I'm not specifying any specific cores to be used, that's all done automatically. All I am doing is enabling/disabling a CCD (8 cores vs 16), SMT (On/Off) or adjusting the CPU Worker Thread count (6/7 the two fastest configs) using the MW2 config file.

The fastest overall configuration, with a CPU worker thread count of 6 is both CCDs enabled, SMT off. 1% lows go up (slightly) with the same config but SMT on, however average FPS are slightly lower.

In case it's not obvious, I'm referring to the CPU FPS data from the screenshots above.
 
Last edited:
You don't need to do anything yourself (other than install chipset drivers, update game bar, don't disable game bar/game mode) it's all handled automatically.

Most games will use the CCD with cache on there as it's faster, however there are a few exceptions where certain games prefer higher clock frequency instead of cache. In those examples the game will use the second CCD for better performance.

If a game requests the use of more than 8 cores, see Spiderman as an example, it'll use both CCDs.

Whatever you heard is wrong, most like parroted by someone that has never used the processor.
I think certain people will defend AMD no matter what. Spider-Man and Cyberpunk that’s literally it. Even the list AMD handed out to influencers was ridiculous just to get it to work.

Nobody wants to admit the fact that it’s better to turn them off because in the majority of games it’s going to cause more harm than good. And people can slate certain YouTubers but the truth is, in amongst all the character bull **** they do to make this subject a bit more exciting for views, there is actual truth.
 
Yes, but when console peasants were doing this it was bad.
When PCMR "I spent more therefore I am worth it!" types do this it is the best thing ever.

Keep thinking that someone is driving this narrative. Suddenly everyone wants fake generated frames too.


Since AMD are using chiplets and the memory controller is on a separate chip, it would have been relatively easy for AMD to offer Zen4 CCDs with a DDR4 I/O controller.

How that would have performed vs Zen4 DDR5 is unknown. And since they would have had to segment AM5 sockets into AM5-DDR5 and AM5-DDR4, longer term it would have it could have been a bad thing. And by segmenting the boards they probably wouldn't gained the "halo" effect like Intel have - where people see 13900K series with DDR5 at the top of charts but buy a 13600K with DDR4.

Yep, nobody likes to admit they are spending 4x as much for one component and getting the same experience or worse than a £450 console at times :D
 
I think certain people will defend AMD no matter what. Spider-Man and Cyberpunk that’s literally it. Even the list AMD handed out to influencers was ridiculous just to get it to work.

Nobody wants to admit the fact that it’s better to turn them off because in the majority of games it’s going to cause more harm than good. And people can slate certain YouTubers but the truth is, in amongst all the character bull **** they do to make this subject a bit more exciting for views, there is actual truth.
You are mistaken. The list was purely for troubleshooting if you got lower than expected performance, ie if something was not right and you had to check things were installed correctly, ie chipset drivers. Why TPU posted that I'm not sure...

The actual list (once you leave out all the BS) was simply, install Chipset drivers, update game bar (via Windows Store) and don't disable game bar/game mode. That's it. That's all there is too it.
 
Last edited:
Wrong. I'm not specifying any specific cores to be used, that's all done automatically. All I am doing is enabling/disabling a CCD (8 cores vs 16), SMT (On/Off) or adjusting the CPU Worker Thread count (6/7 the two fastest configs) using the MW2 config file.

The fastest overall configuration, with a CPU worker thread count of 6 is both CCDs enabled, SMT off. 1% lows go up (slightly) with the same config but SMT on, however average FPS are slightly lower.

In case it's not obvious, I'm referring to the CPU FPS data from the screenshots above.
How is it wrong? If the game can use more than 8 cores, it won't properly show with the 7950x cause the penalty of the ccd latency might be bigger than the gains from the extra cores. That's not the case with Intel cpus, that's why it's easier to test there whether games requires more than 8 cores
 
How is it wrong? If the game can use more than 8 cores, it won't properly show with the 7950x cause the penalty of the ccd latency might be bigger than the gains from the extra cores. That's not the case with Intel cpus, that's why it's easier to test there whether games requires more than 8 cores
The game can use as many cores as you allow it to, by default it uses 1 below your max physical CPU count so 15 for 16 core cpus, but it performs best (highest average FPS/best 1% lows) with a worker render count of 6 or 7. It's the same for Intel too, it's best to set it to use half or less of your physical CPU cores. Do some research into it on YT/Online if you doubt what I am saying, but it's well known amongst COD Bro's. I can put up some numbers of it running on 15 CPU worker threads. It still runs fine, just not as fast as limiting it to 6/7. It was the same for my 5800X3D (only 1 CCD ;) ) and my 5950X/7950X.
 
Last edited:
  • Like
Reactions: J.D
You are mistaken. The list was purely for troubleshooting if you got lower than expected performance, ie if something was not right and you had to check things were installed correctly, ie chipset drivers. Why TPU posted that I'm not sure...

The actual list (once you leave out all the BS) was simply, install Chipset drivers, update game bar (via Windows Store) and don't disable game bar/game mode. That's it. That's all there is to it.
That’s not what this guy says and I know you rate him highly. :D

 
Last edited by a moderator:
That’s not what this guy says and I know you rate him highly. :D

You are going to have to summarise it for me or pay me to watch his spiel bud. :cry:

EDIT - I'll take a guess, 'no details on settings or games, no FPS numbers just take my word for it Intel good AMD bad'.

Was i close? :p
 
Last edited:
The game can use as many cores as you allow it to, by default it uses 1 below your max physical CPU count so 15 for 16 core cpus, but it performs best (highest average FPS/best 1% lows) with a worker render count of 6 or 7. It's the same for Intel too, it's best to set it to use half or less of your physical CPU cores. Do some research into it on YT/Online if you doubt what I am saying, but it's well known amongst COD Bro's. I can put up some numbers of it running on 15 CPU worker threads. It still runs fine, just not as fast as limiting it to 6/7. It was the same for my 5800X3D (only 1 CCD ;) ) and my 5950X/7950X.
Render count has nothing to do with how many cores the game is using. I know for a fact that with a render count of 6 it uses all 16 cores on my 12900k.
 
Render count has nothing to do with how many cores the game is using. I know for a fact that with a render count of 6 it uses all 16 cores on my 12900k.
Fair enough, I can’t speak from an Intel perspective but that goes against everything else online.

From an AMD perspective, which I can speak to, the game will spread the workload out over however many cores you set the render count to. This can be verified using Task Manager and selecting the logical cores view.
 
Back
Top Bottom