• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
The arctic LFII is probably the best cooler going. I’ve fitted several to my kids PCs you often find the mountings aren’t quite right (if they were bought before that revision came out like the 7000 series) but if you contact arctic they will send you any ones you don’t have FOC.
I bought mine earlier in the year so it should fit ok
 
Any reason I shouldn't install a 79503D in a Gigabyte B650 Aorus Elite AX?


I reckon it’s going to need sone significant cooling for the silicon with the 7000 series needing extra cooling so far and the cache being stacked.

I'm hopeful for all these next gen air coolers coming out this year. I really don't want to buy Noctua's overpriced dinosaur.

Can't see much info at the moment other than a few like the Deepcool Assassin 4. Supposedly coming out in April.
 
Last edited:
So disappointed in AMD.

I tend to upgrade every year, 3950x to 5950x etc and fully intended on going to 7950x.. My boys or home server get the hand-me-downs parts.

With the temps, power consumption and absolutely ridiculously limitations / temperamental DDR5 speeds I skipped this time.. Hoping 7950x3d would be my next jump.

Guess I hold out hope for next gen, may end up going back to Intel though as the decision isn’t as simple anymore.

If someone was looking at upgrading from a much older CPU, I’d probably direct them to get a used or snap up a good retail deal on a 5xxx series. As I can see new motherboard chipsets being needed anyway if you really want to run DDR5 at high speeds and large capacity for next Gen
 
So disappointed in AMD.

I tend to upgrade every year, 3950x to 5950x etc and fully intended on going to 7950x.. My boys or home server get the hand-me-downs parts.

With the temps, power consumption and absolutely ridiculously limitations / temperamental DDR5 speeds I skipped this time.. Hoping 7950x3d would be my next jump.

Guess I hold out hope for next gen, may end up going back to Intel though as the decision isn’t as simple anymore.

If someone was looking at upgrading from a much older CPU, I’d probably direct them to get a used or snap up a good retail deal on a 5xxx series. As I can see new motherboard chipsets being needed anyway if you really want to run DDR5 at high speeds and large capacity for next Gen

This is why 170 watt TDP was utterly idiotic when 105 watts gets you 97% of the performance.
 
Do you think the performance difference between my 5800X3D and a 7800X3D will be so great that it will be worth the upgrade?

I have 32 GB CL14 memory and a 7900 XTX.
 
It’s such a minefield knowing a genuine review from one that isn’t.

The same reliable reviewers get posted in a massive thread every single time there's a hardware launch.

If its a review before the official date then there's major questions into the quality since everyone with a history and wanting a future in reviewing sticks to the manufacturers NDA or they get listed as untrustworthy and never get review parts or questions answered or pre-launch support to make sure they're fully informed about the product and any last minute updates. Or in some cases no one but the approved reviewers even have the correct drivers at that point.
 
Last edited:
This is why 170 watt TDP was utterly idiotic when 105 watts gets you 97% of the performance.


it's less than that, when I did my own testing using the 105w TDP power profile in the bios resulted in a loss of 15% performance in Cinebench R23 multithread.

Maybe it's just 3% loss in gaming?

The gaming performance difference is negligible, it's clear the 170w TDP profile was used to push multithread performance as hard as possible. This was clearly a target of AMD and it's also unusual for them - I believe it's because of Intel, AMD knew Intel was going to put up big numbers and they needed to compete and Zen4 doesn't have the core count to compete so it had to push power.

It's possible that AMD introduces the new 16 core CCDs in Zen 5 and then they might go back to using the 105w or 120w TDP because that high TDP is not needed to push high single clocks for gaming so if Zen5 offers up to 32 cores then AMD won't need to push power
 
Last edited:
4K? Probably not. 1080p? Yeah.

I don’t think that many pc gamers play in 4K. Most are in 1440p. I guess it’s something we can now consider if you have a 4090.

Even playing fortnite on my XTX only gets me around 150 fps with medium settings and nano turned on.
 
Do you think the performance difference between my 5800X3D and a 7800X3D will be so great that it will be worth the upgrade?

I have 32 GB CL14 memory and a 7900 XTX.

I've tested my 4090 Strix with a 7600, 7700, 13900k. In modern games at ultra settings (4k), especially those with RT, saw noticeable gains with each CPU, with the 13900k being fastest, followed by 7700, then 7600.

There are many active posters in this thread who have not tested a 4090 (or similar powered GPU) on any of these CPU's, but will see fit to judge and reach incorrect conclusions all perform the same, just be wary and do your own research.

I'm looking forward to testing a 7950X3D/7800x3d to see how they compete, they should be the fastest by far and also the most efficient.
 
Last edited:
I don’t think that many pc gamers play in 4K. Most are in 1440p. I guess it’s something we can now consider if you have a 4090.

Even playing fortnite on my XTX only gets me around 150 fps with medium settings and nano turned on.
I don’t think anyone other than very competitive fps players should be buying a 4090 for anything less than 4k. We are at a stage now where there are cheaper options that will be better suited to lower resolutions. Monitors are more affordable these days so if you have 4090 money…….
 
it's less than that, when I did my own testing using the 105w TDP power profile in the bios resulted in a loss of 15% performance in Cinebench R23 multithread.

Maybe it's just 3% loss in gaming?

The gaming performance difference is negligible, it's clear the 170w TDP profile was used to push multithread performance as hard as possible. This was clearly a target of AMD and it's also unusual for them - I believe it's because of Intel, AMD knew Intel was going to put up big numbers and they needed to compete and Zen4 doesn't have the core count to compete so it had to push power.

It's possible that AMD introduces the new 16 core CCDs in Zen 5 and then they might go back to using the 105w or 120w TDP because that high TDP is not needed to push high single clocks for gaming so if Zen5 offers up to 32 cores then AMD won't need to push power

Yeah, you're closer than my initial claim.

According to this its 90% the performance at 105 watts in R23.
And 94% in Blender. Its still faster than the 13900K in Blender, now at half the power.
So in Blender at least, its 60% of its power consumption at 94% the performance, 105 ECO vs stock.

d5Q4bSB.png

QBWYaLI.png

IBVMlyU.png
 
Last edited:
I've tested my 4090 Strix with a 7600, 7700, 13900k. In modern games at ultra settings (4k), especially those with RT, saw noticeable gains with each CPU, with the 13900k being fastest, followed by 7700, then 7600.

There are many active posters in this thread who have not tested a 4090 (or similar powered GPU) on any of these CPU's, but will see fit to judge and reach incorrect conclusions all perform the same, just be wary and do your own research.

I'm looking forward to testing a 7950X3D/7800x3d to see how they compete, they should be the fastest by far and also the most efficient.
I think we are going to see “more efficient” not “fastest by far”.

I honestly think you are doing this on purpose so that you can say “see I bought AMD and look what happened”

I’m onto you Dave ;)
 
Yeah, you're closer than my initial claim.

According to this its 90% the performance at 105 watts in R23.
And 94% in Blender. Its still faster than the 13900K in Blender, now at half the power.
So in Blender at least, its 60% of its power consumption at 94% the performance, 105 ECO vs stock.

d5Q4bSB.png

QBWYaLI.png

IBVMlyU.png

That and the whole DDR5 mess really hacks me off, I want to upgrade. Especially as I have a 4090.

This Gen feels like the worst one in a while, suppose I shouldn’t expect anything less with a socket change.
 
Back
Top Bottom