• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 5 rumours

Regardless of what it is, wait for X3D.

AMD has proved twice now that there is no reason to get the non X3D processors, they have no real advantage in real life applications
There are many reasons why the non 3D chips make sense. I myself wanted an efficient and cool chip for primarily c++ development and a bit of gaming on the side. The 7900 non-x is the best chip for the money for that workload (AMD anyway).
 
Where are these areas that figuratively destroy 8 core CPU?
If you were truly playing at 4K max RT you would be looking at fps in the 20s on a 4090 probably. More like 13 fps on a 4080. :p

With a lot of upscaling and lower settings I can see big differences between CPU performance on a 5800X3D and 7800X3D which are both 8 cores. If there's a decent uplift in per core perf in Zen5 then the 8 cores will be fine for gaming.

I'm between 80-100 FPS in Cyberpunk @ 4k, RT max, obviously with DLSS enabled. That's with a 4090 Strix, 7950x3d and 13900k. It runs better on my 13900k, though that's one of the very few games that Intel wins at, the rest AMD is superior.

As for areas, Tom's Diner is a good example, try walking around there with a 8 core CPU @ 4k, RT on, it cripples it.
 
I'm between 80-100 FPS in Cyberpunk @ 4k, RT max, obviously with DLSS enabled. That's with a 4090 Strix, 7950x3d and 13900k. It runs better on my 13900k, though that's one of the very few games that Intel wins at, the rest AMD is superior.

As for areas, Tom's Diner is a good example, try walking around there with a 8 core CPU @ 4k, RT on, it cripples it.
Define 'cripples' - On my lowly 4080 even DLSS ultra performance at 2160p RT overdrive results in 90%+ GPU usuage and not even 80 fps :p
 
Define 'cripples' - On my lowly 4080 even DLSS ultra performance at 2160p RT overdrive results in 90%+ GPU usuage and not even 80 fps :p

Big spikes below 60FPS, making choppy and to me, unplayable.

Good news is CPU's with more than 8 cores are cheap, so no reason to cripple yourself with one :)
 
Guys, thank you very much for your suggestions, everything has been ordered and will be delivered within a week.
Expect me to post asking for help on how to assemble everything, it's been 20 years since last time!
 
Did you deliberately cut the legend off so no one could see you quoting the 720p results? As you probably will know I was quoting 4k results which is almost certainly more valid to the majority of users here trying to decide what they should pair with there 4090 and if they should wait for the X 3D variant

I don't know of anywhere that properly tests CPU gaming performance across a variety of gaming workload types. GN does okay by including Stellaris performance benchmarks but there are plenty of other titles that really hit the CPU.

Ideally a CPU gaming benchmark suite will include a few popular AAA titles, some kind of grand strategy, so popular ARPG, some MMO, a turn based game, a simulation titles like Flight Sim or ACC, some kind of city builder and a few open world titles sprinkled here and there.

This CPU does not matter at 4K guff is only really valid for the latest AAA titles, which is fine but parroting it everywhere is not really valid unless you qualify your statement to specify 'for the latest AAA games' which you and many others don't bother to.
 
I don't know of anywhere that properly tests CPU gaming performance across a variety of gaming workload types. GN does okay by including Stellaris performance benchmarks but there are plenty of other titles that really hit the CPU.

Ideally a CPU gaming benchmark suite will include a few popular AAA titles, some kind of grand strategy, so popular ARPG, some MMO, a turn based game, a simulation titles like Flight Sim or ACC, some kind of city builder and a few open world titles sprinkled here and there.

This CPU does not matter at 4K guff is only really valid for the latest AAA titles, which is fine but parroting it everywhere is not really valid unless you qualify your statement to specify 'for the latest AAA games' which you and many others don't bother to.
Did you have a look at the Tech Power Up review? While not including a large selection sample it did seem to include a fair variety of game types and age. Considering i was of course quoting an average meaning the average user would need to play a quite a few of the type of games that would qualify your statement to impact the average score unless of course there happened to be one of two titles with a massive difference that could through the average out.

To qualify my statement i never once said the CPU did not matter at 4k, please feel free to quote me if i am wrong. I did say that the difference between 7950X 3D and 7950X was not high enough to really warrant waiting around for the 3D version and that i considered the 4K results taken from the discussed review to be more relevant to the majority of the users then the 720p results in this instance
 
Did you have a look at the Tech Power Up review? While not including a large selection sample it did seem to include a fair variety of game types and age. Considering i was of course quoting an average meaning the average user would need to play a quite a few of the type of games that would qualify your statement to impact the average score unless of course there happened to be one of two titles with a massive difference that could through the average out.

To qualify my statement i never once said the CPU did not matter at 4k, please feel free to quote me if i am wrong. I did say that the difference between 7950X 3D and 7950X was not high enough to really warrant waiting around for the 3D version and that i considered the 4K results taken from the discussed review to be more relevant to the majority of the users then the 720p results in this instance

They only recently stopped testing Civ 6 FPS.. I can't think of a more pointless benchmark for a 4x turn based game.

The other factor you miss is GPU upgrades are far easier than CPU upgrades, even when it is just a swap, so a 4090 gamer might be buying a CPU to last for 2 maybe even 3 GPUs. By going with the fastest at 720/1080p you gain longevity which will help prevent annoying hiccups when they grab a 5090. A wider range of tests would make discerning the fastest a lot easier.

The final factor is power consumption, the X3D parts use a lot less energy than the standard parts so you can have a far quieter system.
 
They only recently stopped testing Civ 6 FPS.. I can't think of a more pointless benchmark for a 4x turn based game.

The other factor you miss is GPU upgrades are far easier than CPU upgrades, even when it is just a swap, so a 4090 gamer might be buying a CPU to last for 2 maybe even 3 GPUs. By going with the fastest at 720/1080p you gain longevity which will help prevent annoying hiccups when they grab a 5090. A wider range of tests would make discerning the fastest a lot easier.

The final factor is power consumption, the X3D parts use a lot less energy than the standard parts so you can have a far quieter system.
I get what you are saying and i do not even disagree with the point you are making i just do not think its something the majority of users will be affected by. The only thing that you said that i am not sure i agree with is GPU's being easier to upgrade then CPU's, looking at the longevity of the sockets currently both GPU and CPU seems as easy as pull out the old and plug in the new
 
The only thing that you said that i am not sure i agree with is GPU's being easier to upgrade then CPU's

'Far easier' may be overstating it but I do think it is easier to unplug a few cables, pull out the GPU, plug a new one in and connect the cables again than it is to unmount the HSF, pull out the CPU, clean the HSF, plug in the new CPU, apply the thermal paste and mount the HSF again.
 
'Far easier' may be overstating it but I do think it is easier to unplug a few cables, pull out the GPU, plug a new one in and connect the cables again than it is to unmount the HSF, pull out the CPU, clean the HSF, plug in the new CPU, apply the thermal paste and mount the HSF again.
I get that and my fault for not being clearer, i was referring to more the fact no need to upgrade motherboard and Ram.
 
Doubling down paying off:

Screenshot_2024-04-05-10-44-44-69_0b2fce7a16bf2b728d6ffa28c8d60efb.jpg
 
Back
Top Bottom