• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
y at 1080p?
we need to see it on a 40 series at 4k and 1440p vs the 13th gen intel chips

then if it smashes 13th gen u can buy it and be happy.
Because at higher resolutions you're typically GPU bound and the CPU has a lesser impact. At low resolutions the games become CPU bound, so it more clearly demonstrates which CPU offers a performance benefit.
 
It is looking like the outright winner even compared to the 13900ks. Not much price difference between the two either. Performance, efficiency and price.
 
y at 1080p?
we need to see it on a 40 series at 4k and 1440p vs the 13th gen intel chips

then if it smashes 13th gen u can buy it and be happy.

Because they are testing the CPU's performance, not the GPU, the sort of thing you're talking about would not distinguish anything from a previous gen mid range CPU, in which case what is the point?
 
Last edited:
Wouldn't be surprised to see a specific edge case of 50% better at "something".

Up to is as always an abomination because technically correct isn't satisfying when you find it's not reflective of normal use.

I've already seen 40% in one game so 50% isn't out of the question. The 7950x3d shows 40% gain over the 7950x when paired with an RTX4090 at 1080p resolution in Horizon Zero Dawn. Maybe 50% at 720p?
 
Make sure you have a strong GPU otherwise the new x3d don't show major gains. A second reviewer has released his data - tested using an RTX3090 and the 7950x3d showed almost no improvement. You need something like a 4090 to see those gains
 
Last edited:
It’s how it performs at the setting used by most that matter. Looking for the best possible example is just marketing as 99.9% of the time it’s wrong. That ends in forums full of “why is my CPU not 50% faster?” posts.
 
It’s how it performs at the setting used by most that matter. Looking for the best possible example is just marketing as 99.9% of the time it’s wrong. That ends in forums full of “why is my CPU not 50% faster?” posts.

It's the current trend to do benchmarks at unrealistic low graphic settings and claiming gaming benefit or "crowns" as if the cpu is so important these days.

At least some reviewers have the decency to admit it's a farce to talk about comparing gaming performance when 10 cpus are doing average 300fps+ and the moment the resolution goes up everyone is trapped by what the gpu can do.
 
It's the current trend to do benchmarks at unrealistic low graphic settings and claiming gaming benefit or "crowns" as if the cpu is so important these days.

At least some reviewers have the decency to admit it's a farce to talk about comparing gaming performance when 10 cpus are doing average 300fps+ and the moment the resolution goes up everyone is trapped by what the gpu can do.
It’s just removing as much as possible so the test is going to show the CPU performance difference. That’s what people are interested in, how much faster the CPU is, not the GPU.
 
Back
Top Bottom