• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU for 4K Gaming Misunderstanding

I've been saying it for ages, once you have a 4090 class GPU even with some games with maxed out settings at 4K the CPU matters a lot these days. I didn't want to change platforms yet so over a year and a half ago I went from a 5950X to a 5800X3D because the 5950X even at 4K was holding it back in some games and the averages went up by a decent amount as well as the minimums, plus it was able to keep the GPU pegged at 99% far more often. So even at 4K don't cheap out on a CPU even for 4K gaming.

For example The Division 2 maxed out in the White House area my 4090 on a 5950X struggled to pass 90% GPU usage, the moment I popped in a 5800X3D my GPU without any changes to the BIOS or Windows kept the GPU pegged at 99% without issues and gave me a nice average FPS increase.

Here's Far Cry 6, the 5800X3D walks all over the 5950X and again, no BIOS or Windows changes, just popped in the new CPU and booted the game up, massive performance increase and this game was running fully maxed out at 4K. If I was to go for a newer 7000 series or the latest Intel chips then this would be a bigger jump again because the GPU in this game is still being held back by the CPU.

This is only going to get worse once the 5000 series cards are out and a 4090 becomes a higher end 5070/5080. These days GPUs are so fast that 4K is not a difficult resolution to run anymore and it needs a lot of CPU power to keep these fast cards fed properly.

JuUAfcxh.jpg


kQIsBKzh.jpg

I seen a pretty massive boost in frame rates with a 3080. I’d say with Nvidia midrange onward buy the highest performance CPU you can. With a 4090 build a 7800X3D probably should be target or even a 7950X3D.
 
I have 4K, 120HZ. At one point I had both a 3600x and a 5800x3d with a 4070ti. There was no difference at all in FPS between either chip except in 1 game. However the 1% lows were considerably better with the 5800x3d
 
There’s currently a lot of changes going on in the gaming industry as a whole, in regards to a transition to more cores and higher memory bandwidth, something that’s needed and unavoidable, so things aren’t quite as clear as they once were.

All of this is really a mixture of all these factors, so you can’t really put it down to a single thing.

I think everyone can probably see that the excessively large L3 caches, for e.g., are so uniquely disproportinal, that you can’t expect them to be an explanation all by themselves.
 
I think you are hating on the 7800X3D a little bit. The 7800X3D is a bonkers fast CPU for 75- watts. It’s probably the greatest desktop CPU of all time. After testing one for a few hours I could hardly believe AMD pulled this off for this money.

I think people get a bit carried away by the huge difference in pure CPU power consumption - for example Hogwarts Legacy 1440p Ultra with a 4090 at roughly the same performance averages 380-390 watt whole system power consumption with the 7800X3D and 450-460 watt with a 14700K (let's not talk about the 14900KS).
 
Last edited:
I think people get a bit carried away by the huge difference in pure CPU power consumption - for example Hogwarts Legacy 1440p Ultra with a 4090 at roughly the same performance averages 380-390 watt whole system power consumption with the 7800X3D and 450-460 watt with a 14700K (let's not talk about the 14900KS).

If you’re pro PC gaming the 7800X3D is technologically light years ahead of any other hardware in the market. It’s the single most gaming focused piece of silicon in history.

GPU power use is simply out of control. Intel have painted themselves into corner. The CPU strategy and silicon is just wrong.
 
Back
Top Bottom