• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Some-ones going to have to explain this to me, I'm on a 12700kf as well which will do 5g on all Pcores I'm running it stock though at 4k 120hz, I honestly see no bottlenecking at all on my 4090, yet people are upgrading this cpu and 12900k's. Why? What am I missing? I get the upgrading cause you don't have to change your motherboard but why do people think 12th gen is holding back the 4090, I see no gains in overclocking or bottlenecks? I understand lower res like 1440p 16:9 but 4k as well?
What scores are you getting in portroyal, my recent was circa 26000. My 12700kf temps don't go over 80 with a 240aio which i see the 13th gen run hotter with more power.

I've manually set voltage curve to cap out at 1000mv which gives me a straight-line boost to 2850 well happy with that considering the card now saves 80w and temps although good anyway are 5-10° lower.

I spent a bit more time tonight getting my ram from 3600 - 4000mhz stable which has seen a bump in Aida scores anyway.

Had a great flight in vr earlier in msfs, very smooth.
 
Last edited:
Yeah that's really interesting and all. The point is whether a bottleneck exists. You liked his post so therefore you agree that "Every cpu bottle necked is utter nonsense at 4k, it’s not unless your real low end cpus"?

People have been complaining of large bottlenecks on high end 10th series cpu's. If you pit a 5800X3D vs a 13900k at 4k with a 4090 the 5800x3d is going to get lower fps in some, maybe many games

Where, which people, show me people with recent cpus “struggling”
 
It's important to that strange breed who go online to claim the difference is important to everyone in a more complex narrative. It's as if their reality is a stupidly high hz gaming refresh for that lucrative fps gaming contract where every frame counts, when in reality they're flying planes around in a simulation and an extra frame or two might make the clouds look a bit better.
If it really was that much of a difference in 4k going from a previous gen CPU to a next gen then I'd be more inclined to blame Nvidia's poor implementation of GPU hardware scheduling rather than the CPU but until I see this GPU not hitting it's limit in a modern titles with V sync off then It really doesn't matter I'm yet to see it.

No one is buying the 4090 for anything other than 4k and up (Ok maybe some people might do :D) I certainly didn't upgrade from a 3080 to this to play in any lower resolution.
 
No one is buying the 4090 for anything other than 4k and up (Ok maybe some people might do :D) I certainly didn't upgrade from a 3080 to this to play in any lower resolution.
Well I bought 4090 for 3440x1440 and its still not enough :D CB2077 dips below 60fps without DLSS(and gaming on fast screen like oled reveals that DLSS Q in motion is still worse than native)
 
Last edited:
Yeah that's really interesting and all. The point is whether a bottleneck exists. You liked his post so therefore you agree that "Every cpu bottle necked is utter nonsense at 4k, it’s not unless your real low end cpus"?

People have been complaining of large bottlenecks on high end 10th series cpu's. If you pit a 5800X3D vs a 13900k at 4k with a 4090 the 5800x3d is going to get lower fps in some, maybe many games


You aren't describing a bottleneck, just the 'measurable' difference between CPU's. Bottleneck is where you GPU isn't reaching high 90's % usage. Both results show the GPU @99% so no bottleneck. If GPU was at much lower usage then that is a bottleneck.

Your RAM looks like it could bottleneck your CPU though. Not sure if the bench picks up native timings but your RAM looks slow if not. You want 3600Mhz+ for AM4.

Difference between 5800X3D and 13900k is normal architecture differences. If 5800X3D is lower, it isn't a bottleneck just the differences in efficiency in the architectures. @ 4k results across many CPU's are single figure differences and near as makes no difference in the experience when gaming. It's the 1% lows in FPS that are usually most telling as max fps figures are so close and is usually considered as measurable error.

If you have a 4090 @ 4k and you are in high 95%+ GPU usage I wouldn't bother changing the CPU. LOts of money for little difference when actual gaming. 2560x1440 is ~45% of 4k, and 3440x1440 is ~60% of 4k (~1.5M more pixels than 2560).

Both resolutions will be bottlenecked with a 4090 as these resolutions are still a long way off 4k. Waste of a 4090.
 
Back
Top Bottom