• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4090 CPU Bottleneck & Stuttering

Associate
Joined
9 Oct 2022
Posts
25
Location
London
Hi All

My main PC is an i7 12700k with 32 GB 5600 MHz DDR5 RAM and MSI Gaming X Trio RTX 4090. It runs any game I throw at it buttery smooth at 4K high FPS without any stutter and only minor FPS drops which are unnoticeable due to Nvidia GSync.

My bedroom PC is a i7 10700k with 32 GB 3200 MHz DDR4 RAM and MSI Ventus RTX 4090. My bedroom PC runs the same 4K games anywhere from 20 - 30 FPS slower than my main PC. This is after enabling MSI’s game boost feature CPU overlock in the motherboard BIOS of my bedroom PC. Before enabling MSI game boost I was getting 30 - 40 FPS less on my bedroom PC than my main PC.

What’s most frustrating is that sometimes when the FPS drops the bedroom PC will intermittently stutter for a micro second in all the games I tested Witcher 3 next-gen, Cyberpunk 2077 and Watch Dogs Legion. My main PC does not stutter at all. The stuttering on the bedroom PC is intermittent and doesn’t happen all the time but it’s frustrating when it does happen. The intermittent stutter and frame drops are very noticeable despite having Nvidia GSync enabled.

I’m putting this down to a CPU bottleneck with the 10700k not being able to keep up with the mighty RTX 4090 but I’m surprised if that’s the case. Everyone is always saying that CPU bottlenecks don’t occur much at 4K resolution only at 1440p or lower. It’s the reason why most tech reviewers will only benchmark games at 1080p and 1440p not 4K.

The bedroom PC is a clean install of Windows 11 and the system passes 3DMark Timespy stress tests without any issues.

I wanted to check with you all if I’m right and this is definitely due to a CPU bottleneck? I really don’t want to shell out on a new CPU and motherboard but I might need to as gaming on the 10700k system with the RTX 4090 is unplayable and takes away the enjoyment of games.

Please let me know your thoughts? Any help or advice you can give me would be greatly appreciated.
 
Both systems running the game from an NVME rather than a SSD?
Thanks for the reply. Yes both running games from NVME SSD but the bedroom PC is only Gen3 NMVME where as the main PC is Gen4. Although I don’t think it matters because even though all of the games are installed on the Gen4 NVME on the main PC I just realised that Cyberpunk 2077 is running off of the SATA SSD and it still runs better than the bedroom PC which is a Gen3 NVME SSD.
 
Yes it's a very clear and obvious case of CPU bottlenecking. In truth even your 12700K isn't really strong enough to tame a 4090, but we can only wait for better. Presumably the 7700X3D is arriving by March and that should be the new king.
 
Ignoring overclocking that 10700k is anything up to 40% slower than the 12700k, a chip that is already borderline for cpu bottlenecking a 4090.

Basically the 10700k isn't strong enough for the 4090 without some adjustments to allow for it's reduced performance. You can reduce stuttering by limiting framerates, lowering some settings etc.
 
Yes it's a very clear and obvious case of CPU bottlenecking. In truth even your 12700K isn't really strong enough to tame a 4090, but we can only wait for better. Presumably the 7700X3D is arriving by March and that should be the new king.
Thanks for the reply. Yeah I know what you mean. The new Raptor Lake chips are insane an i5 13600k is more powerful than an i9 12900k at gaming workloads.

However, I’m fine with the i7 12700k performance. I can buy a new Z690 or Z790 motherboard, not sure if i’ll go for a 12700k or 13600k yet.

Is it worth investing in a DDR5 capable motherboard or should I get DDR4? Since it’s my bedroom PC I’d prefer to save some costs if I can re-use my 32 GB 3200 MHz DDR4 RAM.

On the other hand if the DDR4 RAM will bottleneck and I’m shelling out for a new motherboard and CPU I’d rather go for DDR5.
 
Ignoring overclocking that 10700k is anything up to 40% slower than the 12700k, a chip that is already borderline for cpu bottlenecking a 4090.

Basically the 10700k isn't strong enough for the 4090 without some adjustments to allow for it's reduced performance. You can reduce stuttering by limiting framerates, lowering some settings etc.
Thanks for the reply.

Looks like a new motherboard and CPU it is then.
 
Ignoring overclocking that 10700k is anything up to 40% slower than the 12700k, a chip that is already borderline for cpu bottlenecking a 4090.

Basically the 10700k isn't strong enough for the 4090 without some adjustments to allow for it's reduced performance. You can reduce stuttering by limiting framerates, lowering some settings etc.
Yeh at 1080p or 1440p maybe, but at 4K i seen a video where it was like 3%/5%to10% difference compared to the 13700k even, no way it would be 40% difference lower, maybe in apps or blender, my 4090 if i can be bothered to install it, i will be using my 6700K even and will be ok, i mean it looses like 20% max at 4k in games even compared to a 13900k, loads of videos comparing games with the 4090 vs 13900 vs right down to 6700k even, every cpu is bottlenecked against the 4090, at 4K gaming not a lot difference, im happy to use my 6700K at 4K with my 4090 until i get new cpu and motherboard.

 
Last edited:
Thanks for the reply. Yeah I know what you mean. The new Raptor Lake chips are insane an i5 13600k is more powerful than an i9 12900k at gaming workloads.

However, I’m fine with the i7 12700k performance. I can buy a new Z690 or Z790 motherboard, not sure if i’ll go for a 12700k or 13600k yet.

Is it worth investing in a DDR5 capable motherboard or should I get DDR4? Since it’s my bedroom PC I’d prefer to save some costs if I can re-use my 32 GB 3200 MHz DDR4 RAM.

On the other hand if the DDR4 RAM will bottleneck and I’m shelling out for a new motherboard and CPU I’d rather go for DDR5.
At 4K the 12700k is just fine, hardly any difference with the new cpu at games, maybe blender and photoshop it helps.
 
Yeh at 1080p or 1440p maybe, but at 4K i seen a video where it was like 3%/5%to10% difference compared to the 13700k even, no way it would be 40% difference lower, maybe in apps or blender, my 4090 if i can be bothered to install it, i will be using my 6700K even and will be ok, i mean it looses like 20% max at 4k in games even compared to a 13900k, loads of videos comparing games with the 4090 vs 13900 vs right down to 6700k even, every cpu is bottlenecked against the 4090, at 4K gaming not a lot difference, im happy to use my 6700K at 4K with my 4090 until i get new cpu and motherboard.


That video makes no sense. Are those benchmarks legit or faked? My 10700k is at least 30 - 40% slower when paired with an RTX 4090 compared to my 12700k.

Although none of those benchmarks in that video are using ray tracing. All the games that I am playing are with ray tracing and DLSS enabled.
 
That video makes no sense. Are those benchmarks legit or faked? My 10700k is at least 30 - 40% slower when paired with an RTX 4090 compared to my 12700k.

Although none of those benchmarks in that video are using ray tracing. All the games that I am playing are with ray tracing and DLSS enabled.
well there is your answer :P cant really compare it to what you are running when its nothing like the test
 
That video makes no sense. Are those benchmarks legit or faked? My 10700k is at least 30 - 40% slower when paired with an RTX 4090 compared to my 12700k.

Although none of those benchmarks in that video are using ray tracing. All the games that I am playing are with ray tracing and DLSS enabled.
Plenty of other videos out there showing the same, at 4K gaming there aint a massive difference for gaming at 4K.
 
Yeh at 1080p or 1440p maybe, but at 4K i seen a video where it was like 3%/5%to10% difference compared to the 13700k even, no way it would be 40% difference lower, maybe in apps or blender, my 4090 if i can be bothered to install it, i will be using my 6700K even and will be ok, i mean it looses like 20% max at 4k in games even compared to a 13900k, loads of videos comparing games with the 4090 vs 13900 vs right down to 6700k even, every cpu is bottlenecked against the 4090, at 4K gaming not a lot difference, im happy to use my 6700K at 4K with my 4090 until i get new cpu and motherboard.


That's fake benchmarks

I think we should actually have a rule about it; any video claiming to benchmark hardware must:

1) Show on camera that they have the hardware they are claiming to benchmark

or

2) Show on camera the face of the person running the channel

And failure to meet at least one of those should mean the video can't be posted because 99% of all videos on YouTube that don't meet rule 1 or rule 2 are fake
 
Last edited:

This guy has a slightly better processor than me 10850k vs 10700k and RTX 4090.

His performance in Watch Dogs Legion is only slightly better than mine.

The only other major difference is that he is playing with DLSS set to Balanced and I am playing with it set to Quality.

It’s definitely a CPU bottleneck I’m sure of it.
 
There isn't a CPU currently out that can suitably marry up to a 4090 below 4K, and even at 4K even - So there is that, but at 4K plenty of things can be done to help improve the stutter, although a 10700K is considerably slower and will show its age anyway vs a 12th gen chip. especially the 1% lows and frametime pacing - That would explain the stuttering. For higher performance GFX cards you don't really want to be mating them to anything below 12th gen really or whatever is AMD's equivalent.

Cap the framerate to a few fps below your refresh rate, so 188fps for example if your Gsync is 120Hz. I prefer doing so via RTSS as different games implement frame rate capping differently, some introduced noticeable latency for example. RTSS only adds a single frame of latency, no more.
 
Last edited:

This guy has a slightly better processor than me 10850k vs 10700k and RTX 4090.

His performance in Watch Dogs Legion is only slightly better than mine.

The only other major difference is that he is playing with DLSS set to Balanced and I am playing with it set to Quality.

It’s definitely a CPU bottleneck I’m sure of it.
Course its a cpu bottleneck, even the latest and greatest 13900K is a bottleneck to a 4090 at 4K.
 
Back
Top Bottom