• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Reddit user buys 4090 to replace 3090, complains that the 4090 doesn't run any faster. Turns out he has a Ryzen 3700x CPU :cry:


This is kinda interesting, because for years the wisdom was that the graphics card was the main determinant of performance and having an older processor wasn't really important. All those folks gaming on Sandy/Ivy bridge for the best part of a decade and being advised that they weren't going to see a lot of difference in gaming by going to whatever-lake, instead spend the cash with nvidia/AMD on modernising that graphics card...

Now we're at "don't buy the top end GPU because without also having the most modern CPU you'll see no benefit".
 
Last edited:
Right, stupid question time! Why is it that 1440p is cpu bound but 4K isn’t?
At lower resolutions the GPU is rendering frames really fast. The CPU has to handle the increased pace, if it can't then this is what is referred to when talking about CPU bottlenecks. The more powerful the CPU, the easier it can get that info flowing. Higher resolutions cause the GPU to render frames slower. At 4k it is usually slow enough that there is little difference on the CPU side. This would be the GPU bottleneck. The GPU is so taxed that it is not producing frames fast enough so less powerful cpus will work as well as the stronger ones. This is just the general rule though. The games themselves will influence how much you see the effect, but the basics will still remain the same with slight variances in how big the effect is.
 
I decided to stupidly install Geforce Experience just to see what settings it suggests....
and the recommended settings for all my games are crap example.... yea optimal not even 3440x1440 are you trolling nvidia? it maxes out at 144fps easy

7AUDehn.jpg
How can a non native res be optimal

wants m e to play cyberpunk without RT :cry: why buy a 4090 if you're going to play at 3080 settings
 
Last edited:
Reddit user buys 4090 to replace 3090, complains that the 4090 doesn't run any faster. Turns out he has a Ryzen 3700x CPU :cry:

If he was playing at 4k then a RTX4090 would still give a boost in many games that are very GPU dependent. Others of course not so much.

Did you see what games he was playing before the thread was deleted?
 
Last edited:
I decided to stupidly install Geforce Experience just to see what settings it suggests....
and the recommended settings for all my games are crap example.... yea optimal not even 3440x1440 are you trolling nvidia? it maxes out at 144fps easy

7AUDehn.jpg
How can a non native res be optimal

wants m e to play cyberpunk without RT :cry: why buy a 4090 if you're going to play at 3080 settings
GFE is poo lol. Bin it off and learn from this :D
 
If he was playing at 4k then a RTX4090 would still give a boost in many games that are very GPU dependent. Others of course not so much.

Did you see what games he was playing before the thread was deleted?

He said he was gaming at 4k and was currently playing Spider-Man and Cyberpunk, he said the minimum framerate in cyberpunk was particularly bad as it would drop as low as 24fps for him and average framerate was the same as the 3090
 
Last edited:
Depends what your future plans are. My 4090 arrives today and it while it may be an upgrade from my 2080ti as I game at either 4k/60hz or 3440x1440/165hz I am also realistic that a 3900x (with ram I really need to tinker with) won't get the best out of it.

But I've had that 2080ti for three years and I expect the 4090 to at least last the same. The only question is whether I move to a 5800x3D this side of Christmas or wait until middle of next year for something different....
 
He said he was gaming at 4k and was currently playing Spider-Man and Cyberpunk, he said the minimum framerate in cyberpunk was particularly bad as it would drop as low as 24fps for him and average framerate was the same as the 3090
It will be interesting to see a site do a full benchmark suite across many games with RTX4090 and a weaker CPU... I imagine consistency and performance increases would vary a hell of a lot depending on the game engine.
 
Didn't nvidia claim in the region of 2-4x faster, game dependant? It also has the dlss 3 advantage which people will see as a plus point as the 30 series have been locked out of that (which obviously plays a big part in their performance claims).
It doesn't look much over half the speed of a 4090 despite coming in at 3/4 the cost.
 
Not just the CPU "holds back" the 4090... monitor & human vision too.
I've tested with my 165 hz 4k monitor and above 120 hz I dont see much of a difference.
I bought my 4090 for 4k/RTX/120 ish fps... and becase I'm weak when it comes to toys :)
 
Last edited:
Does this build look ok to you guys?

2TB Samsung 980 PRO NVMe PCIe 4.0, 7000MB/s Read, 5000MB/s Write, 1000K IOPS
Fractal Design Define 7 XL
ASUS ROG STRIX Z790-F GAMING WIFI
Intel Core i9 13900K, Raptor Lake, 24 Cores, 32 Threads, 3.0GHz Base, 5.8GHz Turbo
Noctua NH-D15 Chromax Black - Quiet Performance Air Cooler
32GB (2x16GB) Corsair Vengeance RGB DDR5 5200MHz CAS36 [Black]
24GB ASUS GeForce RTX 4090 TUF GAMING OC, 16384 Cores, 2565MHz Boost, GDDR6X - GeForce RTX VR Ready
Corsair RM1000x, Modular, Silent, 80PLUS GOLD
 
I bought my 4090 for 4k/RTX/120 ish fps... and becase I'm weak when it comes to toys :)
Ditto, I want 4k120 Ultra in the games I play on my TV. Sadly my desktop monitor is only 60hz because I am still hanging on for the new 27-32" OLED's coming in 2023. However all I play on my desktop are games like Total War Warhammer 3 or anything else 'requiring' mouse and keyboard so it's not much impact. For everything else I am using a PS5 pad and my TV.
 
  • Like
Reactions: Rup
At lower resolutions the GPU is rendering frames really fast. The CPU has to handle the increased pace, if it can't then this is what is referred to when talking about CPU bottlenecks. The more powerful the CPU, the easier it can get that info flowing. Higher resolutions cause the GPU to render frames slower. At 4k it is usually slow enough that there is little difference on the CPU side. This would be the GPU bottleneck. The GPU is so taxed that it is not producing frames fast enough so less powerful cpus will work as well as the stronger ones. This is just the general rule though. The games themselves will influence how much you see the effect, but the basics will still remain the same with slight variances in how big the effect is.
What a great eli5, thanks.
 
This is kinda interesting, because for years the wisdom was that the graphics card was the main determinant of performance and having an older processor wasn't really important. All those folks gaming on Sandy/Ivy bridge for the best part of a decade and being advised that they weren't going to see a lot of difference in gaming by going to whatever-lake, instead spend the cash with nvidia/AMD on modernising that graphics card...

Now we're at "don't buy the top end GPU because without also having the most modern CPU you'll see no benefit".
Thats something I started to notice over past years with the 3080 yet alone the 4090. Until the 3080 I found a 10-year old Sandy/Ivy bridge had little impact in Division 2 with older GPUs being at 90% usage while the CPU was 33% usage clearly showing the game was GPU limited with lots of spare CPU power. When I upgraded to the 3080 the GPU usage dropped down to 44% while the CPU went up to 70%+ Thats the first time in 10 years that it felt like the CPU was the bottleneck and needed upgrading. So I am not surprised at all that the 4080 or 4090 are showing even more limitations with older CPU's.

The ratio of CPU to GPU is always different game to game but it does feel like after 10 years we are at the point where at last its worthwhile to upgrade away from a high-end Sandy/Ivy bridge. As long as the Raptor Lake reviews don't show anything unexpectedly bad, I guess I should upgrade.
 
He said he was gaming at 4k and was currently playing Spider-Man and Cyberpunk, he said the minimum framerate in cyberpunk was particularly bad as it would drop as low as 24fps for him and average framerate was the same as the 3090

I'm using a 10700k, and I saw a huge improvement in Cyberpunk at 4k moving from a 3090 to a 4090.

I only managed to get everything installed late last night (had to swap over power supply etc), but from the limited testing I've done so far the only game where I wasn't seeing much difference was Flight Sim (although I may be able to now turn up the settings and get the same sort of frame rate as I was getting before, I don't know yet.)
 
I'm using a 10700k, and I saw a huge improvement in Cyberpunk at 4k moving from a 3090 to a 4090.

I only managed to get everything installed late last night (had to swap over power supply etc), but from the limited testing I've done so far the only game where I wasn't seeing much difference was Flight Sim (although I may be able to now turn up the settings and get the same sort of frame rate as I was getting before, I don't know yet.)


 
Reddit user buys 4090 to replace 3090, complains that the 4090 doesn't run any faster. Turns out he has a Ryzen 3700x CPU :cry:

What is wrong with people!!!! I'm going to be ******** on some people on these forums but I don't care. Why are people buying a 4090 and then skimping on the rest of their system? If you cannot afford to upgrade the rest of your system to match the 4090, then you cannot afford a 4090!
People trying their luck with potentially under powered PSUs and then patting themselves on the back because they saved £150. You just bought a £1600-£2000 GPU (at launch for that matter!!!) and now all of a sudden you want to save money!!!! If you were serious about saving money you wouldn't have bought the GPU in the first place.

It is the same as people buying high performance cars but then they can't afford tyres. So they put rubbish tyres on the car or worst, run them till they're bald and the thread is visible.
 
Yes the EU energy regs are dumb. Eg they limited vacum cleaner power, so you just had to spend longer vacuming, hence total energy used is the same.
Just wastes your time.

And yet some of the best vacuum cleaners in tests and actually sucking up dirt were the like so Miele with 900W motors and the ones with 2700W motors were useless and took longer to vacuuming your floor????

Bigger engine doesnt always mean faster or more efficently. Just look at cars for an example. You get old US 7.1l v8 engines putting out 200bhp and then 1.3l triple turbo engines putting out the same bhp.
 
Last edited:
What is wrong with people!!!! I'm going to be ******** on some people on these forums but I don't care. Why are people buying a 4090 and then skimping on the rest of their system? If you cannot afford to upgrade the rest of your system to match the 4090, then you cannot afford a 4090!
People trying their luck with potentially under powered PSUs and then patting themselves on the back because they saved £150. You just bought a £1600-£2000 GPU (at launch for that matter!!!) and now all of a sudden you want to save money!!!! If you were serious about saving money you wouldn't have bought the GPU in the first place.

It is the same as people buying high performance cars but then they can't afford tyres. So they put rubbish tyres on the car or worst, run them till they're bald and the thread is visible.

If I end up with one I won’t be upgrading my 5950X for it.

For the sake of a few % performance I don’t think it’s worth it.

Sure, it would be nicer to run one with a 7950X but I’m really not that bothered to go to the trouble of upgrading CPU, mobo and RAM for AM5.
 
Back
Top Bottom