• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

I'm CPU bottlenecked in quite a few games I play. Turns out hitting 240fps at 4k requires a monster CPU, who knew?!.
Even the 9800X3D cannot do it in every multiplayer game I play. I will happily turn down graphical settings and use DLSS performance in a fast paced game to get close to 240fps on the GPU side, but there is zero scaling in games on the CPU side. Hence my slightly odd decision (to many) to upgrade from a 7800X3D to 9800X3D.
Yeah, with my 7950X3D/4090 setup, I've found that CPU bottlenecking at 4K is almost as common as GPU bottlenecking. Even in a game that is mostly GPU limited, there'll often be some parts that are CPU limited.

I remember playing Resident Evil 4 remake on a 5800X3D/4090, there were certain parts that just dropped drastically. Lowering the resolution didn't help at all. But those drops were solved almost completely, when I upgraded to the 7950X3D. Yet, when you go on reddit, most people there said that upgrading from a 5800X3D were pointless for 4K gamers.
 
Yeah, with my 7950X3D/4090 setup, I've found that CPU bottlenecking at 4K is almost as common as GPU bottlenecking. Even in a game that is mostly GPU limited, there'll often be some parts that are CPU limited.

I remember playing Resident Evil 4 remake on a 5800X3D/4090, there were certain parts that just dropped drastically. Lowering the resolution didn't help at all. But those drops were solved almost completely, when I upgraded to the 7950X3D. Yet, when you go on reddit, most people there said that upgrading from a 5800X3D were pointless for 4K gamers.
they really need to create somthing to counter the CPU Bottlenecking, I swear I saw somewhere where AMD CPU n GPU did somthing to stop this from happening, or am i thinking about sharing RAM to GPU? dunno
 

However, multi frame generation doesn't work as well in other titles. I saw relatively spiky frame-times in Alan Wake 2, for example, while my colleagues report that Hogwarts Legacy was far from a great experience. Similarly, Dragon Age: The Veilguard worked well during gameplay, but frame generation didn't work well in cutscenes. Clearly there's still some work to do here, and it may be that some games just don't suit the technology - or won't ever be updated to support it. We'll take a closer look at DLSS 4 soon to get a better idea of the lay of the land.
Interesting review. Seems like 3X/4X MFG is mostly pointless right now. Hogwarts and AW2 are two of the games that would benefit the most from further frame-gen performance.

DF also showed that at 4K, the 5090 has a sizeable advantage, often hitting 30% higher FPS. But the 4090 keeps up much better with the 5090 at 1440P, which is the base resolution for 4K DLSS Quality. So when using 4K DLSS quality, will the gap be closer to the differences at 1440p or 4K?

This is important because I feel like, even with a 5090, most people are going to use DLSS quality instead of native 4K for better performance.
 
Yeah, with my 7950X3D/4090 setup, I've found that CPU bottlenecking at 4K is almost as common as GPU bottlenecking. Even in a game that is mostly GPU limited, there'll often be some parts that are CPU limited.

I remember playing Resident Evil 4 remake on a 5800X3D/4090, there were certain parts that just dropped drastically. Lowering the resolution didn't help at all. But those drops were solved almost completely, when I upgraded to the 7950X3D. Yet, when you go on reddit, most people there said that upgrading from a 5800X3D were pointless for 4K gamers.
I think the CPU has become a critical part of the modern gaming PC. That may sound obvious, but several years ago you could get away with a 12400F class of CPU in the stack and pair it with a big GPU. That's no longer plausible with the latest AAA game and UE5. I don't think the demand for CPU power will slow down anytime soon either. I just hope the 10800X3D Zen 6 CPU will be on AM5.
 
they really need to create somthing to counter the CPU Bottlenecking, I swear I saw somewhere where AMD CPU n GPU did somthing to stop this from happening, or am i thinking about sharing RAM to GPU? dunno
You're probably thinking of Smart Access Memory (SAM). It's not really to counter CPU bottlenecks but aid how the two parts of your system work together.
 
they really need to create somthing to counter the CPU Bottlenecking, I swear I saw somewhere where AMD CPU n GPU did somthing to stop this from happening, or am i thinking about sharing RAM to GPU? dunno
Rebar?

I think that REBAR just made memory communication more effective when utilized properly. As far as I know, REBAR didn't help to boost CPU bottlenecked games.

Frame-generation actually helps to "fix" CPU bottlenecked games. It doesn't lower CPU requirements but it keeps the framerate high even when there are big dips due to inadequate CPU power. For a game like Hogwarts, it is pretty to useful to keep the framerate at 100 when it drops to 50 even on a 9800X3d.
 
I think the CPU has become a critical part of the modern gaming PC. That may sound obvious, but several years ago you could get away with a 12400F class of CPU in the stack and pair it with a big GPU. That's no longer plausible with the latest AAA game and UE5. I don't think the demand for CPU power will slow down anytime soon either. I just hope the 10800X3D Zen 6 CPU will be on AM5.
Yeah I remember during the PS4 days, people didn't upgrade their CPU for 5 years or more. It was pointless because games were still being developed first for those consoles with awfully weak CPUs. Also, Intel was just coasting and AMD was still trying to catch up in single core performance.
 

However, multi frame generation doesn't work as well in other titles. I saw relatively spiky frame-times in Alan Wake 2, for example, while my colleagues report that Hogwarts Legacy was far from a great experience. Similarly, Dragon Age: The Veilguard worked well during gameplay, but frame generation didn't work well in cutscenes. Clearly there's still some work to do here, and it may be that some games just don't suit the technology - or won't ever be updated to support it. We'll take a closer look at DLSS 4 soon to get a better idea of the lay of the land.
Interesting review. Seems like 3X/4X MFG is mostly pointless right now. Hogwarts and AW2 are two of the games that would benefit the most from further frame-gen performance.

DF also showed that at 4K, the 5090 has a sizeable advantage, often hitting 30% higher FPS. But the 4090 keeps up much better with the 5090 at 1440P, which is the base resolution for 4K DLSS Quality. So when using 4K DLSS quality, will the gap be closer to the differences at 1440p or 4K?

This is important because I feel like, even with a 5090, most people are going to use DLSS quality instead of native 4K for better performance.

Not disagreeing with them but Hogwarts Legacy is a bad example.

They’re so averse to optimising their game beyond a fkn single thread that they’d rather just lazily slap on MFG on a game that never gets updated anymore.

If anyone ever disagrees with you saying DLSS isn’t a crutch for devs and optimisation just point them to that game as an example.
 
The stupidity of MFG is that if you can use it effectively you don't really need it as you're already on a stable framerate (basically need a 240hz monitor, which at 4K seems ridiculous unless it's an ancient game) and if you need DLSS just to get to that frame rate in the first place the addition of Fake frames is only going to make it an artifact ridden slopfest.
 
Not disagreeing with them but Hogwarts Legacy is a bad example.

They’re so averse to optimising their game beyond a fkn single thread that they’d rather just lazily slap on MFG on a game that never gets updated anymore.

If anyone ever disagrees with you saying DLSS isn’t a crutch for devs and optimisation just point them to that game as an example.
TBF, the high CPU requirements in Hogwarts are mostly due to RT and specifically the max setting. If I turn off RT, the game runs nice and smooth at over 120FPS, even without frame gen. Even RT is mostly usable at the setting right below Max. But, for some reason, max RT runs like a stuttery max. I can't imagine that even the improvement from 7950X3D to 9800X3D would save it.
 
Hogwarts is a bugged game, it has crap frame pacing and traversal stutters and they will never fix it, they are working on a remaster instead lol. RT is broken in it as well.
 
Last edited:
Yeah I remember during the PS4 days, people didn't upgrade their CPU for 5 years or more. It was pointless because games were still being developed first for those consoles with awfully weak CPUs. Also, Intel was just coasting and AMD was still trying to catch up in single core performance.
Haha, yes. Good old Jaguar cores. Now many of the latest AAA games are 30fps on the consoles with possibly a performance mode that only hits 40-50fps. Subsequently you need a monster CPU on PC to even hit 80+fps.
 
Hogwarts is a bugged game, it has crap frame pacing and traversal stutters and they will never fix it, they are working on a remaster instead lol. RT is broken in it as well.

One of the biggest disappointments with the game for me - it would have been the perfect showcase for a proper ray tracing renderer, that and seeing how low quality a lot of the trees are if you look at them close up.
 
We can't really evaluate what the (high end) average gamer should or shouldn't buy until the 5080 reviews tbh. Even if the difference is fairly big, so to is the price.

We only have half the equation at the moment.
I’m expecting the 5090 to be 40-60% depending on the game faster than a 5080 for 100% more money, so not a good deal in terms of price to performance but then the 5080 looks more like a last gen card.
 
I've not watched too many reviews yet but it's looking like what we were expecting from Nvidia's slides and leaks.

Feels like, as with so many in recent history, it's a good card with a bad price. If this had released at the same price as the 4090 then it might not have been amazing but would probably still have been decent. As it is it doesn't seem to be worse value than the 4090, just not the uplift you would expect from a new generation.
As an upgrade from a 4090 it probably doesn't make much sense but if you're upgrading from something much older then it probably makes as much sense as a 4090 did/does.
I would also normally say there's no such thing as a bad GPU, only bad pricing. But even if the 5090 was priced the same as 4090... the heat and power draw... yeah, I can see why folks are making Fermi comparisons.

Anyhow, the outlook is exactly what I expected, though surprisingly even ray-tracing and AI didn't get more than 30% average boost.

Now I look forward to the hilarious trainwreck of the 5080 reviews. As always, I'll be checking back here after to see who changed their minds about buying one :D
 
Back
Top Bottom