• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

CPU limited at 4k with only a 17% performance bump. Does that seem off for anyone else?

Yes, particularly since they're using an i9-12900K. It's rather unusual for a game to be CPU limited on such a powerful CPU.

It's obviously more disinformation from nvidia, but I'm mildly curious as to the details. My guess is that they found (or created) specific settings that place a wildly unrealistic load on the CPU in that game with DLSS2 performance mode and not with DLSS 3. Either that or they're just showing fake numbers to go with their fake frames. Lying would be easier, but slightly riskier than being misleading without quite lying outright.
 
Seems very suss to be cpu limited at 4K.

DLSS 2 only 17% when it’s normally like 60-100% improvement or something

I'm on a 5800x played at 4k Max settings with RT and Spiderman is the only DLSS game where it makes virtually no difference from native to me (3080 Suprim X). My CPU is getting utilised miles above any other games, not sure if its a glitch or what. You'd think Cyberpunk would be similar if it wasn't but I get a great uplift there.
 
I don't buy the whole CPU limited thing either, something is extremely off about all this, and the embargo of framerates thing until just before release is wtf, yet only one set of fps figures have been released in the form of a small video from nVidia.

I don't trust this one bit. There is no way a 12900K is somehow CPU limited let alone any of the other 12th gen chips as well.

For some reason the press hype about this doesn't sit straight, like something fishy is afoot. There just doesn't seem to be any genuine excitement from the press about tis and all we are seeing is the mass-press repeat the exact same script nvidia have given them.

I have a bad feeling about this....

Maybe it's just Spiderman and other titles that have something engineered into the code for such a time as this... The long game so to speak?
 
Last edited:
DLSS 3 is stupid anyway. Just decrease some settings to keep the frames high and use DLSS 2 => problem solved without the latency increase and the artifacts that come with DLSS 3.
Why using DLSS 3? To brag you play at "ultra settings" ? If you don't notice the artifacts due to the high frame rates, you won't notice you play at high settings instead of ultra either.
I heard they are already working on DLSS 4 and the next frame generation will be so advanced you won't even have to start the game to use it. Just think about what you want to play and Nvidia does the rest.
 
I don't buy the whole CPU limited thing either, something is extremely off about all this, and the embargo of framerates thing until just before release is wtf, yet only one set of fps figures have been released in the form of a small video from nVidia.

I don't trust this one bit. There is no way a 12900K is somehow CPU limited let alone any of the other 12th gen chips as well.

For some reason the press hype about this doesn't sit straight, like something fishy is afoot. There just doesn't seem to be any genuine excitement from the press about tis and all we are seeing is the mass-press repeat the exact same script nvidia have given them.

I have a bad feeling about this....

Maybe it's just Spiderman and other titles that have something engineered into the code for such a time as this... The long game so to speak?
Watch the video I posted above , this particular game hits the system hard ... he is seeing a noticeable difference going from pcie 3.0 to 4.0 and is saying memory bandwidth is having a noticeable impact aswell, with RT on at 4k it seems to be at its worse .
 
DLSS 3 is stupid anyway. Just decrease some settings to keep the frames high and use DLSS 2 => problem solved without the latency increase and the artifacts that come with DLSS 3.
Why using DLSS 3? To brag you play at "ultra settings" ? If you don't notice the artifacts due to the high frame rates, you won't notice you play at high settings instead of ultra either.
I heard they are already working on DLSS 4 and the next frame generation will be so advanced you won't even have to start the game to use it. Just think about what you want to play and Nvidia does the rest.

Odd I know. It was a standard argument to just 'turn down a setting or two' in other threads, it certainly isn't worth breaking the wallet for.
 
DLSS 3 is stupid anyway. Just decrease some settings to keep the frames high and use DLSS 2 => problem solved without the latency increase and the artifacts that come with DLSS 3.
Why using DLSS 3? To brag you play at "ultra settings" ? If you don't notice the artifacts due to the high frame rates, you won't notice you play at high settings instead of ultra either.
I heard they are already working on DLSS 4 and the next frame generation will be so advanced you won't even have to start the game to use it. Just think about what you want to play and Nvidia does the rest.
This is exactly how I see it, can imagine dlss 3 being utterly horrendous at 60fps.
It seems to need really high frame to not notice the artifacts.
I love dlss 2 on quality setting, to my eye it's better than native with some crap aa solution on top. I'm frightened they move away from that.
 
The Native resolution numbers look pretty good compared to a 3090Ti if that is run at 4K max setting as 3090Ti seems to do between 60-70FPS where as seeing 100-125fps here...

 
Last edited:
Watch the video I posted above , this particular game hits the system hard ... he is seeing a noticeable difference going from pcie 3.0 to 4.0 and is saying memory bandwidth is having a noticeable impact aswell, with RT on at 4k it seems to be at its worse .
Given its a ps4 game, it looks incredible but given it used to run on jaguar cores and the equivalent of like a 7850 or similar, it must be optimised like absolute dogcrap... outside of original hardware config.
 
The way i see it, is that DLSS 3 is more of a marketing tool, in my opinion, i don't know how other people will see it, but it's a way to say "hey we can get 4 x the performance compared to the competition" and while technically not correct as it's not apples to apples, they are right. And also, does it seem that nVidia would then be able to lean on this while producing cheaper and cheaper products while charging more and more and slapping DLSS 3 on it to get you the performance?
 
Watch the video I posted above , this particular game hits the system hard ... he is seeing a noticeable difference going from pcie 3.0 to 4.0 and is saying memory bandwidth is having a noticeable impact aswell, with RT on at 4k it seems to be at its worse .
Surely this just highlights poor optimisation in the game's development then, since barely any other game exhibits the same issue. This game and the new Spiderman are the only PC games using the Insomniac engine too, so I would knock this off as a one-off TBH. This satisfies me now. Plus, I didn't find Spiderman all that gripping anyway lol.

To me, the benchmark game for today is Cyberpunk 2077 on how hard it hits all of the core system, especially with RTX/DLSS enabled. It is the only game where I have seen healthy CPU usage when the GPU is at 99% - Some games barely even touch the CPU, Days Gone for example is in the single digits for CPU utilisation when the GPU is maxing out 98/99% - As fluid as melted butter.

With that in mind, those % variances for the 4090 appears to be healthy for Cyberpunk against a 3090 Ti, which I know is a bit above my 3080 ti, so have some idea of a comparison.

Edit* I just remembered actually, here's how Spiderman "utilised" my 12700KF lol, I did wonder why it was spiking so high regularly... not even Night City's most busiest areas during the day hit the CPU above 35% for comparison, and RED Engine has a LOT more going on compared to the comparatively empty city in Spiderman.

20220813_014357.jpg


Screenshot%202022-08-13%20015717.jpg
 
Last edited:
The way i see it, is that DLSS 3 is more of a marketing tool, in my opinion, i don't know how other people will see it, but it's a way to say "hey we can get 4 x the performance compared to the competition" and while technically not correct as it's not apples to apples, they are right. And also, does it seem that nVidia would then be able to lean on this while producing cheaper and cheaper products while charging more and more and slapping DLSS 3 on it to get you the performance?
Agree. Reviewers will need two charts from now on: fps and ffps (fake frames per second).

Honestly, I've played all three games shown (CP, SM and P2) at 120 fps. They all felt silky smooth. Sure, I wasn't playing with MAX-RT(tm), but who really cares. Not really sure sticking another alternate frame in between those existing frames - particularly with an added latency cost - is going to sway me towards 4000 series (which I can't afford anyway.. :))
 
xx70 is just a name. There's no defining feature of xx70 other than more expensive/faster than xx60 and cheaper/slower than xx80. I agree that having two quite distinct cards called a 4080 is needlessly confusing but that doesn't make it an "xx70" or a "straight rip off". As for price/performance, it looks like both 4080s are much better value than the 30xx cards of similar value (based on what Nvidia's claiming), and they'll almost certainly offer better bang for buck than the 4090 in real world usage.
Ahhhhahahahaha, how much Nvidia paying you?
 
Back
Top Bottom