• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series







It is clear this is a 4k card. And it does pretty well at 4k. However, at 1080p and 1440p it doesn't look appealing at all for the price. If you are esport gamer you might be more interested in a 6900xtxh or a 3090ti.




Transient spikes over the long term is very concerning. This suggests that you need a top end PSU just to protect yourself from any oddities like shutdowns, etc.
I wonder if nvidia will have a recommended psu list like they use to (Both AMD/Nvidia use to do this but I haven't seen this lately). IMHO, nothing short of a platinum rated 1000watt PSU would do.
Emphasis on the Platinum. But I am sure you will have some who believe that the 450-500w bronze is good enough for a 4090 showing these kinds of transients spikes. :rolleyes:

The problem is this looks game/application dependent and you won't know what will trigger this kind of behavior unless it's exposed like this example above.
 
Last edited:
4090 overkill for 3440x1440? No OLED 4K monitors (at sensible sizes) exist yet and I don't know how much more demanding that res is over non widescreen 2k.
 
Last edited:
haha with people buying 100 GBP cables.. pricing's out of the window, no argument to be had there :D
embrace the new


maybe i was hallucinating, but i could consistently see how 1% lows for 4090 were consistently higher than last gen average.. and you dont need to upgrade anything other than the gpu if you game on 4k.. perhaps a 5800x3d could be a good companion purchase
Probably got more to do with the cpu that it was using rather than the card tbh.
 
ArsT talks more in-depth about testing FG, and as expected it's a mess unless at already high fps, plus the latency added is brutal. Nvidia's trying to serve an undercooked feature to pre-empt the Ampere GPU flood from crashing the sales of the 4000 series (below the x90).


Nvidia gets to make a great first impression with DLSS 3 FG in part because it's operating on a powerhouse GPU like the RTX 4090. On a 4K panel, the lowest base frame rate I can work with has come in the 40s, in part because DLSS 3 FG always has some form of DLSS super resolution enabled. You cannot apply DLSS FG to raw pixel counts, which would help me drive frame rates a lot lower. Thankfully, the Nvidia Control Panel includes a handy toggle to trick my 4K panel into accepting 8K content, which meant I could push DLSS 3 FG to its limits. At such an overkill resolution, with DLSS set to "balanced," I could get my Microsoft Flight Simulator testing environment to run at 28 fps before enabling FG, which then boosted performance to 41 fps. In this scenario, FG was still surprisingly stable on the visual front, so long as I kept my mid-flight camera steady. Moving the camera quickly, on the other hand, revealed severe, YouTube-like macroblocking errors as the DLSS system tried and failed to stitch together accurate 8K images.
An arguably bigger issue in this stress test is that FG's latency hit turns out to be either a multiple or exponential factor of the base—up to roughly 150 ms of button-click latency in my extreme FG tests, compared to 110 ms of latency in the same 8K test with FG disabled. That's 36 percent more latency with FG enabled. An additional 8K test of an Unreal Engine 5 demo, provided by Nvidia, revealed an even more extreme jump in DLSS 3 FG latency: up to 390 ms with FG on, compared to 115 ms of latency in the same 8K sequence with FG turned off.

(At one point, I tricked a preview DLSS 3 build of Codemasters' F1 22 to activate FG inside its VR mode. The result was a splotchy, stomach-turning mess whenever I looked to the left or right while driving. As a result, I do not see a future for DLSS 3 FG in VR.)
Will the base architecture that powers Nvidia's latest "OFA" system adequately scale down to a hypothetical RTX 4070 or RTX 4060 to ensure minimal macroblocking artifacts and latency hits for anyone who wants to apply DLSS 3 FG to 1440p games or even 1080p fare? The 4090's FG system exhibits some clear limits when it's pushed too far, and more affordable RTX 4000-series GPUs will certainly have fewer tensor cores to work with, despite them all being one generation newer than those in the RTX 3000-series GPUs. One telling spec count: The $899 RTX 4080 12GB model has less than half of the 4090's pool of tensor cores.

My educated guess at this point is that the RTX 4090 is so overpowered that we should temper our DLSS 3 FG expectations for other GPUs. Weaker cards' FG systems may have lower image interpolation quality, worse latency results, or both. If I had to bet about future, weaker Nvidia GPUs, I would assume their FG latency will be the thing to suffer because Nvidia reps have already made clear that they have not tuned DLSS 3 FG for competitive first-person shooters.
During testing, we were also given a counterintuitive suggestion for Nvidia systems: reverse the steps normally used to get Nvidia's G-Sync technology working on compatible screens and monitors. G-Sync users are usually told to enable V-Sync on the Nvidia Control Panel level, but doing this increases DLSS 3 FG's latency an additional 60–100 ms. Nvidia says it is working on a streamlined solution to let G-Sync and DLSS 3 FG play nice together.
 
If Jensen gets his way you're probably better off with a console as he thinks graphics cards should have an average selling price equivalent of consoles, so he thinks $500 for a mid-range GPU is fair. If he was giving away a CPU, motherboard, RAM, storage, case, and a PSU he'd have a point but i don't think he's got plans to bundle all those with graphics cards.

That would not be too bad, the problem is his midrange GPU's are not $500, they are $900.
 
Last edited:
We all know there will be a 4090Ti soon enough, say for another £300 premium (£2k for 4k yayy! :rolleyes:) which will make all the 4090 AIB cards look like Colostomy Editions.
 
4090 overkill for 3440x1440? No OLED 4K monitors (at sensible sizes) exist yet and I don't know how much more demanding that res is over non widescreen 2k.

Of course its overkill.

This card is for 4K/120fps + gamers and 4K triple screen gamers.

Everyone else buying them are probably wasting money when they should be upgrading a different component of their rig if coming from a 3080+ card.
 
Back
Top Bottom