only allowing FE reviews is so weird
Because it will show that the AIB cards are overpriced lemons but the FE will all be sold out by then.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
only allowing FE reviews is so weird
Probably got more to do with the cpu that it was using rather than the card tbh.haha with people buying 100 GBP cables.. pricing's out of the window, no argument to be had there
embrace the new
maybe i was hallucinating, but i could consistently see how 1% lows for 4090 were consistently higher than last gen average.. and you dont need to upgrade anything other than the gpu if you game on 4k.. perhaps a 5800x3d could be a good companion purchase
Pc gaming is becoming the ultimate **** take, a single component is basically the price of a half decent pc now.
Nvidia gets to make a great first impression with DLSS 3 FG in part because it's operating on a powerhouse GPU like the RTX 4090. On a 4K panel, the lowest base frame rate I can work with has come in the 40s, in part because DLSS 3 FG always has some form of DLSS super resolution enabled. You cannot apply DLSS FG to raw pixel counts, which would help me drive frame rates a lot lower. Thankfully, the Nvidia Control Panel includes a handy toggle to trick my 4K panel into accepting 8K content, which meant I could push DLSS 3 FG to its limits. At such an overkill resolution, with DLSS set to "balanced," I could get my Microsoft Flight Simulator testing environment to run at 28 fps before enabling FG, which then boosted performance to 41 fps. In this scenario, FG was still surprisingly stable on the visual front, so long as I kept my mid-flight camera steady. Moving the camera quickly, on the other hand, revealed severe, YouTube-like macroblocking errors as the DLSS system tried and failed to stitch together accurate 8K images.
An arguably bigger issue in this stress test is that FG's latency hit turns out to be either a multiple or exponential factor of the base—up to roughly 150 ms of button-click latency in my extreme FG tests, compared to 110 ms of latency in the same 8K test with FG disabled. That's 36 percent more latency with FG enabled. An additional 8K test of an Unreal Engine 5 demo, provided by Nvidia, revealed an even more extreme jump in DLSS 3 FG latency: up to 390 ms with FG on, compared to 115 ms of latency in the same 8K sequence with FG turned off.
(At one point, I tricked a preview DLSS 3 build of Codemasters' F1 22 to activate FG inside its VR mode. The result was a splotchy, stomach-turning mess whenever I looked to the left or right while driving. As a result, I do not see a future for DLSS 3 FG in VR.)
Will the base architecture that powers Nvidia's latest "OFA" system adequately scale down to a hypothetical RTX 4070 or RTX 4060 to ensure minimal macroblocking artifacts and latency hits for anyone who wants to apply DLSS 3 FG to 1440p games or even 1080p fare? The 4090's FG system exhibits some clear limits when it's pushed too far, and more affordable RTX 4000-series GPUs will certainly have fewer tensor cores to work with, despite them all being one generation newer than those in the RTX 3000-series GPUs. One telling spec count: The $899 RTX 4080 12GB model has less than half of the 4090's pool of tensor cores.
My educated guess at this point is that the RTX 4090 is so overpowered that we should temper our DLSS 3 FG expectations for other GPUs. Weaker cards' FG systems may have lower image interpolation quality, worse latency results, or both. If I had to bet about future, weaker Nvidia GPUs, I would assume their FG latency will be the thing to suffer because Nvidia reps have already made clear that they have not tuned DLSS 3 FG for competitive first-person shooters.
During testing, we were also given a counterintuitive suggestion for Nvidia systems: reverse the steps normally used to get Nvidia's G-Sync technology working on compatible screens and monitors. G-Sync users are usually told to enable V-Sync on the Nvidia Control Panel level, but doing this increases DLSS 3 FG's latency an additional 60–100 ms. Nvidia says it is working on a streamlined solution to let G-Sync and DLSS 3 FG play nice together.
Interesting. HU said he did all his testing on a Corsair 850w PSU and had no issues. Video starts at 30.48
If Jensen gets his way you're probably better off with a console as he thinks graphics cards should have an average selling price equivalent of consoles, so he thinks $500 for a mid-range GPU is fair. If he was giving away a CPU, motherboard, RAM, storage, case, and a PSU he'd have a point but i don't think he's got plans to bundle all those with graphics cards.
We will see just how bad the cost of living crisis is tomorrow at 2:00:30pm
I hope they don't sell more than 40, i really do, this needs to stop.
Are people just incapable of reading through this thread?Guys, what time does the RTX 4090 FE go on sale on the Nvidia website in the UK tomorrow?
4090 overkill for 3440x1440? No OLED 4K monitors (at sensible sizes) exist yet and I don't know how much more demanding that res is over non widescreen 2k.
Needs to, but won't. The pioneers of the 1 grand "gaming video card" are edging closer to the 2 grand mark with each new release.
Superflower 650W Leadex Gold with a 3080, 5950x, 1 each of NVME, SATA SSD and Mechanical drive, 9 fans and a 360MM AIO on a X570 Tomahawk. Zero issues even on CP2077. And that's a 10GB 3080!but but but but..... people said you would need to be upgrading to new psus too!!!!! Sound familiar @MissChief on your system using a 650w psu?