Didn't see that at all through the nerd rage, sorry
Haha no worries
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Didn't see that at all through the nerd rage, sorry
Do what I do, redirect it to the local post office. I've done this for years and it's never been a problem. The hassle of picking it up far outweights lazy drivers.At this time in 14 days, The DPD man will be driving away, having taken a photo of my front door, without even attempting to deliver my new 5090 and I'll be punching the walls and screaming blue murder (happens every time!)
DLSS upscaling is really good especially when running quality and to a lesser extent balanced. You render the game at a lower resolution and then upscales to a higher res.I'm not technical enough but my assumption which could be totally wrong is that DLSS can cause the artifacting but the frame generation then makes it worse because it is using that as its base for the frames it creates.
We have to see how it is, we have limited knowledge, to me cyberpunk generally looks good from what we have seen in the videos but when watching PC Centric, in the live chat he said he wasn't very impressed with Black Myth Wukong and said the artifacting was bad with the hair.
Yeah i understand how DLSS and Frame gen work seperately.. I just wasn't sure whether Frame gen generates frames from the native image or the DLSS image when both are turned on... I assume it generates frames from the DLSS image which if that has any artifacting/errors then it would mean it would get even worse with FG plus the artifacting FG causes by itself anywayDLSS upscaling is really good especially when running quality and to a lesser extent balanced. You render the game at a lower resolution and then upscales to a higher res.
Frame gen using AI to add frames in between 2 normally rendered frames and the AI is trying to predict what happens in between and drop in extra frames, the time gap between the 2 normally rendered frames which is where your latency and normal FPS is derived will actually increase slightly because the extra frames being inserted have a slight cost. The AI frames are pretty bad and have a lot of artifacts but if you have the normal frames high enough then this will mask the artifacts to some extent.
That in itself doesn't say anything unless you also state resolution. DLSS Quality in 1440p is just 960p, which has plenty of issues in modern games that DLSS can't fix (TAA smearing details on effects like foliage, textures etc., large noise and lacking details in RT etc.). Anything below Quality and you quickly end up below 720p, which is rather tragic IMHO. 4k is a different story.DLSS upscaling is really good especially when running quality and to a lesser extent balanced. You render the game at a lower resolution and then upscales to a higher res.
This also depends on resolution - if it's DLSS upscaling from 960p (or lower!), already semi-smearing and with lower details, then you add FG to it, which just amplifies these issues and resulting quality is... quite meh, latency aside. In 4k, a different story, where latency hit is the biggest problem.Frame gen using AI to add frames in between 2 normally rendered frames and the AI is trying to predict what happens in between and drop in extra frames, the time gap between the 2 normally rendered frames which is where your latency and normal FPS is derived will actually increase slightly because the extra frames being inserted have a slight cost. The AI frames are pretty bad and have a lot of artifacts but if you have the normal frames high enough then this will mask the artifacts to some extent.
It's from DLSS image, as it's added after DLSS already did its job, to the final image. Then again, NVidia said they will give all DLSS enabled cards new algorithm that should much improve the quality - we'll see how that works.Yeah i understand how DLSS and Frame gen work seperately.. I just wasn't sure whether Frame gen generates frames from the native image or the DLSS image when both are turned on... I assume it generates frames from the DLSS image which if that has any artifacting/errors then it would mean it would get even worse with FG plus the artifacting FG causes by itself anyway
I was thinking more about the DLSS package in general, being able to be upgraded and work all the way back to 2xxx series, to its main functions it had at the time. The games coverage, easiness of upgrade and constant improvement represents (to me), that fine wine.K
One thing to take into account though is the 5090 will be running DLSS4.0 which improves FPS by 15% or so even for a x2 FG vs the 4090 running the older DLSS3.0
The biggest problem with fake frames is the very concept. The whole point of turning up resolution, effects, level of detail, shadow quality, lighting quality, and so on, is so that each rendered and displayed frame looks better.
Why would I then insert garbage frames in between them? How much of that extra fidelity is kept in each artificial frame? By definition it's lower quality than the rendered frame.
And since when did latency become ok to compromise on? Around the time I bought my first 144hz monitor 11-ish years ago, with freesync, the gaming community wouldn't shut up about latency. 'Set the frame limiter to your max refresh minus 3 frames for the best latency', 'it's all about frametime', framepacing was in there as well (SLI was still around).
Maybe I'm getting old, but I don't get it anymore. It seems more important to play at 'Max' settings with entirely inappropriate real-time lighting, at fake resolutions with fake frames...
(I'm not a bitter old man, I swear )
Thanks for confirming, i assumed it was that way but wasn't 100% sureIt's from DLSS image, as it's added after DLSS already did its job, to the final image. Then again, NVidia said they will give all DLSS enabled cards new algorithm that should much improve the quality - we'll see how that works.
Who cares! 200fps is 200fps. Bigger bar better. WOO!I'm interested in a comparison of latency at true 200fps vs frame generated 200fps. Most reviewers fail to highlight the fact that there will be a pretty big difference. While you may see 200 frames, the game will be similar in feel to the base frame rate. A true 200fps will feel much more responsive.
60 FPS is 16.7ms. 6.9ms for 144, and 4.1ms for 240Hz, FG at 240 with a 60 base fps will be around 20ms so almost a 5x increase in latency than running native 240hz.I'm interested in a comparison of latency at true 200fps vs frame generated 200fps. Most reviewers fail to highlight the fact that there will be a pretty big difference. While you may see 200 frames, the game will be similar in feel to the base frame rate. A true 200fps will feel much more responsive.
60 FPS is 16.7ms. 6.9ms for 144, and 4.1ms for 240Hz, FG at 240 with a 60 base fps will be around 20ms so almost a 5x increase in latency than running native 240hz.
60 FPS is 16.7ms. 6.9ms for 144, and 4.1ms for 240Hz, FG at 240 with a 60 base fps will be around 20ms so almost a 5x increase in latency than running native 240hz.
Play a first person shooter on your AW3225QF with 50ms latency on M&K and tell me, with a straight face, that it feels 'not that bad'. Go on.Would matter to me of i played fps games online or something. Single player games which many I play with a controller, not so much of an issue.
I come from 8bit cartridge console games where everythingwas crap. 50ms latency really is not that bad imo.
Probably helped that TVs back in the 50s also had a ton of delay & latency too.Would matter to me of i played fps games online or something. Single player games which many I play with a controller, not so much of an issue.
I come from 8bit cartridge console games where everythingwas crap. 50ms latency really is not that bad imo.
I'm interested in a comparison of latency at true 200fps vs frame generated 200fps. Most reviewers fail to highlight the fact that there will be a pretty big difference. While you may see 200 frames, the game will be similar in feel to the base frame rate. A true 200fps will feel much more responsive.