• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AI (frame generators) DLSS4

So we have more cuda cores than ever, faster ram (gddr7) and a wider bus 512 bit but it would appear these chips/cards can’t generation enough extra frames over the previous gen so we have dlss4 to boost performance

Reviewers seemed skeptical of these features (added latency) but is there really an issue ?

Single player games would benefit wouldn’t they whilst online games might not.

Anyone add to the situation about this
Mean you say single player games.

I'd rather say if you don't care how well you play in games then people won't care about the latency.

Even in single player games I like to play well so the added latency is a no go for me even on a controller.
 
This whole argument is pretty pointless, who in their right mind plays online completive games where latency really matters at 4k and Ultra settings even with a 4090, this new tech is designed to make games look pretty and playable.

This is how things will move forward and it have no impact whatsoever on people who need low latency, switch it off and lower settings and run at 480fps
 
This whole argument is pretty pointless, who in their right mind plays online completive games where latency really matters at 4k and Ultra settings even with a 4090, this new tech is designed to make games look pretty and playable. This is how things will move forward and it have no impact whatsoever on people who need low latency, switch it off and lower settings and run at 480fps
That's not what we've seen gaming industry evolving into so far - it's all about cost-cutting on development, with some bling added but image becomes more and more blurry and laggy. Optimisations are seen as "Who cares" thing, it seems, as even basic UE guidelines are not being followed in so many cases. Also, MFG in how it works is no different from FG. And did FG really help that much so far? As it's been said above, a game has to be already playable to use FG, FG doesn't change it - it's designed for very high refresh rate monitors to change 100FPS into 400FPS for example, but not to help average Joe push 30FPS to 120FPS, as latency will be horrible then. Upscaling people accepted but FG still seems to be mostly meh, has it's uses but not a panaceum for low performance. MFG won't change that at all in current form. Some form of reprojection or similar algorithm that doesn't increase latency might, but that's not here and won't be for years to come it seems.
 
People talking about 8k gaming already is insane.. but if its with the frame gen stuff its probably possible at a decent fps but is there much of a point if its not native?
 
People talking about 8k gaming already is insane.. but if its with the frame gen stuff its probably possible at a decent fps but is there much of a point if its not native?
A rising tide brings up all boats, people were grumbling about 4k not that long ago but it's led to greater adoption of 1440p. That's the thing about resolutions, they are dragged kicking and screaming into the future by the far high end.
If vendors start focusing on 8k, the bottom end creeps up. It's a tale as old as time

I remember way back when I registered here, people were complaining bitterly about 1080p displays, saying that nobody will ever need to go beyond 720p. And quite a few were adamant about staying on 4:3 aspect ratio too. It's a funny old business.
 
Last edited:
A rising tide brings up all boats, people were grumbling about 4k not that long ago but it's led to greater adoption of 1440p. That's the thing about resolutions, they are dragged kicking and screaming into the future by the far high end.
If vendors start focusing on 8k, the bottom end creeps up. It's a tale as old as time

I remember way back when I registered here, people were complaining bitterly about 1080p displays, saying that nobody will ever need to go beyond 720p. And quite a few were adamant about staying on 4:3 aspect ratio too. It's a funny old business.
Fair point yeah, hoping if thats the case them 4k actually becomes the norm everywhere
 
This is all wonderful but the key takeaway is - that's not DLSS4 and MFG in the form presented by NVIDIA, it's some future tech with reprojection that should make things better. Ergo, again nothing in 5k series that seems enticing to me - maybe 6k or later will be better.


IMO the monitor space is more interesting than fake frames. This tech around CRT emulation and even Pulsar makes images more fluid, cleaner and reduce blur and creates zero artifacts and has zero impact on latency
 
Last edited:
A rising tide brings up all boats, people were grumbling about 4k not that long ago but it's led to greater adoption of 1440p. That's the thing about resolutions, they are dragged kicking and screaming into the future by the far high end.
If vendors start focusing on 8k, the bottom end creeps up. It's a tale as old as time
That's not exactly accurate. 1440p has a bit higher adoption (not by THAT much though and 4k still is miniscule) because prices dropped and they became more mainstream. Sure, initial production and R&D were funded by "whales" but the pricing is main drive of average Joe, it seems. Tech has to mature for it to become cheaper and mass production drops pricing even more so.
I remember way back when I registered here, people were complaining bitterly about 1080p displays, saying that nobody will ever need to go beyond 720p. And quite a few were adamant about staying on 4:3 aspect ratio too. It's a funny old business.
4:3 still has some advantages over 16:9. Personally, I don't touch laptops that don't have 16:10 for work, 16:9 can just leave me alone. :) Big PC monitor is another matter (hence UW in my case).
 
I think what's made 4k relevant (for me at least) is the rise of 4k120 tv's becoming affordable. If I'm sat in front of a monitor, I don't really need/want it to be more than 1440p but if I'm in the living room...
Same same. I don't need 4k TV (55" few m away from sofa - couldn't tell a difference between 1080p and 4k!) but it got so affordable (even OLED) that I got one for other features than resolution (HDR etc.).
 
Last edited:
So we have more cuda cores than ever, faster ram (gddr7) and a wider bus 512 bit but it would appear these chips/cards can’t generation enough extra frames over the previous gen so we have dlss4 to boost performance

Reviewers seemed skeptical of these features (added latency) but is there really an issue ?

Single player games would benefit wouldn’t they whilst online games might not.

Anyone add to the situation about this
None of the games I play support FG or DLSS.
 
Same same. I don't need 4k TV (55" few m away from sofa - couldn't tell a difference between 1080p and 4k!) but it got so affordable (even OLED) that I got one for other features than resolution (HDR etc.).
(Affordable) Oled has been a real game changer. HDR, high fps panels. I've rewatched a ton of my old favourite films as well and it's a massive upgrade. Just need affordable GPU teach that's up to the task!
 
The AI generated frames don’t interact with mouse and keyboard inputs so I don’t class it as real fps, they also introduce smearing, blur and artifacts which isn’t something I want from a high end GPU that should be delivering a top quality image and performance experience.
 
(Affordable) Oled has been a real game changer. HDR, high fps panels. I've rewatched a ton of my old favourite films as well and it's a massive upgrade. Just need affordable GPU teach that's up to the task!
That's very true. Still, it's not the resolution, it's the panel's quality and HDR, no blur, high FPS etc. Especially if one goes for streaming, as most streaming services have rather bad 4k quality (if one can even call it 4k), as since pandemic they cut cost on that and in most cases even with face by the monitor I had trouble to see any real difference between "4k" and proper 1080p.
 
For activities that the display is a product of the user input ( like fps ) I think in the long term, frame gen is ultimately a stop-gap technology that is trying to bridge the gap until the level of raw RT compute can do without it.

That said, I think the potential spin-off is that non-user input display (watching videos) will benefit from the likes of frame gen to display higher frame rates .
 
Back
Top Bottom