• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

New DLSS 3 - Only Software Locked?

I'm not sure about that. £2K Is a significant outlay for a casual gamer. What I would struggle with is finding a reason to use it. My 3090 barely struggles but I could use the extra resources. But it what scenario would you want to boost graphical detail to a level where your latency increases so much. Just doesn't seem worth it. Unless, like suggested above, they iron out the issues?
i guess a fair chunk of buyers are in the must have category, doesnt matter what its short commings are,plus half wouldnt have a clue about latency or dlss3 issues, to sum its bragging rights, some just want the best, then you have the full time gamers, i can see a case for the fulltime gamers wanting the best fps, and tbh as with most gpu's launched, buyers tend to be beta testers, theres always bugs to iron out, pretty much most pc tech these days, :cry:
im happy with my little old 3060ti, im only a very part time gamers these day and tbh i wouldnt notice this stuff, i got a family and ajob, so limits my time:cry:
 
In terms of the massive latency spikes to 100+, yes. In terms of avoiding a 10-15ms increase, no.

dJIVIji.png


But yes, DF = shills :rolleyes:

Those options there I would be going with the DLSS2 ones as latency is low and FPS within my 4K120Hz screen limit. 155fps would mean tearing so thats a nope and the input latency looks awful with vsync. Hopefully they can sort that out in time as in theory the tech is neat. Or someone can sell me a 240Hz 4K OLED but that might have other issues when it comes to bandwidth :p
 
i guess a fair chunk of buyers are in the must have category, doesnt matter what its short commings are,plus half wouldnt have a clue about latency or dlss3 issues, to sum its bragging rights, some just want the best, then you have the full time gamers, i can see a case for the fulltime gamers wanting the best fps, and tbh as with most gpu's launched, buyers tend to be beta testers, theres always bugs to iron out, pretty much most pc tech these days, :cry:
im happy with my little old 3060ti, im only a very part time gamers these day and tbh i wouldnt notice this stuff, i got a family and ajob, so limits my time:cry:

In fairness, the 60Ti is probably the best price/performance of the lot so you haven’t done bad :)
 
Alex is like Jensen's bodyguard. You can't say anything bad about Nvidia without him complaining. :D

Well he's not wrong is he? :p

You could look at HUB as nitpicking to make it out like it's always bad and dlss 3/fg will be like that image of theirs throughout every moment of gameplay, same with their YT one:

J11sR5Q.png


Reality is that is exactly what they have done, they picked specific scenes where you won't even notice the issues/fake frames for most of the time (Tim even went on to say that in his video that this is an unrealistic situation, it can be genuinely hard to notice the issues in normal gameplay). That f1 22 was from them switching different camera angles i.e. completely throwing of the tech due to the nature of how it works where it relies on the previous and future frame to generate the middle fake one so it's not a wonder, you'll get frames like that (Tim even said this himself), Alex even showed that in his footage (so it's not a case of him hiding/ignoring it).

Ultimately, we know why they have gone for that thumbnail though i.e. more clicks/views.

As both Alex and Tim have stated though, to get the best experience from FG, you want to be pushing 100+ fps.

Obviously there is still a lot of work to be done on FG/dlss 3 as all reviewers have said, mostly around the latency issues with vsync turned on and HUD elements.

Those options there I would be going with the DLSS2 ones as latency is low and FPS within my 4K120Hz screen limit. 155fps would mean tearing so thats a nope and the input latency looks awful with vsync. Hopefully they can sort that out in time as in theory the tech is neat. Or someone can sell me a 240Hz 4K OLED but that might have other issues when it comes to bandwidth :p

Yup apparently it has been put forward to nvidia but they don't think they can fix it as it is just the way vsync works, think Alex recommendation is quite good, basically use higher graphical settings or/and dlss quality or even native res. to keep the fps below your screen refresh rate.

Or do your last option :D
 
dJIVIji.png


The DF video compared DLSS2 vs DLSS3 latencies with v-sync off at 120Hz which of course shows that DLSS3 is 155fps with a 35ms latency. However, they did not show what the latency of true 155fps would be as a fair comparison. I bet the latency will be around 15ms which is much better than the 35ms frame generation achieves. I suspect DLSS3 won't feel as good as true 155fps but haven't seen any reviews mention this.
 
dJIVIji.png


The DF video compared DLSS2 vs DLSS3 latencies with v-sync off at 120Hz which of course shows that DLSS3 is 155fps with a 35ms latency. However, they did not show what the latency of true 155fps would be as a fair comparison. I bet the latency will be around 15ms which is much better than the 35ms frame generation achieves. I suspect DLSS3 won't feel as good as true 155fps but haven't seen any reviews mention this.

I massively notice anything approaching 35ms or above - I had one of the first 4K monitors where minimum button to pixel latency was around 50-60ms at the very best and it was horrific. I don't think DLSS 3 in most cases offers anything better than a 60FPS/HZ experience regardless of the frame rate numbers and in some cases can be worse than that (some might not notice it so much or be fooled for awhile but I think in the longer run more will notice it if they have a low latency experience to compare it against).
 
Last edited:
Ultimately, we know why they have gone for that thumbnail though i.e. more clicks/views.
I don't even notice the issues until he points out exactly where the viewer should be looking.

I bet he had to go through in slow motion multiple times to even notice the difference himself

Obviously, anything that loses image clarity is bad though even the slightest blurring of the image and the smallest detail loss should be avoided in an ideal situation.

I'm sure people won't be complaining about it if otherwise they would be running at a substandard FPS though.


I think benchmarks comparisons of cards a between AMD and NVIDIA should be without any trickery though
 
Last edited:
I don't even notice the issues until he points out exactly where the viewer should be looking.

I bet he had to go through in slow motion multiple times to even notice the difference himself

Exactly and here's the funny thing, his exact words when talking about those awful issues:

obviously this is an unrealistic situation because in a game you're not stopping to view each frame but the goal of dlss 3 is to hide these frames between the real rendered frames and fake its way up to higher level of smoothness provided the frame rate is high enough and these dodgy generated frames is shown only for a short enough time, you'll probably not notice what is going on and that is true, it can genuinely be hard to spot all of these issues that can appear so glaring when you view the individual generated frames. The big problem with FG is when you are using FG at a low frame rate

:cry:

Some more end users/consumers videos/thoughts are starting to come through (such as bang4buck) now and overall, opinion seems to be pretty good, obviously it's not perfect but as long as the fps is over 100 fps, it does exactly what it set out to achieve from higher FPS, a smoother/more fluid gaming experience (which both Tim from HUB and Alex from DF have stated), ultimately this is for people with high refresh rate displays though so it will be a wasted feature on those with 60-100 hz displays.
 
Probably more appropriate to place this here too as I'm sure someone who knows more on modding games or finding their way around back end files will be able to get this working as it appears that the frame generation option could potentially be turned on in spiderman:


If people can mod fsr to use dlss, I'm sure there is a way to do the same with FG.....
 
The DLSS3 only for 4xxx cards always smacked as Nvidia trying to cover up the fact it does work on older cards. It seems to be a major selling point for the 4xxx that they are pushing and if it works remotely decent on 3xxx or 2xxx they are going to get so much stick for it.
Same as Gsync wont work without their hardware module, that wasnt quite true. If there is a will people will find a way.
 
Last edited:
The DLSS3 only for 4xxx cards always smacked as Nvidia trying to cover up the fact it does work on older cards. It seems to be a major selling point for the 4xxx that they are pushing and if it works remotely decent on 3xxx or 2xxx they are going to get so much stick for it.
Same as Gsync wont work without their hardware module, that wasnt quite true. If there is a will people will find a way.

TBF with regards to gsync hardware module, iirc, their gpus at the time didn't have the hardware capable to use adaptive sync aka gsync compatible. That and the hardware module was ahead of the game for a long time in terms of quality, freesync when it first launched had poor quality control, either low ranges or/and flickering with certain fps ranges, the gsync module also provided variable overdrive, which was a pretty big pro over freesync.
 
TBF with regards to gsync hardware module, iirc, their gpus at the time didn't have the hardware capable to use adaptive sync aka gsync compatible. That and the hardware module was ahead of the game for a long time in terms of quality, freesync when it first launched had poor quality control, either low ranges or/and flickering with certain fps ranges, the gsync module also provided variable overdrive, which was a pretty big pro over freesync.

Time will tell. In the end it wasnt needed. We dont know how well the 3xxx would cope with DLSS3, it might be quite acceptable, for some, then again it might not run well. Until some bright spark unlocks it we wont know but never say never.
 
Last edited:
Time will tell. In the end it wasnt needed. We dont know how well the 3xxx would cope with DLSS3, it might be quite acceptable, for some.

Well it was needed if you didn't want to wait 2+ years to get a good vrr experience :p

But agree on dlss 3/fg, I think it would work fine, however, latency might suffer badly.....
 
dJIVIji.png


The DF video compared DLSS2 vs DLSS3 latencies with v-sync off at 120Hz which of course shows that DLSS3 is 155fps with a 35ms latency. However, they did not show what the latency of true 155fps would be as a fair comparison. I bet the latency will be around 15ms which is much better than the 35ms frame generation achieves. I suspect DLSS3 won't feel as good as true 155fps but haven't seen any reviews mention this.
The Hardware Unboxed video mentions this. The DLSS Quality + Frame generation gave an fps of 112, but an nearly identical latency to native with an fps of 42. 18min 35s on their video
 
Last edited:
dJIVIji.png


The DF video compared DLSS2 vs DLSS3 latencies with v-sync off at 120Hz which of course shows that DLSS3 is 155fps with a 35ms latency. However, they did not show what the latency of true 155fps would be as a fair comparison. I bet the latency will be around 15ms which is much better than the 35ms frame generation achieves. I suspect DLSS3 won't feel as good as true 155fps but haven't seen any reviews mention this.
The funny thing is that in reality those latency numbers are even worse for FG because without it you can further reduce latency to a dramatic degree with SpecialK which puts even Reflex to shame, so for people who are sensitive to latency and who'd take that extra step you are at the very least suffering double the latency with DLSS FG.

I would never be playing games without Reflex or a frame rate limiter configured to minimize latency - so the comparison is not a valid one, in my opinion.
Since games will queue up to three frames in advance, by default, it's easy to make it look like DLSS3's latency is negligible in that scenario.

I'll reuse some images from a post of mine a month or so ago.
I was using Gone Home in this example, but the game is irrelevant.
PresentMon latency stats are at the top-center of the images.
Now, latency is a complex thing and I'm only looking at PresentMon stats here, not end-to-end latency - so this doesn't tell the whole story.
But whatever the game's latency is, you're going to have a couple of frames latency added on top of that by DLSS3 (minimum 17ms at 120Hz).

If we take the starting point of 40ms latency, use Reflex to bring that down to 8ms, and then add two frames for interpolation (17ms) - sure, we end up with lower latency than we started with, at ~25ms.

But my starting point was using Latent Sync for <1ms latency, and now this will be adding at least 17ms.
That's like downgrading my current TV's latency to that of one 10 years old.

Even if you were to start with just NVIDIA's Low Latency mode enabled, DLSS3 would be doubling the latency.
 
Last edited:
Back
Top Bottom