• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.

NVIDIA DLSS 3 “Frame Generation” Lock Reportedly Bypassed, RTX 2070 Gets Double The Frames In Cyberpunk 2077



While there's definitely a performance gain on older cards with DLSS 3 frame generation enabled, it should be pointed out that it wasn't without its fair share of issues. The user experienced instability and frame drops so running frame generation on DLSS 3 won't get you the most optimized gaming experience at the moment since it is designed with GeForce RTX 40 graphics cards in mind.

Guess us ampere owners are good now :p

But in all seriousness will be interesting to see how much of a hit there is to latency. Said he is also going to test on a 3080 so be interesting to see if the fps drops and crashes go away compared to the 2070 experience.
 
Last edited:
  • Haha
Reactions: TNA

Guess us ampere owners are good now :p

But in all seriousness will be interesting to see how much of a hit there is to latency. Said he is also going to test on a 3080 so be interesting to see if the fps drops and crashes go away compared to the 2070 experience.
was able to bypass a software lock by adding a config file to remove the VRAM overhead in Cyberpunk

@TNA is buckled right now!:p
 
His comments on why changing that particular vram overhead flag allows it to be enabled:

Think speed, latency and counts. Matching the requirements grants access. This is all I can share.

Of course, this guy could be talking complete **** and trolling though :p
 
As per usual, a fantastic in depth look from DF :cool:


So essentially what we thought with fg/dlss 3, it comes down to:

- type of game you play
- fps you are getting, ideally you want to be over 100 fps when using fg
- latency with vsync on in nvcp is huge if you're hitting your screen refresh rate (110ms) but with vsync off or when you're not hitting your monitor refresh rate when vsync is on, latency is hardly increased at all, only 10 or so ms higher than dlss 2

And this boys and girls is why DLSS + FG is so good:

- higher graphical settings can be used
- higher fps can be achieved

sWPAKvY.png
 
Last edited:
34 > 49 ms latency though ? Eek

Yup not ideal but as Alex said, you are gaining so much more in return for that hit:

- far better visuals
- higher fps which will look/play smoother

That is using dlss quality preset too, so you could quite easily bring the latency down by using balanced or performance mode or even just reduce some settings.
 
Yup not ideal but as Alex said, you are gaining so much more in return for that hit:

- far better visuals
- higher fps which will look/play smoother

That is using dlss quality preset too, so you could quite easily bring the latency down by using balanced or performance mode or even just reduce some settings.

Visuals might be better but I cant see how more latency would play smoother. A better comparison would have been the same latency with increased visuals/frame rate imo
 
Visuals might be better but I cant see how more latency would play smoother. A better comparison would have been the same latency with increased visuals/frame rate imo

Higher fps will give a smoother/more fluid motion/animation, good site for comparing different frame rates here:


In terms of latency feel, obviously nothing can be done there if you are above40/50ms, you will feel that extra latency if/when using m+k in a FPS shooter, so if going by tftcentrals chart:

  • Class 1) Less than 16ms / 1 frame lag – should be fine for gamers, even at high levels
  • Class 2) A lag of 16 – 32ms / One to two frames – moderate lag but should be fine for many gamers
  • Class 3) A lag of more than 32ms / more than 2 frames – Some noticeable lag in daily usage, not suitable for high end gaming
You ideally will want to be <30ms (and for those sensitive to latency <15ms), which is doable by using dlss balanced or performance or lowering settings.
 
Yup not ideal but as Alex said, you are gaining so much more in return for that hit:

- far better visuals
- higher fps which will look/play smoother

That is using dlss quality preset too, so you could quite easily bring the latency down by using balanced or performance mode or even just reduce some settings.

Based what I've heard from a couple of people, the extra image and motion smoothness more than makes up for the latency - unless it's a competitive fps game you won't notice the latency but you will notice how much smoother the game runs
 

NVIDIA GeForce RTX 4090 FE in the DXR, DLSS, Reflex and Frame Generation test – When the CPU gets stuck in the bottleneck with a rattle​




Today my focus is on the NVIDIA gimmicks. DLSS 2.3 and DLSS 3.0 together with NVIDIA Reflex and Boost are allowed to show what they can do. RTX on is the focus today without exception.
 
Last edited:
There was me thinking it was going to be another "dlss/rtx sucks" rant especially after he made the comment of dlss adding input lag a while back but conclusion:

There is nothing more to say against DXR on and when DLSS 3.0 comes into play, the whole thing is almost a no-brainer. Cyberpunk on UHD with maximum ray tracing and DLSS 3.0 quality makes 111 FPS possible. Yes, the latency is a bit higher for this, but still far from becoming a problem in terms of input latency.

Can see certain individuals getting triggered already :p :cry: ;)

I can’t see any difference here. DLSS 2.3 and DLSS 3.0 can produce a better image than native thanks to NVIDIA’s own AA (Anti Aliasing)
 
Last edited:

NVIDIA DLSS 3 “Frame Generation” Lock Reportedly Bypassed, RTX 2070 Gets Double The Frames In Cyberpunk 2077





Guess us ampere owners are good now :p

But in all seriousness will be interesting to see how much of a hit there is to latency. Said he is also going to test on a 3080 so be interesting to see if the fps drops and crashes go away compared to the 2070 experience.

Sadly looks like this workaround for cp 2077 and FG may be patched :(

That's what I'm currently trying to check. It did seem odd at first but this just seems to be CDPR's method to identify the video card. Also, all of the results I've shared come from a WIP version of Cyberpunk so it's likely they will change the video card checking method before the update goes live.
 

Bets on how many will use their screen captures of the "fake frames" to say "DLSS 3/FG sucks and this is why" even though they have said "it is very hard to see these in normal gameplay and the gameplay itself is mostly good" :cry: ;) :D

Would take their input latency results with a pinch of salt as they didn't mention about if vsync was on/off (unless I missed it?), which as proven by DF, can massively increase input lag.

Defo seems you will be wanting to keep fps above 100 fps with FG for the best experience anyway, which is what I try to aim for on my 3440x1440 display.
 
tl;dr for DLSS FG
- Worse image quality
- Worse latency
- Less sync/frame-cap compatibility
- When it's bad it also gets even worse the lower the fps/res
- On the plus side you do get a visually smoother image at >80 fps (more or less, depending on how you feel about the other degraded aspects)

 

Bets on how many will use their screen captures of the "fake frames" to say "DLSS 3/FG sucks and this is why" even though they have said "it is very hard to see these in normal gameplay and the gameplay itself is mostly good" :cry: ;) :D

Would take their input latency results with a pinch of salt as they didn't mention about if vsync was on/off (unless I missed it?), which as proven by DF, can massively increase input lag.

Defo seems you will be wanting to keep fps above 100 fps with FG for the best experience anyway, which is what I try to aim for on my 3440x1440 display.

I'll test it myself. You can't demonstrate how something "feels" through a YouTube video, it's completely subjective to the individual user.

For example, I've seen a lot of positive videos made on Nvidia Reflex over the last couple years because it reduces latency. Wanna know how much I care about reflex? Zero cares, because I've never been able to notice any difference with it on vs off, maybe I'm too old to "feel" the latency anymore
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom