• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

Some people in here would have you believe you can take a 1080p image put DLSS 2.0 on it and get better quality than native 4k :D:cry:

What Nvidia is really good at and I have given them credit for this in the past is marketing.
 
Remember that people buying expensive GPUs tend to also have expensive monitors, which tend to offer have higher refresh rates, which DLSS can help make use of.

I really hope DLSS like tech is adopted by both Intel and AMD. Ideally FSR will go through the same evolution as Vulkan and eventually be the frame work that includes solutions from all vendors. There are already rumours that FSR is nothing more than the framework for a ML capable RDNA3, which does make sense to me.

Today, DLSS > FSR in high end, while FSR > DLSS in lower end for obvious reasons, Gsync > AdaptiveSync as I can't find a decent AdaptiveSync panel that doesn't have issues with flicker and PhysX > No PhysX as I was just thinking the other day how good Mafia II looked when I got a 980Ti.

I don't really look into the future much, because it changes all the time and corporations promise pie in the sky all the time. I look at what we have here and now. Currently, quality-wise, DLSS 2.0 (even with artefacts and other issues) wins but can be used by tiny minority of the market only. Majority sits on GPUs that can't use it. Same with RT - tiiiiny minority can use it but for most it's a pie in the sky and meaningless currently.

In regards to Adaptive Sync - I love G-SYNC Compatible program by NVIDIA. It's still AS but they test it to not have issues like flickering etc. and then post monitors' models that passed the test on their website. I bought monitor with that logo and never had any issues using it with AMD card or NVIDIA card - works great with both vendors. Wish AMD would have similar certification, but this is good enough already. Still, it doesn't make it a G-SYNC monitor, it's still Adaptive Sync, just well tested.

PhysX was in the past restricted to only NVIDIA hardware but gladly market forced them to unlock it for CPU use and here we are - most games use it with CPU just fine, these days. We don't even talk about it, it's just there and works. Almost a standard (though it's not the only physics engine that games use).
 
And that's fair. You might also not be the target of this tech and have simply no need to use it.

I have a hi resolution VR HMD and a 3080 RTX and even a 3090 or 6900 XT cannot drive the HMD at full FPS. I use it for sim games and would love to see FSR being something that can increase FPS without having to rely on reprojection. Right now reprojection on give quite a bit of ghosting or corruption that I would consider a win if FSR meant I could avoid.
 
I have a hi resolution VR HMD and a 3080 RTX and even a 3090 or 6900 XT cannot drive the HMD at full FPS. I use it for sim games and would love to see FSR being something that can increase FPS without having to rely on reprojection. Right now reprojection on give quite a bit of ghosting or corruption that I would consider a win if FSR meant I could avoid.

Do you have No Man's Sky? If so, what are your thoughts on that with DLSS in VR? I've still got an old Rift that I need to dust off during the cooler days.
 
Maybe it's just down to play style? I take a slow stealthy approach when ever possible in games. Perhaps I just don't move the mouse quick enough to notice ghosting. I've not played competitive FPS titles since the original Unreal Tournament (Freeserve and 56k modem with £600+ phone bills).

It has nothing to do with play style or moving the mouse too fast. You could be standing perfectly still while watching the world go by and you can see some animated items leaving ghost trails. A fly hovering around a waste bin leaving weird ghosting effect, car tail lights at night etc. So while these are minor issues it does happen. So when people say DLSS gives better than native IQ and is perfect and that FSR needs to live up to that or it will be a failure. Then I just think they a either delusional, misguided, or a Nvidia shill. Because quite frankly they are holding FSR to higher standards that even DLSS 2.0 fails at.

So ultimately they will be two different approaches to the same issue. Some pluses and negatives to both but ultimately a compromise that people can chose to take, or not. DLSS, FSR with associated IQ issues but a larger performance boost, or native res with lower settings but no artifacting issues.

The problem with DLSS is the proprietary nature and poor adoption rate. This is where FSR can make a big impact. If it works on the majority of GPUs and consoles along with being is easier to implement with a significantly wider user base. These facts alone will make it far more attractive to cross platform developers.
 
Last edited:
Problem is if FSR doesn't replace/disable poorly implemented TAA then you will unfortunately still get awful motion issues associated with TAA i.e. days gone, look at the back of the bike/wheel, from my experience, this is worse (it's a lot worse/more noticeable in motion) than cyberpunks dlss trailing/ghosting:

TxWy0Rn.jpg

611HEgE.jpg

f8yJDcE.jpg

WD83ykB.jpg

And for rdr 2, it's even worse, not just for motion either but also in static shots as it is incredibly blurry but it does a wonderful job of removing all shimmering and aliasing issues in rdr 2.

Recently installed nms to check out dlss and it works great again because of TAAs downfalls.
 
Well let's hope it does a better job than TAA and is widely adopted because Days Gone has TAA because it is a console port. TAA is used as a crutch and I cringe when people say Days gone is well optimised because TAA is the spawn of the devil and needs to die :)
 
Just uploaded a cyberpunk comparison of dlss quality and native here which shows the issues with "native/taa" quite well:

https://imgsli.com/NTY5MTk

Deliver us the moon has a decent TAA implementation, dlss quality vs native/taa:

https://imgsli.com/NTY5MjE

Well let's hope it does a better job than TAA and is widely adopted because Days Gone has TAA because it is a console port. TAA is used as a crutch and I cringe when people say Days gone is well optimised because TAA is the spawn of the devil and needs to die :)

Like 99.9% of games today sadly :( I despise TAA so much. Days gone is a fantastic port though tbf.
 
I clearly stated exactly where it does a better job.

I've had a great time using DLSS quality at 1440p, which uses a source of only 960p. Atrifacting has not been an issue for me.
i know you did and thats what i responded to, hey if it works for you then thats good, as always i think the more choice we have the better.
 
how can it be better than native w/aa ? literally doesnt make any sense to me at all. you are literally removing stuff to get more frames, its like comparing lossless audio to lossy how can lossless ever be better than lossy. surely it can only ever be equal to in a best case scenario.

yes its a good option to improve frames, but sayings its the same quality wise or better is just absurd.

In theory the AI algorithm can infer information which isn't there and potentially reproduce a better version of the scene - games use things like mipmaps to render textures onto surfaces which reduce the texture resolution at a distance which doesn't always produce the sharpest results, etc.

In reality I've not seen an implementation which is there yet and produces a consistently same or better than native resolution experience over an entire gaming session.
 
Just uploaded a cyberpunk comparison of dlss quality and native here which shows the issues with "native/taa" quite well:

https://imgsli.com/NTY5MTk

Deliver us the moon has a decent TAA implementation, dlss quality vs native/taa:

https://imgsli.com/NTY5MjE



Like 99.9% of games today sadly :( I despise TAA so much. Days gone is a fantastic port though tbf.

I shall remind people that first implementation of TAA (TXAA) has been done by NVIDIA, who advertised it lots as a superior AA technique. Though, they did not really invent it, it's been used in cinematography for a long time beforehand and if done well can produce great results. But doing it great is costly in performance, hence compromises had to be done.
 
AMD announced 2 days ago they are supporting RX470 and RX480, what I'm keen to know is with it being open source will someone get older cards supported as they need it more. E.g. my HD7950 would really benefit from rendering 540p then upscale to 1080p.
 
AMD announced 2 days ago they are supporting RX470 and RX480, what I'm keen to know is with it being open source will someone get older cards supported as they need it more. E.g. my HD7950 would really benefit from rendering 540p then upscale to 1080p.

I would say not supporting something in official way doesn't mean it won't work on it. It just means AMD isn't testing nor specifically optimising for these cards but likely the decision for game to let you enable it or not will lie with devs of that game, not with AMD.
 
It's funny how AMD adds support for great new features to their old GPUs. Nvidia on the other hand, took ages to add RTX Voice to older GPUs.

The 1060 launched 5 years ago. How long do you want to keep a GPU when tech such as DLSS and raytracing is now almost 3 years old? Do note that Nvidia did add software support for RT on Pascal.
 
In theory the AI algorithm can infer information which isn't there and potentially reproduce a better version of the scene - games use things like mipmaps to render textures onto surfaces which reduce the texture resolution at a distance which doesn't always produce the sharpest results, etc.

In reality I've not seen an implementation which is there yet and produces a consistently same or better than native resolution experience over an entire gaming session.

you can again do the same thing with audio, introduce outside information into an mp3 for example, but its still lossy. i doubt we will see an implementation which is able to render the same scene without the same resources to run as in that scenario it may as well be native.

theories are just that until proven and are broken all of the time, just look at the theory of relativity for example.
 
Last edited:
The 1060 launched 5 years ago. How long do you want to keep a GPU when tech such as DLSS and raytracing is now almost 3 years old? Do note that Nvidia did add software support for RT on Pascal.
5 years sounds like a long time till you realise there has only been 2 GPU generations since then and the most recent generations hasn't even finished launching its low end cards.
 
theories are just that until proven and are broken all of the time, just look at the theory of relativity for example.

Erm, I don't think you know what scientific theory is... And no, theories of relativity (both of them) haven't been "broken" - what's more, they are the most tested theories in science and worked every single time. Sans exceptions in places we don't really understand yet, like centres of black holes (we're missing a good theory of gravity), or quantum physics (which we're still far away from really understanding). Even just your GPS is relaying on them working perfectly, or it would not be able to show your location at all after just few seconds.

I suspect what you meant to say was that hypothesis must be proven before they can become a scientific theory and they're being "broken" all the time without ever becoming said theory.
 
5 years sounds like a long time till you realise there has only been 2 GPU generations since then and the most recent generations hasn't even finished launching its low end cards.

Pascal cards are still fine to play games. They support everything aside RT and DLSS. They aren't the fastest but for 1080p with high details? Plenty fast at top end (like 1080Ti). Enough vRAM too. And it's not that anyone with Pascal card can just go online or to the shop any buy ANYTHING better, is there? Thanks to Intel we had a big slow-down in CPU development for 10 years, which is now speeding up nicely again (thanks to AMD). GPUs also did not evolve much - between Pascal and Turing there are min. differences in rasterization speed, RT had 0 use till recently and DLSS 1.0 was useless too.

Fun fact - it would seem Epic in newest Unreal Engine doesn't care one bit for RT or DLSS. The default methods for both are faster, just as good (actually better in case of RT - much faster and more flexible), also "it just works" with close to 0 fiddling by game dev needed and are hardware agnostics. It would seem Epic just shown NVIDIA big middle finger to closed tech and are offering better, easier to implement and supporting all hardware methods to get same (or better) results. And yes, all of them work just fine on Pascal too.
 
The 1060 launched 5 years ago. How long do you want to keep a GPU when tech such as DLSS and raytracing is now almost 3 years old? Do note that Nvidia did add software support for RT on Pascal.

Most people play games that don't need dlss and most certainly don't have RT

Or will ever need RT
 
Back
Top Bottom