Man of Honour
Nobody at HUB wears glasses, media we can trust!
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Not when FSR is in consoles.It's not good for the industry having to maintain and support both DLSS and FSR.
They should pick one or the other. Either let AMD and Intel support DLSS and abandon FSR or abandon DLSS and then Nvidia can invest heavily in FSR.
What AMD have done may be scummy but surely Nvidia keeping DLSS closed is even more scummy.
I suspect that they engine programmers spent far too much time with RT and DLSS rather than fixing actual problems. In a $250 million budget game, having so much resources dedicated to making a RT tech demo at Nvidia's behest always seemed like a very strange move. With that huge budget, any Nvidia sponsorship must have been relatively minor.Cyberpunk is a bad example since that game was messed up on launch and fsr and dlss were the least of its issues lol.
Probably never will get anyone to buy AMD for FSR, but the only way I can see FSR loosing the adoption war is if Nvidia really put the effort in to get DLSS to work on every GPU and especially the default game targets: consoles.This is a waste of AMD's time anyway, FSR will never win adoption over DLSS and nobody buys AMD for FSR.
Surely, by definition adding any support for upscalers is 100% compromsing the visual quality of their games?But do they (MS/Bethesda) really have so little regard for their art and their customers that they're willing to compromise the visual quality of their game for a few extra bucks? I guess so.
No, DLSS is in the unique position that when implemented right, the visuals can and do become better. All the big channels have covered this in video reviews already so just look them up. Essentially anything 1440P+ DLSS can show more detail, better AA, better performance than the native image render.Surely, by definition adding any support for upscalers is 100% compromsing the visual quality of their games?
At least if a dev implements DLSS they have the option of enabling DLAA which is the best AA solution available to modern games.Surely, by definition adding any support for upscalers is 100% compromsing the visual quality of their games?
(That and the poor AA which default nowadays and makes DLSS sometimes look better than poor-AA native.)
Nobody at HUB wears glasses, media we can trust!
Unsure of how good it works.Yup nvidia have an open source solution called streamline that implements their solutions all in one go i.e. reflex, dlss and FG, intel were onboard with it but not amd due to very iffy reasoning by their chief engineer (in DF interview with Alex) even though it would have benefitted just not consumers (being able to use what works best for their hardware) but also developers (do all 3 in one go as opposed to separately) and also would have benefitted amd since uptake would be larger and quicker for them.
Streamline
developer.nvidia.com
GitHub - NVIDIAGameWorks/Streamline: Streamline Integration Framework
Streamline Integration Framework. Contribute to NVIDIAGameWorks/Streamline development by creating an account on GitHub.github.com
There was a reason I put in:No, DLSS is in the unique position that when implemented right, the visuals can and do become better. All the big channels have covered this in video reviews already so just look them up. Essentially anything 1440P+ DLSS can show more detail, better AA, better performance than the native image render.
This better than native nonsense is almost 100% because the default AA is so poor.(That and the poor AA which default nowadays and makes DLSS sometimes look better than poor-AA native.)
Nobody at HUB wears glasses, media we can trust!
Like I said, just watch any reviews that deep dive into comparing them all. Once Naughty Dog fixed the DLSS = no AA issue in the recent patches, the sharpness returned in Last of Us, and DLSS offers the best IQ vs the others once again. That doesn't mean some reflections on water are perfect, but it's far better than FSR/XeSS in this game, but above all else the sharpness and AA quality is back to where it should be.There was a reason I put in:
This better than native nonsense is almost 100% because the default AA is so poor.
Better than native with a decent AA, no AA or one of those computationally expensive old-fashioned AA is unlikely. But I believe that render-ahead rendering is the technical reason why the old-fashioned AA no can no longer work with most modern engines.
It would be quite hilarious,if Starfield launches and still does better on Nvidia cards!
I would bet solid money that this is exactly what does happen
Have to say that when I tried TW3 I did not think so, but at the time I was mostly doing a bunch RT on/off comparisons. I might go back and have another look.In Witcher 3 DLSS is once again much sharper with better image quality, and that was at DLSS Performance too on a 3080 at the time, so even better on more powerful cards with Quality mode. There are lots of other games too, some out of the box, some with DLSS injected in.
And there's the problem, if we tolerate this crap things will never change and only get worse.Are AMD scummy for this, yup. Are Nvidia any different, nope. Just accept it.
This isn't an 'either/or' - FSR, DLSS and XeSS all use the same inputs from the engine and all of them have tools to aid implementation - all PC games should be supporting all of these technologies.
Once you've implemented FSR you're already 90%+ of the way there to integrate the others.
Be sure to use a later version of the dll file for DLSS with Witcher, as it ships with an old version that didn't have the later revision to use NIS instead of the legacy sharpening etc.
Edit* The bit many will no doubt miss from the HUB video:
AMD's response to a similar question question to Gamers Nexus? "no comment"... So basically a yes? A previous statement, whilst to HUB AMD said "we don't know what to say yet":
I mean these are both just boiler plate PR guff.
Just Nvidia PR dept had a better statement prepped and ready to go, while AMD's didn't.