• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia DLSS 5 months on-a win or fail?

I'm honestly amazed my upgrade to a 1070 Ti amidst the current RTX generation has yet to generate any buyer's remorse from myself.

I play at 3440x1440 and knew relatively early on that RTX would be unusable at the resolution, and the warning signs were there that DLSS wouldn't support ultrawide resolutions. Other than that, DLSS looked set to be the saving grace for the NVidia RTX line up.

I couldn't have imagined DLSS would be so poor and have such a restrictions even at 16:9 resolutions as what has transpired.

The 2060 might've been a real threat, but I see VRAM usage of 7 GB+, so that's ruled out.

As for the evidence I should've gone for a Vega 64 instead, well OC'd the 1070 Ti can equal or beat the Vega 64 in many circumstances, especially in NVidia favoured games. I also enjoy the ultra quiet and less power usage. It's like the 1070 Ti has sat in the eye of the storm and survived so far at 3440x1440, and it's picked up FreeSync compatibility in the meantime to boot.
 
They have to send their source code. Most devs will tell them to get lost.


No, they don't have to send the source code at all. They barely even need the executable, simply a 4K video with setting maxed across a few levels would suffice


You spread ignorance in every post you make, all over the forum. Quite an achievement to be so wrong so pftem on so many topics.
 
No, they don't have to send the source code at all. They barely even need the executable, simply a 4K video with setting maxed across a few levels would suffice

The DLSS model is trained on a mix of final rendered frames and intermediate buffers taken from a game’s rendering pipeline.

https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/

The velocity map and how it’s generated differ depending on each game engine. In order to support that aspect and to keep pixel jitter under control, we needed to modify parameters.

https://wccftech.com/ffxv-nvidia-dlss-substantial-fps-boost/
 
Which is all done at the driver level without ever having to look at source code. Thank you for proving my point right.

No the "AI" has to analyse the source code to optimise where and how DLSS is used. Just looking at drivers isn't enough and could be why it looks so ***** and blurs the entire scene. The "rendering pipeline" site higher than drivers.

You are trying to defend something which quite clearly is flawed and isn't going to be widely adopted.
 
Last edited:
No the "AI" has to analyse the source code to optimise where and how DLSS is used. Just looking at drivers isn't enough and could be why it looks so ***** and blurs the entire scene.


What complete and utter BS.
Teh AI is not touching the source code at all, it is not optimizing or changing the source code in the slightest.


DLSS takes a high resolution 4K 64SSAA image, along with a lower resolution image AA. This is the target image and the training image. Given a large dataset, the machine learning aims to find a set of network weights that can generate a 4K image form the low reoslution training image that minimizes the differences.

No source code needed at all, and achievable with only video. The reason an executable is used used is to control the stochastic nature of SSAA and the intermediate buffer so both target and training image have repeatable output.

This is how DLSS works:
https://arxiv.org/pdf/1609.04802.pdf


Go read that and then explain why any source code would be needed in the slightest.
 
Just read Nvidia's own FAQ ffs: https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/

"At this time, in order to use DLSS to its full potential, developers need to provide data to NVIDIA to continue to train the DLSS model. The process is fairly straightforward with NVIDIA handling the heavy lifting via its Saturn V supercomputing cluster." That data is going to be game code (an exe is not considered data in the IT world) for Nvidia to analyse and it takes MONTHS to number crunch it.
 
Just read Nvidia's own FAQ ffs: https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/

"At this time, in order to use DLSS to its full potential, developers need to provide data to NVIDIA to continue to train the DLSS model. The process is fairly straightforward with NVIDIA handling the heavy lifting via its Saturn V supercomputing cluster." That data is going to be game code (an exe is not considered data in the IT world) for Nvidia to analyse and it takes MONTHS to number crunch it.


No, you are just making up complete nonsense. no where in that link does it mention anything about source code.

Data is explicitly not source code.

keep digging that hole of yours
 
Which is all done at the driver level without ever having to look at source code. Thank you for proving my point right.

I don't know how you can read "taken from a game’s rendering pipeline" and come to the conclusion that you've arrived at.

Here's from Wizzard at TPU:

 
why does Nvidia's AI give worse image quality than normal upscaling?


dbddb4b2095984d2c9d9d2596f91ba08dd6bef013a14d8003bcbd6d1185a3b21.jpg
 
I don't know how you can read "taken from a game’s rendering pipeline" and come to the conclusion that you've arrived at.

Here's from Wizzard at TPU:


All look, you are posting more sources that back up exactly what I am saying. Thank you for proving my argument for me!



Game devs send a special build of their game to NVIDIA before release

So game devs send nvidia the compiled executable that nvidia runs, and all done before release. Exactly as I said it operated.
It is almost as if I am entirely accurate and you were wrong
 
Great performance improvement as long as you like your games covered in a THICC layer of vaseline.

To me it looks like FXAA all over again.

edit - actually no, its worse than FXAA.

 
Last edited:
Great performance improvement as long as you like your games covered in a THICC layer of vaseline.

To me it looks like FXAA all over again.

edit - actually no, its worse than FXAA.

Way worse than FXAA and Nvidia felt it was ok to release this... Goes to show how low they think of the pc gaming community.

Happy to see them getting all this negative press. They got D.P. working overtime on here :p:D
 
Back
Top Bottom