• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia DLSS 5 months on-a win or fail?

FFXV managed to look better with DLSS. I think metro and BFV needed more time in the super computer oven tbh. I feel DLSS in the two games was rushed to market?

With FFV we know it works, so maybe DLSS will get better with times and patches?


What videos did you watch?
The faults of DLSS has been pointed out since day one.
 
FFXV managed to look better with DLSS. I think metro and BFV needed more time in the super computer oven tbh. I feel DLSS in the two games was rushed to market?

With FFV we know it works, so maybe DLSS will get better with times and patches?

The same issues pop up in the FFV game also it just wasn't picked up by the community. On a scale it has now. The excuses crops up that it isn't a full game.

DLSS will always work better in a benchmark scene.
 
Do NVidia pay to have RTX included?

Is this a know fact or is it just unsubstantiated rumour?

I've had a quick look but as of yet found nothing to corroborate it either way.

Personally I don't think NVidia or AMD for that matter have that much say in the titles they put their name too, it is down to the developing studio, or maybe the parent company in the case of companies like EA.

Well if I was a developer I wouldnt bother with it unless paid to do so. It makes no business sense, you dedicate resources to something like that that which 1 or maybe 2% of gamers will use.
 
FFXV managed to look better with DLSS. I think metro and BFV needed more time in the super computer oven tbh. I feel DLSS in the two games was rushed to market?

With FFV we know it works, so maybe DLSS will get better with times and patches?

GamersNexus video showed otherwise, FF15 the game (not demo) actually looked worse there was various loss of details on faces etc. and also shimmering whilst "not" in motion.
 
They probably are paying them. When have you ever seen a game not sponsored by Nvidia which includes support for their features? What on Earth happened to "physx"?

It will be the same story with RTX. It'll linger in a few (mostly rubbish anyway) AAA games like BFV as a novelty that very few people use, then after a while it will vanish and devs will just use an open platform alternative. Once hardware is ACTUALLY capable of running it.
 
Last edited:
FFXV managed to look better with DLSS. I think metro and BFV needed more time in the super computer oven tbh. I feel DLSS in the two games was rushed to market?

With FFV we know it works, so maybe DLSS will get better with times and patches?
At this stage it depends on how well the normal AA is implemented in the game. For ffv its not that great so dlss looks much better. For bfv the taa implementation is quite good so makes dlss look worse by comparison.
 
the killer for me is the comparisons of DLSS vs just dropping the scaling, in all the tests I've seen if you just drop the scaling down to 80%ish you get the same frame rates with a much sharper image, at which point DLSS seems like a complete waste of time :(
 
the killer for me is the comparisons of DLSS vs just dropping the scaling, in all the tests I've seen if you just drop the scaling down to 80%ish you get the same frame rates with a much sharper image, at which point DLSS seems like a complete waste of time :(



These kinds of comparisons make me think there is some kind of bug or misuse of DLSS models under some settings.

Nvidia have a state of the art DL super-resolution expertise, their results are well published and far better than standard up-scaling algorithms. So it is a bit if a mystery why in some cases DLSS is performing so badly.

Even more of a mystery why nvidia have let this go through their QA process. It is making them look bad, which is a shame because such techniques have huge potential to get us to high quality higher resolution displays. At 4K, there is an insane number of pixels but the actual information is far less

I hope to see big improvements in the next months but undeniably this is a great balls up on nvidia part. The worse part is they will now have an uphill batle to convince the skeptics that this is indeed a worthwhile approach in the future.
 
These kinds of comparisons make me think there is some kind of bug or misuse of DLSS models under some settings.

Nvidia have a state of the art DL super-resolution expertise, their results are well published and far better than standard up-scaling algorithms. So it is a bit if a mystery why in some cases DLSS is performing so badly.

Even more of a mystery why nvidia have let this go through their QA process. It is making them look bad, which is a shame because such techniques have huge potential to get us to high quality higher resolution displays. At 4K, there is an insane number of pixels but the actual information is far less

I hope to see big improvements in the next months but undeniably this is a great balls up on nvidia part. The worse part is they will now have an uphill batle to convince the skeptics that this is indeed a worthwhile approach in the future.

It's down to RTX (DXR/DLSS support) not having enough horse power. Some of us have been saying this since the launch, some just would not listen. This carry on with Nvidia is nothing new e.g. Maxwell's async compute, Pascal's HDR performance. All we can do is hope the 30 series goes some way to correcting the mess known as RTX.
 
It's down to RTX (DXR/DLSS support) not having enough horse power. Some of us have been saying this since the launch, some just would not listen. This carry on with Nvidia is nothing new e.g. Maxwell's async compute, Pascal's HDR performance. All we can do is hope the 30 series goes some way to correcting the mess known as RTX.


DLSS has absolutely nothing to do with RTX. It doens't touch the RTX cores at all. I understand your confusion because Nvidia marketed the 2 features on the smae new hardware, but they are entirely different.


DLSS uses the Tensor cores to accelerate the DLNN inference. It does this just fine, the Tensor cores in Turing are extremely powerful. the upscaling does take a few millseconds so more tensor cores would cut this down but it wouldn't make a difference to image quality.

The performance of RTX is an enitely different debate.
 


"A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions
 
Nowhere it was written or said what AMD likes or dislikes, their point of view is what is right. And what is right, is confirmed by the majority of people, where 85% share AMD's opinion.
Are you sure it is 85%? I am rubbish at math but you are even worse lol :D :D
 
Things take time. New tech, which requires a big ass supercomputer and the game code to make work properly.

As it stands at the moment, DLSS is no good for me but give it time and it will be.
 
Things take time. New tech, which requires a big ass supercomputer and the game code to make work properly.

As it stands at the moment, DLSS is no good for me but give it time and it will be.

You’ve been beant over backwards by Nvidia and your still defending their lack lustre tech?

The 2080 series has been a disgrace from the start.
 
DLSS will be a fail if Nvidia cannot produce these massive GPU dies with RTX cores in the future. DLSS is massively reliant on the ability of the RTX cores to perform the de-noising operations for both image quality and frame rate.

Currently it's not effective to use RTX and DLSS unless you have a 2080/2080Ti and those are still hugely underwhelming, compared to what Nvidia has tauted RTX and DLSS to be. There's no way they'll be able to do these massive 700mm2+ GPU dies on 7nm, the cost for wafers at 7nm is almost 2x as much as on 14nm. Something as big as the Turing GPUs would most likely cost in excess of 2x over 14nm, you'd be looking at easily £1000+ xx80 and 1500+ xx80Ti.

I'm guessing that DLSS will never be as good as people want it to be and at some point in the next few generations, Nvidia will abandon it and just focus on ray tracing. There's way too much going against it, especially with 4K being increasingly common and 8K PC gaming set to debut in the near future.
 
Back
Top Bottom