I haven't insulted you, I have just pointed out that you have been living in denial since these cards were announced. When there was no mention of performance in Normal games during the Launch event some people start worrying that the performance might not be great, but, you were like, no, the performance will be amazing. Then when the interview with Tom Petersen was released where he said what the performance was going to be, you had a ton of excuses why that might not be the actual performance, and that it could be much better. Then finally the reviews came out and now it's, Oh when DLSS comes out in games these cards are going to rock. And you, and others, have also said that when drivers mature these cards will perform much better. Will come back to these two points later.
One game and one tech demo does not a card make. If this was an AMD card you would laugh at me if I recommend purchasing one based on potential future performance from a marketing slide and one game.
Which leads on to Driver Maturity. Why are people hoping for a big boost in performance from drivers? Isn't this why people buy Nvidia? On countless driver threads people are always saying that they go Nvidia because they give 100% performance from the start. I am sure there will be the usual driver increases, a few % here and there in some games, especially new games. But, those people waiting for that miracle driver boost for normal games will be waiting a long time.
Now DLSS, Before the reviews were released, DLSS was the one thing I was most excited about with Turing cards. I defended DLSS in a discussion with
@FoxEye and
@bru. Myself and Bru both had high hopes and Foxeye, well, was been Foxeye. But, since the reviews and now more info has come to light, I am beginning to think that Foxeye was right and myself and Bru were wrong. I was really looking forward to the reviews because I wanted to see what DLSS was capable of. Not one game using DLSS in any review?? Does this not raise any red flags for you? Well, it did for me. I started having doubts. Then when Hardware unboxed did their analysis of everything they knew about DLSS, I had even more doubts. After their analysis they concluded that running a game using DLSS is the same as running a game at 1800p. Then of course there was other red flags, why were they not allowed test against other forms of AA? If the performance is got because it's actually running at reduced resolution, but just it's really hard to tell the differences between it and real 4K, is it a real performance boost? For those people that want to run at the highest settings, then DLSS is just the same as reducing the settings that make very little visual difference to get more performance.
And it still worries me that there are no games using DLSS yet.
Oh, you are wrong about one thing too, For DLSS, Nvidia aren't relying on third parties at all. The performance will be entirely down to Nvidia. Developers decide if they want it or not that's all. If you turn on DLSS in a game, you will be using Nvidia's algorithms developed on their Supercomputer and using the Tensor cores on the card.
As for me defending my 1080ti? LOL Bizarre, in threads, asking for advice between the 1080ti and the 2080, I recommending the 1080Ti based on what we actually know, you are recommending spending extra on the 2080 based on unknown future performance in two techs that are only going to be used in a very small number of games.