• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

2070 is a TU106

Soldato
Joined
30 Nov 2011
Posts
11,375
All your DLSS does is run an algorithm that was created by training a neural network, rather than one explicitly written by an engineer.

You might as well say "render web pages in your browser, leveraging the power of 1000 software engineers!".

mmm sort of, Turing also has Tensor cores which are optimised for this sort of work - similarly to raytracing, doing the same thing on a non-Turing card would take longer as well as taking cycles away from the main cuda cores - similar to the 6x improvement for RT, tensor cores do the "same math" roughly 12x faster than a non-tensor core equipped card, so they enable a technique that simply wouldn't be possible on a Pascal based card.
 
Soldato
Joined
1 Dec 2003
Posts
6,476
Location
Kent
This might be a bit pedantic of me, but calling DLSS an algorithm and comparing it to traditional AA techniques (which are all purely algorithmic) isn't quite accurate. While I know what you mean, it's not really the correct terminology.

DLSS is actually running a trained AI over images. In this case, each image being a single frame of any given game.

1. A game is rendered at ultra high resolution by NVidia's supercomputer, and is also rendered at a traditional resolution.
2. The AI then scans the pixels of each frame/image, comparing the high/low level versions.
3. At the end of each scan of each frame, the AI receives feedback as to the difference between the pixels it has generated and the pixels in the ultra high resolution frame.
4. These operations are run over and over again, with the AI adjusting itself each and every pass, until eventually, it gets close to the high res image.
5. The longer the AI is trained, the better the resulting output.
6. This training model is then saved and sent out to users, who then have an AI that is fully trained to make each game look as close to the ultra high resolution version as possible.
7. When the users run the game with DLSS on, the AI is engaged and scans the pixels each and every frame, using all of it's precomputed training knowledge in order to modify the resulting output.

I expect that the end result will be combined in a pixel shader, which would have been operating over each pixel in the frame anyway, so in terms of computation required at the user level, this should be very nearly completely free.

It really is a paradigm shift in terms of real-time graphics, and going forward, as the power of these AI's continue to grow, I expect that we will see this kind of technique improving the end result all over the place.
 
Last edited:
Soldato
Joined
20 Dec 2004
Posts
15,834
I don't think anyone is disagreeing here. It's a supersampling filter that's generated by a neural network, and the Turing cards have specialist cores for running the filter, so the AA workload is taken off the main rendering pipeline.

Performance win? Well, it takes the AA off the main rendering pipeline....but you've had to add silicon to do it. We'll have to see what the results are like once the NDAs are lifted. It's quite possible you'd get similar results by using the Tensor core die space for more raw GPU grunt.

I've not really looked into the API yet, frankly it's applications of the Tensor cores other than DLSS that are more interesting. Provided the benchmarks for the 2070 look decent I'll be picking one up and getting stuck into the code myself.
 
Last edited:

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
The rumours are that the 2080 is just 6% faster than the 1080Ti using benchmarking figures uploaded to various sites - anonymously of course.

https://www.dsogaming.com/news/nvid...pectively-than-gtx1080ti-in-final-fantasy-xv/

Yup so in that one game the 2080ti is 33% faster whereas the 2080 is only 6% faster. both without DLSS

You can down play it all you like, but the performance in that game is very good.
Better still look at the graphs. 2080ti Kicking the Titan V's butt.
FF-bench1.jpg
FF-bench2.jpg
 
Soldato
Joined
19 Dec 2010
Posts
12,027
Yup so in that one game the 2080ti is 33% faster whereas the 2080 is only 6% faster. both without DLSS

You can down play it all you like, but the performance in that game is very good.
Better still look at the graphs. 2080ti Kicking the Titan V's butt.
FF-bench1.jpg
FF-bench2.jpg


Well, it's only 13 to 16% faster than a non gaming card. Hardly what I would call a butt kicking.

But, anyway, all these leaks seem to be confirming one thing, that Tom Petersen's numbers were spot on when he said 35 to 45% faster than the previous gen in normal games. That puts the 2080 on par with the 1080ti and the 2080ti 35 to 45% faster, apart from the few outliers.
 
Soldato
Joined
19 Dec 2010
Posts
12,027
Well only a few more days and then one of us can come back and say "Yup I was wrong".:)

Well, it's not looking like the 2070 will be faster than the Titan Xp, as the 2080 is basically on par with the 1080Ti.

There are no good reviews of DLSS apart from a couple of canned benchmarks. But, if 2080 ends up 40% faster than the 1080ti using DLSS, that means the 2070 will be still be slower than the Titan Xp even with DLSS on?

So we are sort of still in the same boat we were a couple of days ago. DLSS is still a mystery and 2070 performance is still up in the air.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Well, it's not looking like the 2070 will be faster than the Titan Xp, as the 2080 is basically on par with the 1080Ti.

There are no good reviews of DLSS apart from a couple of canned benchmarks. But, if 2080 ends up 40% faster than the 1080ti using DLSS, that means the 2070 will be still be slower than the Titan Xp even with DLSS on?

So we are sort of still in the same boat we were a couple of days ago. DLSS is still a mystery and 2070 performance is still up in the air.

Yup on the first reviews I've read it certainly looks like your were right and I am was wrong, that's what I get for believing in CEO maths.:p

So irritating that we still have no idea about DLSS, I do have high hopes for that, more so than the ray tracing, which I'm sure will be good, but it is very early days yet.
 
Soldato
Joined
19 Dec 2010
Posts
12,027
Yup on the first reviews I've read it certainly looks like your were right and I am was wrong, that's what I get for believing in CEO maths.:p

So irritating that we still have no idea about DLSS, I do have high hopes for that, more so than the ray tracing, which I'm sure will be good, but it is very early days yet.

Well, lol, I wasn't saying you were wrong, you still might be right, He might have been talking about games and there still might be that one outlier game that the 2070 is faster in. :p

And yeah, it sucks about DLSS. Surely they could had one game ready for launch!!
 
Soldato
Joined
9 Nov 2009
Posts
24,827
Location
Planet Earth
This might be a bit pedantic of me, but calling DLSS an algorithm and comparing it to traditional AA techniques (which are all purely algorithmic) isn't quite accurate. While I know what you mean, it's not really the correct terminology.

DLSS is actually running a trained AI over images. In this case, each image being a single frame of any given game.

1. A game is rendered at ultra high resolution by NVidia's graphics card, and is also rendered at a traditional resolution.
2. The ANN(Artificial Neural Network) then scans the pixels of each frame/image, comparing the high/low level versions.
3. At the end of each scan of each frame, the ANN receives feedback as to the difference between the pixels it has generated and the pixels in the ultra high resolution frame.
4. These operations are run over and over again, with the ANN adjusting itself each and every pass, until eventually, it gets close to the high res image.
5. The longer the ANN is trained, the better the resulting output.
6. This training model is then saved and sent out to users, who then have an ANN that is fully trained to make each game look as close to the ultra high resolution version as possible.
7. When the users run the game with DLSS on, the ANN is engaged and scans the pixels each and every frame, using all of it's precomputed training knowledge in order to modify the resulting output.

Well my pedancy also got activated too!! :p

Its a not really an AI in traditional sense as most would understand it. AI is used as marketing term to cover everything nowadays. They are using hardware based neural networks to look for certain patterns in the native images to create a model which can be then applied on a per game basis.
 
Last edited:

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Well, lol, I wasn't saying you were wrong, you still might be right, He might have been talking about games and there still might be that one outlier game that the 2070 is faster in. :p

And yeah, it sucks about DLSS. Surely they could had one game ready for launch!!

Nah for the 2070 to have a chance the 2080 had to be a healthy 10-15% faster than the outgoing 1080tiand from what I've read so far it looks lucky to stretch to 5%, as you say there might be that single bench where it does it but that's not really in the spirit of the phrase "the $499 2070 is higher performance than the $1200 TitanXP."
Maybe with DLSS, when they finally get to show us what it can do.
 
Back
Top Bottom