• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

OcUK Nvidia RTX series review thread

To the OP, would it be possible to have a sub-section for sites that didn't sign up to the Nvidia NDA.

I don't know about everyone else, but I never trust reviews from sites that are tied in to NDA's.

TBF,I suspect most sites would have signed the NDA unless they bought the cards themselves.

Any site that has a review out now has signed the NDA. You had to sign the NDA to get proper drivers before release to do the review.
 
Turing is about 30% faster than Pascal.

Man, TU104 is 545 sq.mm and 13.6 bln transistors and higher boost frequency, while GP102 is 471 sq.mm and 12 bln transistors.
Absolute performance difference is 8-9%.

If you normalise as per transistor and as per frequency, Turing may be actually slower!
 
Last edited:
Man, TU104 is 545 sq.mm and 13.6 bln transistors and higher boost frequency, while GP102 is 471 sq.mm and 12 bln transistors.
Absolute performance difference is 8-9%.

If you normalise as per transistor and as per frequency, Turing may be actually slower!

Fortunately we are not doing it on transistor count.
 
I don't mean to sound like a dick, but that's simply not true.

In fact, given that we can actually see the image quality in screenshots, and have several reviews that are in direct contradiction to your claim, I find your assertion to be quite bizarre.

While i'm not a games developer, I've spent nearly a decade as a professional app developer, with several of those years spent focusing almost exclusively on 3D graphics, and I can tell you categorically that taking a native 1440p image and interpolating the extra detail required to build a final result that is comparable to a native 4K image, is a fantastic achievement.

The end result is only distinguishable in stills, and even then you can't tell the difference in many of them.

This is something that could only be achieved, at least with the quality that we're seeing, by using a trained AI. Old upscaling algorithms are simply dreadful in comparison.

Here's what we know from the reviewers who've seen DLSS first hand.

- At 4K, DLSS X1 (1440p internally) can increase performance over native 4K by anything up to 50%.
- The detail loss between native 4K and 4K DLSS X1 is minimal and barely noticeable, but it does exist.
- In some aspects, particularly transparency, DLSS X1 4K is actually better than native 4K with TAA
- DLSS X2 runs internally at full 4K, has all of the AA improvements over TAA, but should still have a small performance boost due to it removing the need for algorithmic, software based AA approaches.

If you didn't use DLSS and merely dropped the resolution to 1440p, sure, you'd get the same sort of performance boost, but the picture would be hugely inferior. Especially on a 4K monitor that's poorly downscaling for you.

Nothing is as simple as "turning down the detail".


I was using the video NV released to point out the lack of detail.

If you watch and zoom in on the girl, you can see the lack of detail with DLSS turned on.
So all it is doing is lowering the detail for you to get more FPS.

https://imgur.com/a/es22dfH

Click on each picture to see the difference
 
I was using the video NV released to point out the lack of detail.

If you watch and zoom in on the girl, you can see the lack of detail with DLSS turned on.
So all it is doing is lowering the detail for you to get more FPS.

https://imgur.com/a/es22dfH

Click on each picture to see the difference

I cannot see the difference on the pad but they all look horribly compressed.
 
I was using the video NV released to point out the lack of detail.

If you watch and zoom in on the girl, you can see the lack of detail with DLSS turned on.
So all it is doing is lowering the detail for you to get more FPS.

https://imgur.com/a/es22dfH

Click on each picture to see the difference

Like I said, the end result is a very slight loss of detail when compared with native 4K, but saying that they’re merely “turning down the detail” to get a boost in performance is intellectually dishonest.

If you merely “turned down the detail” yourself, to a level that matched the performance of 4K DLSS X1, the image would look nowhere near as good as what we are seeing here.
 
So all it is doing is lowering the detail for you to get more FPS.

But it's not.

It seems like you didn't read gordyr's explanation - which from my perspective as having experience in game dev is spot on btw.
 
Does DLSS x2 still boost performance as shown with DLSS in benchmarks?

No.

It is still actually rendering a 4K image, so no performance gain. Although you supposedly get SSAA (DLSS2X to be exact) more cheaply (possibly free).
 
So the reason DLSS performance is higher rather than lower in the benchmarks is attributed to the actual lower res I guess?
 
So the reason DLSS performance is higher rather than lower in the benchmarks is attributed to the actual lower res I guess?

Yes, but the tensor cores supposedly make up the difference by guessing what a 4K image actully looks like from a 1440P image. Think of it as a very advanced upscaler.

We have yet to see how it performs outside of a what are basically cutscenes.
 
Yes, but the tensor cores supposedly make up the difference by guessing what a 4K image actully looks like from a 1440P image. Think of it as a very advanced upscaler.

We have yet to see how it performs outside of a what are basically cutscenes.

They're a step above cutscenes as they are using the game engine, it's just on rails, but it does give a decent indication. But it won't be until developers get their hands on it and we see actual playable in-game results that we really know how effective it's going to be. It's very promising technology though, and could really put the 2080 well ahead of the 1080Ti in those games that support it. There's certainly far more reason to be excited for DLSS over ray tracing, in the short term anyway.
 
Does DLSS x2 still boost performance as shown with DLSS in benchmarks?

There will still be a performance boost as the GPU is still effectively running the scene with no AA, where as pretty much all other forms of AA do use up some performance (except fxaa but thats not really AA, just a blur filter)
 
But it's not.

It seems like you didn't read gordyr's explanation - which from my perspective as having experience in game dev is spot on btw.


It looks like you never saw a side by side comparison of the two(DLSS\TAA). DLSS has loads of detail missing.
 
Back
Top Bottom