• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

OcUK Nvidia RTX series review thread

Soldato
Joined
27 Feb 2015
Posts
12,621
yeah I consider that bonus performance, raw performance is something I expect to work on most games.

So to me performance benefits from say DLSS are not considered to be raw performance.
 
Associate
Joined
30 Sep 2018
Posts
4
yeah I consider that bonus performance, raw performance is something I expect to work on most games.

So to me performance benefits from say DLSS are not considered to be raw performance.
If Nv would not include RTX and tensor cores then "raw" performance would be probably better. Maybe they should done that because can you imagine 2080ti with twice as many shaders? It would be not only 4K 60fps card but also 120fps and people would not need to wait for new games supporting totally new technology.

https://youtu.be/63dVCMQelAI
Finally YT gameplays from normal guys starts showing up. Here far cry 5 with nearly 100 fps maxed out
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
32,618
If Nv would not include RTX and tensor cores then "raw" performance would be probably better. Maybe they should done that because can you imagine 2080ti with twice as many shaders? It would be not only 4K 60fps card but also 120fps and people would not need to wait for new games supporting totally new technology.

https://youtu.be/63dVCMQelAI
Finally YT gameplays from normal guys starts showing up. Here far cry 5 with nearly 100 fps maxed out


While I thought about this, and on the face of it, it makes even more sense since the 10nm node wasn't suitable then the best way to maximize performance would be to delay RTX and Tensor cores to 7nm

However, they wouldn't double the number of cores, the RTX and tensor cores don't take up that much space. They would maybe get another 30% CUDA cores, but that might only give 15-20% more performance. That would be enough to make more positive reviews.

But I believe what nvidia are trying to do is force a change in technology roadmaps that will put AMD off guard. It is now no longer the case that AMD "only" has to worry about performance and power efficiency, they now have to add RTX cores and something to combat the increasing use of Tensor cores. You have to start somwhere, and by releasing TRX capable card now and working with developers the environment will be very different when 7nm GPUs are released with twice the RTX power.


Nvidia are smart, and consistently make good decisons about what technologies and architectures to pursue when.
 
Soldato
Joined
27 Feb 2015
Posts
12,621
The reason I feel for the change of tact by nvidia is that pascal is almost an end game generation, in the sense its almost fast enough that one would potentially never need a GPU upgrade again or at least for a long time.

They needed a new tech to be implemented in graphics that slows down existing GPUs so people have a reason to upgrade, and raytracing is that tech. Traditionally in the past what has kept people upgrading is the increasing size of played resolutions and also more recently the higher framerates as well. The successor to 4k resolution will possibly come one day but not early enough for next gen of cards.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
The reason I feel for the change of tact by nvidia is that pascal is almost an end game generation, in the sense its almost fast enough that one would potentially never need a GPU upgrade again or at least for a long time.
Do you realise how often people have said similar things in the past...

"You'll never need more than 100 MB disk space." This was back in 1990-something :p
 
Soldato
Joined
19 May 2012
Posts
3,633
Total performance improvement in current games range from unimpressive 25% to very impressive 68% in wolfenstein 2. In the most demanding location 1080ti gets 50fps min and 61 fps average while, 2080ti 90fps min and 103 fps average (111fps with OC).

Average performance improvement is around 30% but if I would take into account just games that run with problems on 1080ti, then these particular games run around 40-50% better now.

Hellblade - 42 fps on 1080ti, 69fps on stock 2080ti
Rise of the tomb raider 59fps on 1080ti, and 75 on 2080ti
Shadow of the tomb raider 40 fps on 1080ti, 59 fps on 2080ti, and with high detail 2080ti unlike 1080ti (50fps)run that game with very impressive 80fps. expected.
The witcher 3, before 59fps on 1080ti, now 81 fps on 2080ti so basically game will run for 99% of time above 60fps on 2080ti, and on 1080ti 99% of time below that target. There are also other games like crysis 3, mass effect andromeda, the evil within 2, TC the division, the crew 2, and many other that run around 40-50fps on 1080ti and run now either above 60fps, or very close to it (very basic details tweak and 60fps locked gameplay should be already).

So I say in current games 25-68% is good performance already, 2080ti results looks very similar to GTX 1080 in SLI (the same cost as single 2080ti) and future games will show even better performance. Native HDR support (in some games 1080ti is 20% slower in HDR while 2080ti should not have problems like that), new shading features like veriable rate shading alone should improve performance by 20%, DLSSx1 shoud give additional 50% up to 100% (min fps on 2080ti in final fantasy without DLSS 18fps, and 40fps with DLSSx1), and DLSSx2 should provide SSAA quality although we still dont know how much performance cost DLSSx2 will take. DLSSx1 is rendering internally at 1440p, yet performance is much worse than 1440p, so DLSSx2 will probably cost some additional fps too compared with native 4K, although probably not as much as native 4K plus SSAAx4.

Also there's RTX feature. In star wars RTX tech demo 1080ti scores 9min-11average fps while 2080ti 45-65 fps, that's 6x improvement in performance although we still dont know how RTX performance will look in real games. According to Digital Foundry gamescon RTX tech demos were prepared in a rush for RTX GPU's and made with titan V in mind first (and that card doesnt have RTX cores). So these tech demos were probably far from being optimized, and Digital Foundry also mentioned very interesting thing, because apparently RTX effects are calculated 1:1 with native resolution now, and if developers wold just render RTX effects at 1080p and then just upscale them the end result should still look great and performance will be good even at higher resolutions.

So to sum it up, although Turing GPU's are already here we still dont exactly how fast they are, but because current games only use half of turing chip we can say for sure, with time performance gap between pascal and turing will widen.

I was personally trying to deliberate whether that FPS boost is worth the extra £350 over a 2080.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
I was personally trying to deliberate whether that FPS boost is worth the extra £350 over a 2080.

Look at the games you play mostly, then look at the fps difference between the cards you are looking at in those games. If there is less than 10fps, you probably aren't going to notice a difference. If both cards are getting over 100fps, and one is getting 140 and the other is getting 110, you probably aren't going to notice the difference there either.

Then decide whether the extra money is worth it or not.
 
Soldato
Joined
19 May 2012
Posts
3,633
Look at the games you play mostly, then look at the fps difference between the cards you are looking at in those games. If there is less than 10fps, you probably aren't going to notice a difference. If both cards are getting over 100fps, and one is getting 140 and the other is getting 110, you probably aren't going to notice the difference there either.

Then decide whether the extra money is worth it or not.


I'm gaming at 4k so the difference is mostly 10-20FPS.

Ultimately I've decided not to get it as the 1080ti/2080 are usually around 60fps on most titles so its probably easier to just down some graphics options.

Also getting a 2080ti might require me to buy a whole new PSU etc which I'd rather not as i CBA. Also its a really slippery slope... when you just add a few hundred on to every component. I probably play <10 PC games a year so I should probably calm down a little...
 
Soldato
Joined
19 Dec 2010
Posts
12,031
I'm gaming at 4k so the difference is mostly 10-20FPS.

Ultimately I've decided not to get it as the 1080ti/2080 are usually around 60fps on most titles so its probably easier to just down some graphics options.

Also getting a 2080ti might require me to buy a whole new PSU etc which I'd rather not as i CBA. Also its a really slippery slope... when you just add a few hundred on to every component. I probably play <10 PC games a year so I should probably calm down a little...

The number of games you play will increase a lot when you get VR. I had mostly stopped gaming before I got my Rift, and now I game all the time, I am either boxing or playing table tennis to keep fit or having paintball battles in Rec Room or I can explore space or the ocean or meet people in Big screen and Vrchat, and that's just scratching the surface.
 
Caporegime
Joined
18 Oct 2002
Posts
39,379
Location
Ireland


Only Asus could put an incredibly cheap blower cooler on a reference board and still manage to price it higher than the FE card that has a superior heatsink on it. The stock asus turbo 2080 isn't listed on here but the ti version is at an LOL worthy £1350. It's even priced higher than third party cards with far better coolers on them. For £1350 you get a hairdryer, and for £150+ LESS you can get cards that are barely audible.
 
Soldato
Joined
28 May 2007
Posts
3,365
Location
Saturn’s moon Titan
Only Asus could put an incredibly cheap blower cooler on a reference board and still manage to price it higher than the FE card that has a superior heatsink on it. The stock asus turbo 2080 isn't listed on here but the ti version is at an LOL worthy £1350. It's even priced higher than third party cards with far better coolers on them. For £1350 you get a hairdryer, and for £150+ LESS you can get cards that are barely audible.


The name you are paying for to me thinks.
 
Soldato
Joined
28 Oct 2011
Posts
8,416
Only Asus could put an incredibly cheap blower cooler on a reference board and still manage to price it higher than the FE card that has a superior heatsink on it. The stock asus turbo 2080 isn't listed on here but the ti version is at an LOL worthy £1350. It's even priced higher than third party cards with far better coolers on them. For £1350 you get a hairdryer, and for £150+ LESS you can get cards that are barely audible.


To be fair to ASUS though, the clue is in the name!
 
Back
Top Bottom