• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

OcUK Nvidia RTX series review thread

Its GFX card touting


Pretty much, just the same sad story of retailers digging the elbow in on gpu launches. Year in year out the same old song and dance. "Shortages, brexit, weak pound, supply and demand, subsidised first batch". Always something they can cite to tack on as much as they want and it's getting ******* boring.
 
But the Ti isn’t aimed at 1440P, it’s aimed at those who want 4K. The 2080 would be fine @ 1440P, and I doubt the Ti is bottlenecked at 4K.
True, but its the first time I've read about a CPU bottlenecking a GPU at 1440P and especially a top end gaming CPU, the best gaming CPU apparently. YOu can never have enough GPU power.....unless you run out of CPU to drive it :).
 
True, but its the first time I've read about a CPU bottlenecking a GPU at 1440P and especially a top end gaming CPU, the best gaming CPU apparently. YOu can never have enough GPU power.....unless you run out of CPU to drive it :).
Not really. If the monitors were a 144Hz one where people aim to push frame rate close to that rather than 60fps, it is not uncommon for the CPU to bottleneck, especially if not overclocked.

There's also games which the game engines that don't make use of all CPU cores could also lead to CPU bottleneck.

My 4670k bottlenecked my gtx 1070 at 1440p in FF15.
Is the game utilising the CPU fully though?
 
Not really. If the monitors were a 144Hz one where people aim to push frame rate close to that rather than 60fps, it is not uncommon for the CPU to bottleneck, especially if not overclocked.

There's also games which the game engines that don't make use of all CPU cores could also lead to CPU bottleneck.


Is the game utilising the CPU fully though?

It will utilise all cores yes. Its not a single threaded game.

A guy has played FF15 on a 24 core XEON and all cores were pretty well balanced utilised.

But like most games it doesnt utilise HTT, I am talking real cores. But the 4670k only has physical cores, and yes all 4 cores were briefly hitting saturation points when the game stuttered.

Just remember task manager at its polling interval will report average usage for the polling period. So if e.g. the polling period is 2000ms, and for 100ms it was 100% load, but the other 1900ms it was fully idle, it will report 5% usage. I diagnosed it by seeing cpu utilisation spikes (typically 80% or higher) matching times of stutter. Once the 8600k was put in the system the problem was resolved. Although FF15 has a fair few code problems, its not a very well optimised game, to get a completely smooth experience someone made a mod for it which reduces the required cpu load by adding optimisations that remove some cpu wastage.
 
 
Last edited:
I would pick the palit if buying a 2080, their heatsink is immense. Plus they are one of the cheaper brands.

Interesting the 2080 using more power than a 1080ti.
 
https://www.computerbase.de/2018-09/ryzen-7-2700x-core-i7-8700k-geforce-rtx-2080-ti/

Wow Ryzen 2700X games performance on GTX 1080 Ti, RTX 2080 and RTX 2080 Ti are absolutely rubbish compared to my 8700K.

Nothing new there... that's just what has been seen before at 1080p. The gap narrows at 1440p, further at 1440p UW, and at 4K there's very little between them. The 20xx series was never expected to change that.

That article provides a very narrow analysis given it ONLY shows 1080p stats, so only relevant if you game at that res.
 
4k here and very happy with my 2700x. 1080p is for ray tracing card owners. :p

Yeah 2700X at 4K is very close to 8700K, I think it even beats it in some titles... but overall I think you'd be at pains to notice side by side. New cards won't change that. Upcoming 9700K and 9900K will be interesting to see how they perform, but again at 4K there really won't be much between them.

AMD and Intel will be trading blows for the foreseeable future when it comes to CPUs in gaming, especially at higher resolutions. Now if only they could manage that for GPUs... but I won't hold my breath.
 
Yeah 2700X at 4K is very close to 8700K, I think it even beats it in some titles... but overall I think you'd be at pains to notice side by side. New cards won't change that. Upcoming 9700K and 9900K will be interesting to see how they perform, but again at 4K there really won't be much between them.

AMD and Intel will be trading blows for the foreseeable future when it comes to CPUs in gaming, especially at higher resolutions. Now if only they could manage that for GPUs... but I won't hold my breath.


AMD could make a comeback in gpu's considering the limitations they were under for several years, intel rested on their laurels and got kicked square in the plums when ryzen came out. Same thing could easily happen to nvidia, just takes the gpu r&d budget to go up and with ryzen and its offspring doing so well there's no reason why the gpu division shouldn't get more resources to work with.
 
Total performance improvement in current games range from unimpressive 25% to very impressive 68% in wolfenstein 2. In the most demanding location 1080ti gets 50fps min and 61 fps average while, 2080ti 90fps min and 103 fps average (111fps with OC).

Average performance improvement is around 30% but if I would take into account just games that run with problems on 1080ti, then these particular games run around 40-50% better now.

Hellblade - 42 fps on 1080ti, 69fps on stock 2080ti
Rise of the tomb raider 59fps on 1080ti, and 75 on 2080ti
Shadow of the tomb raider 40 fps on 1080ti, 59 fps on 2080ti, and with high detail 2080ti unlike 1080ti (50fps)run that game with very impressive 80fps. expected.
The witcher 3, before 59fps on 1080ti, now 81 fps on 2080ti so basically game will run for 99% of time above 60fps on 2080ti, and on 1080ti 99% of time below that target. There are also other games like crysis 3, mass effect andromeda, the evil within 2, TC the division, the crew 2, and many other that run around 40-50fps on 1080ti and run now either above 60fps, or very close to it (very basic details tweak and 60fps locked gameplay should be already).

So I say in current games 25-68% is good performance already, 2080ti results looks very similar to GTX 1080 in SLI (the same cost as single 2080ti) and future games will show even better performance. Native HDR support (in some games 1080ti is 20% slower in HDR while 2080ti should not have problems like that), new shading features like veriable rate shading alone should improve performance by 20%, DLSSx1 shoud give additional 50% up to 100% (min fps on 2080ti in final fantasy without DLSS 18fps, and 40fps with DLSSx1), and DLSSx2 should provide SSAA quality although we still dont know how much performance cost DLSSx2 will take. DLSSx1 is rendering internally at 1440p, yet performance is much worse than 1440p, so DLSSx2 will probably cost some additional fps too compared with native 4K, although probably not as much as native 4K plus SSAAx4.

Also there's RTX feature. In star wars RTX tech demo 1080ti scores 9min-11average fps while 2080ti 45-65 fps, that's 6x improvement in performance although we still dont know how RTX performance will look in real games. According to Digital Foundry gamescon RTX tech demos were prepared in a rush for RTX GPU's and made with titan V in mind first (and that card doesnt have RTX cores). So these tech demos were probably far from being optimized, and Digital Foundry also mentioned very interesting thing, because apparently RTX effects are calculated 1:1 with native resolution now, and if developers wold just render RTX effects at 1080p and then just upscale them the end result should still look great and performance will be good even at higher resolutions.

So to sum it up, although Turing GPU's are already here we still dont exactly how fast they are, but because current games only use half of turing chip we can say for sure, with time performance gap between pascal and turing will widen.
 
Last edited:
Total performance improvement in current games range from unimpresive 25% to very impressive 68% in wolfenstein 2. In the most demanding location 1080ti gets 50fps min and 61 fps average while, 2080ti 90fps min and 103 fps average (111fps with OC).

Average performance improvement is around 30% but if I would take into account just games that run with problems on 1080ti, then these particular games run around 40-50% better now.

Hellblade - 42 fps on 1080ti, 69fps on stock 2080ti
Rise of the tomb raider 59fps on 1080ti, and 75 on 2080ti
Shadow of the tomb raider 40 fps on 1080ti, 59 fps on 2080ti, and with high detail 2080ti unlike 1080ti (50fps)run that game with very impressive 80fps. expected.
The witcher 3, before 59fps on 1080ti, now 81 fps on 2080ti so basically game will run for 99% of time above 60fps on 2080ti, and on 1080ti 99% of time below that target. There are also other games like crysis 3, mass effect andromeda, the evil within 2, TC the division, the crew 2, and many other that run around 40-50fps on 1080ti and run now either above 60fps, or very close to it (very basic details tweak and 60fps locked gameplay should be already).

So I say in current games 25-68% is good performance already, 2080ti results looks very similar to GTX 1080 in SLI (the same cost as single 2080ti) and future games will show even better performance. Native HDR support (in some games 1080ti is 20% slower in HDR while 2080ti should not have problems like that), new shading features like veriable rate shading alone should improve performance by 20%, DLSSx1 shoud give additional 50% up to 100% (min fps on 2080ti in final fantasy without DLSS 18fps, and 40fps with DLSSx1), and DLSSx2 should provide SSAA quality although we still dont know how much performance cost DLSSx2 will take. DLSSx1 is rendering internally at 1440p, yet performance is much worse than 1440p, so DLSSx2 will probably cost some additional fps too compared with native 4K, although probably not as much as native 4K plus SSAAx4.

Also there's RTX feature. In star wars RTX tech demo 1080ti scores 9min-11average fps while 2080ti 45-65 fps, that's 6x improvement in performance although we still dont know how RTX performance will look in real games. According to Digital Foundry gamescon RTX tech demos were prepared in a rush for RTX GPU's and made with titan V in mind first (and that card doesnt have RTX cores). So these tech demos were probably far from being optimized, and Digital Foundry also mentioned very interesting thing, because apparently RTX effects are calculated 1:1 with native resolution now, and if developers wold just render RTX effects at 1080p and then just upscale them the end result should still look great and performance will be good even at higher resolutions.

So to sum it up, although Turing GPU's are already here we still dont exactly how fast they are, but because current games only use half of turing chip we can say for sure, with time performance gap between pascal and turing will widen.

Are these all quotes directly from some Nvidia Marketing Blurb??
 
I have just summed up already know data (results from reviews and user benchmarks), and I have pointed out some interesting facts and obvious future implications.

I havent included links in my previous post and maybe that's the problem but if you want to see some particular results I have mentioned in my previous post just ask me and I will direct you to that particular review.

Here for example results from wolfenstein 2, new orleans level (the most demanding level in entire game)
bemcj_arl.jpg


RTX2080-REVIEW-48.jpg



Shadow of the tomb raider, high settings minus motion blur and.impressive 83fps. Because 1080ti scores around 50fps with the same settings so clearly 2080ti results are much better
UypjIFh.jpg


And maybe you think I have made up something related to new shading features? So here you can hear some details about them:
https://youtu.be/LCT7pF6HrIg
https://youtu.be/3CD4XC1bCIY

I cant afford 2080ti but I can look at facts, and the fact is, 2080ti is extremely expensive but at the same time it's easily the fastest gaming GPU right now, and it's true performance is still unknown because currently there are no games made with turing features in mind.
 
Last edited:
Lets see games not recommended in the nvidia review guide to see raw performance jump figures.
Raw performance when only half of GPU is being used? Only new games made with turing architecture in mind will show raw performance and that's still big unknown because currently there is not even single game like that available. Turing launch is like a comedy, because reviewers cant even test product features that were presented during gamescon. Of course it's Nv fault because if they would make sure even few games would support DLSS, RTX and new shaders (and with good results) people would at least see what turing architecture is capable of.
 
Last edited:
Back
Top Bottom