• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

https://wccftech.com/nvidia-rtx-2080-ti-2080-2070-are-40-faster-vs-pascal-in-gaming/

HotHardware :
“What can gamers expect? Let’s say a gamer has a 1080 now, could they expect a 2080 to be a faster card? Is the experience with current games going to be better?”

NVIDIA Director of Technical Marketing, Tom Petersen:
“That’s a great question I think we could’ve done a little bit better on during the public announcement. Turing is a beast. It’s going to significantly improve the gaming experience on old games and it’s going to rock it when you adopt new technology…”

“We did share some data that showed a bunch of games and you’ll see the perf [performance] roughly somewhere between 35 to 45 percent better at roughly the same generation. So, 1080 Ti to 2080 Ti and of course that’s going to vary based on the game and based on the setting.”

Transcript by wccftech.com
 
I'm not ignoring that at all lol... I'd have thought it goes without saying that you can turn things up and down. A 1080TI can not "barely do" 4K, it's very game dependent and some run quite comfortably even with things turned down only slightly. Yes of course you will have to turn RTX settings down on a 2070/2080 vs a 2080Ti. Really, I need to say that?? I never said anything about running balls to wall maxed out on every GPU.

The point is, the indication from certain 'leaks' and the way many people are talking RTX down, is that IF a 2080Ti could only just manage 60FPS at 1080p, then it will really struggle at 1440p+... and a 2070/2080, even with details turned down, won't stand a chance (and many 2070/2080 buyers will own ultrawides and 4K monitors). So the performance WILL be better than is being suggested. Maybe not immediately, but in time (months) when devs have had time with the tech and drivers have matured.

I'm well aware how hard ray tracing is to pull off, but I don't for one second believe Nvidia would spend a decade developing this technology and then release it half baked. There's just no reason they would do that... it's not like they are under any competition pressure.

I am sorry, I just don't understand. You say you know about the power needed, that you can turn down settings etc. But, then you refuse to accept that Nvidia would release a card only capable of doing 60fps at 1080p. If you really understood how much power real time Ray Tracing needed, you wouldn't be as upset with the figures. And they aren't releasing it half baked. They are starting on the road to making Ray Tracing mainstream. To put it to you another way, it's only now we are getting GPU's capable of running 4K properly. Ray Tracing needs way more computational power, and yet people expect it to run at 4K? This is a start, that's all.

And I have no doubt that the games will run better on release, but they will sacrificing full Ray Tracing for performance. They already demoed Enlist running at 4K @ 60fps, using Vulkan and just showing Global Illumination.
 

damn it... that was the wrong question imo.... we all know the 2080 is going to blow the 1080 out of the water. of course it is, more cores, faster ram etc etc etc..... but the 2080 is a far more expensive card, more so than the 1080ti.

I reckon what most want to know is will the experience with current games be worse, on a par with or better than a 1080ti.... because that is the card which is closer on price parity. :(
 
damn it... that was the wrong question imo.... we all know the 2080 is going to blow the 1080 out of the water. of course it is, more cores, faster ram etc etc etc..... but the 2080 is a far more expensive card, more so than the 1080ti.

I reckon what most want to know is will the experience with current games be worse, on a par with or better than a 1080ti.... because that is the card which is closer on price parity. :(

The AdoredTV video pretty much covers that, and now this interview with the Nvidia guy, pretty much means that it's the 2080 going to be trading blows with the 1080ti, which means the 2070 isn't going to come close to a 1080ti, as was the case with the 1070 vs 980Ti.

I'm more looking forward to picking up a cheap 2nd hand 1080ti in the future than any of these new 20 series lol
 
damn it... that was the wrong question imo.... we all know the 2080 is going to blow the 1080 out of the water. of course it is, more cores, faster ram etc etc etc..... but the 2080 is a far more expensive card, more so than the 1080ti.

I reckon what most want to know is will the experience with current games be worse, on a par with or better than a 1080ti.... because that is the card which is closer on price parity. :(
Getting scary this but I agree with @melmac again and think that the 2080 will win against some games and slower in others. I imagine quite a big win if using DLSS.
 
I am sorry, I just don't understand. You say you know about the power needed, that you can turn down settings etc. But, then you refuse to accept that Nvidia would release a card only capable of doing 60fps at 1080p. If you really understood how much power real time Ray Tracing needed, you wouldn't be as upset with the figures. And they aren't releasing it half baked. They are starting on the road to making Ray Tracing mainstream. To put it to you another way, it's only now we are getting GPU's capable of running 4K properly. Ray Tracing needs way more computational power, and yet people expect it to run at 4K? This is a start, that's all.

And I have no doubt that the games will run better on release, but they will sacrificing full Ray Tracing for performance. They already demoed Enlist running at 4K @ 60fps, using Vulkan and just showing Global Illumination.

Unless you have worked on the Nvidia Turing team for the past decade, I don't see how you can speak to the specific performance capabilities of these cards. From a pure common sense and business point of view, if Nvidia release a range of GPUs having waxed lyrical about their amazing new ray tracing capabilities... yet only those who fork over £1200 for the top end model and then downgrade to 1080p monitors can actually enjoy it on any meaninful level... this just won't work out well for them. So my view is that performance at 1440p/ultrawide/4K will be a lot better than many people seem to fear, and more than playable/enjoyable (even if it does require some settings to be turned down), that is all.
 
I guess I am wrong... however just a wild thought which you can blow out of the water.
given that the ray tracing is done by "RT cores" and not the cuda cores.... I dont understand why the resolution is so limited.

shouldnt the "RT cores" be able to do the raytracing independant of the cuda cores doing all the other stuff (and independent again from the tensor cores possibly doing DLSS?)

looking at it from my simplistic view..... if the cuda cores can pump out 4k 60fps with Ray tracing off, and if ray tracing on the fps is 45 - 60fps...... but using a separate part of the gpu...... in theory should it not be possible to decouple those 2 things, so perhaps the rays are only running at 45fps (lets make it locked to 30 so it divides nicely into the 60 fps of the game) so in effect the lighting just runs at a lower framerate than the game.

or is this not possible (i am guessing not seeing it is not how it works)

The ray tracing is done on a per pixel basis. For each pixel displayed on screen, a number of rays are cast into the scene to calculate exactly what colour the pixel should be. At 4K vs 1080p, either the number of rays cast per display pixel would need to be 1/4 of the rays used at 1080p to maintain performance, or the frame rate would have to be cut to 25%. The more rays/pixel or the higher the resolution, the more processing power required.
 
I honestly think DLSS will be the big thing for Nvidia here, especially on the 2080 and 2070.
See what I am wondering is, in a scenario where one uses DLSS instead of traditional AA and not bother using ray tracing, then there would be a nice gain right there and a IQ improvement to boot.
 
5gwo9dcm1rh11.jpg
 
Sorry but I don’t believe anything nvidia says some things fishy about the performance of this card no benchmarks to back anything up..
 
Tom Petersen isn't that confident. What do you know that he doesn't?

Quote from the interview

HH: Is there a case where 2080 might outperform 1080 Ti?

TP: I think so. I would expect some cases where 2080 would beat 1080 Ti but I don't have the data in front of me.

and combined with his statement about generational performance, in standard games it's looking like the 2080 will be less than 10% faster than the 1080ti.

It's all speculation at this point. It's no different than fiancial experts who often get things wrong. I have no idea who TP is, sorry.
What do I know? The balance of probabilities tells me the chances are it will be probably be better and someone who should know better than most of us stating that AMD will struggle to match the 1080 Ti let alone the 2080 and 2080 Ti. Of course they may be wrong too but I will be very suprised if the 2080 isn't a decent amount faster than the 1080 Ti which itself is already about 35% faster than the 1080 was.
 
Unless you have worked on the Nvidia Turing team for the past decade, I don't see how you can speak to the specific performance capabilities of these cards. From a pure common sense and business point of view, if Nvidia release a range of GPUs having waxed lyrical about their amazing new ray tracing capabilities... yet only those who fork over £1200 for the top end model and then downgrade to 1080p monitors can actually enjoy it on any meaninful level... this just won't work out well for them. So my view is that performance at 1440p/ultrawide/4K will be a lot better than is being suggested.

The performance figures aren't coming from me. One of The most experienced Developers with RTX has come out and said that they hope to have the game running at 1080p at 60 fps using full Ray Tracing by next February!!

Why would people need to downgrade to 1080p monitors? You know that, firstly, 4K monitors and 1440p monitors can display 1080p right? and secondly, back to settings again, you will be able to turn down the settings to a playable level, like the Enlist Developer did with his 4K demo and thirdly, they will be using a combination of old and new technologies to get better frame rates instead of doing it all using Ray Tracing.

You are basing your figures on nothing at all. Just like when you called us all stupid because you couldn't believe that the cards would be called RTX and that there was no way they would have Ray Tracing.
 
Back
Top Bottom