• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

It's all speculation at this point. It's no different than fiancial experts who often get things wrong. I have no idea who TP is, sorry.
What do I know? The balance of probabilities tells me the chances are it will be probably be better and someone who should know better than most of us stating that AMD will struggle to match the 1080 Ti let alone the 2080 and 2080 Ti. Of course they may be wrong too but I will be very suprised if the 2080 isn't a decent amount faster than the 1080 Ti which itself is already about 35% faster than the 1080 was.

He is the technical marketing director at Nvidia.
 
See what I am wondering is, in a scenario where one uses DLSS instead of traditional AA and not bother using ray tracing, then there would be a nice gain right there and a IQ improvement to boot.

Well, I was going one step further, will DLSS be able to upscale a scene rendered using Ray Tracing at 1080p to 4K? Probably not.
 
The performance figures aren't coming from me. One of The most experienced Developers with RTX has come out and said that they hope to have the game running at 1080p at 60 fps using full Ray Tracing by next February!!

Why would people need to downgrade to 1080p monitors? You know that, firstly, 4K monitors and 1440p monitors can display 1080p right? and secondly, back to settings again, you will be able to turn down the settings to a playable level, like the Enlist Developer did with his 4K demo and thirdly, they will be using a combination of old and new technologies to get better frame rates instead of doing it all using Ray Tracing.

You are basing your figures on nothing at all. Just like when you called us all stupid because you couldn't believe that the cards would be called RTX and that there was no way they would have Ray Tracing.


I'm not giving 'figures', I'm merely suggesting performance will be better than is being suggested... and I'm not basing that on nothing. Unless you think common business sense is nothing. What is meant by "full ray tracing" exactly? That doesn't mean games won't be employing it and looking vastly improved in certain areas... and running well at resolutions above 1080p.

No you're right people won't need to buy 1080p monitors, but running their expensive ultrawides or 4K monitors at 1080p is hardly something most people will want to feel they have to after spending more than £1K on a GPU! And yes, I keep acknowledging settings can and will be turned down... you don't need to keep repeating that.

What I said before has absolutely nothing to do with this, and I acknowledged that was wrong before the announcement once the leaks made it abundantly clear RTX was the way they were going. But again, completely irrelevent to this topic.
 
Those saying you can't run 4k/60fps, I'm assuming you are meaning maxing out settings?

I know some people start having palpitations and breaking out into cold sweats if they hover over the lower settings options in the graphics menu, but I'm sure a 1080ti with modified settings, not much below the highest can easily achieve 60fps @ 4k. Hell I'm not that far off with my 980Ti, and a 1080Ti is what? another 30-40% faster?!

One would assume the minimum and not the average, average still means your getting drops below 60fps, which is not desirable, or acceptable for many, some games even struggle to get an average.

4K gaming isn't quite there yet, although saying this seems to really anger a lot of 1080Ti/Titan users, I can only guess as to why.
 
In that case he's sandbagging :p.
IMO, the 2080 will be a decent amount faster than the Ti. But then that's kinda what he's saying too. "in some cases". I suspect in some cases will be more frequently than not.

Leaks seem to confirm that the 2080 will be less than 10% faster than the 1080Ti, the Technical director says that the cards will be 35% to 45% faster than the previous generation. So that puts the 1080ti and the 2080 basically on par with each other. And lastly, He says that "in some cases" and mentions something about not having figures, in the biggest launch of the last 2 years and he doesn't know exactly what the performance is?

Darren, I have a bridge to sell you.
 
Leaks seem to confirm that the 2080 will be less than 10% faster than the 1080Ti, the Technical director says that the cards will be 35% to 45% faster than the previous generation. So that puts the 1080ti and the 2080 basically on par with each other. And lastly, He says that "in some cases" and mentions something about not having figures, in the biggest launch of the last 2 years and he doesn't know exactly what the performance is?

Darren, I have a bridge to sell you.
Sorry, you're just speculating. Lets wait and see huh? People going round and round in circles with the same speculative rubbish, reading too much into every "leak" and word spoken :D.

Either way, I'll be surprised if you don't all want one when it comes to crunch day :D.

As I said above "a decent amount faster" is not slower is it? 10% as you said is faster but I think it'll be better than that, but maybe not more than 20%. And the 80 Ti being 30%+ faster than the TXP I fully expect. Like I said you can apply the balanced of probablilities to this.
 
Last edited:
So Nvidia state on the record 35-45% faster - Nvidia are a US company - US have FTC Act - FTC Act states it is illegal to mislead consumers - IF Nvidia are lying they are are in violation of FTC Act - some people still think that Nvidia are stupid and don't consult their lawyers before making public statements and still insist on spreading conspiracy theories EVEN THOUGH Nvidia are now on record saying 35-45% performance boost on Pascal.

It is almost as if there are a bunch of paid shills from AMD in this thread spreading FUD....
 
Sorry, you're just speculating. Lets wait and see huh? People going round and round in circles with the same speculative rubbish, reading too much into every "leak" and word spoken :D.

Either way, I'll be surprised if you don't all want one when it comes to crunch day :D

I am not speculating anything, I am basing my figures on the what the marketing director of Nvidia says. Or is he just speculating too?
 
So Nvidia state on the record 35-45% faster - Nvidia are a US company - US have FTC Act - FTC Act states it is illegal to mislead consumers - IF Nvidia are lying they are are in violation of FTC Act - some people still think that Nvidia are stupid and don't consult their lawyers before making public statements and still insist on spreading conspiracy theories EVEN THOUGH Nvidia are now on record saying 35-45% performance boost on Pascal.

It is almost as if there are a bunch of paid shills from AMD in this thread spreading FUD....

LOL what the hell? 35 to 45% faster would put the 2080 on par with the 1080Ti. How does make people paid shills for pointing it out?
 
LOL what the hell? 35 to 45% faster would put the 2080 on par with the 1080Ti. How does make people paid shills for pointing it out?
Did I say you? I guess if the hat fits....

And no it doesn't put the 2080 "on par" with 1080Ti it puts it generally faster than 1080Ti and "on par" with Titan XP (shown in benchmarks and now backed up by Nvidia public statement in interview). So I guess your head is nice and warm after all...

According to: http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-Nvidia-GTX-1080/3918vs3603

Average FPS increase between 1080Ti and 1080 is 14% with max overclocked bench at 23% faster - so 35-45% is significantly faster than 1080Ti (around 12-22% vs 1080Ti max overclock & around 21-31% fps increase) - but yeah keep making crap up because it so much easier than using facts...
 
Did I say you? I guess if the hat fits....

And no it doesn't put the 2080 "on par" with 1080Ti it puts it generally faster than 1080Ti and "on par" with Titan XP (shown in benchmarks and now backed up by Nvidia public statement in interview). So I guess your head is nice and warm after all...

LOL Epic fail. I asked you "how does that make people paid shills for pointing it out?" didn't mention myself at all.

And 35 to 45% puts it faster in some cases slower in others. Which is exactly what the guy you are quoting said in his interview.
 
LOL Epic fail. I asked you "how does that make people paid shills for pointing it out?" didn't mention myself at all.

And 35 to 45% puts it faster in some cases slower in others. Which is exactly what the guy you are quoting said in his interview.
At 35-45% faster it is at worst 12% faster than 1080Ti (see my previous response for data)
 
or the frame rate would have to be cut to 25%

That's not true because in a given frame, more than just ray tracing is done. The performance of the ray tracing part of frame generation would be cut to 25% yes, but not the overall frame rate.

As for the portion of frame generation time that ray tracing takes up, well that's anyone's guess.

Theoretically, given that the RT cores are dedicated, it might even be possible to do in parallel with the rasterisation portion of frame generation, who knows?

We would need to understand more about the architecture to know anything like that for certain.
 
I am not speculating anything, I am basing my figures on the what the marketing director of Nvidia says. Or is he just speculating too?
Given that he said he didnt have the figures to hand, I would say yes, bit of speculation and you're coming to speculative conclusions over a few words not certainties and also over "leaks".
Ever head of sandbagging? It's where a company will underplay something.
Don't get me wrong I'm not expecting big numbers but I thought this was more about the 2080 being the same as the old Ti? I'm saying it will be a decent amount faster. 10%, 20%. I'd go with more like 20% myself.
But it is all just speculation :D.
I'll be happy to be proven wrong but as I said, I'm applying a bit of simple logic, balance of probabilities to my conclusion. I don't need to look at all the fake leaks out there and a dodgy interview with someone who says they have no exact figures to hand. It's making me laugh how the same discussions are still happening over a week later - pure speculation.

So what about when the tensor cores are put to good use with new games? we have no idea how they're going to perform vs the "old" cards. We do know visuals are going to be improved at least.
Regardless of what people currently say, the serious amount of speculation and interest the cards are drumming up shows most people want them. Otherwise, people really wouldn't care, right?

I think many are hoping the reviews are bad rather than expecting them to be. If things look pretty decent then folks have to decide whether to splash out or keep to the "old" stuff :P.
 
Last edited:
That's not true because in a given frame, more than just ray tracing is done. The performance of the ray tracing part of frame generation would be cut to 25% yes, but not the overall frame rate.

As for the portion of frame generation time that ray tracing takes up, well that's anyone's guess.

Theoretically, given that the RT cores are dedicated, it might even be possible to do in parallel with the rasterisation portion of frame generation, who knows?

We would need to understand more about the architecture to know anything like that for certain.
I think what would potentially make this very interesting would be if due to the use of the tensor cores, they are able to use AI/Machine Learning to predict raytracing ahead of frame for x% of rays reducing the time needed to render the rays in real time. I don't know if they are exploring this but it could even mean that game performance increases over time (as the AI predictions become more accurate based on learning from historical data). There may well be potential in these cards that has not even been realised yet - DLSS might be only the beginning for the tensor cores.
 
That's not true because in a given frame, more than just ray tracing is done. The performance of the ray tracing part of frame generation would be cut to 25% yes, but not the overall frame rate.

As for the portion of frame generation time that ray tracing takes up, well that's anyone's guess.

Theoretically, given that the RT cores are dedicated, it might even be possible to do in parallel with the rasterisation portion of frame generation, who knows?

We would need to understand more about the architecture to know anything like that for certain.

What is not true? To raytrace a 4K scene with the same number of rays/pixel as 1080p requires 4x the number of rays per frame than are required at 1080p and therefore requires 4x the performance from the RTX cores.

Without raytracing bfV will run easily at 144fps @1440p let alone 1080p on an RTX2080Ti. It is clear that the ray tracing pass is a bottleneck to frame generation from the demos. Asking the RTX cores to do 4x the work is not going to improve those frame rates. As for the AI tensor cores, they're alreday using them
 
What is not true? To raytrace a 4K scene with the same number of rays/pixel as 1080p requires 4x the number of rays per frame than are required at 1080p and therefore requires 4x the performance from the RTX cores.

Without raytracing bfV will run easily at 144fps @1440p let alone 1080p on an RTX2080Ti. It is clear that the ray tracing pass is a bottleneck to frame generation from the demos. Asking the RTX cores to do 4x the work is not going to improve those frame rates. As for the AI tensor cores, they're alreday using them
But having setting which allow you to change the rays/pixel ratio will allow for performance tweaks. So if you have a 1080p scene with (max) 8 rays per pixel RT but then have a 4k scene with (max) 2 rays per pixel then the RT cores only have to do the same level of compute for both scenes meaning a 4x loss in RT quality (but probably still a very nice effect - we will need to wait and see to confirm this) but little to no loss in frame rate.
 
But having setting which allow you to change the rays/pixel ratio will allow for performance tweaks. So if you have a 1080p scene with (max) 8 rays per pixel RT but then have a 4k scene with (max) 2 rays per pixel then the RT cores only have to do the same level of compute for both scenes meaning a 4x loss in RT quality (but probably still a very nice effect - we will need to wait and see to confirm this) but little to no loss in frame rate.

The ray tracing is done on a per pixel basis. For each pixel displayed on screen, a number of rays are cast into the scene to calculate exactly what colour the pixel should be. At 4K vs 1080p, either the number of rays cast per display pixel would need to be 1/4 of the rays used at 1080p to maintain performance, or the frame rate would have to be cut to 25%. The more rays/pixel or the higher the resolution, the more processing power required.
 
Not disagreeing - clarifying. Based on what some developers have said this seems to be the intended model - to give varying degrees of RT quality rather than just On/Off.
 
Back
Top Bottom