• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Nvidia are soon to release a 65 inch 4k 144fps monitor with hdr and gsync..but wait, there's no current tech that can do that, right?

If it doesn't have HDMI 2.1, you can call it 98hz, if you want your colours to be shown properly and not crunched.
Like they do on the 27" HDR Gsync monitors above that refresh rate.

In the mean time, there is the 65NU8000 60hz but Freesync HDR TV at 1/4 the price of the Nvidia TV. (there is also 55" model, and curved ones)
And new 120hz TVs coming out next year, while all of them going to be much cheaper than the Nvidia TV because their market is normal households not those feeling "exclusive" cash cows.
 
The PC market shunned AMD because the performance wasn't there. You don't buy a GPU based on hopes and prayers. Even when it was, there was often a tradeoff.

The price/performance has been there in every Generation since the 4000 series and only really tailed off at high end with Fury onwards. AMD's market share had dropped well before Fury X ever hit the scene. AMD mid range cards have often been cheaper with similar performance and came with more vram. Since most of the sales are low/mid i can't remember AMD ever not having competitive products in this area yet sales dropped way of a cliff.
 
8% better than a 1080Ti mate :p

It needs to be 15-20% better imo.

I hope that’s a joke? Cause it’s not funny.

1080 from 980Ti was on average just under 10%, although if you go back and look at the 1080/Pascal announcement they talked about it being revolutionary and the "Most Advanced Gaming GPU Ever Created" which hardly gave people the impression they would only be seeing a less than 10% average improvement in games vs a 980Ti. You can certainly expect the same rhetoric again which will give the impression this thing has been crafted by the hands of [insert chosen deity here] him/her/itself! That said, the gap between the 980Ti and 1080 was only a year... it's been longer since the 1080Ti, so it's certainly reasonable for people to expect more of a performance bump given they've been waiting longer. And I don't doubt Nvidia COULD deliver that... but lack of competition means they don't HAVE to. I would like to see 15-20%, but when all is said and done and the benchmarks are in, it could indeed by a single figure % difference. We shall just have to wait and see, but I don't expect the Gamescom announcment to provide this info, we'll have to wait until the cards have been properly reviewed.
 
1080 from 980Ti was on average just under 10%, although if you go back and look at the 1080/Pascal announcement they talked about it being revolutionary and the "Most Advanced Gaming GPU Ever Created" which hardly gave people the impression they would only be seeing a less than 10% average improvement in games vs a 980Ti. You can certainly expect the same rhetoric again which will give the impression this thing has been crafted by the hands of [insert chosen deity here] him/her/itself! That said, the gap between the 980Ti and 1080 was only a year... it's been longer since the 1080Ti, so it's certainly reasonable for people to expect more of a performance bump given they've been waiting longer. And I don't doubt Nvidia COULD deliver that... but lack of competition means they don't HAVE to. I would like to see 15-20%, but when all is said and done and the benchmarks are in, it could indeed by a single figure % difference. We shall just have to wait and see, but I don't expect the Gamescom announcment to provide this info, we'll have to wait until the cards have been properly reviewed.
According to this Techpowerup review reference 980 Ti to 1080 performance difference was 30-35%: https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/26.html
 
Same performance as the 1080 Ti for £200 less for the 1170 would be a good place to start.
According to the "leak" the 1170 is going to be nowhere near a 1080Ti. At least 20% slower.

e: And it's going to be £500... so... worse value for money? Odd.
 
Last edited:
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-1080/3439vs3603

980ti-vs-1080.jpg
 
Last edited:
No, the numbers are as they are.
Just hinting to look at the process node rather than anything else, difference between 16nm and 12nm, is there any?
I don't think there is any significant difference. Although supposedly TSMC have a special 12 nm FFN or 12nm FinFet Nvidia node which is custom for Nvidia and I believe the Titan V uses this. I still think it doesn't make any real noticeable difference over 16nm though.
 
All other reviews say the difference is much greater than this. Also, you have to remember to compare stock Founders Edition vs stock Founders Edition as this is what Nvidia does. A heavily overclocked 980 Ti can come close to a stock 1080.

I hope the leak are fake otherwise I'll be disappointed unless the raytracing these cards can do is truly revolutionary.

Well, we will see. As I say, I'd certainly love to see a 20% bump over the 1080Ti, but even if we do, it won't be that across the board in ALL games. Depending what people play (and at what resolution), they may indeed only see a single figure % boost, and be paying A LOT more for the privilege. In some instances it could be a very poor value upgrade over the 1080Ti if Nvidia price it too high. We really just don't know, and I don't suspect we will for a while... it will require professional reviews and probably a bit of dirver maturity before the 11-series comes into its own. We don't even know they will have raytracing... and I kind of suspect they won't as no games are taking advantage of that yet. I am thinking the raytracing is more likely going to be for their Quadro pro cards, not the gaming ones.
 
I don't think there is any significant difference. Although supposedly TSMC have a special 12 nm FFN or 12nm FinFet Nvidia node which is custom for Nvidia and I believe the Titan V uses this. I still think it doesn't make any real noticeable difference over 16nm though.

There is the performance track library for 12FF that is unofficially the "nVidia" node - nominally performance/density/power differences, depending on what you are aiming for, are in the 14-20% range for 12FF over 16FF. With recharacterization and for specific applications a 35% performance increase is possible with some sacrifice of density.
 
Back
Top Bottom