• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Yeah that theory has been doing the rounds for a while and it certainly makes sense. What doesn't make sense is Turing. At all. There really is no reason for it to exist at this moment in time. Absolutely none. Even if it weren't stupidly priced, there's obviously no utilisation for it, and it will be a long time before we have a slew of games that do take advantage of RTX features. Even then, we probably aren't going to see full ray traced gaming this generation... nothing outside of 1080p anyway, at best. All we'll probably get is half baked pretty effects of some sort, and DLSS of course, but that's largely been derided by tech press as nothing you can't already achieve with upscaling 1440p to 1800p. I do think ray tracing holds great promise, but its realisation is years away.

I don't know, time will tell, but it's all a big mess right now and what with Intel's insanely priced 9-series, overall just an awful time to be buying new hardware.

It makes sense Turing was developed for the VFX market as Nvidia talked about it and so did investment websites like Nasdaq,so what this is Nvidia making a play for this market,like Fermi was a play for compute in commerical markets. With Maxwell onwards Nvidia had two sets of GPU lines - one for gaming and one for professional markets,so it looks like they have unified these lines to a degree,which saves on having so many lines,so this should help save a degree of costs. Hence all the fully enabled GPUs appear to be in the Quadro RTX lines,and the rest in the gaming cards,but OFC Nvidia needed to find a use for the RT cores and tensor cores,so hence RTX and DLSS. But since the focus was more for commercial usage,the software stack for gaming,etc was probably not quite there,hence why devs seem behind the curve on this.

Basically I see gamers as kind of funding the foray into the VFX market.

If RTX was developed for gaming first,it would have made more sense to produce another line of FP32 focussed cards with larger chips than Pascal,which would have been a bigger jump.Imagine the TU104 chip in the RTX2080 if had standard CUDA cores?? Then develope one high end RTX enabled chip,the TU102,and release this as a professional card for commercial RT work,and seed it to games devs,to get the software stack for games up and running,and then refine the tech and release 7NM cards in a year or so.
 
As others have mentioned in this thread, is it worth taking the Founders Edition 2080Ti (which is now in stock) over the other AIB cards? It seems there will be a decent wait for the MSI Trio and maybe even the Aorus cards.
 
So it's basically nvidia's vega, a card developed for one market doing double duty, though obviously one being substantially faster than the other.

This is how it used to be until Maxwell. Except in the case of Turing(unlike Vega) they have managed to find a way to use all the RT cores and tensor cores in some way. The most AMD was able to do is use FP16 in some efffects to a limited degree.
 
I don't remember AMD or NVidia getting this much flak with the first DX10, 11, or 12 cards came out and they weren't usable for those features at the time either.

Flak for the price I can agree with but the forward looking features not so much.
 
I don't remember AMD or NVidia getting this much flak with the first DX10, 11, or 12 cards came out and they weren't usable for those features at the time either.

Flak for the price I can agree with but the forward looking features not so much.

I think there are several contributing factors... the price obviously plays a part in that flak, so it's perhaps hard to separate it from the rest. If the cards launched at Pascal prices, we'd merely be seeing minor annoyance at not having any RTX games to play. That lack of game content is another factor of course... being asked to pay such high prices with no promise of anything that's actually going to take advantage of those features. Furthermore, from all that we've heard and seen, even the 2080Ti is going to struggle pushing 60fps at 1080p... this is not something someone who is being asked to pay £1100 (minimum) for a state of the art GPU wants to hear, not least because they are almost certainly gaming at 1440/Ultrawide or 4K.

The problem is that here and now, there is really very little, if anything, to be positive about with RTX. The 2070 and 2080, pfft, forget it, little to no performance bump over Pascal yet more money. The 2080Ti is a beast at 4K, but is dreadful value. No RTX content, and no guarantee you'll even be able to play that content when it arrives at the resolution you want, and to rub salt in the wound, rumour that this entire GPU line is all a stop-gap anyway until 7nm cards arrive in the next year or so!

The flak is unprecedented, but so is everything Nvidia have done with this launch, which in every way almost seems to have been designed to turn gamers against them and these cards! It's all quite bizarre really. :rolleyes:
 
I think there are several contributing factors... the price obviously plays a part in that flak, so it's perhaps hard to separate it from the rest. If the cards launched at Pascal prices, we'd merely be seeing minor annoyance at not having any RTX games to play. That lack of game content is another factor of course... being asked to pay such high prices with no promise of anything that's actually going to take advantage of those features. Furthermore, from all that we've heard and seen, even the 2080Ti is going to struggle pushing 60fps at 1080p... this is not something someone who is being asked to pay £1100 (minimum) for a state of the art GPU wants to hear, not least because they are almost certainly gaming at 1440/Ultrawide or 4K.

The problem is that here and now, there is really very little, if anything, to be positive about with RTX. The 2070 and 2080, pfft, forget it, little to no performance bump over Pascal yet more money. The 2080Ti is a beast at 4K, but is dreadful value. No RTX content, and no guarantee you'll even be able to play that content when it arrives at the resolution you want, and to rub salt in the wound, rumour that this entire GPU line is all a stop-gap anyway until 7nm cards arrive in the next year or so!

The flak is unprecedented, but so is everything Nvidia have done with this launch, which in every way almost seems to have been designed to turn gamers against them and these cards! It's all quite bizarre really. :rolleyes:
They know gamers have short memories mate. All they have to do is release their next gpu’s at pascal prices and they will be hailed as a hero and those gpu’s will be made out to be bargains by people who will be comparing it with 20 series pricing.

Even if they did not, more than enough people are paying the inflated prices it seems for them to get away with selling lower quantity for similar profit anyway. We will see when the next few quarterly financial reports come out I suppose.
 
I hope you don’t have a 750D as that’s what mine will be going into. :)

Nope you should have brilliant space and cooling in your case. I have a very small case that fits a full atx motherboard but only just. It’s a Meshify C. I changed the position of the case (which is unfortunately sat on the floor with underfloor heating) so that the back vents aren’t stuck against the wall.

I’ve also changed the fan curves on the CPU and now I’m seeing 70c max in Shadow TR bench and 60c in FH4 gameplay and around 50c in SCVI with +500 on memory and +103 on GPU. So it’s seems to be improved.
 
I’ve also changed the fan curves on the CPU and now I’m seeing 70c max in Shadow TR bench and 60c in FH4 gameplay and around 50c in SCVI with +500 on memory and +103 on GPU. So it’s seems to be improved.

Well that seems a bit more respectable. Thanks for the info as I was suddenly a little nervous about the XC Ultra. TBH availability has been crap for most cards and there are actually very few reviews for anything other than the FE’s.

Originally went for this card due to its smaller size but now I’m building a new PC the size doesn’t really matter. Shipping date of the 31/10 means I’m reluctant to order a bigger card elsewhere though and the decent Ti’s are still rarer than hens teeth unless you want to pay a £200+ premium.
 
Well that seems a bit more respectable. Thanks for the info as I was suddenly a little nervous about the XC Ultra. TBH availability has been crap for most cards and there are actually very few reviews for anything other than the FE’s.

Originally went for this card due to its smaller size but now I’m building a new PC the size doesn’t really matter. Shipping date of the 31/10 means I’m reluctant to order a bigger card elsewhere though and the decent Ti’s are still rarer than hens teeth unless you want to pay a £200+ premium.

Well I hope you get yours soon. I was lucky I got mine a few days after launch.
 
I don't remember AMD or NVidia getting this much flak with the first DX10, 11, or 12 cards came out and they weren't usable for those features at the time either.

Flak for the price I can agree with but the forward looking features not so much.

I do.
8800GTX came out 1 year before Crysis, even if it was advertised as "a card you could play Crysis in DX10". Yet the performance was atrocious when Crysis came out a year later and you tried to play DX10.

Same thing we see right now. When proper RT games come out, it will be atrocious perf the one provided by these Turing cards.

And lets not forget when G80 came out was overpriced like Turning is. Next gen was way better and better priced also.
 
Last edited:
GT was a fair bit slower than the GTX wasn’t it?

Depended on what you were doing a bit - the GTX was the slightly older more brute force approach while the GT used a more refined approach. So some stuff the GT held up well against the GTX but sometimes when things like raw fillrate/memory bandwidth were a factor it could fall behind.
 
Back
Top Bottom