• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

I'm dubious, and if true would be interesting to see sales figures. Performance jump would have to be huge but I doubt that is the case.


It is also frequently the case that NVidia wont release official prices to the last minute to keep things secret, they might offer a ceiling price that some AIBs might use as a placeholder.

Remember the Titan has been dilineated, this may continue; the 2080ti May be the top end consumer card. Wouldn’t be the first time they pulled the ol bait n switch. The ti always made the Titan look like a waste of money, the price meeting in the middle makes sense to me. I don’t think the performance leap would have to be huge either, just in that tiny middle ground between the Titan and ti. I mean Christ, look at the cuss core disparity, 1,500 more on the ti if the link is to be believed. I reckon it’ll be an absolute monster. I think those hoping for 4K 144hz will be dissapointed though. I’d still buy it, as long as it obliterates UW 1440p 144Hz, or 4K 80+; roughly the performance I expect. As I’ve said before, the next gen will bring 4K capability into the hands of gamers en mas, not Turing.

And given 1080ti cards costing that much in the UK were being hoovered up, is it really that unlikely?
 
So will all the 2080TI have no DVI? I have a old 120mhz Asus monitor which i run via DVI-D, the hdmi on it is only 1.4. So what are my options to get 120Hz on the new cards with my current monitor. Will some come with adapter or possibly dvi connection with some brands?
 
Last edited:
my guess pulled out of nowhere is

each new card is equivalent performance to the previous next one up

£350 1070 = 2060 £400
£450 1080 = 2070 £500
£650 1080ti = 2080 £700
£950 titan = 2080ti £1000

with 10 range selling for say £50 less to get rid of stock
 
No hdmi 2.1 either to run full 4k 144hz with chroma 4:4:4 or 8k via single cable. Would have been good for future proofing

HDMI 2.1 means VRR, and Nvidia cannot opt out from it, like they did from Display Port adaptive sync.
So expect to stick with HDMI 2 for a long time.
 
my guess pulled out of nowhere is

each new card is equivalent performance to the previous next one up

£350 1070 = 2060 £400
£450 1080 = 2070 £500
£650 1080ti = 2080 £700
£950 titan = 2080ti £1000

with 10 range selling for say £50 less to get rid of stock

Yet prices are up, and Gibbo said they will go even higher for Pascal cards. Already the 1070Ti is selling at higher price than the Vega 64...
 
At these prices I'll keep my 1080ti thanks. Current prices even now make gaming on a PC look crazy compared to a console. It's sad because I am a true PC gamer at heart but the pc gaming market is pricing itself in to a niche market and that's not a good thing.

Looking at an XBOX one X or a playstation pro (£400)

That won't even buy you a GPU to play at the same performance level of the entire console! Then add the CPU, motherboard, case, psu, storage, RAM.

To be honest I do wonder why I keep fuelling that kind of crazy market.
 
Last edited:
Already the 1070Ti is selling at higher price than the Vega 64...

That depends on where you look. The thing is, while the AMD deal is stellar (especially for the RD 64) the Devil is a hooooge card, seems to run a bit hotter or noisier (you get to make that decision) and seems to have quite a few people having teething issues with them (purely going by the 64 users thread on here).

They both appeal for different reasons. I honestly can't decide and I don't have a particular monitor making the decision easy for me.
 
People need to look at the size of the chip in the GTX2080TI and new Titan card. If it is the rumoured 754MM2 then it will be the full deal,and would be the first time in a very long time Nvidia has launched a graphics card with a large GPU first.

If it is under 450MM2,then it is probably a smaller chip of the new generation,and then there is room for a card with a larger GPU on top of the GTX2080TI.
 
It is also frequently the case that NVidia wont release official prices to the last minute to keep things secret, they might offer a ceiling price that some AIBs might use as a placeholder.

Exactly this, until Jensen is standing there with the card in his hand and the price up on the screen behind him, all these leaks are just guesses/rumours, doesn't matter whether it is Jim from Adored (lots of people seem to think he really knows what is happening), Wccftech (we print everything so eventually we must be right) or PNY's own website (its our own site so it must be legit). NVidia will be the only ones who really know how much these cards will be listed at on Monday.

As for the, Its AMD fault for not competing, just remember that these cards take years to produce, from conception to design to actually manufacture and getting to the consumer, it can take 3 or 4 years. So when NVidia started on these it could well of been the 290's or the refreshes that were AMD's best at the time.
You design/build the best product you think you can, and then as the time moves closer you see where it's performance is and take things from there.

I would imagine that NVidia will be using the best chips for the Quadro's as I bet their demand will be very high, so that leave us consumers with cut down cards, which is to be expected.
 
That depends on where you look. The thing is, while the AMD deal is stellar (especially for the RD 64) the Devil is a hooooge card, seems to run a bit hotter or noisier (you get to make that decision) and seems to have quite a few people having teething issues with them (purely going by the 64 users thread on here).

They both appeal for different reasons. I honestly can't decide and I don't have a particular monitor making the decision easy for me.

Nash issues were with his B350 motherboard. The others had to use DDU in safe mode.
 
HDMI 2.1 means VRR, and Nvidia cannot opt out from it, like they did from Display Port adaptive sync.
So expect to stick with HDMI 2 for a long time.

A friend of mine told me yesterday that with HDMI 2.1 VRR is only optional. Now I've looked online but I cannot find anything definitive either way, I thought is was a compulsory part of the spec, so Ill put it out there to see if anyone else has any thoughts.
 
If a 2080 and 2080 Ti launch at same time and the non Ti is around 25% more performance than the 1080 Ti for £600 I'll bite not spending £800+ on the Ti when the non Ti will still offer a good amount of graphical grunt!
 
2080Ti with 4352 cores will most likely have gaming performance on the level of the Titan V, which was about 25% faster than the 1080Ti stock for stock but had problems with utilising all its 5120 cores in gaming situations. I'm guessing Nvidia fixed that somewhat with Turing and 4352 cores with gaming boost clocks should be around the perf level of the Titan V clocked around 1.5GHz.
 
At these prices I'll keep my 1080ti thanks. Current prices even now make gaming on a PC look crazy compared to a console. It's sad because I am a true PC gamer at heart but the pc gaming market is pricing itself in to a niche market and that's not a good thing.

Looking at an XBOX one X or a playstation pro (£400)

That won't even buy you a GPU to play at the same performance level of the entire console! Then add the CPU, motherboard, case, psu, storage, RAM.

To be honest I do wonder why I keep fuelling that kind of crazy market.
Yea...still haven't upgraded because putting graphic card aside, after 7 years the CPU, motherboard, ram that's worth upgrading from say something like a i5 Sandy platform would now cost nearly double of what it costed back then...
 
Back
Top Bottom