• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

Man of Honour
Joined
13 Oct 2006
Posts
92,197
The 16FF+ Pascal dies will be a lot more expensive than the extra mature 28nm Maxwell ones. The wafers are almost twice the price.

I wouldn't expect NV to eat that cost. It'll probably blow way past the £1000 mark on release. Unless they leave it and stick with the 1080 until costs can be brought in line with what the market will bear. Then again, people will certainly pay £1200 or more for a Titan. But how many?

If the big core is also what they used to fulfil commercial compute obligations then that might subsidise costs a bit on consumer cards - though that might mean there would be a slightly less than full spec "ti" type cards at a fairly reasonable price point and then the full loaded card at a fairly eye watering price point.
 
Soldato
Joined
30 Nov 2011
Posts
11,358
The 16FF+ Pascal dies will be a lot more expensive than the extra mature 28nm Maxwell ones. The wafers are almost twice the price.

I wouldn't expect NV to eat that cost. It'll probably blow way past the £1000 mark on release. Unless they leave it and stick with the 1080 until costs can be brought in line with what the market will bear. Then again, people will certainly pay £1200 or more for a Titan. But how many?

They might be nearly double now, but that will drop as they ramp to volume... Big Pascal will cost nvidia around $110 per die, even assuming some hefty costs for HBM, there is still plenty of room for a titan-esque card at $1000 and a ti type card below that
 
Soldato
Joined
18 Oct 2002
Posts
19,395
Location
Somewhere in the middle.
I cant quite understand what we are gonna realistically gain with these next level cards.

The games by the time of this release are going to be almost identical to what we have now with regards to requirements.

So lets say it will be able to play The Witcher 3 at 4k 100fps. Wow.. Playing a game that you have probably already completed 1.5 years previously, but this time after paying £1000 to get more FPS. How about Farcry 4 at 4k.. uh huh.. we been there.. Batman... GTA V...

I appreciate the advancement of technology, but I would be surprised if there is anything remotely mind blowing by the time the next gen comes out.

Fingers crossed developers really do push it to the max with games, but since the market for a £1k card is so small I doubt they will see a return on their invested time.
 
Man of Honour
Joined
13 Oct 2006
Posts
92,197
There are 1-2 games that look like they might take advantage of what DX12 compute can provide - the Dawn Engine (new Deus Ex game) looks like it'll push GPUs with everything turned up.
 
Soldato
Joined
26 May 2014
Posts
2,983
No doubt Fallout 4 will be modded until the engine is creaking under the weight and requires some ridiculous GPU grunt like modded Skyrim does, there's a new (and no doubt horrible PC port of) Ass Creed on the way, Rise of the Tomb Raider, the new Homefront running on CryEngine, Just Cause 3, The Division, Doom, a new Hitman, Deus Ex: Mankind Divided that's supposed to have lots of DX12 goodness, ARK, Dead Island 2... and that's just from now until the first half of next year.

Not that I'm sure that a 980 Ti will have any trouble running most of those acceptably, but there's always going to be shiny new graphical ****fests coming along for people to test their shiny new toys out on. ;)
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Will the 1070 be priced like the 970?

If it costs more to make, and is seriously quicker... then no.

I'm betting £350-£400 for the 1070.

My personal hope is that the 1060 performs better than a 970. I've allocated about £250 for a 1060, which is what I'm expecting that card to cost (£230-£250).

Who knows, they could be even more expensive than this at launch. I think we've been on 28nm for so long, they can basically charge whatever they want to start with. Even more so if AMD are late to the party (anyone want to bet AMD will /not/ be hit with some kind of delay?)

With 980Ti release just slightly after Titan X for nearly half the price imho hurt nvidia creditability a little. I for one wont fall for this trick again.

Even at that price, there are people on this very forum who will buy 8 on release. Just because they can. 8-way Titan P SLI...
 
Last edited:
Caporegime
Joined
24 Sep 2008
Posts
38,280
Location
Essex innit!
I cant quite understand what we are gonna realistically gain with these next level cards.

The games by the time of this release are going to be almost identical to what we have now with regards to requirements.

So lets say it will be able to play The Witcher 3 at 4k 100fps. Wow.. Playing a game that you have probably already completed 1.5 years previously, but this time after paying £1000 to get more FPS. How about Farcry 4 at 4k.. uh huh.. we been there.. Batman... GTA V...

I appreciate the advancement of technology, but I would be surprised if there is anything remotely mind blowing by the time the next gen comes out.

Fingers crossed developers really do push it to the max with games, but since the market for a £1k card is so small I doubt they will see a return on their invested time.

The current gen of DX11 games are at GFX level 10 but when you have the extra grunt on a single card, devs can go that extra notch and turn the GFX up more. The biggest problem I see is the consoles and how they take top priority whilst us PC gamers are just an afterthought and the majority of PC gamers are using real low spec machines....
 
Soldato
Joined
14 Jul 2004
Posts
2,549
I genuinely hope I'm wrong, but AMD's new "we're premium too!" marlarky does not fill me with confidence for a new price war :eek:

Having fallen to a 20% market share, something tells me AMD might need to rethink that! There's no way to know anything for sure. a lot can happen in a year.
 
Soldato
Joined
15 Dec 2007
Posts
16,565
I cant quite understand what we are gonna realistically gain with these next level cards.

The games by the time of this release are going to be almost identical to what we have now with regards to requirements.

So lets say it will be able to play The Witcher 3 at 4k 100fps. Wow.. Playing a game that you have probably already completed 1.5 years previously, but this time after paying £1000 to get more FPS. How about Farcry 4 at 4k.. uh huh.. we been there.. Batman... GTA V...

I appreciate the advancement of technology, but I would be surprised if there is anything remotely mind blowing by the time the next gen comes out.

Fingers crossed developers really do push it to the max with games, but since the market for a £1k card is so small I doubt they will see a return on their invested time.

We're bust until the next console gen. See you in 6 years.
 
Soldato
Joined
31 Oct 2002
Posts
9,962
With 980Ti release just slightly after Titan X for nearly half the price imho hurt nvidia creditability a little. I for one wont fall for this trick again.

The problem is the TI version of Pascal will likely only release once AMD have something out to compete with it.

This could take months - so I expect the same situation with the 900 series, where we'll have the Pascal version of the 980 first, followed by full fat Pascal titan, followed by the 980ti pascall version only once/if AMD release a competitive product.

Many will be unable to wait months, so they'll go for the £850-£900 Titan, and why not if you've got the excess funds lying around, it will be the top card for a long time.
 
Soldato
Joined
22 Jun 2012
Posts
3,732
Location
UK
I cant quite understand what we are gonna realistically gain with these next level cards.

The games by the time of this release are going to be almost identical to what we have now with regards to requirements.

So lets say it will be able to play The Witcher 3 at 4k 100fps. Wow.. Playing a game that you have probably already completed 1.5 years previously, but this time after paying £1000 to get more FPS. How about Farcry 4 at 4k.. uh huh.. we been there.. Batman... GTA V...

I appreciate the advancement of technology, but I would be surprised if there is anything remotely mind blowing by the time the next gen comes out.

Fingers crossed developers really do push it to the max with games, but since the market for a £1k card is so small I doubt they will see a return on their invested time.

The only reason to get the high end pascal is for 4k or if you want to play at 1440p and have 100+ fps with 4xaa on every game. The FPS at 1440p on a 980ti is already very good.

But up until now Nvidia's mid-high end cards (970/980) have always been a bit crap at higher resolutions because of the small memory bus, but if there are going to be HBM mid-high end cards, then that won't be a problem. So unless you want 4k 60fps or 1440p at 100+ fps on every game, then the 970/980 pascal equivalents should be better than they have been on GDDR5. So there might be the 980 pascal equivalent which is faster than a 980ti and lower TDP than a 980.
 
Last edited:
Soldato
Joined
7 Aug 2013
Posts
3,510
The only reason to get the high end pascal is for 4k or if you want to play at 1440p and have 100+ fps with 4xaa on every game. The FPS at 1440p on a 980ti is already very good.
I agree with your post.

The jump from 1080p to 1440p is not the same thing as the jump from 1440p to 2160p, though. I think that's important to point out.

I'd say the 980Ti is the perfect 1440p/60fps card. But I think the leap to 2160p is underestimated by many. It is a huge, huge leap in terms of pixels needing to be rendered. I really think we need at least 60-70% more power than the 980Ti to have a 2160p/60fps capable card, given a small allowance for the fact that 2016 games will have continued raised demands.

4k is truly an incredible resolution leap that is an order of magnitude higher than we've ever dealt with before. Just a reminder - it is FOUR times the resolution of 1080p.
 
Soldato
Joined
22 Jun 2012
Posts
3,732
Location
UK
I agree with your post.

The jump from 1080p to 1440p is not the same thing as the jump from 1440p to 2160p, though. I think that's important to point out.

I'd say the 980Ti is the perfect 1440p/60fps card. But I think the leap to 2160p is underestimated by many. It is a huge, huge leap in terms of pixels needing to be rendered. I really think we need at least 60-70% more power than the 980Ti to have a 2160p/60fps capable card, given a small allowance for the fact that 2016 games will have continued raised demands.

4k is truly an incredible resolution leap that is an order of magnitude higher than we've ever dealt with before. Just a reminder - it is FOUR times the resolution of 1080p.

Its over twice the pixels of 1440p, so to run games at 4k a card over twice the speed of a 980ti would be ideal, to run every game at 60+ fps with 2xAA or even 60-144fps when 144hz 4k panels are released, a card 3-4x the speed of a 980ti! As you said if the pascal is 50% faster than 980ti then that is still not great for 4k.
 
Last edited:
Associate
Joined
8 Jul 2015
Posts
886
Location
Sheffield.
I'm hoping someone releases an actual 100Hz 21:9 IPS G-Sync panel by the time Pascal hits next year, I think that'd be perfect to accompany a beastly graphics card such as 1080/Titan whatever/1080Ti.

When will Displayport 1.3 become a thing? The extra refresh rate @4k might make people more inclined to upgrade, especially if a single card can provide that type of performance.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
But up until now Nvidia's mid-high end cards (970/980) have always been a bit crap at higher resolutions because of the small memory bus, but if there are going to be HBM mid-high end cards, then that won't be a problem. So unless you want 4k 60fps or 1440p at 100+ fps on every game, then the 970/980 pascal equivalents should be better than they have been on GDDR5. So there might be the 980 pascal equivalent which is faster than a 980ti and lower TDP than a 980.

Do we know that all Pascal cards in the 10xx series will use HBM? Same with AMD?

Is there a chance they'll stick with GDDR5 for the lower-end offerings?
 
Soldato
Joined
22 Jun 2012
Posts
3,732
Location
UK
Do we know that all Pascal cards in the 10xx series will use HBM? Same with AMD?

Is there a chance they'll stick with GDDR5 for the lower-end offerings?

Don't know but it would be a bit crap if they stick with GDDR5 and only have HBM on the £500+ cards.
 
Soldato
Joined
7 Aug 2013
Posts
3,510
Its over twice the pixels of 1440p, so to run games at 4k a card over twice the speed of a 980ti would be ideal, to run every game at 60+ fps with 2xAA or even 60-144fps when 144hz 4k panels are released, a card 3-4x the speed of a 980ti! As you said if the pascal is 50% faster than 980ti then that is still not great for 4k.
I make some allowance for the fact that you don't always need 2x the power to power 2x the resolution(it's dependent on various factors), and also that HBM2 will provide a performance enhancement/bottleneck remover at such high resolutions. AMD Fury line-up's HBM1 has already proven to show marked increased competitiveness at 4k over Nvidia's GDDR5 cards, and HBM2 will provide another doubling of bandwidth over that. That may not mean a doubling of performance increase over HBM1, but it certainly means that 4k wont require a raw quadrupling of power over 1080p.

Do we know that all Pascal cards in the 10xx series will use HBM? Same with AMD?

Is there a chance they'll stick with GDDR5 for the lower-end offerings?
We don't know anything, but I think it's possible that either GDDR5 or HBM1 is put onto the lower end cards. GDDR5 definitely on the commercial low end cards(<x50 cards).
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Don't know but it would be a bit crap if they stick with GDDR5 and only have HBM on the £500+ cards.

Tbh, I think it's quite likely. I'm sure the 1080 and above will have HBM. The 1070 I have no idea.

But the 1060 and below I can see being GDDR5 still. There will probably only be 4GB on these cards, and nV might well conclude there just isn't the need for HBM bandwidth to feed these cut-down cards.

Plus the fact that HBM2 supply is likely to be a little constrained, so they won't want the whole line-up using it.

So I don't think we've seen the last of GDDR5 quite yet. I think this will apply to AMD's new line-up too. Mid-range and below on GDDR5.
 
Soldato
Joined
22 Jun 2012
Posts
3,732
Location
UK
Might probably be 2017 before you can get a £350-450 card with HBM. I have a bad feeling that the 1080ti and titan are going to be ridiculously expensive.
 
Back
Top Bottom