• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

you missed the important part although you highlighted it - the 1070 whilst replacing the 970 - is priced as the 980.

NVidia are price hiking again .

I think the draw to the 970 was the sub £300 price point upon release and when OC'd it could hang in there with the 780Ti. If they release the 1070 at £400 plus it puts itself in a completely different market.

I'm not a hobbyist, I'm a price conscious gamer. I'm not too fussed about having the best. The 970 was the perfect card for me.

Personally, I have no allegiance to either Nvidia or AMD but I am hoping that AMD can be competitive with Polaris and beyond. It has to be good for the consumer. We certainly don't need Nvidia being allowed to charge inflated prices. The Nvidia shareholders wont mind though.
 
W
Now there are some reports that 1080 will have less cuda cores than 980ti with similar amount of transistors as maxwell top end. I know the clocks can go up, but will it offset the loss of cuda cores(which in its own will be enhanced a bit if any).

That doesnt mean anything. The 980 had 2048 cores compared ot the 780Ti's 2880.

That means the 780Ti had over 40% more cores than the 980 but was still slower.
 
I think the draw to the 970 was the sub £300 price point upon release and when OC'd it could hang in there with the 780Ti. If they release the 1070 at £400 plus it puts itself in a completely different market.

I'm not a hobbyist, I'm a price conscious gamer. I'm not too fussed about having the best. The 970 was the perfect card for me.

Personally, I have no allegiance to either Nvidia or AMD but I am hoping that AMD can be competitive with Polaris and beyond. It has to be good for the consumer. We certainly don't need Nvidia being allowed to charge inflated prices. The Nvidia shareholders wont mind though.

I would find it surprising if Nvidia priced the X70 card in the £400+ range, unless they know AMD's polaris cards are going to be a lot slower.

The X70 card has always traditionally been around the £300 mark (£280 - £330 ish)
 
I would find it surprising if Nvidia priced the X70 card in the £400+ range, unless they know AMD's polaris cards are going to be a lot slower.

The X70 card has always traditionally been around the £300 mark (£280 - £330 ish)

There are several factors that would likely push the price up - profitability at some of the companies that provide other parts that make up the GPU has been hit by recent natural disasters and other economic reasons as well as 16nm not being the cheapest to produce products on.
 
That doesnt mean anything. The 980 had 2048 cores compared ot the 780Ti's 2880.

That means the 780Ti had over 40% more cores than the 980 but was still slower.

+1

You can not compare cores from one family of GPUs with another family of GPUs.

It is a bit like counting wheels on a vehicle, just because a motorbike only has 2 wheels does not mean it is going to be slower than a car with 4.
 
Or whether people are basing pricing on some vague rumours. Nvida, like any company, will set prices that will maximize profits, pure and simple. Maxwell has very large margins, so even if Gp104 cars are more expensive to produce initially there isn't a big deal reducing margins in the short term. Then again, if the 1080 is decently faster than the 980Ti then it will easily sell at similar prices. once the 1080ti appears then nvidia can lower 1080 prices back to the middle ground, which will liekly coincide with when yields are far better.
 
I think the draw to the 970 was the sub £300 price point upon release and when OC'd it could hang in there with the 780Ti. If they release the 1070 at £400 plus it puts itself in a completely different market.

I'm not a hobbyist, I'm a price conscious gamer. I'm not too fussed about having the best. The 970 was the perfect card for me.

Personally, I have no allegiance to either Nvidia or AMD but I am hoping that AMD can be competitive with Polaris and beyond. It has to be good for the consumer. We certainly don't need Nvidia being allowed to charge inflated prices. The Nvidia shareholders wont mind though.

How Nvidia set UK prices will be completely different again to how they set US prices. In Europe and especially the UK companies can generally maximize profit by setting higher relative prices than in the US.
 
Well if NVidia keep price hiking as people suggest then people will stop buying them, either that or we will or have been paying stupid money for really low end parts. £300 entry level card anyone, cannot see that selling well.

Companies will sell at whatever the market will sustain, even AMD will sell at the price they can sell at, for example the Nano, it didn't sell particularly well at £550 so it is now under £400 and it sells quite well at that price point.
 
That doesnt mean anything. The 980 had 2048 cores compared ot the 780Ti's 2880.

That means the 780Ti had over 40% more cores than the 980 but was still slower.

You are comparing Kepler cores with plenty of FP64 grunt with lean maxwell cores ;) it was much easier to clock maxwell higher and add some performance because all the compute stuff was chopped off.
Now all that compute stuff is coming back with pascal and core count is still going down ;)

I do understand that each cuda core iteration is different, but leaked numbers do not indicate a massive leap in performance some of you are dreaming about, unless nvidia manages to clock new chips to kingdom come.
 
Well if NVidia keep price hiking as people suggest then people will stop buying them, either that or we will or have been paying stupid money for really low end parts. £300 entry level card anyone, cannot see that selling well.

Companies will sell at whatever the market will sustain, even AMD will sell at the price they can sell at, for example the Nano, it didn't sell particularly well at £550 so it is now under £400 and it sells quite well at that price point.

No they wont :D Titan x/980ti and original titan/780ti situation showed that nvidia users are willing to pay whatever they are charged. ;)
I hope I am wrong this time, but nvidia will sell whatever they price their products for ;)
 
Now all that compute stuff is coming back with pascal and core count is still going down ;)

citation needed (and not info about the Tesla P100 as for all we know, that is completely unrelated to the geforce GP104 cards.)

but leaked numbers do not indicate a massive leap in performance some of you are dreaming about, unless nvidia manages to clock new chips to kingdom come.

citation needed

Also, few think the 1080 will be a "massive leap" but i think it is sensible to assume it will at least be a decent bit quicker (say 20% or so)

Nvidia are not going to release a flagship card over a year after the old one, with a whole new generation and name, a whole new arhchitecture and die shrink, for the same price with no increase in performance. It is commercial suicide, especially with AMD releasing new cards as well.
 
Last edited:
I do understand that each cuda core iteration is different, but leaked numbers do not indicate a massive leap in performance some of you are dreaming about, unless nvidia manages to clock new chips to kingdom come.

To be fair what we have seen so far with other products on 16nm and 14nm and the information on GP100 tends to suggest that pretty significant jumps in clock speeds are possible - though companies like Apple seem to have gone for a balance of increased clock speed and increased power efficiency over raw performance.
 
Just to be clear I'm not making a case of the overall suitability of a 780 at 1440p these days
Fair enough. I see a lot of people make really exaggerated claims about what is required for what sometimes and I feel compelled to call it out not because I like to prove people wrong on the internet but because I fear that some people reading may get the wrong idea and base a purchase on false or misleading information.

- none the less there is a huge gulf in performance across the range on out the box performance - something like 20% from an early A1 to one of the last B1s and what might be ok with one 780 might fall flat on its face on another.
I'd say any performance increases are far more likely to be from driver improvements than actual hardware iteration. At least the bulk of it.

Regarding my own setup - one aspect here is that I'm running G-Sync which makes it a lot more tolerable when you are getting drops just below 60fps - otherwise the card would have been replaced by now as it is certainly stretched at 1440p.
Well you having G-sync would explain a lot. I'm guessing you're not actually hitting a consistent 60fps in many games. For me, I'm still sensitive to that, variable refresh rates or not.

But games like GTA V - you cant even max that out at 1080p on a 980Ti. And there's *tons* of games that wont be playable at 1440p/60fps with high/ultra settings using a 780 or a 970. Seriously, like most modern AAA games and quite a few graphics-heavy indie games, too. We're not just talking the odd game here and there.
 
There are several factors that would likely push the price up - profitability at some of the companies that provide other parts that make up the GPU has been hit by recent natural disasters and other economic reasons as well as 16nm not being the cheapest to produce products on.
Factors that can push the price up. I'd stop short of saying 'likely', though.

Personally, I dont expect to see another £270 x70 card, but I also wouldn't expect anything like a £400 card, either. Maybe £320-350. A lot will depend on production numbers and stock, too. If they can be confident of having a large number of GPU's to sell, they'd be wise to do another very good price like they did with the 970 which sold gangbusters as a result.

It'll also depend on what AMD do, or what Nvidia think AMD are doing, at least.
 
Fair enough. I see a lot of people make really exaggerated claims about what is required for what sometimes and I feel compelled to call it out not because I like to prove people wrong on the internet but because I fear that some people reading may get the wrong idea and base a purchase on false or misleading information.

I get that - I often post from a similar perspective.

I'd say any performance increases are far more likely to be from driver improvements than actual hardware iteration. At least the bulk of it.

Something that seems to have passed a lot of people by is that nVidia did a second spin of the Kepler 2.0 silicon - partly due to improvements at TSMC and partly I suspect to test some changes that would later give Maxwell better potential clock speeds and power efficiency - A1 stepping cards typically have an actual boost of 1006MHz out the box while B1 cards typically are in the range 1137-1207 and likewise the max overclock potential generally has a similar delta - long rant edited out but basically I find many of the review site numbers for the 780 way low having both a 970 and 780 here to compare against even a poor boosting A1 card would struggle not to exceed the numbers posted in recent articles on many sites.

Well you having G-sync would explain a lot. I'm guessing you're not actually hitting a consistent 60fps in many games. For me, I'm still sensitive to that, variable refresh rates or not.

But games like GTA V - you cant even max that out at 1080p on a 980Ti. And there's *tons* of games that wont be playable at 1440p/60fps with high/ultra settings using a 780 or a 970. Seriously, like most modern AAA games and quite a few graphics-heavy indie games, too. We're not just talking the odd game here and there.

Not hitting a constant 60+fps with ultra type settings no - but generally it is only dips into the 50s which with G-Sync is tolerable in single player - if I didn't have G-Sync I suspect I'd have ditched the card by now but in all honestly I've played or am playing many recent games with ultra settings and its perfectly playable.
 
Last edited:
has this been posted?

http://videocardz.com/59266/nvidia-pascal-gp104-gpu-pictured-up-close

Quite interesting, especially as the leaker said the 1080 has gddr5x

It seemed leaker was wrong, 1080 do not have GDDR5X.

http://wccftech.com/nvidia-pascal-gp104-gpu-pictured-leaked/

1080 got K4G80325FB-HC25 memory chips on PCB.

I checked Samsung pdf below confirmed K4G80325FB-HC25 is GDDR5, not GDDR5X.

http://www.samsung.com/semiconductor/global/file/insight/2015/08/PSG2014_2H_FINAL-1.pdf

NVIDIA GeForce Pascal GP104 GPUs Are Ready For DirectX 12, Vulkan, VR and Compute Preemption

Compute Preemption look like Nvidia answer to AMD Async Compute.

Before Pascal, on systems where compute and display tasks were run on the same GPU, long-running compute kernels could cause the OS and other visual applications to become unresponsive and non-interactive until the kernel timed out. Because of this, programmers had to either install a dedicated compute-only GPU or carefully code their applications around the limitations of prior GPUs, breaking up their workloads into smaller execution timeslices so they would not time out or be killed by the OS.

Indeed, many applications do require long-running processes, and with Compute Preemption in Pascal, those applications can now run as long as they need when processing large datasets or waiting for specific conditions to occur, while visual applications remain smooth and interactive—but not at the expense of the programmer struggling to get code to run in small timeslices.

Compute Preemption also permits interactive debugging of compute kernels on single-GPU systems. This is an important capability for developer productivity. In contrast, the Kepler GPU architecture only provided coarser-grained preemption at the level of a block of threads in a compute kernel. This block-level preemption required that all threads of a thread block complete before the hardware can context switch to a different context. via NVIDIA

Interesting thing about GP104 qualification sample with 1614A1 on it was manufactured on week 14 which was 2 weeks ago in early April. It was the final test sample before full volume production.
 
Back
Top Bottom