- Joined
- 2 Jan 2012
- Posts
- 12,317
- Location
- UK.
Didn't take you long to get over the Fury X hype I see.![]()
Lol, I'm always looking forward

Pascal looks it could be an absolute monster, new architecture, die shrink and HBM 2.0. Should be a big step.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Didn't take you long to get over the Fury X hype I see.![]()
^^ Yup.
Flagship Titan Pascal GP200 could well have 12GB / 16GB of HBM. With the normal Flagship GP204 having 6GB / 8GB of HBM.
They are going to be pricey for sure, but performance is going to be mind-blowing.
We're getting Gen 2.0 HBM along with a massive die shrink either 16nm/14nm down from 28nm. Plus a new architecture. These cards are going to be scary !
There is no way they will manage to make it twice as fast, not in real terms anyway. Not in the first release.
You only need to look at the jump from 40NM to 28NM to see where things will end up in the first year.
Exciting stuff for sure. IBM are out of the fab business now though aren't they? Presumably they intend to license this technology to Samsung and GloFab.
Why people think you'll be able to pay the same money in a year and a half and get double the performance is beyond me.
First Pascal cards will be an improvement, but not 80% better than what we have today (TX/Ti)
Its always possible of course that nvidia will drip feed progressively faster versions of whats possible and/or take advantage of increased efficiency to reduce costs to themselves rather than head directly towards performance.
EDIT2: Its always possible of course that nvidia will drip feed progressively faster versions of whats possible and/or take advantage of increased efficiency to reduce costs to themselves rather than head directly towards performance.
http://www.kitguru.net/components/c...0nm-risk-production-actual-chips-due-in-2017/
Maybe this is why NV were courting Samsung? Who btw have just announced an even more aggressive push towards 10nm. What I read from all this is that 14/16 will be disappointing, not sure why else they would be suicidally throwing themselves at such a tricky and expensive engineering problem, aside from Apple's patronage of course.
I wont. Some company's are pushing ray tracing hybrid GPU's for the next consoles which is way better then a 980TI or better then a TitanX in ways if it comes about.I will be shocked if the next consoles have anything better than a 980Ti.
I will be shocked if the next consoles have anything better than a 980Ti.
That could very well be the case but I wouldn't be shocked either way.Or they might have 5 year old off the shelf x64 parts.