Don't tell me the GTX 3xx series is going to be one of their $$$ massive - "right at the edge of the TSMC die cutter spacer settings" - dies again... Fail
Bob
Why not? GTX 280 was good, and it's still the best single gpu card around.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Don't tell me the GTX 3xx series is going to be one of their $$$ massive - "right at the edge of the TSMC die cutter spacer settings" - dies again... Fail
Bob
Why not? GTX 280 was good, and it's still the best single gpu card around.
First lot as they said are going to be buggy, its expected to be buggy and any working parts at all is considered a bonus... it has little direct correlation to the results of the second run.
I didn't think they started talking about yields until further along the line so seems someone has come up with their own interpretation for what the experts were saying.
Only gonna start worrying personally if we see low numbers of working parts off debugged dies - or they never even manage to get enough working dies to debug at that stage.
EDIT:_ Rroff, its also worth pointing out this is in no way their first spin of silicon, like ATi will easily be on their 2nd spin, if not third, Nvidia have had 1-2 spins fail already, still at unreleaseable product yields, and non working products after the 3rd/4th spin is abysmal in terms of for Nvidia, I still say its TSMC's fault but, well, clockspeed and size are a huge issue for leakage, which is the 40nm process's biggest issue. I do wonder if Nvidia might have to simply go after the same area's as ATi this round, IE a lower clocked smaller core where the highest end 280gtx equivilent is scrapped completely and they cut the core size down, shaders and try and produce a smaller core. Nvidia have a far larger core and problematically, less shaders but massively higher clocked, ATi's highest clocked core parts are what, 750-800mhz on the 5870, Nvidia will even with no increase from previous gen, be at 1.3-1.5Ghz on their shaders, again making the leakage worse.
This is not good for any of us, including nVidia (for which it's very, veeeeeery bad). We need decent competition to ATi dammit, gotta keep those prices down!
Indeed we do. I'd be gutted if those 5800 cards were still over £200 this time next year.
Originally Posted by future Semi-Accurate article
The downward-spiraling wreckage of fail continues today as it has been revealed by reliable sources that NVIDIA's "next-gen" GT300 GPU will use more power than any GPU in the history of the industry, no doubt in a wreckless, zealous bout to reclaim some semblance of long-lost glory and prestige the company might have once had (though apparently too long ago for anyone to remember). Instead of any sort of responsible or daresay competent target power envelope for their last desperate effort in the highend market, NVIDIA has decided to abandon any restraint in gluttony for a staggering 400W consumption level just for the card alone, putting even entire desktop machines to shame.
The same source also revealed a few tidbits of the performance characteristics we might expect from such a ghastly beast, and despite high expectations for hardware that could bring an industrial-strength power generator to its knees, reality bites back to paint quite a different picture. Information reveals performance should be around 10% less than current-gen GTX 295 overall, despite completely broken and unusable driver suites powering the bloated predecessor. It should then come as no surprise that NVIDIA has not only lost the consumer-confidence crown as we've become accustomed to, but also the sheer performance race that it seems they alone care about these days. NVIDIA has truly become worthless even by their own standards. Perhaps they should have just renamed their old parts instead, since that seems to be one thing they're good at.
In other news, NVIDIA sucks, I hate them and they are big fat doodoo heads.
I'm assuming thats half satirical... theres no way it would be 10% less than the 295GTX even with horribly broken drivers if its even close to the specs nVidia are heading for... the power consumption could be true tho.
Buggy is one thing, frankly the bugs should be few, easy fixes and not a real issue. Yields are a HUGE issue, bigger than bugs, lots of errata ends up in full production chips, i7/i5/P2's/4870's/280 all tend to have bugs that need software work arounds or, simply failed features that never work in the final product.
I can't see it being true either.
That is, unless they're scrapping the jump to 40nm and trying to shoehorn as much power onto a 55nm die as they can.
That might well do it.