• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

nVidia GT300 - GeForce GTX 380 yields are sub-30%

ATI have R800 ready and waiting, to launch alongside W7 at the latest according to FUD.

Unless some DX11 games come along that cause jaw-dropping like Crysis did, it's mostly a propaganda victory though.

So why did people buy the g80 it wasn't for dx10 but 9
 
Because the g80 absolutely owned at DX9 compared to any other card out at the time...

Whereas theres very little on DX9 at the moment that doesn't run silly fast on current high end cards and likely to be that way for awhile - infact as far as DX9 goes neither the new ATI cards or the GT300, etc. are going to be that much faster - its DX10+ where they will shine (or not).
 
This arch was being worked on for like 3 years, it takes a lot longer than you think to design a GPU.

That's fair, but I hope GT400 will be a new direction - I don't think nVidia can take this much further than they already have without switching to something a lot smaller and more efficient.
 
PC hardware is so boring these days compared to what it used to be...

agreed, not a fanboi but particaly Nvidas hardware recently, all basicly rebranded sept for a few, unlike AMDs every single time they bring out a new card it always differs, like the HD4770 with a 40nm die I mean wtf? lol
 
Can't say I would agree with that. There's a huge amount going on.
The problem is it seems like tech takes too long to get into consumers hands and when it does arrive it's out of date.
The internet information effect.

Granted, Nvidia have been riding those old cores a bit but the economic climate is not the best.
 
Last edited:
The Global Foundaries move reeks of potential for dodgy price fixing :S

Would do if TSMC wasn't a fall back option.

AS I've been saying for a while TSMC have screwed ATI on 65nm, screwed Nvidia on 55nm(though in fairness that was largely down to Nvidia and not playing ball with TSMC by all accounts) and screwed both companies on 40nm. Basically they've screwed up every single process shrink in the past 4 years, and are rather a joke of a company.

Global Foundries getting up and running, and significantly getting the New York fab up and running asap will be good for everyone as they take on large orders in a very quick high yield high tech plant meaning reduced costs for all end users. TSMC will finally have real competition to their contracts as right now there is no viable large scale alternative for Nvidia/Ati(unless Intel start selling off spare fab capacity to both companies, which considering they closed 3 down due to the current economy, would actually be a smart move for them ;) ) , which will mean they'll spend more on R&D, more on upgrading their kit to more current/bleeding edge equipment and do better with their shrinks. Once TSMC become competitive Global Foundries will do the same and get cheaper.


AS for saying all new parts have poor yields, its utterly ridiculous, Intel, AMD, Samsung, IBM and plenty of others have no problems whatsoever. TSMC's problem was cutting 200 people from their R&D department and cutting spending by 50%, which they have since reversed since GloFo got serious.

But TSMC have been milking Nvidia/ATi for years, and many other chip makers, because there was no competition there was no alternative, TSMC continue to put as little cash in as possible, don't lose any business, and make a tonne more money than they "should" be making. Mistakes didn't matter as where was anyone gonna go. They'll finally be competing for business so these problems should dissappear.


There is still the issue that Nvidia, like ATI, knew all this, saw how ATi got utterly screwed with their "monolithic" R600 which was supposed to release on 65nm and on time, but was late, and on the wrong process and cost a bomb and made no money. Nvidia with 3 years notice still haven't made the ATi move of small cores. ATi are still highly susceptible to TSMC's utter screwups, but leakage quite literally effects a smaller core to a lesser degree, and smaller cores will almost always(very very very rarely aren't) higher yield than a larger core. Nvidia really needed to copy ATi's move to be competitive in cost, yield and to survive TSMC's screw ups. Its entirely no coincidence that ATi cards were first to 65, 55 and 50nm, small core, simple design, less current leakage and higher yields. Its cost Nvidia Huge in the past year and its getting worse.
 
Last edited:
ATI have R800 ready and waiting, to launch alongside W7 at the latest according to FUD.

Unless some DX11 games come along that cause jaw-dropping like Crysis did, it's mostly a propaganda victory though.

What, are you nuts?

Of course it won't be a propaganda victory, they will be out first and will sell a boat load of 5870.

A few die hard Nvidia fanboys will wait for GT300 but most people on this forum will move straight to 5870.
 
What, are you nuts?

Of course it won't be a propaganda victory, they will be out first and will sell a boat load of 5870.

A few die hard Nvidia fanboys will wait for GT300 but most people on this forum will move straight to 5870.

I dont think 'most' ppl will go for ati until they see how nvidia compares price/perf.
 
If ati release a 5870X2 and it has a decent improvement over the 4xxx they will shift them no problem if pricing is as competitive as it has been. TSMC have needed competition for a long while and now they have it so they are going to have to start making more of an effort then they have n the past. This is a problem for nvidia and there is no getting away from that but pending on how well g300 performs will decide how much of a set back this ultimately turns out to be. Also there is no getting away from the fact if ati do release the 5xxx this year it is more then just a propaganda victory and being honest i don't really expect the initial 5xxx to be the ultimate dx11 killer tech but it should finally run dx10 very well as that is usually how it works first card for a new api being better at the old api rather then the new.
 
well theres some news/rumors that nvidia will release dx10.1 graphics cards for desktop in September , i bet it will be then when ati will respond with dx11 rv870.

source : http://www.fudzilla.com/content/view/14383/65/

The problem being, the 4770 which in terms of size compared to a 40nm version of say the 280gtx, is less than half the size isn't it, maybe closer to 1/3rd the size, and IT has yield, heat and power problems all over the shop. So add in a much larger core, with massively lower yields and, well you see the issue.

Now the worrying thing is that Nvidia has mobile dx10.1 parts on 40nm, low yield as I hear also, but just doable because, why, its about 1/4 the size of their current 280gtx, it has, iirc a smaller bus(massively) and 96sp's, instead of a 512bit bus and 240sp's. My guess would be it will be mobile parts branded to desktop that will be, lets be honest, embarassingly slow. A gx2 version of mobile parts like they did generations ago would still have only the raw sp power of an original 260gtx, just with dx10.1(which shouldn't be undervalued) but will be expensive, a dual core card and a total waste of cash.


My bet would be mobile parts in desktop packaging, or a never to appear card to make people wait and not spend cash on ATi cards and hope they keep waiting all the way to their new cards.

Lets be honest, they've designed and seemingly had 2 or 3 respins of their NEW core on 40nm of which none are working, other than being marginally smaller the 280gtx design is no less complicated, but would cost a lot to port that over to the new process especially for a short production run. If they were forced to make a 280gtx + dx10.1 + 40nm just for a couple months, then it begs the question is it "only" the size and 40nm being the problem with the new core, or are their fundamental problems with the new core not actually working very well.


But as I've said before, Nvidia are moron's who refuse to adapt, Nvidia's own CEO types were saying TWO YEARS AGO that you can't compete in this market without an entire platform, to be fair they probably can't do much in terms of getting a CPU option out, but they've done very little to win the intergrated mobo sector of which they will now be wiped out from in the next year or so. But it was also clear from TSMC's past 3 years and ATi's 2 year shift to value market pricing and massive drop in core size that its the only way to compete and yet 2 years ago they didn't start working on small cores. They've blinding gone on trying to win the smallest market around with massive expensive cores that are hard to make with a company that can't deliver new processes on time.

ATi/AMD have adapted to whats going on around them and prospered, Nvidia has seen the problems, ignored them, run blindly into worsening problems without a change in strategy.

Which isn't great for any of us, if Nvidia stop being competitive, its bad news for all of us. Though Intel stepping in might keep the fight for performance/prices going.
 
Would do if TSMC wasn't a fall back option.

I don't mean by GF or even AMD directly... I mean if nvidia was to use GF - it would be a massive amount of business for GF - then between nvidia, themselves and ATI they could negotiate prices to make as much money off the consumer as possible... bye bye cheap cards from either company.
 
Back
Top Bottom