• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI next gen tapes out

I haven't found a game yet that makes my GTX260 cry... guess I'll be waiting for an upgrade.

Maybe Mafia 2 will make it cry, I'll probably still be waiting then as no doubt the new models won't run it great either.
 
SI is a hybrid of Evergreen and NI? It sounds interesting... doesn't suprise me if it does stay on 40nm. When will AMD move onto Global?
 
I read to "if all goes really well we could" then stopped.

Don't get disheartened, if you read "if all goes really well we could" out of Nvidia's CEO's gob then you can be sure whatever he's promising won't happen... ie "if all goes really well Fermi will be available 23rd November"...... :p


For AMD "if all goes really well we could" has generally turned out to be the case for 30months or so ;)

Its always going to be a hard ask to get some decent parts out for a trade show 2-3 months in advance of retail release, more so on a dodgey 40nm process, AMD will likely be increasing Shader count, meaning a most likely bigger core, which means running into yield problems because of TSMC being such complete turd.

As they increase core size, knowing exactly what clock speed/shader counts they'll get the best yields with today, will be very hard.

Its a tough call on performance, being that they simply know manufacturing better than Nvidia, and have tweaked their design to suit the process unlike Nvidia, they "could" go for a same sized core and get higher yields. But even at 30-35% over Nvidia's 20ish %, price would be significantly increased over 58xx series costs.

I reckon they'll go for something about midway between current 58xx size and Fermi size, slightly tweaked shader, better performing uncore that hopefully takes a little space leaving room for a 20-25% bump in shaders would increase its size but not to disasterous Fermi yield proportions.

I doubt they'd want to go beyond 2.5billion transistors, up fro 2.15, short of Nvidia's 3.1billion transistors. THeres not a huge amount of room for increasing performance.

A "refresh" would be the same core at higher clocks, its not a refresh, its getting significant but not massive changes, even with a massive architecture change, if they stay around 2.15billion transistors they aren't likely to gain more than 5-10% efficiency, more performance costs more transistors.

If TSMC had their 32nm on time then they'd have room from the shrink to go to 4billion transistors and be close to the same size as a 58xx core, at 40nm, they are massively limited on transistor increases.

28nm is a huge drop from 40nm though, and even bigger when you move from a crap TSMC 40nm, to a excellent GloFo 28nm, it looks set to be maybe 2nd quarter next year before GloFo is up and running with bulk 28nm, TSMC is supposedly due for 2nd quarter, but most analysts and industry people think thats BS, TSMC haven't hit a release target for a process in 5 years, and often miss them by 6-12 months, often not getting good yields for a further 6-12 months.


EDIT:- Are we sure the Nvidia CEO promised stock before Xmas and on 23rd November, and also said 2009 when he said that? Maybe he meant 2010 ;)
 
Nvidia is simply in almost the same position Ati were with the 2900xt. Consumes too much power, runs too hot and doesnt perform as well as expected. Well lets be honest, they are quick cards (unlike the 2900xt) but overpriced.

I think Nvidia is confused about what to do. They see the PC gaming market dwindling (Which drives sales and advertising) so try to branch out with what the cards can do, but at the expense of its main role. Ati are still just going the gaming route, so in some ways hats off to nvidia, even if they are a bit misguided and risking so much.

Of course TSMC issues doesnt improve matters and a global melt down as well, but thats business, you have to ride it out. This is what Nvidia will have to do just like Ati did, and play heavy on the marketting.

I would love to have a fermi card if money was not an issue but for the majority it is, and these cards just are not priced competitively. Ati is simply better value at present. It will change even if one company bows out.
 
Interesting stuff. Nvidia will get bailed out though if they really do get into deep waters as they are the only competition which ATI has in that market so they have to survive otherwise all consumers will get the shaft due to ATI pricing whatever they want for their parts lol.
 
Nvidia is simply in almost the same position Ati were with the 2900xt. Consumes too much power, runs too hot and doesnt perform as well as expected. Well lets be honest, they are quick cards (unlike the 2900xt) but overpriced.

I think Nvidia is confused about what to do. They see the PC gaming market dwindling (Which drives sales and advertising) so try to branch out with what the cards can do, but at the expense of its main role. Ati are still just going the gaming route, so in some ways hats off to nvidia, even if they are a bit misguided and risking so much.

Of course TSMC issues doesnt improve matters and a global melt down as well, but thats business, you have to ride it out. This is what Nvidia will have to do just like Ati did, and play heavy on the marketting.

I would love to have a fermi card if money was not an issue but for the majority it is, and these cards just are not priced competitively. Ati is simply better value at present. It will change even if one company bows out.

The problem with this theory is that fermi is being made on the 40nm process as was intended and is still struggling with power, heat and noise. The 2900 was designed for 65nm but because of tsmc had to be built on the 80nm process which gave the 2900 most of the problems it had. I still think on 65nm the 2900 would have had problems competing with the 8800gtx but it would have been a lot closer than it was and it would not have suffered with heat, noise and power like it did.
 
I think if ATI do manage this launch fairly quickly that NV will be in deep do-do. A respin of a design that seems to be better than NV's would make fermi's well overpriced. However, as we all know the bread and butter sales are what really matter and if they can improve yields and performance at the lower end, that is where they will hurt the competition the most. Will be interesting to see how this pans out...
 
They do not cost £50 more than a 58xx to produce, thats utter rubbish, a wafer costs $5k, thats not under discussion anyone in the world would tell you that, if you're getting around 15 cores per wafer for both 480's and salvaged 470's, thats $5k/15= $333 just for the cores, if its less than that the price goes up.

Thats the CORE, most generations the core would cost something along the lines of $40-80 and they'd make a 100% profit and sell at $80-160.

$333 is JUST the core, its got more memory, 512mb of the fastest memory in the world costs a good $20-30 extra, probably $70-100 for the memory(prices are not good for memory of any type right now), $50 for a very complex pcb, military grade vrms/caps is anotehr $10-20, the cooler will cost $30, dvi/hdmi connectors, and putting it all together, packages with the cards, boxing and shipping, they likely cost minimum of $500, absolute bare minimum, probably closer to $600-650.

You are assuming, here, that yields are around 15%, when even back on the A2 revision of the die in november they were getting around 25%. (http://www.brightsideofnews.com/news/2010/1/21/nvidia-gf100-fermi-silicon-cost-analysis.aspx)
The shipping A3 is almost certainly better than that, up above 30%. which puts usable chips at a cost of at most $150. $20 for the PCB, $20 for board components, $40 for 12 GDDR5 chips, $25 HSF, $10 assembly, $10 packaging. My estimate puts it at around $275 to manufacture. With a $349 rrp on the GTX470 and $499 on the GTX480 there's a profit margin on both cards. Those margins may be slim compared to what nVidia is used to, but they arn't bankrupting themselves by losing hundreds of dollars on every card.

Until someone leaks the final yield numbers there's no way to work out what the actual total manufacturing costs are.
 
Last edited:
You are assuming, here, that yields are around 15%, when even back on the A2 revision of the die in november they were getting around 25%. (http://www.brightsideofnews.com/news/2010/1/21/nvidia-gf100-fermi-silicon-cost-analysis.aspx)
The shipping A3 is almost certainly better than that, up above 30%. which puts usable chips at a cost of at most $150. $20 for the PCB, $20 for board components, $40 for 12 GDDR5 chips, $25 HSF, $10 assembly, $10 packaging. My estimate puts it at around $275 to manufacture. With a $349 rrp on the GTX470 and $499 on the GTX480 there's a profit margin on both cards. Those margins may be slim compared to what nVidia is used to, but they arn't bankrupting themselves by losing hundreds of dollars on every card.

Until someone leaks the final yield numbers there's no way to work out what the actual total manufacturing costs are.

You asked for leaked numbers?

http://forums.overclockers.co.uk/showthread.php?t=18138407

It's enough to send the investors into panic anyway. Info about how good the yields were with A2 was Nvidia spin as it can be seen clearly now. Fact is Nvidia has only made 10,000 Fermi chips full stop so far which is appauling.

If yields were higher, there would be many more gtx4xx cards available.

so 94 max cores per wafer means that Nvidia are getting between 19 and 28 usuable gtx480/470 cores per wafer which gives them a price of between $178 and $263 per core. Not as bad as Drunkenmaster said but still appauling over $71 cost for ATI. That makes Fermi between $100 and $200 per core more expensive.

As Drunkenmaster said, the rest of the components on the Fermi cards are more expensive than ATI as well.

And lastly, even going by your figures and getting to $275, you are mistakenly assuming that Nvidia gets all the RRP money at $349 and $499. Nvidia partners need a profit plus distribution costs plus retailers profit. Even with your figures, each gtx470 will be a loss for Nvidia and perhaps a small profit on gtx480. If the yields on some wafers are as little as 20% as reported then even the gtx480 will be losing Nvidia money for each one sold.

So with the 5870 having a RRP of $449 and Nvidia having a core cost of between $100 and $200 more you can see that even if Nvidia are just making a little money per card, ATI are making $50 to $150 per card more money. That is a lot when you have shipped £6 million cards compared to Nvidia's 10,000.
 
Last edited:
So with the 5870 having a RRP of $449 and Nvidia having a core cost of between $100 and $200 more you can see that even if Nvidia are just making a little money per card, ATI are making $50 to $150 per card more money. That is a lot when you have shipped £6 million cards compared to Nvidia's 10,000.

I'm having a hard time understanding where these numbers come from, from the reports that were released recently operating income for AMD's graphics segment was $47m for Q1/2010 and $50m for Q4/2009. That means a total income of $97m for the last 6 months. If they shipped 6 million DX11 cards in the last 6 months then were are talking about less than $16 per card shipped average (considering that they would have also shifted a fair number of non-DX11 cards in the same period). Also bear in mind that operating income does not included taxes.
 
I'm having a hard time understanding where these numbers come from, from the reports that were released recently operating income for AMD's graphics segment was $47m for Q1/2010 and $50m for Q4/2009. That means a total income of $97m for the last 6 months. If they shipped 6 million DX11 cards in the last 6 months then were are talking about less than $16 per card shipped average (considering that they would have also shifted a fair number of non-DX11 cards in the same period). Also bear in mind that operating income does not included taxes.

Exactly, I wasn't saying ATI made $50 to $150 per card. I am saying they make $50 to $150 more than Nvidia. If ATI only made $16 per card shipped then that just goes to prove that Nvidia must be losing a fortune per Fermi.
 
Lets not forget the profit margin will vary from card to card in the range. Im sure ATI have far lower margins on low/mid range products than they do on the high end. Since the convo seems to be focusing on the high end it wouldnt suprise me if ATI make on average ~$100 per card.
 
Firstly BSN have never been accurate, when they made up their numbers completely, he had the cores per wafer completely wrongly calculated, which just made him a complete moron. He comes out with these odd articles fairly frequently but it helps to get the basic maths right when you make crap up to make it sound remotely believable.

When analysts talk about numbers, a lot of them don't really know what they are talking about in all honesty, its guessing. They are also most likely guestimating yields based on wafer output possible from time of production, which they've most likely got wrong also.

These cards we're seeing now are hot lots, risk based wafers that get made at the same time as the silicon is taping out, it saves 6 weeks, or gives you 6 weeks of extra production.

But if they are seeing 10,000 cores sold for instance, and assuming production started in Feb, you'll get say 3,3k cores per month on say 1 wafers a month and come up with a guess of yields.

The problem is they probably started production as a hot lot, at the same time A3 was sent to be taped out, which would be mid November, that means a LOT more wafers have been made for 6 weeks longer to get to the number they've come up with, which actually would show far lower yields.

IE they could be assuming as I said 1k wafers a month for 3 months and 10k cores, when infact its been 1k wafers a month for 4.5months and still the same amount of supply, that would bring the yields down significantly.

They also work in "bands" because you can't accurately give yield per wafer as its variable, 20-30% means the band they think the yield is in, its likely at the VERY bottom of that band, not anywhere near the 30%, and every piece of info suggests its not as high as 20%.

Basically for AMD 5-8k cores would be, 700-1000 wafers, something they could do in a month, Nvidia could have been making wafers since mid November, they have higher allocation, and have less than 10k cores out so far with no signs of big new shipments.


Even then, as Greebo said, you've got AIB's marking up the cards, distributors and retailers all taking a cut, meaning even at the incredily, well, inaccurate number of $279 a card, theres 0% chance Nvidia would be taking anything but a loss on it to get them into stores at $399. They'd need to leave Nvidia at $200 maybe WITH profit, to enable the other guys to take their cut, to sell at under $400 at a profit
 
I'm having a hard time understanding where these numbers come from, from the reports that were released recently operating income for AMD's graphics segment was $47m for Q1/2010 and $50m for Q4/2009. That means a total income of $97m for the last 6 months. If they shipped 6 million DX11 cards in the last 6 months then were are talking about less than $16 per card shipped average (considering that they would have also shifted a fair number of non-DX11 cards in the same period). Also bear in mind that operating income does not included taxes.

its revenue - expenses, profit from cards sold has to cover all the R&D money pumped into the teams making the next gen, and the next gen after that. Operational expenses are huge when you've got thousands of employee's, high R&D costs and a process that offers less profit per core than ever before.

I've been trying to point out for a long time the 4850 was a low profit card designed to sell in massive quantity, the 5850 is the same card offering a VERY similar level of profit, IE, not much at all, the cost of making a a 5850 compared to a 4850 has more than doubled, memory prices have increased and remember also the 4850 had incredibly cheap gddr3 rather than uber expensive gddr5. The difference in price betwen the 5850/4850 is completely accountable from increase wafer cost, lower yields, bigger core relatively, more memory, far more expensive memory.

But R&D for cpu's and GPU's in is the 100's of millions a year. Operating expenses are huge and the number you'd want to tell how much profit they are making per card, rather than after everythings taken into account, is Revenue, operating income, without operating expenses, tells you entirely nothing.

operating expenses could be £5million, making the operationg income incredibly poor, or the expenses could be £500million, meaning their income is outstanding.
 
I don't get what all the excitement is about. Charlie has indeed been right on a number of points but he has also been proved wrong on so many too!!!

Fermi isn't really a flop as such, I plan to get a couple myself.

The ATI refresh isnst really that big a deal either as it's just going to be the same old architecture team red will spew out with some very minor modifications to the pipeline, hopefully bringing much needed tesselation and AA performance.

As for prices, not much wi change and things will be compatible. ATI will be cheaper as they don't include much innovation in their engineering, opting for high click rates. And you will still, as always, have to pay higher for the added engineering efforts for an nvidia card.

Value consumers will still go for atiand enthusiasts will sti go for nvidia.

And at the end of the day we will all be happy with our buying choices and will still be playing some cool games!!

Who genuinly cares about any of this crap? Sun is shining and there is cold beer available :D
 
Back
Top Bottom