I was commenting on this;
very very few people are going to be buying a laptop based around a "470" mobile chip with any regard for battery life.
Not sure how hard to understand this is, people were saying, nice range, I said, no it won't sell because the "performance" segment of laptops is about 1/1000000th the size of the performance segment in desktops. High power parts fail. Very few people will buy the 470m AT ALL, they'll hope to sell 10million mobile gpu's, of which they'd be happy if 50k were 470m, 99% of people buying laptops DO NOT BUY high end high power stuff, those that do don't care about power, whats your point, I haven't said they do. The point, for the umpteenth time is sales are ruled by volume, the LOW end parts make ALL the profit, their low end parts are pretty abysmal.
As for direct competition Harlequin, a 5570 is essentially 1/4 of a 5870, a 420gt is pretty much 1/10th of a 480gtx, they aren't even close to direct competition, the 5570 should be pushing 3 times as fast as the gt420 and it uses less power, thats a truly awful situation to be in. The 80shader part won't be significantly slower. Basically a 420gt will be horrible for gaming, as 80shader parts are, a 400shader part will be probably faster than the 96 shader Gt420, and the 400 shader 5570 uses less juice than the 48 shader version.
I'm also expecting, not sure if the minimum 6xxx series card will have 160shaders, but AMD really need a filler card between the 80 and 400 shader mark. A 420gt performance wise will be probably on par with a 160shader AMD part, they just don't make one.
With Sandy bridge equalling the 80shader part now, I'm not sure if its better to have an extra SKU at the 160 or even 240shader part(which should destroy a gt420), or make the baseline performance 160shader. Its very difficult because performance itself isn't even remotely required on the low end, the idea is to make something cheap enough to get a display working and thats it. So moving from 80-160shaders decreases profit. But performance is necessary when comparing two low end cards, people like value for money and won't turn down twice the performance at the same cost, so in terms of the lowest end AMD part being twice as fast, or matched by Sandy Bridge, it looks FAR better if its twice as fast.
Of course the 6 series could change the number of shaders in a cluster, in which case the minimum might be different to 80 anyway. I think its the time to see them bump the minimum performance up for the good of everyone though.
Anyway, its an even bigger failure than I thought earlier, as the little tidbit they left out was they are retaining the gt310/315 parts, I wondered why then it hit me.
They can't get a 5-10W part out because, they changed the granularity of the core from 16 shader clusters, to basically a 3x shader cluster with altered core logic for each cluster, mostly adjusting the rop/tmu ratio to each cluster.
THe reason the lowest mobile GPU has 48 shaders is, the gf104 doesn't scale down below 48 shaders, which is either a laughable oversight or they actually planned to have to rely on last gen's low end, which again is a laughably bad situation to be in.
SO I was wrong, the gt310/315 will sell the entire 4xx mobile parts by a factor of 10,000 to 1 because they are the low end parts. The problem is, Dell and co love new cards, they like selling "new" computers with brand new, latest gen parts. Relying on last gen low end for the bulk of sales is just bad business, even more so because the gt310/315 have been losing market share hand over fist as guys like Dell move towards AMD, more so as Apple move away from them in the mobile sector which is seemingly happening in the next year.
The naming gets even more odd because they are now going to have a line up of GT310/315, gt415 up to gt470. I think normally you'd have Nvidia rebrand those bottom two as gt410/415 and make the new gen from GT420 and up. Though I guess because the gt310 was already rebranded from gt210, and the gt3xx range is a joke with dx 10.1, and dx10 parts mixed up, that gt4xx with varying dx support wouldn't be justifiable.
Seriously though, designing a new generation but forgetting to include the ability to make a truly low end part is very very odd.