• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Is Happy With The Performance Of Fermi

The GT240, GT220 and G210 are from that line of graphics cards (they're both 40nm and support DX10.1), I'd say Rroff is likely correct. GT212, what should've been a high end GT200 derivative on the 40nm process, as reported by Charlie Demerjian (take what you will from this, but it's certainly not an invention of Rroff's), was allegedly canned.


I never read anything about a DX10.1 card for November. and Nvidia said nothing as far as I know.

I did hear a rumour of shrunk (45nm?) 200's for q1-q2 of 2010 but that's a nine month difference.
 
Im sure nvidia knew when win7 would launch with its new dx11 because ati sure did and had cards ready before the os even launched.
The fact is nvidia missed the boat and ati didn't.
They are late and are paying the price with lost sales.
 
oh yay, how much these things going to go for? oh well I don't care too much I know they will be a lot more expensive than ATI also hoping they push them down a bit (that does not make me a fanboi)

if these cards go for around the same price as ATI and are worth it my next card will be a Nv although I don't like them ethoir
 
This is recent news from Taipei. It seems another rework is on the cards.


Edit: Fermi first, then a Geforce (GF104/105 ?) range a month or two after. Could be closer to June, I hope its not a hot running card.
 
Last edited:
As much as I hate to link to fud...

http://www.fudzilla.com/index.php?option=com_content&task=view&id=11237&Itemid=1

Originally slated for Q2 2009 but delayed 4-6months by the 40nm process fiasco and then ditched as the AMD 5800 series made them irrelevant.

Sorry but thats really a ridiculous statement. So they spent 9 months working on a shrink, and then AMD magically out of the blue, (in the same month everyone knew it was coming for over 6 months) brought out the largely more expensive and probably larger core(compared to a gt200b on 40nm) and so Nvidia threw millions in R&D and time and effort down the sink because it was obsolete? Nonsense.

They couldn't get it working, nothing more or less, it would be a core not far off half the size of a gt200b, considering they had to stop producing that as it was basically selling at cost and was rather pointless, you don't think, considering they've only sold a handful of cards from a couple months ago(and maybe for another 3 months) over £100, that somehow a gt200b, at roughly half the cost to produce, when they could sell a GT200b at £150 as a finished card and make a tiny bit of cash is pointless?

A gt200b on 40nm, with the SAME clocks would use less power, cost Nvidia half as much to make, and be able to be sold at a somewhat similar price to the 5770 which it is a decent amount faster than. But they've canned it, because the £100 bracket between the 5770-5850, isn't worth fighting for. IF the core worked, it would be out there right now as, most likely a 340-355gts's selling between £120-150 and making profit. Now lets see, would Nvidia choose to make profit on a card they can sell and not have to say no to Dell, Apple, HP, etc, etc when they ask for midrange cards, or would they choose simply to not make it because a card in a completely different price range is faster than it.

Seriously, pull the other one, if it worked, it would be out there, with avengance, with a load of review sites being paid to test Batman and smeg all else. Its not out there, hence we know it doesn't work. It's a core fundamentally designed for 65nm, with most likely very very little thought put into it(quite fairly and without blame) for combating leakage on a crap 40nm process that wouldn't be available for 2 years. It just about managed to get to 55nm with some trouble, its just a design thats not working at 40nm.


Problem is Nvidia clearly need it not just to plug the gap now, but likely until Nvidia next next gen has a midranged part based on the next gen Fermi. IE they'd have been planning to sell the 40nm gt200b as the midrange under the Fermi high end till 2011 most likely.
 
It would seem Nvidia are just fumbling in desperation to get something out the door ASAP. I have a feeling Fermi wasn't even designed at 40Nm never mind Rrof's 40nm gt200's.

Could we be looking at another GeForce 5900 style fiasco ?
 

You kinda missed my original point... which was reinforcing my point that nvidia aren't really behind schedule as such despite all the news sites saying that fermi is delayed... because they originally planned the 212 cores to be out by Nov/Sept atleast... and fermi was never designed to come out before them... and was pencilled in for the end of Q1 2010 for a long time.

As for why 212 never saw the light of day... well its arguing over semantics really but its pretty conclusive that the 5800 series finished it off so what I said isn't really so ridiculous... no ones going to buy a high end DX10.1 card now with DX11 on the table.
 
It would seem Nvidia are just fumbling in desperation to get something out the door ASAP. I have a feeling Fermi wasn't even designed at 40Nm never mind Rrof's 40nm gt200's.

Could we be looking at another GeForce 5900 style fiasco ?

Given the problems with the 40nm process its not suprising they ditched the efforts on the 200 refresh and concentrated on the 300 series... its gonna be a pretty painful time for them but once they crack it - it should pay off - in the long run its the better design as you get better performance returns as you scale upwards compared to ATI's design which is going to start seeing exponential diminishing returns in the long run without a redesign.

EDIT: Having said that look at what happened with the P4 and Athlon... the P4 is kinda like the ATI cards and the Athlon like nvidias design (in a way)... the P4 ran out of headroom but the stronger Athlon design for whatever reason didn't carry them into the next generation very well whereas intel came back with the core 2... so it could go either way.
 
Last edited:
Given the problems with the 40nm process its not suprising they ditched the efforts on the 200 refresh and concentrated on the 300 series... its gonna be a pretty painful time for them but once they crack it - it should pay off - in the long run its the better design as you get better performance returns as you scale upwards compared to ATI's design which is going to start seeing exponential diminishing returns in the long run without a redesign.

EDIT: Having said that look at what happened with the P4 and Athlon... the P4 is kinda like the ATI cards and the Athlon like nvidias design (in a way)... the P4 ran out of headroom but the stronger Athlon design for whatever reason didn't carry them into the next generation very well whereas intel came back with the core 2... so it could go either way.

The problem with this theory is that ati have said the 6 series cards will be there real next gen so nobody knows if its going to be based on the same kind of architecture as we have seen with the 2,3,4 and 5 series cards. So it sounds like the 6 series cards will be a whole new architecture so maybe ati know that they are hitting the limit with there current design.
 
Theres nothing wrong with the theory... I haven't said ATI won't be expecting it - they seem to be good at looking ahead lately...
 
Apart from the fact Nvidia are late with the GF100 and its playing catch up to ATI's old tech.

I don't believe Nvidia ever planned a 40nm shrink for the 200's. It is more likely they felt ATI was bluffing and the 200 cards would see them through until summer when the GF100 would be ready.
 
Apart from the fact Nvidia are late with the GF100 and its playing catch up to ATI's old tech.

I don't believe Nvidia ever planned a 40nm shrink for the 200's. It is more likely they felt ATI was bluffing and the 200 cards would see them through until summer when the GF100 would be ready.

Its late to the table relative to the market movement... its not delayed as some places are making out... unless they do slip and can't make a release around March.

You'd be wrong about the 40nm shrink - they were well down the road of making a 40nm DX10.1 refresh for the 200 series - thats finally seen the light of day in the 215/216 cores... but the high end parts were abandoned around the time ATI came up trumps for whatever reason.
 
I think your assumption is a little off.

Nvidia said in Nov the GF100 was almost ready and pre Christmas production was the plan. They gave the impression it was ready and waiting, but so far Nvidia have given us pictures of a block of wood, and a old fat guy with two blocks of wood.

ATI will have all new generation card ready within three months of the GF100 release. That will most likely make the card obsolete.

It is very late, it has just been delayed and the rumours are it will be even later than expected.

The GF100 needed to be on sale six months ago to be on time, but you seem to blinkered to see that.
 
Back
Top Bottom