• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

New Nvidia card: Codename "GF100"

Looks impressive & regardless how it turns out against the 5xxx's it should put downward pressure on prices which should please everyone.
 
The architecture of graphics silicon designed is strongly tied to trends in games, and how it would perform in current and future games. For instance balancing the ratio of bandwidth to processing power.
I can't see any need for double precision support in games, and support for this must take a huge amount of silicon area. I have a feeling that they haven't targeted the architecture of this chip at computer games...

EDIT: reading that anandtech preview says exactly the same!
 
Last edited:
The architecture of graphics silicon designed is strongly tied to trends in games, and how it would perform in current and future games. For instance balancing the ratio of bandwidth to processing power.
I can't see any need for double precision support in games, and support for this must take a huge amount of silicon area. I have a feeling that they haven't targeted the architecture of this chip at computer games...

Just relying on gamers is not enough any more.
 
Hmmm...

Good launch of the card. Great to see. ECC on a graphics card? I nearly wet myself! LOL :D

This card is great if you want want to calculate the weather cycles and fluid dynamics but it will still run Crysis under 50fps.

Changed my estimation of this card to £600 when it first comes out.

Imagine how quick it will be in badaboom?

Yeah boy
 
Hmmm...

Good launch of the card. Great to see. ECC on a graphics card? I nearly wet myself! LOL :D

This card is great if you want want to calculate the weather cycles and fluid dynamics but it will still run Crysis under 50fps.

Changed my estimation of this card to £600 when it first comes out.

Imagine how quick it will be in badaboom?

Yeah boy

Unfortunately for me all i want is a gaming card so all this extra bs that adds to the price is useless. I think the only thing that will save this card from a gamers side is if its much faster than a 5870 as if its not and it costs the earth to buy then its core market place is gonna be hating on it. Nv earned $10 million dollars from there tesla side on the gtx200 series which on the big scale is nothing to them in the £1billion profit they earned so i hope for the sake of the company they don't get on the wrong side of there real supporters which comes from the gaming community.
 
which on the big scale is nothing to them in the £1billion profit they earned so i hope for the sake of the company they don't get on the wrong side of there real supporters which comes from the gaming community.

Please tell us how you got to the $1 billion figure when they earned just $38 million in Q2;

"The company reported revenue of 777 million greenbacks in Q2 2009, 16.9% more than in a previous quarter and achieved a profit of 38 million USD"

Awful return imo on such a revenue
 
Please tell us how you got to the $1 billion figure when they earned just $38 million in Q2;

"The company reported revenue of 777 million greenbacks in Q2 2009, 16.9% more than in a previous quarter and achieved a profit of 38 million USD"

Awful return imo on such a revenue

http://www.anandtech.com/video/showdoc.aspx?i=3651&p=8

think i may have got it wrong then but this says at its peak grossed $1 bil in a single quarter not sure if thats profit or not or when it was. Below is what he says.

Last quarter the Tesla business unit made $10M. That's not a whole lot of money for a company that, at its peak, grossed $1B in a single quarter. NVIDIA believes that Fermi is when that will all change. To borrow a horrendously overused phrase, Fermi is the inflection point for NVIDIA's Tesla sales.
 
http://www.anandtech.com/video/showdoc.aspx?i=3651&p=8

think i may have got it wrong then but this says at its peak grossed $1 bil in a single quarter not sure if thats profit or not or when it was. Below is what he says.

Last quarter the Tesla business unit made $10M. That's not a whole lot of money for a company that, at its peak, grossed $1B in a single quarter. NVIDIA believes that Fermi is when that will all change. To borrow a horrendously overused phrase, Fermi is the inflection point for NVIDIA's Tesla sales.

They prob grossed $1 billion in a quarter but certainly not made that. If they did I'd prob have bought a few shares by now ;)

They're struggling for sure, and I can't imagine Tesla taking off to such an extent in the next couple of years that it'll keep them out of trouble. Let's remember that they're effectively turning their back on the gfx market in search of something bigger...

But of course circumstance may be forcing their hand in regards to AMD & Intels further encroachment into the gfx arena (coupled with the lack of an x86 license)
 
I think that card is just a gtx285 with a few add in's and more memory. The price on those sort of cards is always a lot more than what the gaming version will retail for. The gt300 will be much more powerful than that and cost much less.
 
http://www.youtube.com/watch?v=UlZ_IoY4XTg
Tesla S1070
RRP £1,125.85
This thing has many of the innovations that FERMI has NVIDIA Parallel, Double precision, modified kernel.
Doesnt look good for Fermi's price :(

There's always a massive markup for HPC-graded parts. More so even than the quadro series. You can expect the Fermi HPC parts, with 3Gb and 6Gb of memory, to still go for well over £1000. The graphics / gaming retail part will still be priced to compete with ATIs cards though.

I'm a bit less skeptical than most of you guys, regarding the direction nvidia are heading. I work with a lot of people who do scientific modelling (FE / FV simulations of solid and fluid mechanics etc), and who would love to port their codes to GPUs. So far it's just been that bit too much effort for the potential rewards in most cases, but with this card running c++ natively, having a shared cache and error checking, and being able to run double-precision at "sensible" speeds... I really think that the interest could snowball rapidly. This card really does implement all the features that people have been asking for.

Of course, if the thing can't compete with the 5800 series nvidia will lose a lot of their core business. But if it can at least compete, I think they could well be onto a winning solution, losing a small amount of their graphics share in order to dominate a new and expanding market. Only time will tell, but I can guarantee at least a few dozen will be sold at my place of work...
 
anandtech said:
NVIDIA claims that by pre-announcing Fermi's performance levels it would seriously hurt its existing business. It's up to you whether or not you want to believe that.

Um just a thought, but ATI's current performance levels are starting to hurt Nvidia, and will do at least until we see the GTX3xxs. If Fermi really is that good, then releasing figures which proved that might slow down the exodus to the red team, as people might be prepared to wait.
 
Hmmm...

Good launch of the card. Great to see. ECC on a graphics card? I nearly wet myself! LOL :D

This card is great if you want want to calculate the weather cycles and fluid dynamics but it will still run Crysis under 50fps.

Changed my estimation of this card to £600 when it first comes out.

Imagine how quick it will be in badaboom?

Yeah boy

Lets face it, based on all your previous posts all you seem to do is slate anything related to Nvidia. Your signature says it all.

Also, I like the fact you keep coming up with random facts about price and performance that will probably be as far from the truth as I am to the moon.
 
Last edited:
Lets face it, based on all your previous posts all you seem to do is slate anything related to Nvidia. Your signature says it all.

Also, I like the fact you keep coming up with random facts about price and performance that will probably be as far from the truth as I am to the moon.

Most of my facts come from semi accurate forums, mostly from Charlie himself. And let's be honest he knows a hell of a lot more than you. Go visit sometimes and educate yourself. So please, in a nicest way, be quiet. It's a $600 iPod encoder. :-)
Don't like the way nvidia is running their bussines anymore. And i think many ppl would agree.
 
Last edited:
Most of my facts come from semi accurate forums, mostly from Charlie himself. And let's be honest he knows a hell of a lot more than you. Go visit sometimes and educate yourself. So please, in a nicest way, be quiet. It's a $600 iPod encoder. :-)
Don't like the way nvidia is running their bussines anymore. And i think many ppl would agree.

Many would, but we wouldn't agree with someone who finishes their post with "Yeah boy".

If you really want to slate Nvidia, go to TWIMTBP Batman restricted graphics thread. A perfectly viable hunting ground for the green prey.

The application of these new technologies is dubious for gamers, and when the card manufacturers struggle to push game performance considerably with an entirely new architecture I begin to worry.
 
THe funniest part of it all is, with the shift away from gaming specific features, the design for GPGPU and being more of a parelel series of cpu's than a gpu shader, it for all intents and purposes, is what Larabee is intending to be, except hilariously, Nvidia have been saying Larabee is completely the wrong way to go, it will suck in gaming and won't be optimised enough for gaming.

Frankly over the next couple years we'll see Nvidia leave the gaming market, Intel enter but they, with the ability to make their own stuff far cheaper on working processes with twice the yield, plus the fact they have a PLATFORM they can sell to anyone who wants GPGPU, means they will blow Nvidia out of the water in that arena aswell. They can also afford to make a "gpgpu" with extra die space dedicated to graphics acceleration, because again its just going to be FAR cheaper for them to produce, they can simply make a bigger chip for a lot less than Nvidia will be able to.

I can't see any reason in the future for Nvidia to aggressively go after the graphics market. the 5870 won't make a huge profit or sell in massive numbers, but the tech and design that trickles down to half size/quarter sized parts, 57XX and 55XX parts will make them a lot of money. Nvidia has no reason to offer mid/low end parts of a massive and ludicrously expensive GPGPU part. People who want it for number crunching won't want a half/quarter power useless thing thats slower than other things doing the same job, and mid/low end gamers will have very expensive/slow parts on their hand so won't want them, intergrated is simply not going to be possible for Nvidia to compete in when low end intergrated stuff is simply on the cpu die.

AS people have said 777million turnover for Nvidia, and 38million profit, is awful, thats not at a level that can even fund future R&D. They haven't been making any real profit on their last generation, mostly because ATi forced them to sell most at cost, rather than at profit, if everything sold for 30-50% more as intended and was required for their size of core, obviously that profit would be higher.

THey simply can not compete financially with ATi with a big core vs a core half its size, doing so last gen completely wiped out all profits for the company. Doing so again, with a more expensive part with even lower yields will only make things worse, considering their mid/low end from last gen only just launched, as ATi's new gen mid/low end is not far away now, they are struggling, severely.


The only way for them to have stayed viable in the graphics market, would have been to make two high end cards, one only for GPGPU with no acceleration of any gaming features, and one very slim core in ATi style, that accelerated every last feature and was efficient as hell and cheap to produce.

EDIT:- I'm also not knocking Nvidia, with no cpu/chipset/mobo platform that AMD/Intel have, they can't compete and its really that simple, they have done well to almost generate the GPGPU market themselves and will now move into it and try to be the leader there.

Its also showing what a stupendously brilliant move AMD made in buying ATi, had they not ATi/Nvidia/AMD all separately could have been in trouble from each other and Intel.
 
Last edited:
Back
Top Bottom