• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia GT 220 Reviews

The GT220 isn't even a real gaming GPU. Like the reviews say, it's a desktop GPU. Even then, the 53xx will probably kick it's ass, because it's not designed for gaming.
 
The GT220 isn't even a real gaming GPU. Like the reviews say, it's a desktop GPU. Even then, the 53xx will probably kick it's ass, because it's not designed for gaming.

That's the big problem tho isn't it? High end Intel GMA chips and lowend ATI chips compete in this not really for gaming segment, you can play Warcraft and the Sims on them and they do sparkly mode in Windows 7. This chip is aimed at the OEM, but as an OEM why would I bother?
 
I'm just wondering what Apple will do.. their current laptop offerings run nVidia (9400 & 9600 based).. they must be facing stiff laptop competition too.
 
Last edited:
Yes I see that but what do they compete with? 9500GT

They are meant to be used in 90% of all PCs made and sold. Internet browsing boxes. Jesus, why can't people understand this. Nvidia are allowed to make crap graphics cards for people who don't need them to game. It's a money maker.
 
They are meant to be used in 90% of all PCs made and sold. Internet browsing boxes. Jesus, why can't people understand this. Nvidia are allowed to make crap graphics cards for people who don't need them to game. It's a money maker.

Exactly. Partly the 'Geforce' brand carried down through their line. Companies aren't going to change - too much cost in redesigning their products but when they next change their model they'll find nVidia will increase the 9400 volume cost if the company goes elsewhere.. but by then ATI could make a big play for it at a company strategy level. Stabbing nVidia right in the jugular. Fermi must be a success by that point - perhaps nVidia will also simultaneously release a low end Fermi too as an OEM part..
 
Exactly. Partly the 'Geforce' brand carried down through their line. Companies aren't going to change - too much cost in redesigning their products but when they next change their model they'll find nVidia will increase the 9400 volume cost if the company goes elsewhere.. but by then ATI could make a big play for it at a company strategy level. Stabbing nVidia right in the jugular. Fermi must be a success by that point - perhaps nVidia will also simultaneously release a low end Fermi too as an OEM part..

But this is coming up right at the ill will NVIDIA generated with the 8600/9600M chipsets. It might be technically good enough, but if you have to instigate a $300m recall on your products, you might be wary about using NV as a supplier again - especially after the so spectacularly denied all knowledge. If I was going to make a hundred thousand laptops i'd ask Intel or AMD respectively to sort out the graphics.
 
I understand that but

When compared to the one-year old Radeon HD 4670, which sits at an even lower price point, the HD 4670 wins in Performance, Price, Perf/Watt, Perf/Dollar. Out of these criteria for a low-end card the most important is price. Just price, not price/performance - 3D performance doesn't matter for 2D/Aero desktop. Right now the GT 220 cards are going for $70-$80, which is clearly too high. Thanks to the 40 nm process NVIDIA can make those GPUs really, really cheap. This, in my opinion, is the whole point of this product: cheaper GPUs for NVIDIA, better margins. In order to be able to compete in the retail market, the price of these cards has to go down to the $50 region.

I agree the OEM market may be different if the margins are high at the current retail price.
 
looks OK for its price point, obv the 4670 is a better buy - but some ppl would go Nv over ATi just because.

Isn't the 9600gt faster and cheaper aswell though?

Its just a bad card start to finish, the price is nuts, the only reason to put a new worse performing card at a price above what the old one sells for, is because yields still aren't great on 40nm and they can't price this at, the $60 or below it needs to be.

THe problem being ATi's next versions are as little as 2 months away, (3 at most it seems as probably available in January), which means added features that only bring it on par with the 4670 which is far faster, will suddenly be trailing in features almost instantly.

The painful part is, the 4670 sells well, makes a profit, and is the better card aswell as having been out for ages. The gt220, costs more, is slower, and will probably only make a profit if sold at $79, if reduced to compete with the 4670 on price might not be making any cash.

THey might have been doing the high end roughly on time and before AMD for the past few gens, but their mid/low end have been embarassingly slow to follow, stupidly so and thats one of the biggest things to have hurt and will continue to hurt Nvidia in the next 6 months.
 
They are meant to be used in 90% of all PCs made and sold. Internet browsing boxes. Jesus, why can't people understand this. Nvidia are allowed to make crap graphics cards for people who don't need them to game. It's a money maker.

It doesn't matter that its crap, thats not an issue, the problem is its crap COMPARED to other cards that cost the same.

People want value for money, it doesn't matter if people won't use the power, if you show two of anything to anyone, tell them they are the same price but ones, faster, lighter, tastes nicer, looks better, etc, etc, etc, it doesn't matter, people choose the best product.

If the card was noticably cheaper than the 4670, then it would start to offer value again, but its MORE expensive than a card thats faster than it with basically identical features. Who purposely pays more money for a worse product, very, very, very few people. As someone else said, for all the OEM's that have had to eat crow from customers who had failed laptops purely because of Nvidia failing products, while Nvidia refused to accept responsibility and all these companies are STILL paying to repair breaking Nvidia products, with Sony now increasing the warranty to 4 years on effected laptops, eating millions of pounds in replacements and repair work. You want to convince these people to still buy Nvidia cards, that cost more and are slower than the competition?

That, is a hard sell....
 
to be fair, the 4670 is a bloody good card... I'm not suprised it's still king of thie hill...

It really doesn't matter if the 4670 is faster, the problem is its faster AND COSTS LESS. WHich makes Nvidia card utterly devoid of worth.

Even worse is, the 4670 is actually quite a good card for somewhat low res gaming, and even older games at higher res it can cope with, but the new nvidia cards, really can't cope with much of anything(but surprisingly good at Batman AA, bizarely).

But this was the issue I had with the GT240, in reality it will lose to a 4850, which is being replaced by a card that can make a higher profit at the same £80-90 price. Nvidia need their worse performing GT240, to be cheaper than the 4850, which will be even cheaper when the 5750 is launched. In a week the 4850 will be going for what, £65-75 to clear stocks, which means for the GT240 to sell(not sure when its slated for release and being available) it would really want to be cheaper than that again. So their whole pricing structure is messed up with overly large cores that make little money.


Seriously, has it not been clear for two years that the only way forward, not least because TSMC screw up constantly, is a small die efficient use of transistor strategy. While Nvidia stick two fingers up at the idea and end up with 3 quarter late parts so big they can't compete on cost with last generation AMD parts that have been out for a year.

If Nvidia don't go for a smaller die part, they are screwed at every price point.

Problem is look at the 285gtx vs the 4890, you're talking about a die with almost half the transistors and no 1.5Ghz shaders, matching and beating it in several games. 285gtx isn't being bought, the 260gtx is soundly beaten by a much cheaper part to make in the 4890, its sold at a loss to maintain sales. Its nuts, but the same die design, while smaller, is what we have here, far far too big and expensive.

By having those shaders in clusters of 5 for AMD, you cut out a buttload of core logic, thats the biggest reason for the difference in size. Nvidia have to go the same route, which incidentally will work great for us because all game makers will have to focus on designing for that style of core. :)
 
But this is coming up right at the ill will NVIDIA generated with the 8600/9600M chipsets. It might be technically good enough, but if you have to instigate a $300m recall on your products, you might be wary about using NV as a supplier again - especially after the so spectacularly denied all knowledge. If I was going to make a hundred thousand laptops i'd ask Intel or AMD respectively to sort out the graphics.

True (I have one of the series of affected Apple MacBook Pros). However on the other hand it is entirely possible for the CEO to step in and rectify the situation because of it's serious nature. I've seen companies that have the odd issue still remain suppliers because of the speed of response to the situation - no, wait, companies had to sue nVidia to get them to admit to it and making it a big PR nightmare.. :D
 
If Nvidia don't go for a smaller die part, they are screwed at every price point.

Fermi is estimated to be 50% larger in area on equal fabrication..
Area = GPU processing power * floating point bit size.

The old games GPU only requires 16bit floating point and doesn't require additional logic for IEEE compliancy, overflow etc. Whereas the GPGPU starts with 32bit "single precision IEEE" FP and if nVidia want to move into supercomputing they will need at least 64 (double) or 128bit (extended precision).

Although it's possible to use error functions to make use of 32bit, in the end you still loose information at high precision - or further reduce the processing speed in half.. remember all this talk about FLOPS is usually quoted at 16bit "single" point for graphics cards rather than say 64/128bit.

So without 64/128 bit they loose their top end supercomputer market whilst the cost of fabricating Fermi makes it a high cost commodity sale (even in volume) thus suffer a very reduced market or slow acceptance (unless they release a mid range Fermi at the same time but the fab probably won't be viable initially for low cost parts).
 
Last edited:
10aub.jpg


Desperate much..:D

WTF edition is more apt..:p
 
Just reading some of the evga whitewash rubbish

"Premium Windows 7 Experience
Graphics processing units (GPUs) are an essential element of today’s PCs, enabling more visual and more interactive experiences. As a leader in visual computing and the inventor of the GeForce GPU, NVIDIA worked closely with Microsoft on the development of Windows 7 to ensure that its GPUs take full advantage of the great new features and functionality. If you use your PC to enhance photos, watch or edit videos, play games, or if you simply desire a fast and efficient graphical interface, NVIDIA GPUs will surely delight your senses and offer a premium experience for Windows 7."

No it doesn't, where's DX11?
 
Back
Top Bottom