• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is this the longest Nvidia have ever gone without a new series of graphics cards?

Soldato
Joined
24 Oct 2002
Posts
6,242
Location
Portsmouth
I'm just curious. AMD/ATI is moving forward with it's cards including top end cards like the 4890, right down to the superb midrange 4770 card.

Meanwhile Nvidia is stagnating with it's ancient 8/9/250 series cards which have been around for 3 years now in one form or another, and the massive GT200 based cards at the top end.

Where is the innovation, where are the new designs, where are the midrange and low end parts that used to fill out their portfolio?

I know the GTX260 is now down to £130, but that's still a large power hungry card, it's not a midrange part and nor has it been priced as such until the past few weeks.

Where are Nvidia's 40nm parts? Are they due to release a midrange 40nm part like AMD have to test the waters?
I've pretty much only bought Nvidia in 10-12 years of building computers, but right now it seems AMD are making all the moves and frankly if I was to currently build a midrange, low powered PC I would move to a 4770 graphics card.
 
I think graphics cards are hitting a point like CPUs did a few years ago.

Back then, Intel and AMD were locked in a clock speed battle, with the Pentium being pushed as hard as possible towards 4GHz, and the Athlon closing in on 3GHz.

Since then, there has been a massive shift from that to a more efficient CPU that can carry out instructions with a lot more power, but a more conservative clockspeed and/or die size.

Now look at GT200. It was a huge chip with a large transistor count compared to RV770. It can't have been cheap to produce in large amounts, and even with it's high launch day price, nVidia can't have been making a large profit on it.

Then they got caught with their pants down, quite simply, with the HD4870. It matched the GTX260 for a lot less, and suddenly they had to cut prices to compete. There was no alternative. But I doubt the yields were good enough at that point to allow it, and nVidia had to just take a massive loss.

And i'll bet they would rather take their time than let that happen again. Expect something smaller and more efficient rather than something that was simply a case of packing as many transistors into a die as possible. Not that it didn't work - GTX280 was a hell of a card, and still is. But in terms of a business model and making profit, nVidia got stung, hard.
 
Yeah that's very true. I personally no longer care about outright power, I want the best blend of power consumption and performance. That's why I'm so stumped that Nvidia have not released a new range of low to mid-range cards based on 40/55nm tech to compete with Ati, especially now Mini-ITX/ATX systems are taken off for small lower power HTPC setups.
 
That's why I'm so stumped that Nvidia have not released a new range of low to mid-range cards based on 40/55nm tech to compete with Ati, especially now Mini-ITX/ATX systems are taken off for small lower power HTPC setups.

because with those setups you shoudl be using onboard video like the nvidia ion chipset, which is basically a 9300/9400 gpu.
 
Don’t forget these older models were once the best cards on the planet. and it's only fair that they use the chips again in mid end cards. It helps on saving money on r+d or would you rather pay more for same performance but new design.

The geforce 256 is also the geforce 4 mx series with enhancements.
Geforce 6 = geforce 7
Radeon 9700 = x600
Pentium m = core i7

I'd like the think they reuse the gtx 200 series as the new mid range gtx 300 series with modification for dx11 and eventually a re-release in 40nm (I think g300 will be released in 55nm despite what is claimed in wiki) I also think nvidia will learn from ati and use 256 bit bus with ddr5 and maybe have a 512bit bus for the top few cards to reduce costs.

I also would love to see the venerable g92b reduced in bus width fitted with ddr5, a die shrink and sold as budget or even as on-board-graphics, suck on that Intel.

A budget card that spits out crysis 30+ fps at 1280x1024 - nice
 
Last edited:
Don’t forget these older models were once the best cards on the planet. and it's only fair that they use the chips again in mid end cards. It helps on saving money on r+d or would you rather pay more for same performance but new design.

The geforce 256 is also the geforce 4 mx series with enhancements.
Geforce 6 = geforce 7
Radeon 9700 = x600
Pentium m = core i7

I'd like the think they reuse the gtx 200 series as the new mid range gtx 300 series with modification for dx11 and eventually a re-release in 40nm (I think g300 will be released in 55nm despite what is claimed in wiki) I also think nvidia will learn from ati and use 256 bit bus with ddr5 and maybe have a 512bit bus for the top few cards to reduce costs.

I also would love to see the venerable g92b reduced in bus width fitted with ddr5, a die shrink and sold as budget or even as on-board-graphics, suck on that Intel.

A budget card that spits out crysis 30+ fps at 1280x1024 - nice

The GeForce 256 was before the GeForce 4 MX.

Also, IIRC the Nvidia engineers couldn't get the GT200 design to shrink to 40nm so they are probably re-engineering the design to allow that. The G92 is only DirectX10, soon we will have DirectX11 cards so it would be stupid for Nvidia to reuse the old core while it's bigger brothers are on Dx11.
 
noob!!!! ;) :p

The Gefore 8800GTX wasn't superceded by a new tech for more than 2 1\2 years!!!

The GT200 series came out about this time last year, maybe a little later even.

:o:D
 
Back
Top Bottom