• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GT200 GTX280/260 "official specs"

well I'm interested in the ATI X2 48xx which should be a significant boost over the 8800GT considering it has 1GB RAM and also the extra stream processors so 1920x1200 will be a breeze for me with FSAA.
 
460 quid? are u serious?

prices are going up and performance is actually going down

No, your bang for buck is going down but always the same for the latest high end.

Just cause it's going to be 4 times the cost of a 8800GT doesn't mean you will get 4 times the performance!

When will people learn :rolleyes:
 
ANyway, wait 2-3 months for the die shrink and the initial price drop and I reckon they will be £300-£350.

Still not worth it for on the bang for buck chart but if you need the power and have the money, why not?
 
I for one will NOT be buying anything that costs more than £300 after buying a 8800GT for £170 and it performing close to 8800GTX speeds which cost more than double at the time It's just not going to happen.

ATI for me for the first time ever i think this time round.

thats all very well saying that and to an extent I agree, but, don't forget the 8800gt's didn't come out for a long time after the 8800gtx was released, I don't think anyone can argue that launch day buyers of the 8800gtx have been ripped off.
 
the 8800GTX remained at double the price even well after the 8800GT launch and only when shops realised nobody is buying a GTX anymore did they drop the price (ocuk included) - some still even sell them at the premium price !!!
 
the 8800GTX remained at double the price even well after the 8800GT launch and only when shops realised nobody is buying a GTX anymore did they drop the price (ocuk included) - some still even sell them at the premium price !!!

yes they normally do stay at a high price even when a cheaper more powerful card has been released, go on google shopping and check the price of a 7800gtx, you'll be in for a present surprise, the point I was making though was that the people who brought that card on launch day at a premium had the advantage of having the most power card on the market for a good year before anything else was available, once the gt's came out of course they we're the card to go for, but on day one that option was not available,

TBH for me by the time the gt's came out I was already tired of the GTX and was hoping a next gen gpu would be released, I was happy for all the 'new comers' though as they had the chance to get a brilliant GPU for chump change, it certainly brought down the cost of building a decent gaming right down so in that respect it was pretty cool.
 
Last edited:
Personally im just going to wait for the next "8800GT" that can run Crysis maxed @ 1920x1200 affordably. Its not like theres many, if any, other games that need so much power coming out.
 
It's like buying a brand new car instead of a few yrs old one with 1 previous owner, it does not make sense! why spend so much and end up losing so much when the new cards are out every year? unless of course you forecast the new cards coming and their potential prices then sell your higher end card at minimal loss whilst having a medium end to sit on for x months until the new cheaper more powerful ones are out which I deem the wisest choice if you have to buy high end right away!
 
unless of course you forecast the new cards coming and their potential prices then sell your higher end card at minimal loss whilst having a medium end to sit on for x months until the new cheaper more powerful ones are out which I deem the wisest choice if you have to buy high end right away!

tbh that's what most of us high end GPU users do on here, as soon as I found out about the GTX280 and confirmed when it was coming out I put my 8800gtx up for sale, I now have a 6200 in my rig :o, I got a £130 for it, take that away from the £479 I paid for the 8800gtx = £349, £349 for one of the best gpu's that's ever been released and one that held the crown for the longest, certainly money well spent for any GPU enthusiast.

I didn't quite get this "It's like buying a brand new car instead of a few yrs old one with 1 previous owner", did you mean that it was better to wait and buy a secoundhand one ?
 
Last edited:
ANyway, wait 2-3 months for the die shrink and

Where you getting that from? I wouldnt presume Nvidia will even manage 55nm, it is plausable that they simply wont without some major changes in desing, thus delaying them even further. They may bin it all together and go for 45nm (again taking longer). They wouldnt be launching it as a 65nm part if they could help it.

On a similar note, going from 65 to 55nm would help with the heat problems that we are all expecting, but it's not a huge jump. ATi did a huge jump going from the 2900XT to the 3870, think that was 80nm > 55nm, and thus sorted the temperature and power woes. Nvidia are targetting a much lesser heat/consumption saving on a part that could run even hotter than a 2900XT, again, assuming they even manage it.

Martyn
 
Where you getting that from? I wouldnt presume Nvidia will even manage 55nm, it is plausable that they simply wont without some major changes in desing, thus delaying them even further. They may bin it all together and go for 45nm (again taking longer). They wouldnt be launching it as a 65nm part if they could help it.

On a similar note, going from 65 to 55nm would help with the heat problems that we are all expecting, but it's not a huge jump. ATi did a huge jump going from the 2900XT to the 3870, think that was 80nm > 55nm, and thus sorted the temperature and power woes. Nvidia are targetting a much lesser heat/consumption saving on a part that could run even hotter than a 2900XT, again, assuming they even manage it.

Martyn

Because they were always meant to be but they have been having problems so have resorted to 65nm just to get it out to compete with ATI however the 55nm GTX280 have already been taped out (alledgely).

Can't be bothered finding the posts on here and the links, do a search and you will see that the GTX280 is coming out in 65nm for maybe just a few months only until they get the 55nm fixed. EDIT see here: http://www.theinquirer.net/gb/inquirer/news/2008/05/29/nvidia-gt200-sucessor-tapes

Cooler chips and potentially higher clocks will be the gain.

How miffed will you be buying a hot card on launch day for £460 which overclocks poorly and then they release a "cheaper" 55nm card three months later which is cooler and has higher overclocks. Your card won't be worth much 2nd hand then and you will wish you had waited.

Of course it was from the inquirer but what if it's true? I for one am going to wait a couple of months or buy the ATI card if it's performance is good enough for the money.

Like you said "They wouldnt be launching it as a 65nm part if they could help it" which sums it up really. They couldn't.
 
Last edited:
65nm -> 55nm is nothing like the 80nm -> 55nm the 2900 XT went through to get the 3870, the power drop won't be nearly as significant, and the die size will only shrink by 6% or so... Frankly I doubt the GT200b will even be worth waiting for.
 
It's like buying a brand new car instead of a few yrs old one with 1 previous owner, it does not make sense!

That statement doesnt make sense. I bought a brand new car cos i can afford one, i dont worry about MOT's or whether it has a dodgy history or warrenty issues. I bought a 8800GTX at £330 and it was probably the best PC buy i have made as ive been playing the latest games at high res for (it seams like) years.
 
65nm -> 55nm is nothing like the 80nm -> 55nm the 2900 XT went through to get the 3870, the power drop won't be nearly as significant, and the die size will only shrink by 6% or so... Frankly I doubt the GT200b will even be worth waiting for.

Yeah but dare you take the risk? If 6% equates to 6% higher clocks you are talking about 36Mhz on a 600Mhz gpu. And shouldn't the die shrink be 15% anyway? :confused: in which case the difference might be 65nm cards launched at 600Mhz and 55nm cards launched at 700Mhz stock.

It will also mean if true that your card you bought on day one was a inferior card made for only 3 months and hence will always be worth less money.

Throw in thet fact that as per all recent new cards from Nvidia, prices are artificially high for first few weeks and IMO it does not make sense buying one at launch week unless money is meaningless.

Okay, if the GTX280 was going to be 2-3 times quicker than a 8800GTX then it probably wouldn't matter but the point is, it is not going to be that much of a gain.

On your basis if the GT200b isn't worth waiting for then the GTX280 GT200 isn't worth waiting for either.
 
So I think ATI still have a very good chance. It is based on price after all not just performance. Why do you think the PS3 has a lousy 7900? It could have had a 8800 and then it would have destroyed the 360 for graphics.

The PS3 has a 7 series based GPU because at first Sony had the bizarre idea that they could get all the games to run on the Cell alone, with no need for a GPU. After realising Cell was crap for this they rushed to Nvidia for the graphics, and being as NV would need a very long time to design a custom GPU for Sony, they had little choice but to dump NV's lastest GPU (at the time) in the console.
 
Yeah but dare you take the risk? If 6% equates to 6% higher clocks you are talking about 36Mhz on a 600Mhz gpu. And shouldn't the die shrink be 15% anyway? :confused: in which case the difference might be 65nm cards launched at 600Mhz and 55nm cards launched at 700Mhz stock.

It will also mean if true that your card you bought on day one was a inferior card made for only 3 months and hence will always be worth less money.

On your basis if the GT200b isn't worth waiting for then the GTX280 GT200 isn't worth waiting for either.

Your working on the asumption that the shrunk die would equate to a better performing card (i.e higher frequencies and what have you). Who said it would?

Since it is just an asumption, it also be plausable to assume there is no performance increase and the shrunk die will reduce the heat generated and that is about it. The whole point of wanting away from 65nm is because of the heat problems, brining those frequencies up would probably put you back to square one (pointless).

I ''think'', am right in saying Nvidia also said in the die shrink would occur but there would be no change to the naming structure (similar to the G80 > G92 8800 GTS scenario), or even the cards performance, simply a shrink to improve yields and what have you (am aware btw that in the G80 > G92 scenario there was a performance boost in there, but 90 > 65nm is a bigger step than 65 > 55nm, and the G80 GTS wasnt exactly famous for huge amounts of heat).

However, am speculating as you are, I doubt even Nvidia know what there doing, and R700 is going to dictate massivly what Nvidia do with there 55nm part (assuming they manage it). My money is on a 55nm version with 256 SP, going under the name 280 Ultra with at best a minimum overclock for marketing reasons. I dont see a GX2 any time soon.

Martyn
 
Last edited:
Your working on the asumption that the shrunk die would equate to a better performing card (i.e higher frequencies and what have you). Who said it would?

Since it is just an asumption, it also be plausable to assume there is no performance increase and the shrunk die will reduce the heat generated and that is about it. The whole point of wanting away from 65nm is because of the heat problems, brining those frequencies up would probably put you back to square one (pointless).

I ''think'', am right in saying Nvidia also said in the die shrink would occur but there would be no change to the naming structure (similar to the G80 > G92 8800 GTS scenario), or even the cards performance, simply a shrink to improve yields and what have you (am aware btw that in the G80 > G92 scenario there was a performance boost in there, but 90 > 65nm is a bigger step than 65 > 55nm, and the G80 GTS wasnt exactly famous for huge amounts of heat).

However, am speculating as you are, I doubt even Nvidia know what there doing, and R700 is going to dictate massivly what Nvidia do with there 55nm part (assuming they manage it). My money is on a 55nm version with 256 SP, going under the name 280 Ultra with at best a minimum overclock for marketing reasons. I dont see a GX2 any time soon.

Martyn

Some good points but as you have said die shrinks=less heat=higher stock speeds.

GTS G80 came 575Mhz as stock but you get G92 GTS at 700Mhz stock so if they do die shrink they won't just settle on less heat, they will use that to release cards with higher stock clocks or if not people will get higher overclocks from them.

Granted 90 > 65nm is a much bigger step which means I can run my watercooled G92 GTS at 850/2200 speeds. You could never do that with a G80 GTS.

And if it is just a heat issue and no better clocks will come of it, doesn't that mean the 65nm first released cards will have a heat problem? Otherwise not much point trying to iron out the problems with the 55nm process except yields which are reported to be less and not more.
 
I think the problem with people assuming die shrink = low low prices is: The difference in price between a 55nm 1.3 billion transistor chip and a 65nm 1.3 billion transistor chip in the face of an extremely complex PCB and lots of high-end GDDR3 memory chips is probably rather minimal. I say this as opposed to the recent new cheap cards, which've both had much more significant die-shrinks and bus cuts. I wouldn't be surprised if nVidia so much as released them as exactly the same product, maybe with an 'energy efficient' label or some such and charged extra for the privilege.
 
The GTX280 and 260 will stay high in price in my eyes.
The yeilds they will get will be very low due to the size of one chip and a 55nm shrink will only help it a little.
 
Back
Top Bottom