• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 280 Refreshes?

Soldato
Joined
17 Jan 2005
Posts
8,806
Location
Liverpool
I read somewhere that the GTX 280 and 260s will be getting a refresh soon with more shaders onboard.

Now I'm thinking of getting a 280 to upgrade my my 8800GTX but was wondering if its worth waiting if the refresh is going to happen soon!

Any ideas if what I heard was true? And if it is, how long away is it?

Cheers,
Andy
 
Supposedly to come with the die shrink but it may only be the 260GTX getting the extra shaders. I don't think there are any on the GTX280 to unlock????

However, rumours say it's October for the die shrink and new GTX's
 
Hmm.. Not sure what to do now, if its worth holding on for the smaller die size and possible GDDR5 or to just buy one now!?
 
280 will have the same amount of shaders, its just getting die shrunk to 55nm, and is also supposedly getting its memory changed to GDDR5.

If you've looked at what they said after announcing that the 200b would NOT be shown at Nvision, they went on to claim that the 'next' (though no clarification on whether that's GT200b or...) 280 would be available no earlier than late October, and then gave no specifics except to 'expect 55nm and possibly DDR5.'


Then suddenly they announce a possible 260+.

If the 260+ catches up anymore to the 280 (even though it's some texture RAM short) then the 280 vs 260 price point is going to be a very odd situation.

Which makes you think the logical thing to do, is drop a 280+ with a minimum 1600mhz for the shaders, and it will definatley surpass the 4870 x2, which turned out not to be as impressive as it sounded at first. The use of 55nm wouldn't even be necessary.


Thus points to a possibility of seeing the GT300 with it's 55nm, DDR5 and whatever other goodies, to be the introduction of Nvidia's next flagship series.


But putting that all aside, if this 260+ comes to light, then something must be done about the current 280s soon, or that's going to be a sticky situation.
 
So am I best off waiting till the end of October then really? Seeing whats coming out.. My 8800GTX is still OK at the moment, its just I'd like a new card for a few of the games I'm looking forward to at the end of the year.

Or I could sell it and get a 4870 on the cheap to tide me over till the new Nvidia cards come out! Decisions...
 
Yea, i would expect the prices to be back up at the £300+ mark for the 280s.

I cetainly cant see them releasing a new high end card with a £240 price tag :P
 
Which makes you think the logical thing to do, is drop a 280+ with a minimum 1600mhz for the shaders, and it will definatley surpass the 4870 x2, which turned out not to be as impressive as it sounded at first.
What rubbish.

The 4870X2 decimates the 280 and beats 280 SLI in many cases. If you think a breathed on 280 at 55nm could beat an X2, you're living in a dream world.
 
Feel free to enlighten us on where an X2 "decimates" a 280?

First and foremost, the majority of modern games are easily handled by non top end flagship cards, such as the 4870, 4850, 3870x2, 260, 9800 GTX (or +), 8800 GTS 512, 8800 GT, 8800 ULTRA/GTX, 8800 GTS 640, and even some of the newer budget cards like the 9600s.

Any review of such cards will easily point this out.

Second, cards like the 280 and the 4870 x2 are complete over kill for them, unless you're going 2560 res with 16AA, but then you're part of an extreme enthusiast market, so naturally those are the products for you.
Which means, unless you're benchmarking, or running synthetic tests, then the times when you will NEED one of these two cards for real world performance, is going to be under acute situations i.e. Crysis or a modern game at the above mentioned resolution and settings.

Crysis brings GPUs to their knees and sometimes I don't agree that we should use it to set the bar so to speak, but time and time again we do and so do all the reviews.

This time around, the 280 and 4870 x2 don't fare that much better than their predecessors. There are plenty of results that show this right here >

http://images.tweaktown.com/imagebank/saph487x2_g_08.gif

In Very High, both the cards continue to struggle. Neither is a clear victor, nor the loser. We can also see the other truths,

GX2 - slightly higher frames; GX2 vs GTX 280 was all the buzz after the 280 launched. But despite being a few frames slower on the top end, the GX2 had nasty stutter, terrible minimum frame rates, and serious compatibility issues.

4870 - Does well; I've tested Crysis with a 4870, that's almost dead on the type of results I received.

9800 GTX with an OC - Another card doing well; I've also tested Crysis with a 9800, that's almost dead on the type of results I received.

4870 Crossfire - A bit better than the single 4870, but most of us should know by know, that Crossfire through the driver, doesn't work well in Crysis.

4870 X2 Crossfire - We actually see a reduction in frames. Even though the Crossfire is on the cards themselves, when paired together, it relies on the driver. It's quite a problematic situation, when faced against the odds of Crysis not allowing Crossfire to scale well. Loss of performance is definatley not surprising.



You can go off and a find review, where they show Vista with all "High" at 1920, resolution, and here you will see most of the cards jump into the 30's range. The X2 will have a slight lead, coming in under around four to five frames, whether AA is turned on or not. What you will also find, is that the 280 has slightly better minimum frame rates. It's nothing to wright home about, but in a game like Crysis, minimum frame stability is KING. Ultimatley, once again, performance is not considerably one sided.


But let's move away from Crysis. People can always complain that it's poorly coded, or that it's not optimized for ATi cards.

Let's try a different game ; something recent, demanding, and etc.

Age of Conan. Unfortunatley not enough people use this game in their reviews, however it's worth doing so. In the few that we have seen, and I can vouche for this, as I've tested it myself, the game seems to like ATi cards. The single 4870 does really well vs a GTX 280, and the 4870 X2 does almost impressively well, WITH transparency/ adaptive AA applied. The 280 holds it's own, and for the majority of people, they'd be absolutley happy with it's performance at 1920X with max settings, bloom, transparency AA and 8Xaa or even 16Xaa, with a few frames loss. However, the X2 gains a noticeable twenty + fps in some areas. Once you pass sixty, what does it matter? But in a spirit that's quite unlike ATi cards, it actually at times has a higher minimum frame rate, than the 280's average frame rate altogether.

Unfortunatley, this is the only game (at least 'common' game) where the X2 actually shines.


If we look around at all the other reviews, with other games and applications in mind, the majority of them make one thing apparent when it comes to performance - that the 280 and the X2 duke it out, with one being the victor here, and the other the victor the next time, but the performance is close enough that the average player wouldn't know the difference whatsoever.


Take it into consideration, and let's look at the rest of the reality.

The X2 is two GPUs. Some people say "I don't buy GPUs, I buy GFX cards" and to that I say 'sure, fair enough,' but it's matter of perspective and principle.

If someone says the X2 is the fastest card out there, ok, that's easily agreeable to, but it comes nowhere near "decimating" the 280, or even a 260/4870 for that matter. Getting 200 fps in FarCry is irrelevant. Potential frames are useless as a measurement of 'performance,' if you spend 80% of your time in the low 20s, struggling to get a smooth gameplay experience.
So that's not how we're measuring the power of the X2, we're measuring it in realistic conditions, vs a card (the 280) that performs more or less the same, but in some classes slightly slower, and THAT is what we should agree upon, to determine which is faster; because one gpu, two gpu, three gpu or four! It still is slightly faster.

That is perspective. If you have the right perspective, you see what the truth is in the matter.

Now let's move on to principle. Some would argue that for the amount of horsepower the X2 has under it's hood, it isn't turning out impressive results. We could sit around and argue on the grounds that the drivers are more immature than the 280s, but we know that the performance will not change dramatically either way. There's nothing inherently flawed about the ATi drivers, nor the Nvidia ones. They'll get better, but not THAT much better.

So how is it that a dual GPU with DDR5 and all the other bells and whistles, doesn't even decimate a single 4870 in a lot of situations? I don't know the answer honestly. Obviously in some instances, the scaling of Crossfire is very poor; but when you think back on all the multi GPU cards, when has SLI/Crossfire ever scaled really well? When has two GPUs actually equalled more than a 40% gain?

I am not surprised at how dismal the X2's performance is, despite the power it possesses. It's just an unfortunate nature of modern GPUs. However my perspective on the matter, doesn't change the reality. And to some people, paying for something that should be at least 50% or more better, and not getting such results, is unacceptable - principle.


Lastly, let's consider the extenuating variables.

The power draw on a X2 is almost a hundred more watts at load.

The heat at idle is almost 1 1/2 times higher than a 280, and the load is also 1 1/4 times higher.

The cost (although I never argue price when dealing with enthusiast products, someone else always does, so we might as well discuss it anyways) is still $100 more or 100 quid more, whichever applies to you.

It weighs quite a bit more.

It doesn't have any physics (while this subject is debatable as to whether or not it's relevant in today's market as so few programs use physics on GPUs, it still is a relevant issue to someone who wants a card that is ready to make use of GPU physics, without having to wait for an unknown future card from ATi that will be physics capable. Future proofing isn't only for CPUs).



So you see, their performance is competitive whether in the mass of games or acute situations. The difference is, not the performance, but the other factors, and these are all relevant factors; maybe not to you, maybe not to me; maybe not to 25% of the market, but when viewing all angles, all variables to the equation must be present and accounted for.


Lastly, moving on to the 'rehashed' 280. The 280s weak point are the shaders, no doubt about it. It's no so much that an increase in from 1300mhz to 1600mhz is going to be a big deal, but rather that there's a frequency wall somewhere in the 1500-1600 range, much like you experience on your motherboards when overclocking. People have done successful volt mods, and reached near 1600mhz, however it's very unstable. To have the card at 1600 mhz native, that was stable would allow it to overcome that frequency block and then allow the shaders some room to stretch their legs.

This won't allow the 280 to make a fool out of the x2, but given how close their performance is at this time, it would give the 280 an edge in some situations, and in others, nothing would change at all.


The more important thing is that the 280 would be more like it should have been at launch.



Now, you can make up whatever excuses you want, but reviews and testimonies tell a different story than the hype.


What I want others to understand is that neither the 280 or the X2 are required for the mass consumers, and if you do get into that range of cards, be prepared to face the reality when you get home and put your X2 in, that it won't blow the doors off the 280, and there's offer negatives to go with it.

If that's ok with you, then by all means... feel free.

Just don't try to BS people, cause you might just run into someone who has an X2 and a 280 in multiple machines sitting right beside him, and he'll tell you how it REALLY is ;wink;
 
Last edited:
But sometimes that's why people buy, because it's overkill to get 200fps, so in 3 years time they can play a contemporary game on max and still get 40 fps. Thinking about the future, not having to upgrade in another 6 months

But you do pay a price that you could often buy 2 lower end cards with, and i can almost guarantee you that any ~100£ in 3 years can beat whatever you can buy now.
 
But sometimes that's why people buy, because it's overkill to get 200fps, so in 3 years time they can play a contemporary game on max and still get 40 fps. Thinking about the future, not having to upgrade in another 6 months

That's what I did with my GF3, although I didn't know crap about gfx cards at the time. :D

I have read that the new cats really give the X2 a boost. Still seems like the 280+/290/whatever could be "good enough" for 1920x1200 and below to the point where an X2 would be throwing good money after bad. IF nV launch it at a sensible price... oh what am I saying.
 
Last edited:
Back
Top Bottom