Feel free to enlighten us on where an X2 "decimates" a 280?
First and foremost, the majority of modern games are easily handled by non top end flagship cards, such as the 4870, 4850, 3870x2, 260, 9800 GTX (or +), 8800 GTS 512, 8800 GT, 8800 ULTRA/GTX, 8800 GTS 640, and even some of the newer budget cards like the 9600s.
Any review of such cards will easily point this out.
Second, cards like the 280 and the 4870 x2 are complete over kill for them, unless you're going 2560 res with 16AA, but then you're part of an extreme enthusiast market, so naturally those are the products for you.
Which means, unless you're benchmarking, or running synthetic tests, then the times when you will NEED one of these two cards for real world performance, is going to be under acute situations i.e. Crysis or a modern game at the above mentioned resolution and settings.
Crysis brings GPUs to their knees and sometimes I don't agree that we should use it to set the bar so to speak, but time and time again we do and so do all the reviews.
This time around, the 280 and 4870 x2 don't fare that much better than their predecessors. There are plenty of results that show this right here >
http://images.tweaktown.com/imagebank/saph487x2_g_08.gif
In Very High, both the cards continue to struggle. Neither is a clear victor, nor the loser. We can also see the other truths,
GX2 - slightly higher frames; GX2 vs GTX 280 was all the buzz after the 280 launched. But despite being a few frames slower on the top end, the GX2 had nasty stutter, terrible minimum frame rates, and serious compatibility issues.
4870 - Does well; I've tested Crysis with a 4870, that's almost dead on the type of results I received.
9800 GTX with an OC - Another card doing well; I've also tested Crysis with a 9800, that's almost dead on the type of results I received.
4870 Crossfire - A bit better than the single 4870, but most of us should know by know, that Crossfire through the driver, doesn't work well in Crysis.
4870 X2 Crossfire - We actually see a reduction in frames. Even though the Crossfire is on the cards themselves, when paired together, it relies on the driver. It's quite a problematic situation, when faced against the odds of Crysis not allowing Crossfire to scale well. Loss of performance is definatley not surprising.
You can go off and a find review, where they show Vista with all "High" at 1920, resolution, and here you will see most of the cards jump into the 30's range. The X2 will have a slight lead, coming in under around four to five frames, whether AA is turned on or not. What you will also find, is that the 280 has slightly better minimum frame rates. It's nothing to wright home about, but in a game like Crysis, minimum frame stability is KING. Ultimatley, once again, performance is not considerably one sided.
But let's move away from Crysis. People can always complain that it's poorly coded, or that it's not optimized for ATi cards.
Let's try a different game ; something recent, demanding, and etc.
Age of Conan. Unfortunatley not enough people use this game in their reviews, however it's worth doing so. In the few that we have seen, and I can vouche for this, as I've tested it myself, the game seems to like ATi cards. The single 4870 does really well vs a GTX 280, and the 4870 X2 does almost impressively well, WITH transparency/ adaptive AA applied. The 280 holds it's own, and for the majority of people, they'd be absolutley happy with it's performance at 1920X with max settings, bloom, transparency AA and 8Xaa or even 16Xaa, with a few frames loss. However, the X2 gains a noticeable twenty + fps in some areas. Once you pass sixty, what does it matter? But in a spirit that's quite unlike ATi cards, it actually at times has a higher minimum frame rate, than the 280's average frame rate altogether.
Unfortunatley, this is the only game (at least 'common' game) where the X2 actually shines.
If we look around at all the other reviews, with other games and applications in mind, the majority of them make one thing apparent when it comes to performance - that the 280 and the X2 duke it out, with one being the victor here, and the other the victor the next time, but the performance is close enough that the average player wouldn't know the difference whatsoever.
Take it into consideration, and let's look at the rest of the reality.
The X2 is two GPUs. Some people say "I don't buy GPUs, I buy GFX cards" and to that I say 'sure, fair enough,' but it's matter of perspective and principle.
If someone says the X2 is the fastest card out there, ok, that's easily agreeable to, but it comes nowhere near "decimating" the 280, or even a 260/4870 for that matter. Getting 200 fps in FarCry is irrelevant. Potential frames are useless as a measurement of 'performance,' if you spend 80% of your time in the low 20s, struggling to get a smooth gameplay experience.
So that's not how we're measuring the power of the X2, we're measuring it in realistic conditions, vs a card (the 280) that performs more or less the same, but in some classes slightly slower, and THAT is what we should agree upon, to determine which is faster; because one gpu, two gpu, three gpu or four! It still is slightly faster.
That is perspective. If you have the right perspective, you see what the truth is in the matter.
Now let's move on to principle. Some would argue that for the amount of horsepower the X2 has under it's hood, it isn't turning out impressive results. We could sit around and argue on the grounds that the drivers are more immature than the 280s, but we know that the performance will not change dramatically either way. There's nothing inherently flawed about the ATi drivers, nor the Nvidia ones. They'll get better, but not THAT much better.
So how is it that a dual GPU with DDR5 and all the other bells and whistles, doesn't even decimate a single 4870 in a lot of situations? I don't know the answer honestly. Obviously in some instances, the scaling of Crossfire is very poor; but when you think back on all the multi GPU cards, when has SLI/Crossfire ever scaled really well? When has two GPUs actually equalled more than a 40% gain?
I am not surprised at how dismal the X2's performance is, despite the power it possesses. It's just an unfortunate nature of modern GPUs. However my perspective on the matter, doesn't change the reality. And to some people, paying for something that should be at least 50% or more better, and not getting such results, is unacceptable - principle.
Lastly, let's consider the extenuating variables.
The power draw on a X2 is almost a hundred more watts at load.
The heat at idle is almost 1 1/2 times higher than a 280, and the load is also 1 1/4 times higher.
The cost (although I never argue price when dealing with enthusiast products, someone else always does, so we might as well discuss it anyways) is still $100 more or 100 quid more, whichever applies to you.
It weighs quite a bit more.
It doesn't have any physics (while this subject is debatable as to whether or not it's relevant in today's market as so few programs use physics on GPUs, it still is a relevant issue to someone who wants a card that is ready to make use of GPU physics, without having to wait for an unknown future card from ATi that will be physics capable. Future proofing isn't only for CPUs).
So you see, their performance is competitive whether in the mass of games or acute situations. The difference is, not the performance, but the other factors, and these are all relevant factors; maybe not to you, maybe not to me; maybe not to 25% of the market, but when viewing all angles, all variables to the equation must be present and accounted for.
Lastly, moving on to the 'rehashed' 280. The 280s weak point are the shaders, no doubt about it. It's no so much that an increase in from 1300mhz to 1600mhz is going to be a big deal, but rather that there's a frequency wall somewhere in the 1500-1600 range, much like you experience on your motherboards when overclocking. People have done successful volt mods, and reached near 1600mhz, however it's very unstable. To have the card at 1600 mhz native, that was stable would allow it to overcome that frequency block and then allow the shaders some room to stretch their legs.
This won't allow the 280 to make a fool out of the x2, but given how close their performance is at this time, it would give the 280 an edge in some situations, and in others, nothing would change at all.
The more important thing is that the 280 would be more like it should have been at launch.
Now, you can make up whatever excuses you want, but reviews and testimonies tell a different story than the hype.
What I want others to understand is that neither the 280 or the X2 are required for the mass consumers, and if you do get into that range of cards, be prepared to face the reality when you get home and put your X2 in, that it won't blow the doors off the 280, and there's offer negatives to go with it.
If that's ok with you, then by all means... feel free.
Just don't try to BS people, cause you might just run into someone who has an X2 and a 280 in multiple machines sitting right beside him, and he'll tell you how it REALLY is ;wink;