• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GT200 GTX280/260 "official specs"

What you posting the crap that is crysis up for, one dodgy game that is so badly coded it is unreal, please look at the majority of the other benches from the same site, the increase over the GTX is massive in many games, crysis LMAO.
People were saying the same thing about FEAR when that was released, when infact they were in denial that their PC's couldn't run it maxed at a steady 60fps. I find it amusing that a few year on, the same peeps (who are now running FEAR at 250+ fps) are saying the same carp about Crysis. *oh noes, I've spent x amount on my rig and it doesn't play Crysis, there's only one explaination for it - it must be badly coded!*

Granted, Crysis may have its fair share of bugs and what not, but the only reason it aint running at 60+ fps for anyone is becouse we're still waiting for the hardware to catch up. Like someone said a page or so back, if the GTX280 could run Crysis maxed at a steady 60 fps then it'll more than likely run any game maxed for the next few years (think 8800gtx), in which case I and many others wouldn't mind shelling out £450-£500 for one (or two for the really greedy). As it stands though, spending £450 for one of these "new gen" cards (imo) wouldn't be a wise move when they already gets brought to their knees by the latest and greatest graphically intensive game. I guess the bottom line is, Nvidia really need to take a leaf out of ATI's book and price these cards in line with the performance they offer.
 
People were saying the same thing about FEAR when that was released, when infact they were in denial that their PC's couldn't run it maxed at a steady 60fps. I find it amusing that a few year on, the same peeps (who are now running FEAR at 250+ fps) are saying the same carp about Crysis. *oh noes, I've spent x amount on my rig and it doesn't play Crysis, there's only one explaination for it - it must be badly coded!*

Granted, Crysis may have its fair share of bugs and what not, but the only reason it aint running at 60+ fps for anyone is becouse we're still waiting for the hardware to catch up. Like someone said a page or so back, if the GTX280 could run Crysis maxed at a steady 60 fps then it'll more than likely run any game maxed for the next few years (think 8800gtx), in which case I and many others wouldn't mind shelling out £450-£500 for one (or two for the really greedy). As it stands though, spending £450 for one of these "new gen" cards (imo) wouldn't be a wise move when they already gets brought to their knees by the latest and greatest graphically intensive game. I guess the bottom line is, Nvidia really need to take a leaf out of ATI's book and price these cards in line with the performance they offer.

7800gtx could run FEAR at 1600x1200 no problem
 
People were saying the same thing about FEAR when that was released, when infact they were in denial that their PC's couldn't run it maxed at a steady 60fps. I find it amusing that a few year on, the same peeps (who are now running FEAR at 250+ fps) are saying the same carp about Crysis. *oh noes, I've spent x amount on my rig and it doesn't play Crysis, there's only one explaination for it - it must be badly coded!*

Completely agreed.

It's far too easy to say a game is "badly coded" rather than accept that their hardware (and in fact any current hardware) is not capable of running it at full detail levels. It's sad really, because people are always talking about "future-proofing" etc, but when it's actually staring them in the face (in the form of 'high' and 'very_high' detail levels in Crysis that they can enable in 2 or 3 years to play through the game again) they cry "badly coded!"

Seriously - how many of you who are crying "badly coded" would recognise unoptimised code if it was put in front of your face? Can't people just accept that the developers have included graphical modes to extend the life of the product, without fretting that they can't access those modes right now?




As it stands though, spending £450 for one of these "new gen" cards (imo) wouldn't be a wise move when they already gets brought to their knees by the latest and greatest graphically intensive game. I guess the bottom line is, Nvidia really need to take a leaf out of ATI's book and price these cards in line with the performance they offer.

Have to disagree here though. Nvidia's marketplace is defined by the other hardware available, not the software market. There will always be a heavy price-tag associated with "the fastest that money can buy". Nvidia are right to charge a heavy price for this. That's not to say they won't later be forced to reduce this proce to compete with ATIs offerings - it's shaping up to look like they will. But still, it's the hardware (not software) market which will cause this change.
 
I think a lot of the pricing also incorporates what CUDA can offer. Unfortunately, for the serious gamer, that really isn't much at the moment and aside from physics, may not be much in the future.

Let's not forget there is starting to be a market outside of gaming for these cards.

Having said that, I'm not jutifying the price - way too much for too little an increase in gamnig performance for me to consider upgrading.
 
crysis is just like when doom 3 came out fear aswell.as already said it was said the game wouldnt be able to be played everything maxed for a couple of years before it was even released.so why buy the highest end card to try what is impossible :confused:better of just getting a decent card and be happy with that and saving a shed load of cash.also with how fast new cards are coming out now i dont think youll see massive gains anyway anymore.
 
8800gtx was the top end and is still sought after (for good reason if you like high res/aa/af). It has stood the test of time like few other gfx cards. My gtx cost £330 probably 2 years ago and i think that was money well spent, can anybody expect NV's latest card to be cutting the mustard in 2 years time?

Who knows ? When 8800GTX was released no-one knew that it would last as long as it did.

Everybody is complaining about 280 because of its price. The fact is, the card is significantly more powerful than previous single-GPU generation, does not have SLI problems associated with GX2, and its price is in line with previous generation high-end cards. ATI is unable to compete at this level and that is why they are playing "bang for buck" game; if they could produce a more powerful card, they would, and would price it high.

Whether it is justified to spend £460 on a card depends on your income, some people can afford it and some can't. But that does not detract from the card being a significant step forward in GPU technology.
 
Every-ones forgetting a small word that means sumthing in all this .......


"DRIVERS"!


8800GTX drivers are near perfect now, try un-installing them and using the orginal driver disk and playing Crysis/COD4 and see what happens, the performance hit will be huge.

I think all this slagging off GTX280's will be over when ATI release there offering.........





................... and its the same performance or slower!!!!!

Lets face it, I doubt it'll be quicker the only thing they can do is make it cheaper!
 
I can afford the cost of a GTX280 but I have common sense and common sense tells me spending £460 on THIS "high end" part is an immense waste of money unlike the 8800GTX at the time because it was the best card out by a long long looong way.

The GTX280 is nowhere near as "awesome" compared to previous cards in the same way an 8800GTX was to previous cards or lower models.

If it was £350 then yes the price would be fully acceptable but at over £100 more than this it's simply not and especially since it does not have DX10.1 which the ATI cards do have and are technologically more advanced.

But back then most people knew the G92 was coming and right now I have a feeling that the exact same will happen for the next gen cards. like Apple and the iPhone to be re-released with better specs and cheaper price nVidia are using this time to grab as much cash as possible so the next gen ones can be good and cheap perhaps...
 
Last edited:
If it was £350 then yes the price would be fully acceptable but at over £100 more than this it's simply not and especially since it does not have DX10.1 which the ATI cards do have and are technologically more advanced.


Have a quick look and you'll find them at £400, wait till the ATI's comes out and they will be £350.

This time the fastest GPU on the market is £50 cheaper than it was 3 years ago when I bought my 8800GTX......doesnt the cost of living and such go up in price?

They could quite easly charge £500 (OCUK £550), how long can you expect to buy the same thing at less cost?
 
Cost of living? what's that got to do with graphics cards LMAO! - if this were the case then the ATI cards would be similar in price but they are not, the US price differences alone are proof of this and you can sum up the end UK price quite accurately because whilst a GTX280 goes for $640-$700~ a Radeon 4850 and 4870 is going for $199-$299
 
Cost of living? what's that got to do with graphics cards LMAO!


Doesnt your house (if you have one) and your wages increase each year? Doesnt it cost a little more to buy a can of coke one year after the next, doesnt the lastest playstation cost that bit more each time one comes out?

To spell it out for you everything increases in price if you havent noticed?
 
Not "most" technological items unless the maker wants to charge a premium for something that is not premium performance, look at the iPhone again a an example of how a newer model with more features can be released at less more than half the cost of the original released mere months ago.

As in, to spell it out, it has nothing to do with living expenses ... this is not petrol.
 
The problem with Crysis is not that it runs poorly on max details but the fact it scales very badly, run it at medium and the framerate is still crap when you apply 4xMSAA/16xAF and doesn't even look that good anymore compared to other games.

And in a few weeks time the 9800GX2 will be EOL and the price of the GTX280 will have fallen dramatically.
 
Back
Top Bottom