• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

can this on the inquirer be belived

Meh, at the end of the day, even if those scores are accurate(might be) the biggest jumps, as in the 100-3000% jumps all scream of "our previous card didn't have enough memory for this resolution before" type performance increases. Take them out and it looks less impressive, and frankly look at the 1920x1200 scores and in most games at the moment the 9800gtx is fine enough, and mostly except in Crysis which I'll probably never play again, theres not going to be a feelable difference in most games. maybe some 8xaa over 2 or 4x, but AA gives ever decreasing IQ increase, 2xaa has the biggest effect.

But again, for 1-2 games that will feel noticeably different do I want to spend, what, maybe £400+, when i can have even if only 75/80% of the performance, for £180-220 from ATi with drivers I trust.

I just don't care about speed, i care about price/performance ratio but only to a certain point. If i can't get playable performance on Crysis, but everything else rocks along at 90fps, I don't care, for Crysis maybe the £400 280 might beat the 9800gtx on price performance, but I'd still buy a 9800gtx out of the two for £150.
 

Indeed. All very true.

I'm going to hold off on upgrading for a long time as my 8800GTS640 runs any and all games I want to play without problems.

Maybe I'll upgrade once some uber "must-have" game comes out that brings it to it's knees but there's nothing on the horizon that tempts me...
 
I think they can be best viewed as "A good starting point". It is not the increase we (I hope I can speak for everyone) hoped for, but in situations where it isn't cpu limited, there seems to be at least a 25% increase.

For me (with a 1280*1024) there is no point spending that much money on a gfx card as unless I'm able to get above the threshold of the cpu limit, I'll never see a decent framerate compared to the 8800gtx's, or even the 8800gts/gt g92's.

Looking like I'll have to get a new monitor to get the best outta these latest cards.

EDIT: Although a lot of peeps are discounting it as a benchmark now, I'm actually pretty surprised how poorly Crysis performs with the gtx280 (if these benches are legit). I wonder what it can do with with the cpu at say 3.6ghz rather than 3.0ghz.

Matthew
 
(I think it's Irish humour :p)

Be nice to see how the cards develop over driver iterations. Lets be honest, the only really dissappointment of these new cards is price vs performance. Way too much for too little increase at the moment. I'm sure they'll get better over time.

Matthew
 
(I think it's Irish humour :p)

Be nice to see how the cards develop over driver iterations. Lets be honest, the only really dissappointment of these new cards is price vs performance. Way too much for too little increase at the moment. I'm sure they'll get better over time.

Matthew

Yeah well the 177.26 are supposed to give quite boost and that includes G92 cards. The GTX280 is coming out with 177.34 drivers which are meant to be even better (they are the ones reviewers have been told to use) so I expect them to be tweaked for Crysis, 3dmark06, vantage, world in conflict etc performance.

Hopefully current G92 users will see similar gains from these as well.

Plus the 177.34 may have physics in them so life gets even more interesting..........
 
Meh, at the end of the day, even if those scores are accurate(might be) the biggest jumps, as in the 100-3000% jumps all scream of "our previous card didn't have enough memory for this resolution before" type performance increases. Take them out and it looks less impressive, and frankly look at the 1920x1200 scores and in most games at the moment the 9800gtx is fine enough, and mostly except in Crysis which I'll probably never play again, theres not going to be a feelable difference in most games. maybe some 8xaa over 2 or 4x, but AA gives ever decreasing IQ increase, 2xaa has the biggest effect.

But again, for 1-2 games that will feel noticeably different do I want to spend, what, maybe £400+, when i can have even if only 75/80% of the performance, for £180-220 from ATi with drivers I trust.

I just don't care about speed, i care about price/performance ratio but only to a certain point. If i can't get playable performance on Crysis, but everything else rocks along at 90fps, I don't care, for Crysis maybe the £400 280 might beat the 9800gtx on price performance, but I'd still buy a 9800gtx out of the two for £150.

To me these scores (if true) seem quite impressive, on some games there is more than 100% improvement. This is a high-end card, price/performance on the high end is never good.

This card is as fast as a GX2 without having to use SLI which has problems with many games and does not allow for multi-monitor use. So for me, it is quite an improvement
 
Yeah...I was expecting a backlash after I posted it but he took it in good humour :) No doubt we'll get some knight in shining armour appear at some point to defend his honour.
 
http://www.theinquirer.net/gb/inquirer/news/2008/06/14/complete-gtx280-scores-here

Not an nvidia fan , but looks interesting dependant on costs

Possible posted elsewhere on here only checked first three pages

Bowza

Yes, it was posted in the huge "Nvidia 260/280 thread that has been at the top of this section for about 2 weeks. http://forums.overclockers.co.uk/showthread.php?t=17876541

What's this obsession with "/tread", pyshcas?

Don't you mean "/thread"?

His command of the english language is only slightly better than that of an antelope. :D
 
Why are people interested in physics being available. Games that support it, same games that support Ageia? as in, none. Still need a spare card to run them, moneys best spent on the 2nd card for sli/crossfire over physics. Realistic physics don't do anything for games, time spent on levels = better more destructable realistic feel levels. Physics have smeg all to do with if a designer programs a wall as a solid indestructable thing, or a breakable tactic's influencing realistically destroyable wall.

Even if more games support the physics from Nvidia, which is unlikely as not INTEL and AMD both support Havok, and will almost certainly both be going on cpu die intergrated cpu's, so Havok will be available in 99% of systems in a year or two, with Ageia/nvidia available only in thos nvidia systems with a 2nd/3rd unused graphics card. Theres simply no reason to move physics off the cpu core, not a single thing shown so far that improves gameplay with "better" physics.
 
Yeah..I don't really get the whole "wow0rz physicz" attitude either. It's going to make very little, if any, difference to our gaming experience.
 
Why are people interested in physics being available. Games that support it, same games that support Ageia? as in, none. Still need a spare card to run them, moneys best spent on the 2nd card for sli/crossfire over physics. Realistic physics don't do anything for games, time spent on levels = better more destructable realistic feel levels. Physics have smeg all to do with if a designer programs a wall as a solid indestructable thing, or a breakable tactic's influencing realistically destroyable wall.

Even if more games support the physics from Nvidia, which is unlikely as not INTEL and AMD both support Havok, and will almost certainly both be going on cpu die intergrated cpu's, so Havok will be available in 99% of systems in a year or two, with Ageia/nvidia available only in thos nvidia systems with a 2nd/3rd unused graphics card. Theres simply no reason to move physics off the cpu core, not a single thing shown so far that improves gameplay with "better" physics.

Well you have made a few assumptions which as far as I know are incorrect. The nvidia driver will mean that any 8 series or above card can run Physics and nobody is sure whether it will require a 2nd card or not.

For example if you are running a GTX280 and getting 160 fps on say COD4, why not use some of the shaders for physics and drop the framerate down to 100 fps?

Secondly, if you have pci-e slots spare and are looking at upgrading and selling your 8800GT or 9600GT for £20 -£40 or less (in the future, as gt's are already only worth £50 on the bay) why not keep the GT for physics instead of selling it?

So we will soon know for sure how the physics is implemented in a few days time.

Lastly, gpu's are much better at doing the calcs for physics than what cpu's ever will be. In saying that, once we have 8,12,16 core cpus where only two/four are been used for the games, it would make more sense to use the spare 4+ cores for physics calculation.
 
Last edited:
Back
Top Bottom