Yay finally. The vista performance for the gtx looks terrible in vista;s Also am I the only one not able to see a image quality difference between dx 9 /10 in that?
Guys, remember this is an article from [H].
As such they are doing their usual dumbass oranges-to-apples comparisons.
The reason the GTX performance looks terrible in Vista, is because that benchmark was run with 4xAA, whereas all the others had no AA. Likewise the 2900XT numbers may look OK in DX10, until you remember that they've decided to run it at lower resolution than all the other cards.
I stopped reading HardOCP around 4 years ago or so when they introduced this crazy benchmarking philosophy, not to mention rejecting standardised timedemos in favour of random FRAPS recordings.
Guys, remember this is an article from [H].
As such they are doing their usual dumbass oranges-to-apples comparisons.
The reason the GTX performance looks terrible in Vista, is because that benchmark was run with 4xAA, whereas all the others had no AA. Likewise the 2900XT numbers may look OK in DX10, until you remember that they've decided to run it at lower resolution than all the other cards.
I stopped reading HardOCP around 4 years ago or so when they introduced this crazy benchmarking philosophy, not to mention rejecting standardised timedemos in favour of random FRAPS recordings.
Standardised timedemos can have optomisations put in place in the drivers by nvidia and ati. Remember nvidia got caught cheating at 3d mark a while back? Thats one of the reasons they stick to actual gameplay.
Looks like it's running much better on the PC than a certain group of fanatics predicted.
Its Unreal Engine 3. It was designed originally on pc and as we saw in Bioshock and in UT3 it runs damn well
I never worried for a second that it would run anything other than brilliantly.