GEARS pc performance article

Looks to be scaled pretty well. Should play good on my rig then as i play at 1280x1024 :) Thanks for the link.
 
Yay finally. The vista performance for the gtx looks terrible in vista;s Also am I the only one not able to see a image quality difference between dx 9 /10 in that?
 
Yay finally. The vista performance for the gtx looks terrible in vista;s Also am I the only one not able to see a image quality difference between dx 9 /10 in that?

Seems to be a recurring theme with dx9 and 10 looking identical, now to wait on someone coming into the thread to spout the "omg dx10 is for better performance not more eye candy" rhetoric. :D
 
Guys, remember this is an article from [H].
As such they are doing their usual dumbass oranges-to-apples comparisons.

The reason the GTX performance looks terrible in Vista, is because that benchmark was run with 4xAA, whereas all the others had no AA. Likewise the 2900XT numbers may look OK in DX10, until you remember that they've decided to run it at lower resolution than all the other cards.

I stopped reading HardOCP around 4 years ago or so when they introduced this crazy benchmarking philosophy, not to mention rejecting standardised timedemos in favour of random FRAPS recordings.
 
Last edited:
Guys, remember this is an article from [H].
As such they are doing their usual dumbass oranges-to-apples comparisons.

The reason the GTX performance looks terrible in Vista, is because that benchmark was run with 4xAA, whereas all the others had no AA. Likewise the 2900XT numbers may look OK in DX10, until you remember that they've decided to run it at lower resolution than all the other cards.

I stopped reading HardOCP around 4 years ago or so when they introduced this crazy benchmarking philosophy, not to mention rejecting standardised timedemos in favour of random FRAPS recordings.


Standardised timedemos can have optomisations put in place in the drivers by nvidia and ati. Remember nvidia got caught cheating at 3d mark a while back? Thats one of the reasons they stick to actual gameplay.
 
Guys, remember this is an article from [H].
As such they are doing their usual dumbass oranges-to-apples comparisons.

The reason the GTX performance looks terrible in Vista, is because that benchmark was run with 4xAA, whereas all the others had no AA. Likewise the 2900XT numbers may look OK in DX10, until you remember that they've decided to run it at lower resolution than all the other cards.

I stopped reading HardOCP around 4 years ago or so when they introduced this crazy benchmarking philosophy, not to mention rejecting standardised timedemos in favour of random FRAPS recordings.

Why are they so retarded in their testing? :confused:
 
Standardised timedemos can have optomisations put in place in the drivers by nvidia and ati. Remember nvidia got caught cheating at 3d mark a while back? Thats one of the reasons they stick to actual gameplay.

They can if you stick to generic timedemos that everyone uses - if they created their own timedemos for each comparison review though it would be fine. Obviously some games don't have this capability but I'm thinking more in terms of Doom3 and the like when they started bringing this in.

Using FRAPS recordings is lotto, they are just playing through the game and hence different things can happen each time. When cards are only seperated by a few percent that variation can make all the difference.

Sure you can use it to get an overall 'feel' for how the different cards perform, but I can't take it seriously as a proper benchmarking resource.
 
"We are quite disappointed in the trend lately with Unreal Engine 3 games having a lack of antialiasing control"

"This is very disappointing for a next generation gaming engine. It is as if AA is an afterthought with Unreal Engine 3 based games, and this disappoints us greatly. Visual quality should be moving forward, it is now late 2007, almost 2008, there is no reason current games should not have in-game controls for AA and AF."



I wish epic would sort that out.
 
Last edited:
Also did anyone notice that the texture quality of GTS 320 & 2900XT was set to "medium" rather than "highest" on the GT/GTS 640/GTX in both DX10 & DX9?
 
[H] reviews are rubbish, totally unfair, dont matter if the GTS 320 or 2900 could not run properly with high settings, for the purpose of the comparison they should all be on high. They should also use better charts too.

I also dont believe that it is unplayable using 4xAA with an 8800GT, its based on the UE3 this game, and yet I'm able to play UT3 with 16xAA and its playable, hardly think 4xAA is going to cause the game to lag.
 
Looks like it's running much better on the PC than a certain group of now-gutted fanatics predicted.

Glad I only finished Act 1 on the Xbox 360 version, going to sell it now and get the PC version when it's out.
 
Its Unreal Engine 3. It was designed originally on pc and as we saw in Bioshock and in UT3 it runs damn well :)

I never worried for a second that it would run anything other than brilliantly.
 
Its Unreal Engine 3. It was designed originally on pc and as we saw in Bioshock and in UT3 it runs damn well :)

I never worried for a second that it would run anything other than brilliantly.



Considering rainbow six vegas ran like a crippled tortoise i doubt it. :)
 
Back
Top Bottom