GEARS pc performance article

Guys, remember this is an article from [H].
As such they are doing their usual dumbass oranges-to-apples comparisons.

The reason the GTX performance looks terrible in Vista, is because that benchmark was run with 4xAA, whereas all the others had no AA. Likewise the 2900XT numbers may look OK in DX10, until you remember that they've decided to run it at lower resolution than all the other cards.

I stopped reading HardOCP around 4 years ago or so when they introduced this crazy benchmarking philosophy, not to mention rejecting standardised timedemos in favour of random FRAPS recordings.

going on that you will also note that they used 4xAA for vista, but not for XP :rolleyes: hence the 'terrible' vista performance.

these guys are cowboys, they really dont know how to do things properly :rolleyes::rolleyes:
 
In the demo you'll probs find an 8800GT can use AA just like been able to use 16xAA in UT3 so at least 4xAA should be possible in GOW unless its miles better graphics than UT3.

You don't use 16xAA since the UT3 games will only do a max of 4xAA in DX9 (BioShock, UT3), hence why you setting to 16xAA is utterly pointless since the driver will just default it back to 4xAA or not apply any AA at all.

You would not have 16xAA playable on a GT even if it did support it.
 
Last edited:
And I've just received an email to say my copy has already shipped!! :eek: And as my copy of Timeshift arrived earlier than expected, I'm hoping this will be the same.
 
Whats wrong with H reviews?

I care more about what the maximum resolution/detail/AA setting a card can play a game smoothly on than I do about whether I gets 72 or 76 fps.
 
[TW]Fox;10428494 said:
Whats wrong with H reviews?

I care more about what the maximum resolution/detail/AA setting a card can play a game smoothly on than I do about whether I gets 72 or 76 fps.

Because of statements like this:
Review said:
There definitely was an improvement in image quality by running with 4X AA in this game.

Djeez, really?! I had no idea!!
 
You don't use 16xAA since the UT3 games will only do a max of 4xAA in DX9 (BioShock, UT3), hence why you setting to 16xAA is utterly pointless since the driver will just default it back to 4xAA or not apply any AA at all.

You would not have 16xAA playable on a GT even if it did support it.

Weird how the fps alters from 4xAA to 8xAA+ aint it then? And with your GTX then if indeed what you are saying is true then the GTX wont have playable with 16xAA.

Yeah but doesn't 4xMSAA have pretty much the same performance impact as 16xCSAA?

Unless you mean 16xQ which I don't think any card could run UT3 with. :eek:

No when I chose 16xQAA in NVCP the fps dropped too an awfull low, I mean the standard 16xAA.

going on that you will also note that they used 4xAA for vista, but not for XP :rolleyes: hence the 'terrible' vista performance.

these guys are cowboys, they really dont know how to do things properly :rolleyes::rolleyes:

I think you're right they might be cowboys, crap charts, silly methods, silly things they say, and I just find the wording they unprofessional, I know it sound stupid, but its just how they explain things sometimes it just seems like a 15yr old written it.
 
Last edited:
Weird how the fps alters from 4xAA to 8xAA+ aint it then? And with your GTX then if indeed what you are saying is true then the GTX wont have playable with 16xAA.
.

GTX has higher mem bandwidth and memory, so it would handle 16xAA I would have thought if it was possible in the game better than the GT (albeit at a **** poor fps), it's well known that you will get the FPS hit but the IQ won't change when you up the AA level past 4x, look at the review the highest AA level available is 4xAA in this game even in DX10.
 
GTX has higher mem bandwidth and memory, so it would handle 16xAA I would have thought if it was possible in the game better than the GT (albeit at a **** poor fps), it's well known that you will get the FPS hit but the IQ won't change when you up the AA level past 4x, look at the review the highest AA level available is 4xAA in this game even in DX10.

No you're GTX would not handle 16xAA if the GT cant, you're just trying to make the GTX look a lot better than a GT now which it aint.

So if the fps hit is there but no IQ change, then it must reflect the performance you would get.

I'll check the AA in UT3 when I get home then if I see no IQ difference I'll raise the white flag ok.

When is GOW Demo out apparently today??
 
No you're GTX would not handle 16xAA if the GT cant, you're just trying to make the GTX look a lot better than a GT now which it aint.
He's not though, the GTX is miles ahead in terms of memory bandwidth and simple amount of memory and that's what matters for AA/AF. :confused:
 
He's not though, the GTX is miles ahead in terms of memory bandwidth and simple amount of memory and that's what matters for AA/AF. :confused:

Well I cant see it getting 16xAA, thats 3x more than 4xAA, does the GTX have 3x more memory bandwidth? No.
 
Well I cant see it getting 16xAA, thats 3x more than 4xAA, does the GTX have 3x more memory bandwidth? No.
You can't compare 4xMSAA with 16xCSAA as "3x more AA" though, since 4xMSAA and 8xCSAA give about the same performance hit. I think you should stop being sarcy and quit before you dig yourself a hole here.
 
I'm not been sarcy......
The way you said "does it have 3x more memory bandwidth? no" just sounded a bit sarcy to me.

Anyway here's a table that shows exactly which forms of AA do what. Good guide.

106gh1e.jpg
 
I've never read a [H] review before and I probably will never try again. Apart from the confused benchmarking it was just a chore to read. Does anyone else feel they managed to take the tedium of a benchmarking article and make it even harder to read?

GOW for PC is a third person game, which is totally weird 'cause the last game I played was a first person game. GOW is not a first person game but is actually played from a third person perspective. In GOW you control the action from behind the guy, and not in the guy like you might expect. Some people may find it hard not being in the guy, like using his eyes, instead you are behind the guy kinda like you are in another guy using his eyes and this guy is following you around. We didn't find being in another guys eyes hard at all, in fact we thought it was pretty easy. You may find being in a guys eyes following another guy who's eyes you don't occupy difficult because you're a PC gamer and only know how to be in the eyes of the guy you're controlling. This is what is meant by third person. Up next, the impact gears have had on wars.
 
I've never read a [H] review before and I probably will never try again. Apart from the confused benchmarking it was just a chore to read. Does anyone else feel they managed to take the tedium of a benchmarking article and make it even harder to read?

GOW for PC is a third person game, which is totally weird 'cause the last game I played was a first person game. GOW is not a first person game but is actually played from a third person perspective. In GOW you control the action from behind the guy, and not in the guy like you might expect. Some people may find it hard not being in the guy, like using his eyes, instead you are behind the guy kinda like you are in another guy using his eyes and this guy is following you around. We didn't find being in another guys eyes hard at all, in fact we thought it was pretty easy. You may find being in a guys eyes following another guy who's eyes you don't occupy difficult because you're a PC gamer and only know how to be in the eyes of the guy you're controlling. This is what is meant by third person. Up next, the impact gears have had on wars.

crap reviews, cant believe mav wanted to wait for the [H] review before thinking about buying an 8800GT, dont know who wants to trust [H] reviews, Toms Hardware are better, at least the wording they use is more professional.
 
Back
Top Bottom