GEARS pc performance article

Well I cant see it getting 16xAA, thats 3x more than 4xAA, does the GTX have 3x more memory bandwidth? No.

LOL.

I love it when will tries to debate something he has absoloutely no clue about.

Keep digging sunshine. Listen to Ulf you might learn something ;)
 
LOL.

I love it when will tries to debate something he has absoloutely no clue about.

Keep digging sunshine. Listen to Ulf you might learn something ;)

I still dont believe your allmighty beastly powerfull GTX will be able to push 16xAA in UT3 or GOW if my GT apparently struggles with 4xAA.
 
Listen to Ulf you might learn something ;)
I don't hear that very often around here! :eek:

I still dont believe your allmighty beastly powerfull GTX will be able to push 16xAA in UT3 or GOW if my GT apparently struggles with 4xAA.
Is there a UT3 demo yet? I can try it on my GTX today and tell you if he's lying.

Not sure why he would though, what would he possibly have to gain? I imagine UT3 runs a lot better than Oblivion as well, and my GTX handles 16xCSAA just fine in that to be honest.
 
[TW]Fox;10428494 said:
Whats wrong with H reviews?

I care more about what the maximum resolution/detail/AA setting a card can play a game smoothly on than I do about whether I gets 72 or 76 fps.

The problem is that VERY few people buy a gfx card for one single game.

In order to make an educated performance comparison between different cards, you need to have apples-to-apples comparisons (obviously at a range of different settings to get a feel for the kind of info you want).

Furthermore, their style of benchmarking is far too subjective - you are letting them make an arbitary choice as to what settings you should be using. How do you know that you might be able to get away with higher for not much performance loss? Or maybe the min FPS on their chosen settings is too low for your liking, and you wanna know how it fares at a lower resolution? What about if you have a TFT monitor and don't like moving outside the native resolution (or maybe cannot even do the resolution used)?

The advantage of standard benchmarks (preferably listing MIN fps in addition to average) is that numbers are impartial. You can look at the results for different settings and work out what suits you best. You are not reliant on a decision taken by the reviewer as to what settings each card is or is not capable of running the game at. In fact in some ways, it smacks of lazy reviewing, a way of simply not having to run such a high number of benchmark runs (maybe half a dozen for each card instead of say 20).

Now, the [H] reviews may make a nice bit of reading and something to use in conjuction with 'normal' ones, but I certainly wouldn't draw any judgements from them on their own.
 
Last edited:
I didnt think many people were getting this :( I thought the seriously achievement would be easy against pc players who had never played GoW before! Oh how I am wrong according to this thread :p
 
I don't hear that very often around here! :eek:

Is there a UT3 demo yet? I can try it on my GTX today and tell you if he's lying.

Not sure why he would though, what would he possibly have to gain? I imagine UT3 runs a lot better than Oblivion as well, and my GTX handles 16xCSAA just fine in that to be honest.

Well according to tom it defaults to 4xAA so 16xAA does not actually work.

There has been a UT3 demo for ages lol. Were do you enable 16xCSAA in Oblivion? I dont see 16xCSAA anywere, you using Nhancer? When I had my 8800GTS I ran oblivion with 16xSSAA ok but that was at 1280x1024.
 
I still dont believe your allmighty beastly powerfull GTX will be able to push 16xAA in UT3 or GOW if my GT apparently struggles with 4xAA.

Quit the sarcy act, it doesn't suit you well.

The GTX has masses of memory bandwidth, the GT albeit a good card has a much more average amount, while the GTX won't push out OMGZ11 100fps with 16xAA if it were even possible it would still manage it, I'd guess at a fairly low FPS level.

Your GT doesn't "apparently" struggle with AA, it flat out can't do it at any playable levels as shown in the H review, possibly the reason your disagreeing with it ;):p

The Memory bus determines how high a resolution you can use, and how much Antialiasing you can use. As shown in DX10 the GT just dont cut the mustard to render in DX10 mode with AA slapped on.
 
There has been a UT3 demo for ages lol.
I've been too addicted to PGR4 and The Witcher, not even played the Crysis demo yet. :eek:

Going to download the UT3 demo now and let you know how 16xCSAA runs on my GTX. :)

Were do you enable 16xCSAA in Oblivion? I dont see 16xCSAA anywere, you using Nhancer? When I had my 8800GTS I ran oblivion with 16xSSAA ok but that was at 1280x1024.
Set your AA to off in Oblivion and to 16xCSAA in Nvidia drivers or nHancer.

16x in the Nvidia drivers is CSAA, it doesn't do 16xMSAA.
 
Quit the sarcy act, it doesn't suit you well.

The GTX has masses of memory bandwidth, the GT albeit a good card has a much more average amount, while the GTX won't push out OMGZ11 100fps with 16xAA if it were even possible it would still manage it, I'd guess at a fairly low FPS level.

Your GT doesn't "apparently" struggle with AA, it flat out can't do it at any playable levels as shown in the H review, possibly the reason your disagreeing with it ;):p

The Memory bus can determines how high a resolution you can use, and how much Antialiasing you can use. As shown in DX10 the GT just dont cut the mustard to render in DX10 mode with AA slapped on.

I havent tried GOW...

I'm talking about UT3, in other games, HL2, Sega Rally and a number of other games I can manager 16xCSAA or 16xAA, in Sega rally with 16xCSAA or whatever the AA is I get about 47fps minimum and max of about 120.

So now you are trying to tell me UT3 is not playable on the GT with 4xAA?
 
I don't hear that very often around here! :eek:

Is there a UT3 demo yet? I can try it on my GTX today and tell you if he's lying.

Not sure why he would though, what would he possibly have to gain? I imagine UT3 runs a lot better than Oblivion as well, and my GTX handles 16xCSAA just fine in that to be honest.


Just for the record I never did say my GTX would handle 16xAA in UT3, it's actually not possible. 4xAA is the max amount of samples you can have driver tweak or not. :p
 
I havent tried GOW...

I'm talking about UT3, in other games, HL2, Sega Rally and a number of other games I can manager 16xCSAA or 16xAA, in Sega rally with 16xCSAA or whatever the AA is I get about 47fps minimum and max of about 120.
GT and GTX are neck and neck just about in every area in HL2 though as it's CPU-limited.

Just for the record I never did say my GTX would handle 16xAA in UT3, it's actually not possible. 4xAA is the max amount of samples you can have driver tweak or not. :p
Oh, that's lame. :( *Stops downloading UT3 demo.*
 
GT and GTX are neck and neck just about in every area in HL2 though as it's CPU-limited.

Oh, that's lame. :( *Stops downloading UT3 demo.*

No the GT and GTX is not, with 16xQCSAA the GTX is about 50fps ahead at 1680x1050 all high, Lost coast is a little less at about 10-20.
 
I havent tried GOW...

I'm talking about UT3, in other games, HL2, Sega Rally and a number of other games I can manager 16xCSAA or 16xAA, in Sega rally with 16xCSAA or whatever the AA is I get about 47fps minimum and max of about 120.

Im sure you can these games are all less heavy on the shaders and memory so you will get away with it. This thread is about dicussing GOW performance. UT3 performance will be relative with GOW since it's the same engine both will be similar builds to each other.

I have also never said your card couldn't flat out handle 16xAA that would be incorrect. It's just some games differ and rely more heavily on memory and the bus width.
 
What's QCSAA? Are you just randomly throwing quality supersampling into the mix now or what?

I thought we were talking about 16xCSAA.

There is an option called 16xQ CSAA in CSS, HL2, TF2, HLW:EP2 etc..

Im sure you can these games are all less heavy on the shaders and memory so you will get away with it. This thread is about dicussing GOW performance. UT3 performance will be relative with GOW since it's the same engine both will be similar builds to each other.

I have also never said your card couldn't flat out handle 16xAA that would be incorrect. It's just some games differ and rely more heavily on memory and the bus width.

So if its relative to GOW, then [H] are BS'ing since I know I CAN handle 4xAA in UT3 with it been perfectly playable, so if its relative to UT3 performance it should be playable in GOW at 4xAA.

I know AA was there, I'm not blind to jaggys.
 
There is an option called 16xQ CSAA in CSS, HL2, TF2, HLW:EP2 etc..
16xQ CSAA is nothing like 16x CSAA. The Q is for quincunx and means it's been massively supersampled, so you're bound to take a much bigger performance hit with it.

Your AA comparisons are worse than [H] benchmarks for being apples-to-oranges. :p
 
There is an option called 16xQ CSAA in CSS, HL2, TF2, HLW:EP2 etc..



So if its relative to GOW, then [H] are BS'ing since I know I CAN handle 4xAA in UT3 with it been perfectly playable, so if its relative to UT3 performance it should be playable in GOW at 4xAA.

I know AA was there, I'm not blind to jaggys.

DX9 and DX10 performance with AA is obviously going to be massively different, as the API's handle it all very differently. I believe DX10 uses shaders to apply the AA, which, if that is the case it is obviously why the GT dies a death since shader AA hammers Memory/Bandwidth.
 
Last edited:
16xQ CSAA is nothing like 16x CSAA, the Q is for quincunx and means it's been massively supersampled.

Your AA comparisons are worse than [H] benchmarks for being apples-to-oranges. :p

I aint got no AA comparisons I dont have the time to do em.

So 16xCSAA is better than 16xQCSAA?

DX9 and DX10 performance with AA is obviously going to be massively different, as the API's handle it all very differently. I believe DX10 uses shaders to apply the AA?.

I havent tried UT3 in DX10 with AA, I was in XP, I'm dual booting XP/Vista again now tho.
 
Back
Top Bottom