GEARS pc performance article

Differences between 4x and 8x or 8x and 16x ??

I have some images which I will show you later (there is a definite difference between 4 and 8), but now I got to drive back to college so I'll be back later.

On yer bike save the planet.
 
Last edited:
So if GOW cant do 4xAA with a GT, is UT3 the last game with awsome graphics the GT will do with AA coz if it is I'm getting a GTX.

Well as H has shown it is not possible in DX10 but we all know how overrated DX10 is, so just use driver forced AA in DX9 which will work fine on your GT easily (as it has done in UT3), Right, must go back later.
 
Differences between 4x and 8x or 8x and 16x ??

I have some images which I will show you later (there is a definite difference between 4 and 8), but now I got to drive back to college so I'll be back later.

8x and 16x in any game of your choice. at a decent resolution please:p (so no 128x1024!)
 
Why do people need 8x/16xaa ? do members of ocuk have fighterpilot vision ?
 
Why do people need 8x/16xaa ? do members of ocuk have fighterpilot vision ?
I for one can definitely tell the difference in image quality between 4x MSAA and 8x CSAA without getting a magnifying glass, not to mention that they have the same performance hit so you're basically getting better image quality for free. I agree with you about 16x CSAA though and as such I pretty much never use it.
 
Except on screenshots I don't notice anything over 4xAA :p

Running the UT3 demo and I barely notice having AA off! lol


(Running in 1920x1200 btw ;))
 
GTX has higher mem bandwidth and memory, so it would handle 16xAA I would have thought if it was possible in the game better than the GT (albeit at a **** poor fps), it's well known that you will get the FPS hit but the IQ won't change when you up the AA level past 4x, look at the review the highest AA level available is 4xAA in this game even in DX10.

While this may well be true, if I still get the performance hit of running 16xAA then it is quite possible to play UT3 on an 8800GT with 16xAA as I have done so (even if it hasn't made a visible difference).

All at the same scene -

No AA, solid 62fps (Vsync)
4xAA, 54fps
16xAA, 43fps

So ner :p

only reason people think the GT can keep up with the GTX is EVERY review that came out on release was completely bogus and people still haven't caught on. EVERY single review on every site deviated from their normal benchmarking patterns. anandtech didn't put all their GTX/GTS/GT/2900 numbers on the same pages, and when you compare those numbers to the previous reviews for hl2, ut3, bioshock you can see they've taken numbers from various different levels which give quite different performance, and tried to make them seem like they've done the same levels. its plain to see if you check the UT3 numbers. they've taken GTS numbers from the fastest map so the gts(at least in the anandtech review) looks fairly close to the GT, but just behind because NO AA. They've taken middle of the range GTX numbers to make sure its ahead of the GT, but not to far, costs on the 90nm part are massively higher, they simply don't want to make gtx's anymore. then they took 2900XT numbers from the lowest performing map, theres no two ways about this, on the lowest performing map the 2900xt is 30% ahead of the gts. on the highest performing map the 2900xt is 30% ahead of the gts. on the lowest vs highest the gts seems almost on par with the 2900xt.

EVERY single site has ignored AA, done weird resolutions, and separated out numbers with no indication that they were all done on the same maps, which in anandtechs case, clearly they weren't.

the GT is no where near as good as people think, its better than the gts without aa, and even fairly close to the GTX at lower res's and no AA> with aa the 640gts often beats it, with lots of AA the 640gts and the gtx pretty much always beat the GT, with the GTS only being £10 more now, its the better card for everything but power consumption.

now reviews are slowly starting to show the small differences between the gt/gts WITHOUT AA. you can bet the gt's 2fps lead at 1600x1200 with no aa is completely gone when AA is enabled, simple as that the GT has a lot less mem bandwidth.

Did you actually read the reviews? It beats the 640mb in everything as far as I can see (apart from Ep 2 at low res, beats it at higher resolutions), with or without AA at high and low resolutions.

So, you own a GTS 640 or the GTX?
 
Last edited:
While this may well be true, if I still get the performance hit of running 16xAA then it is quite possible to play UT3 on an 8800GT with 16xAA as I have done so (even if it hasn't made a visible difference).

All at the same scene -

No AA, solid 62fps (Vsync)
4xAA, 54fps
16xAA, 43fps

So ner :p



Did you actually read the reviews? It beats the 640mb in everything as far as I can see (apart from Ep 2 at low res, beats it at higher resolutions), with or without AA at high and low resolutions.

So, you own a GTS 640 or the GTX?

What res, 640x480? :eek:
 
You sure its even working properly? last i read in the ut3 demo with aa on somethings were getting aa and some weren't. Ive not tried it myself. :confused:
 
Yeah, quite sure...



No AA @ 200% zoom

noAA.jpg


16x AA (4xAA according to Tom|NBK, but set to 16xAA and had the performance hit)@ 200% zoom

16xAA.jpg
 
Why oh why didn't R6 Vegas run as good as UT3? Ok the level structure in Vegas was more complex but it still doesn't explain why it ran 10x slower. :(

because they didnt use the optimised PC version of the engine and used the Console one. and then made small tweaks so it would run on the PC. but as you stated. not very well at all. thus it being the classic of a buggy console port.

oh and Gears uses a newer version of the engine with more features and tweaks
 
I'm shocked at the amount of people that don't like HardOCP's review methods. Why would you rather know the raw FPS rather than what the maximum eye candy settings are for your card? Smacks of people living in the 9700 and earlier period mindset to me.
 
Back
Top Bottom