GEARS pc performance article

16xQ CSAA should look better than 16x CSAA but should also take a much bigger performance hit.

Personally I'd stick with regular CSAA as it's great.

16xQCSAA gives me an average of 176fps in CSS Stress Test so I'm happy using that in TF2, CSS, HL2:EP2, Portal etc..
 
16xQCSAA gives me an average of 176fps in CSS Stress Test so I'm happy using that in TF2, CSS, HL2:EP2, Portal etc..
That's because those Source games are all massively CPU limited. If you see any difference in performance between a GT and GTX in them I would honestly be surprised.
 
DX10 On/Antialiasing


In order to receive antialiasing you must select the “On/Antialiasing” option. When this option is selected the game will use 4X AA by default. There are no in-game options to change the level of AA being used. We looked at the “WarEngineUserSettings.ini” file (which contains user quality settings) and found that “MaxMultisamples=4” was being used. We tried to change this value to 2 but every time we started the game it defaulted back to 4. We also tried overriding AA from the control panel but this did not work either. Therefore the only AA option we had available to us was the default 4X AA with the DX10 On/Antialiasing option. There are also no in-game options for selecting the anisotropic filtering level, but we found success in forcing 16X AF from the control panel on each video card.

See... the game is hard locked to 4x which will more than likely be 100% the case in DX9 aswell since they will share the same INI files, but obviously the calls will differ from DX9/10.
 
I aint got no AA comparisons I dont have the time to do em.

So 16xCSAA is better than 16xQCSAA?



I havent tried UT3 in DX10 with AA, I was in XP, I'm dual booting XP/Vista again now tho.

The UT3 demo is DX9 only so save yourself the hassle and don't bother :)

16xQ is the better AA level and it's the one I always aim for if possible.
 
That's because those Source games are all massively CPU limited. If you see any difference in performance between a GT and GTX in them I would honestly be surprised.

Well tom did a comparisons, and I got 176, slightly higher overclocked got 188 average, this was in XP, 1680x1050, all high 16xQCSAA, and tom got like 250 I think, and I've chcked my performance and its on par with other people with GT's, allthough my CPU is 3.4 his is 3.6 but that surely wont make 60fps difference surely :confused:
 
Q FOR THE LOSE! :p

Also DX10 FOR THE LOSE!, look at the difference between DX10 and 9 in GOW, no difference except performance loss, DX10 is a waste of time to get people to move to Vista, allthough I moved to Vista because its just better, dont care about DX10 no more really, allthough if its smooth might aswell run it lol but no big deal if its laggy.

are we still arguing over insanely high AA lvels that dont make any difference lol

I'm sorry but 4x AA - 8xAA there is a difference, 8xAA - 16xAA there is not much difference but use it coz I can and it just makes you think there is a difference.
 
only reason people think the GT can keep up with the GTX is EVERY review that came out on release was completely bogus and people still haven't caught on. EVERY single review on every site deviated from their normal benchmarking patterns. anandtech didn't put all their GTX/GTS/GT/2900 numbers on the same pages, and when you compare those numbers to the previous reviews for hl2, ut3, bioshock you can see they've taken numbers from various different levels which give quite different performance, and tried to make them seem like they've done the same levels. its plain to see if you check the UT3 numbers. they've taken GTS numbers from the fastest map so the gts(at least in the anandtech review) looks fairly close to the GT, but just behind because NO AA. They've taken middle of the range GTX numbers to make sure its ahead of the GT, but not to far, costs on the 90nm part are massively higher, they simply don't want to make gtx's anymore. then they took 2900XT numbers from the lowest performing map, theres no two ways about this, on the lowest performing map the 2900xt is 30% ahead of the gts. on the highest performing map the 2900xt is 30% ahead of the gts. on the lowest vs highest the gts seems almost on par with the 2900xt.

EVERY single site has ignored AA, done weird resolutions, and separated out numbers with no indication that they were all done on the same maps, which in anandtechs case, clearly they weren't.

the GT is no where near as good as people think, its better than the gts without aa, and even fairly close to the GTX at lower res's and no AA> with aa the 640gts often beats it, with lots of AA the 640gts and the gtx pretty much always beat the GT, with the GTS only being £10 more now, its the better card for everything but power consumption.

now reviews are slowly starting to show the small differences between the gt/gts WITHOUT AA. you can bet the gt's 2fps lead at 1600x1200 with no aa is completely gone when AA is enabled, simple as that the GT has a lot less mem bandwidth.
 
I'm sorry but 4x AA - 8xAA there is a difference, 8xAA - 16xAA there is not much difference but use it coz I can and it just makes you think there is a difference.


holy contradictory overload batman!!!!

will i never said 8x fsaa wasnt an improvement over 4x. (which actually is debatable, considering the upteen million different fsaa modes we have now). but 16xfsaa is just pointless. its for people who have an fsaa e-peen. i dont care for it. now come on dont try to convince me of anything, ive had a gtx for almost a year now dont forget. i like my games pretty and fast
 
Last edited:
only reason people think the GT can keep up with the GTX is EVERY review that came out on release was completely bogus and people still haven't caught on. EVERY single review on every site deviated from their normal benchmarking patterns. anandtech didn't put all their GTX/GTS/GT/2900 numbers on the same pages, and when you compare those numbers to the previous reviews for hl2, ut3, bioshock you can see they've taken numbers from various different levels which give quite different performance, and tried to make them seem like they've done the same levels. its plain to see if you check the UT3 numbers. they've taken GTS numbers from the fastest map so the gts(at least in the anandtech review) looks fairly close to the GT, but just behind because NO AA. They've taken middle of the range GTX numbers to make sure its ahead of the GT, but not to far, costs on the 90nm part are massively higher, they simply don't want to make gtx's anymore. then they took 2900XT numbers from the lowest performing map, theres no two ways about this, on the lowest performing map the 2900xt is 30% ahead of the gts. on the highest performing map the 2900xt is 30% ahead of the gts. on the lowest vs highest the gts seems almost on par with the 2900xt.

EVERY single site has ignored AA, done weird resolutions, and separated out numbers with no indication that they were all done on the same maps, which in anandtechs case, clearly they weren't.

the GT is no where near as good as people think, its better than the gts without aa, and even fairly close to the GTX at lower res's and no AA> with aa the 640gts often beats it, with lots of AA the 640gts and the gtx pretty much always beat the GT, with the GTS only being £10 more now, its the better card for everything but power consumption.

now reviews are slowly starting to show the small differences between the gt/gts WITHOUT AA. you can bet the gt's 2fps lead at 1600x1200 with no aa is completely gone when AA is enabled, simple as that the GT has a lot less mem bandwidth.

Well I know one thing, with AA my GT is faster than my GTS 640 I had, both of them I owned at different times, also the GT is miles better with or without AA compared to my 2900XT aswell. And judging by the benchmarks in Crysis with no AA, the GT is 2fps slower than a GTX.
 
Well tom did a comparisons, and I got 176, slightly higher overclocked got 188 average, this was in XP, 1680x1050, all high 16xQCSAA, and tom got like 250 I think, and I've chcked my performance and its on par with other people with GT's, allthough my CPU is 3.4 his is 3.6 but that surely wont make 60fps difference surely :confused:

Nah I only got 215FPS AVG will

The source results are posted up in the GT overclocking section Ulf, if you CBA to sife through all the "OMGZBBQ1 LOL GT" chat :p
 
So if GOW cant do 4xAA with a GT, is UT3 the last game with awsome graphics the GT will do with AA coz if it is I'm getting a GTX.
 
Back
Top Bottom