The way I see it, I think that in DX10 MSAA also needs some sort of shader AA resolve along side it, Im not entirely sure but perhaps the 2900s are just better at this. Don't hold me to it though, my knowledge of how the DX10 API works is very vauge
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
That doesn't sound logical but who knows.Tom|Nbk said:The way I see it, I think that in DX10 MSAA also needs some sort of shader AA resolve along side it, Im not entirely sure but perhaps the 2900s are just better at this. Don't hold me to it though, my knowledge of how the DX10 API works is very vauge
Dutch Guy said:I guess the ATI card is better at shader AA as that is what the card uses instead of MSAA and as such ATI optimised that performance when Nvidia optimised the drivers for MSAA, maybe Boogle knows more about this?
It seems pixel shaders have access to something to do with MSAA in DX10, but beyond that I know nothing 
Looks like more work for the Nvidia Vista driver team, I am glad I don't work in that departmentBoogle said:I have no ideaIt seems pixel shaders have access to something to do with MSAA in DX10, but beyond that I know nothing
![]()

mmj_uk said:If AMD were really clever they'd have made it so that upon detecting an NVidia card it would render the whole benchmark (not just AA) in software mode.
Would probably be a bit TOO obvious though.![]()
And there have been other instances in the past where changing the executable name changed the performance, I think both ATI and Nvidia did this in the past but luckily they stopped doing this for a long time.Cyber-Mav said:ati did this before with half-life 2, when nvidia cards ran the nvidia shader path they got shafted on speed big time. but then someone forced nvidia cards to run the ati shader path and image quality was the same as nvidia shader path yet speed increased phenominally.
was a bad day for valve back then since it created an uproar and they say piracry of hl2 went up by bucket loads since no one wanted to pay money for a game that purposley ran crippled on thier hardware.
http://www.theinquirer.net/default.aspx?article=40401Dutch Guy said:That doesn't sound logical but who knows.
I guess the ATI card is better at shader AA as that is what the card uses instead of MSAA and as such ATI optimised that performance when Nvidia optimised the drivers for MSAA, maybe Boogle knows more about this?
All DirectX 10 graphics hardware which supports MSAA is required to expose a feature called 'shader-assisted MSAA resolves' whereby a pixel shader can be used to access all of the individual samples for every pixel. This allows the graphics engine to introduce a higher quality custom MSAA resolve operation. The DirectX 10 version of 'Call of Juarez' leverages this feature to apply HDR-correct MSAA to its final render, resulting in consistently better anti-aliasing for the whole scene regardless of the wide variations in intensity present in HDR scenes.
Dutch Guy said:Looks like AMD wanted to make the HD2900XT look a little better than the 8800 cards, "3DMark optimising" all over again![]()
All the things HD2900XT cards are worse at are disabled.

Dutch Guy said:And there have been other instances in the past where changing the executable name changed the performance, I think both ATI and Nvidia did this in the past but luckily they stopped doing this for a long time.