• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CoJ DX10 Nvidia Press Release

The way I see it, I think that in DX10 MSAA also needs some sort of shader AA resolve along side it, Im not entirely sure but perhaps the 2900s are just better at this. Don't hold me to it though, my knowledge of how the DX10 API works is very vauge
 
Tom|Nbk said:
The way I see it, I think that in DX10 MSAA also needs some sort of shader AA resolve along side it, Im not entirely sure but perhaps the 2900s are just better at this. Don't hold me to it though, my knowledge of how the DX10 API works is very vauge
That doesn't sound logical but who knows.

I guess the ATI card is better at shader AA as that is what the card uses instead of MSAA and as such ATI optimised that performance when Nvidia optimised the drivers for MSAA, maybe Boogle knows more about this?
 
Dutch Guy said:
I guess the ATI card is better at shader AA as that is what the card uses instead of MSAA and as such ATI optimised that performance when Nvidia optimised the drivers for MSAA, maybe Boogle knows more about this?

I have no idea :( It seems pixel shaders have access to something to do with MSAA in DX10, but beyond that I know nothing :p
 
Boogle said:
I have no idea :( It seems pixel shaders have access to something to do with MSAA in DX10, but beyond that I know nothing :p
Looks like more work for the Nvidia Vista driver team, I am glad I don't work in that department :p
 
mmj_uk said:
If AMD were really clever they'd have made it so that upon detecting an NVidia card it would render the whole benchmark (not just AA) in software mode. ;)

Would probably be a bit TOO obvious though. :p


ati did this before with half-life 2, when nvidia cards ran the nvidia shader path they got shafted on speed big time. but then someone forced nvidia cards to run the ati shader path and image quality was the same as nvidia shader path yet speed increased phenominally.
was a bad day for valve back then since it created an uproar and they say piracry of hl2 went up by bucket loads since no one wanted to pay money for a game that purposley ran crippled on thier hardware.
 
Cyber-Mav said:
ati did this before with half-life 2, when nvidia cards ran the nvidia shader path they got shafted on speed big time. but then someone forced nvidia cards to run the ati shader path and image quality was the same as nvidia shader path yet speed increased phenominally.
was a bad day for valve back then since it created an uproar and they say piracry of hl2 went up by bucket loads since no one wanted to pay money for a game that purposley ran crippled on thier hardware.
And there have been other instances in the past where changing the executable name changed the performance, I think both ATI and Nvidia did this in the past but luckily they stopped doing this for a long time.
 
Dutch Guy said:
That doesn't sound logical but who knows.

I guess the ATI card is better at shader AA as that is what the card uses instead of MSAA and as such ATI optimised that performance when Nvidia optimised the drivers for MSAA, maybe Boogle knows more about this?
http://www.theinquirer.net/default.aspx?article=40401
All DirectX 10 graphics hardware which supports MSAA is required to expose a feature called 'shader-assisted MSAA resolves' whereby a pixel shader can be used to access all of the individual samples for every pixel. This allows the graphics engine to introduce a higher quality custom MSAA resolve operation. The DirectX 10 version of 'Call of Juarez' leverages this feature to apply HDR-correct MSAA to its final render, resulting in consistently better anti-aliasing for the whole scene regardless of the wide variations in intensity present in HDR scenes.
 
Dutch Guy said:
Looks like AMD wanted to make the HD2900XT look a little better than the 8800 cards, "3DMark optimising" all over again :(

All the things HD2900XT cards are worse at are disabled.


this reminds me of the hl2 scam pulled by valve back in the day. i expect this game to go the same way if they purposley pripple its performance on nvidia cards. its gonna end up being pirated like mad :(
 
Dutch Guy said:
And there have been other instances in the past where changing the executable name changed the performance, I think both ATI and Nvidia did this in the past but luckily they stopped doing this for a long time.


yes your right. although they still do it but now give an option to turn it off, like catalyst Ai and nvidia trilinear, mip and that other optimistaions u can enable and disable.
 
Back
Top Bottom