• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia causes DX10.1 to be removed from Assassin's Creed?

Soldato
Joined
17 Aug 2005
Posts
4,297
I posted this here as the reason we buy video cards is predominantly for the games and new cards offer new effects etc....

Interesting article and i don't think Nvidia and Ubisoft would tell the truth if the DX10.1 code removal was due to Nvidia not wanting to promote the game under its "the way its mean to be played" banner, when including the DX10.1 code allows ATI cards to run much quicker. Obviously Ubisoft would like all the extra promotion it can get and by defecto you have the DX10.1 code removed, even if it wasn't due to an implicit agreement between the two companies.

We have been following a brewing controversy over the PC version of Assassin's Creed and its support for AMD Radeon graphics cards with DirectX 10.1 for some time now. The folks at Rage3D first broke this story by noting some major performance gains in the game on a Radeon HD 3870 X2 with antialiasing enabled after Vista Service Pack 1 is installed—gains of up to 20%. Vista SP1, of course, adds support for DirectX version 10.1, among other things. Rage3D's Alex Voicu also demonstrated some instances of higher quality antialiasing—some edges were touched that otherwise would not be—with DX10.1. Currently, only Radeon HD 3000-series GPUs are DX10.1-capable, and given AMD's struggles of late, the positive news about DX10.1 support in a major game seemed like a much-needed ray of hope for the company and for Radeon owners.

After that article, things began to snowball, first with confirmation that Assassin's Creed did indeed ship with DX10.1 support, and then with Ubisoft's announcement about a forthcoming patch for the game. The announcement included a rather cryptic explanation of why the DX10.1 code improved performance, but strangely, it also said Ubisoft would be stripping out DX10.1 in the upcoming patch.
TR: Is this "render pass during post-effect" somehow made unnecessary by DirectX 10.1?

Beauchemin: The DirectX 10.1 API enables us to re-use one of our depth buffers without having to render it twice, once with AA and once without.

TR: What other image quality and/or performance enchancements does the DX10.1 code path in the game offer?

Beauchemin: There is no visual difference for the gamer. Only the performance is affected.

TR: What specific factors led to DX10.1 support's removal in patch 1?

Beauchemin: Our DX10.1 implementation was not properly done and we didn't want the users with Vista SP1 and DX10.1-enabled cards to have a bad gaming experience.

TR: Finally, what is the future of DX10.1 support in Assassin's Creed? Will it be restored in a future patch for the game?

Beauchemin: We are currently investigating this situation.

So we have confirmation that the performance gains on Radeons in DirectX 10.1 are indeed legitimate. The removal of the rendering pass is made possible by DX10.1's antialiasing improvements and should not affect image quality. Ubisoft claims it's pulling DX10.1 support in the patch because of a bug, but is non-commital on whether DX10.1 capability will be restored in a future patch for the game.


http://techreport.com/discussions.x/14707
 
Back
Top Bottom