• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia causes DX10.1 to be removed from Assassin's Creed?

Last edited:
It doesn't say Dx10.1 in dxdiag, it still shows as running Dx10.

Shame we won't get these improvements that are shown in this Dx10.1 FAQ (on page 3), as its never going to get used now, as Nvidia's going to be paying developers not to use it. :(

http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=103&Itemid=29&limit=1&limitstart=0

Bet your all glad you have Dual/Quad-Cores now eh. :p

I think he has something against DX10.1, he's made a few weird references about it.

I'm starting to think NV can't do DX10.1, I thought they loved to get on the latest feature bandwagons? It's shown that it's not irrelevant and pointless with the assassin's stuff going around. The fact that NV is so against it really seems like they're don't like it because they haven't cracked it yet, which brings about the original stuff about NV complaining for the original DX10 spec to be changed because they also couldn't do that. I think DX10.1 is what DX10 was meant to be in the first place, which would explain why DX10 so far has been pretty underwhelming.
 
I think the problem is that DX10.1's specifications say you MUST do anti-aliasing on the shaders in order to give the developer more control over image quality. AMD's current architecture allows this, whereas nVidia's takes the faster, more traditional route of doing it on the ROPs... It may well be that AA being done like that on nVidia's current cards may be a huge performance hit that they don't want getting out. Of course that's all just speculation, but it doesn't seem unlikely.
 
I think the problem is that DX10.1's specifications say you MUST do anti-aliasing on the shaders in order to give the developer more control over image quality. AMD's current architecture allows this, whereas nVidia's takes the faster, more traditional route of doing it on the ROPs... It may well be that AA being done like that on nVidia's current cards may be a huge performance hit that they don't want getting out. Of course that's all just speculation, but it doesn't seem unlikely.

That would explain why the 3800s get a performance boost on DX10.1, and it would also explain the lack of support/interest from nvidia. Also, does this mean that 2900s are technically capable of DX10.1?
 
You've got to be ******* joking me????????. A 3850Pro for about £88 has this?. Lol even the 256mb version is only £76 and has this feature also.

If a feature that is on a £76 card can boost performances then why, oh why have Nvidia not included this?. Is this DX10 spec 100% concrete?.

The 3450 can do it also and thats sub <£30 :D
 
I think he has something against DX10.1, he's made a few weird references about it.


Against it no.
It's just that I only have DX10.0 so couldn't care less about support for something that doesn't exist.

"it doesn't say 10.1 in dxdiag"....because it's DX10.0
WHen I went from 9a, to 9b, dxdiag reflected the change.
MS putting out new versions without changing the version (or even build) number? not tremendously likely is it.

I'm sure when DX10.1 is available in windows update, everything will be fluffy kittens, until then, don't give a stuff (and the "assassin's creed crashed all the time without DX10.1" things, are just incorrect....unless.....maybe I've GOT supersecret DX10.1 after all and that's why the game runs flawlessly for hours.

Fud, much much Fud.
 
Against it no.
It's just that I only have DX10.0 so couldn't care less about support for something that doesn't exist.

"it doesn't say 10.1 in dxdiag"....because it's DX10.0
WHen I went from 9a, to 9b, dxdiag reflected the change.
MS putting out new versions without changing the version (or even build) number? not tremendously likely is it.

I'm sure when DX10.1 is available in windows update, everything will be fluffy kittens, until then, don't give a stuff (and the "assassin's creed crashed all the time without DX10.1" things, are just incorrect....unless.....maybe I've GOT supersecret DX10.1 after all and that's why the game runs flawlessly for hours.

Fud, much much Fud.

Wow :eek:. Are you actually suggesting that DX10.1 doesn't exist? :confused: If it doesn't exist, where did the performance increases come from in assassin's creed? Maybe this is an example of some one disregarding something because they themselves haven't seen or experienced it. :rolleyes:
 
No, I was just thinkin with me arse really.

Assumed that since I was still seeing my DX reported as 10, that 10.1 was stilla twinkle in an MS-beta tester's eye or was on realease only to developers, or errrm something.

I think folks have tried to explain it to me before but failed as by brain is such a small target, I now comprehend.
 
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7615&Itemid=1

When asked about the plans of getting the D3D 10.1 back to Assassin's Creed, Ubisoft responds that there is no plan to re-implement support for DX10.1. Like we expected anything different from a TWIMTBP title.

Pretty obvious Nvidia wouldn't let them put it back in. :p

Also pretty obvious that if the GT200's had Dx10.1 added, then the above would have been something like this :-

When asked about the plans of getting the D3D 10.1 back to Assassin's Creed, Ubisoft responds that there are plans to re-implement support for DX10.1 in another forthcoming patch, which they are hoping to have ready sometime in June.

:D
 
Last edited:
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7615&Itemid=1



Pretty obvious Nvidia wouldn't let them put it back in. :p

Also pretty obvious that if the GT200's had Dx10.1 added, then the above would have been something like this :-

When asked about the plans of getting the D3D 10.1 back to Assassin's Creed, Ubisoft responds that there are plans to re-implement support for DX10.1 in another forthcoming patch, which they are hoping to have ready sometime in June.


:D

Absolute class m8 :D (applauds)

Edit: to include the quote as it wasn't there.
 
Last edited:
Nothing like a bit of corporate bullying to keep your products in the lead. Also given that nvidia can't seem to get DX10.1 working on their cards maybe people should remember that when they go on about how no competition has allowed nvidia to take it easy and maybe not being able to implement DX10.1 also means G280 or whatever has problems that we don't know about yet.
 
The reason for the removal of DX10.1 from assassins creed:

IMPORTANT NOTE: The in-game "Graphic Quality" setting does not successfully enable AF on the AMD's ATI Radeon HD 3870 series of video cards. The only way to enable AF on these video cards, therefore, is to force it from the control panel. However, forcing AF caused our performance to plummet during testing. Let's hope that Ubisoft sees fit to fix this problem in a future patch.

the problem lies in the dx10.1 code, when dx10.1 mode is enabled AF is disabled and artificially gives the game a large performance increase due to the lack of AF. Hence why the patch was released to disable dx10.1 till a work around can be sorted out for the Anistropic filtering.

this has been posted on Rage3d forums, and even on ubisoft forums. Nvidia has had nothing to do with the removal of dx10.1 from this game.
 
The reason for the removal of DX10.1 from assassins creed:



the problem lies in the dx10.1 code, when dx10.1 mode is enabled AF is disabled and artificially gives the game a large performance increase due to the lack of AF. Hence why the patch was released to disable dx10.1 till a work around can be sorted out for the Anistropic filtering.

this has been posted on Rage3d forums, and even on ubisoft forums. Nvidia has had nothing to do with the removal of dx10.1 from this game.

Sure, it is a nvida game what ever it called, normally it would be patched and fixed not removed by that is just my feeling as like dx10 was ment to be 10.1 but cannot blame them almost all company's would do the same.
 
I do wonder how would the 2900 Have preformed if dx10 would have been left as it was going to be in the first place as the 3*** did seam get a boost. but is meant to be buggy code and to be honest if i had a 3*** and it worked ok i think i wouldn't care if i was getting a big incease in frames.
 
this has been posted on Rage3d forums, and even on ubisoft forums. Nvidia has had nothing to do with the removal of dx10.1 from this game.
So why not fix it rather than completely remove it? Or at least give people the choice to use it. Inform them what will happen if they turn it on, then leave it up to the user. I mean, isn't that what the graphics options are all about - allowing you to sacrifice image quality for speed?

I still see no reason to forcibly remove it, and then state there are no plans to ever reintroduce it.
 
So why not fix it rather than completely remove it?

Exactly, but thats the thing, they can't, it had to be removed as its a NVIDIA 'The Way Its Meant To Be Played' game, those games in that programme are written to take advantage of Nvidia features, so they wouldn't have Dx10.1 in em if their cards can't do that feature, so everyone trying to say it wasn't Nvidia who had it removed is wrong, as it clearly was, anyone with half a brain can see that it was. :p
 
Last edited:
"So why not fix it rather than completely remove it"

At the minute Nvidia seem unable to implement DX10.1 into their cards and this gave the Ati cards a speed boost in a game that is meant to be totally Nvidia. It must have been a little embarasing for Nvidia having a TWIMTBP title that performed better on their rivals hardware with DX10.1 implemented.
 
Back
Top Bottom