Theres no proof nvidia were directly involved in the dx10.1 support being removed from AC, from screenshots at rage3d and [H] I can see a drop in IQ, contrast looks messed up and textures look blurry on the ATI cards tested, the developers themselfs even said the performance increase was due to a bug, it's just pure paranoia on everyones part to believe nvidia paid them to remove it.
Of course we will see dx10.1 games when nvidia support them, developers have no reason not to support it then, right now most users with high end cards do not have DX10.1 capable hardware and SP1 has only just been released, in fact it's still not available to the public via windows update.
From that article.
There should be no doubt that Assassin's Creed contains DirectX 10.1 technology, and that that technology provides AMD's video cards with a solid performance boost. However, it is not to last. A few days ago, we received this statement from Andy Swanson of Ubisoft via NVIDIA:
Ubisoft plans to release a patch for the PC version of Assassin’s Creed that addresses the majority of issues reported by fans. In addition to addressing reported glitches, the patch will remove support for DX10.1. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.
I agree that the IQ is better with the Nvidia card on that {H} article. It makes the image stand out more and it's easier to see from still images but I don't think it will make much of a difference when you're playing as it's not terrible by any means. Also with no DX10.1 capable Nvidia cards to compare against then it makes it hard to determine if Nvidia would be affected by this also.
It's not that I am saying that Nvidia has paid for this to be removed but what is to say that future money was threatened or something similar? (Speculation but do you think this stuff would be common knowledge?, we don't get to find out everything through the Internet). It is a very strange move to remove this feature already but it's down to if the user patches their game or not

. I'm also not saying that Nvidia has paid Futuremark for it to not be included in Vantage. I just think that technology that can boost your performance should be an addition to this benchmark. It wouldn't be fair on Nvidia as they are behind but in all honesty they have had 6 months over ATI to implement DX10.1 and one and a half years overall in the lead. How the hell could ATI go from 2900 to 3870 then nearly have 4870 land on our doorstep within a year with better technology in each step?.
Yes, this thread is about DX10.1 but why ignore Nvidia's lack of progress when it clearly affects the consumer?. So because of Nvidia then all the Assassin Creed players that own ATI DX10.1 cards should do without because Nvidia don't have the technology?. As James.Miller said it doesn't surpass the Nvidia cards performances in this game so what is the harm in leaving this in?.
What about the date of the latest 8800GTX drivers and the date of the latest 9800Pro drivers?.
(just a few) 9800Pro, X800, X1800, X1900, X2900XT, 3800,4xxx but only the 9 series Nvidia cards have had any whql drivers this year. How can the ancient 9800Pro have April 2008 drivers but the 8800GTX is still sitting on last years drivers?.
Assassins Creed, to what we know for now has shown DX10.1 to be pretty decent for it's first viewing in a game so why are Nvidia trailing the company they've been leading for the last year and a half for DX10.1 support?.
Microsoft changed DX10 requirements so Nvidia could be Vista compliant. If a company of that size can bend over for Nvidia then you have to give plausibility to the idea that Futuremark might have done the same?. I think I can remember that they said that DX10.1 came when they were too far into development to implement it which is believable but we'll never get to know what happens behind closed doors. When you have the fact that Microsoft done Nvidia a favour for DX10 and a report that Assassins Creed is to have a feature removed that grants the opposing card with the newest technology better AA performances then that gives you food for thought (speculation of course regarding Futuremark but with past experiences it seems possible).
I've done most of my gaming with ATI. Since I started getting decent cards I've had 9800Pro, X800XT PE, 7600GT OC, X1900XT, 2900Pro and now the 8800GTX. I'm no fan boy though as I wouldn't have bought this card or the 7600GT. I still agree that the GTX(& Ultra) are the best single GPU cards for high res and high IQ gaming. X2 3870 was tempting but for the small increase I thought I'd rather wait as the G80 is the fastest card I've had the pleasure of owning. I'm just mentioning this before anyone thinks I hate Nvidia. I don't. I did think it was poor that the 7 series couldn't do AA and HDR with bad IQ but I've had nothing but praise for the G80/G92. When G92 arrived as the 9800GTX and then I see the Assassins Creed report as well as knowing about Microsoft changing Vista for Nvidia and the 9800Pro being better supported than my GTX I have every right to question the lacklustre advances from the company that are comfortably in the lead in the high end GPU market.
I advise just about everyone to get the GT, GTS or G80 GTX. I know where it's at between 3870Xt and GTX due to owning the less efficient 3870XT(2900Pro overclocked). So for this round I have a good indication of ATI and Nvidia's high end single GPU performances and know that the GT, GTS and GTX all beat the 3870XT. I just admire ATI's advances and resiliency whilst being disappointed in Nvidia's support as well as not having DX10.1 already with the time they've had.