I wasn't talking about implementing DX10.1 - I was talking about the original proposed DX10 specification.
Ok a guess then whatever - I have a reasonable idea of the cost involved as you increase the complexity of your core, increase in cost as your yield per wafer decreases, increase in cost as you have to spend a LOT more time developing your drivers and an even bigger increase in the amount of cross testing you have to do... your software development and testing workload increases exponential with increased features.
The point is, you can tell looking at the specification of the 2900 that what DX11 is now is what DX10 should have been in the first place.
There's too many similarities to disregard this.
To make claims that it would have cost 3x as much to the end user is complete BS. Are you honestly saying that an 8800GTX, had it kept to the original DX10 spec would have been priced at £1200?
If you do, then you're full of more nonsense than I've ever thought.
There's thinking that you're clued up about graphics hardware and then there's making things up.
If ATi were able to incorporate the majority of the original features, then how can you make claims that it would have been too expensive for nVIdia to do so? What makes it worse still is that ATi then halved the price of the 2900s when they moved R600 over to an 55NM process and changed it to the 3800s.
What ever nVidia is doing, I don't think cost is a factor in it. I wouldn't be surprised if it was as plain as they can't get it working.
DX10.1 has demonstrated performance improvements that nVidia surely shouldn't ignore, but implement into their hardware. nVidia being nVidia though, rather than be innovative, they'll just pay off who ever it takes to cripple the competition.
The propaganda they spread about DX10.1 being useless, despite the performance gains we all were aware of in Assassin's Creed, nVidia couldn't have that so they got it removed.
DX10.1 was a good thing, nVidia would have you think otherwise to ensure they don't look bad. They wanted to create the impression of 'ATi are implementing pointless features that will never be used, we're giving the users what they want on the other hand.'
The reality is that it's down to nVidia, they've stifled innovation. They're basically abusing their position as market 'leader' in terms of market shares. They know if they rubbish something that the majority of devs will follow them due to the majority of cards out there lacking support for DX101.
The implementation of it though, in terms of Ubisoft made a huge amount of sense, it was less work for them and less work for the GPU, yet nVidia said it was broken? A feature that didn't work on their cards, so therefore wouldn't have had a direct impact on the performance of their cards.
The difference was, it gave ATi a much needed boost in performance ahead of nVidia cards, making them look even less value for money.
Same happened with 3D Mark Vantage too.
There's so many wrong things that nVidia does that I don't get how people even begin to try and defend their actions as if there's no issue.