So AMD supports part of the DX12 spec better than Nvidia and Nvidia supports part of the DX12 spec better than AMD.
Please tell me how,this is different than DX9,DX10 or DX11??
Not sure why all of this argueing,I really don't.
Do people have such short memories? Do we not remember the rumours that Nvidia was basically asking MS to add some fluff stuff to the DX12 spec that only Nvidia could support.
Do we remember Nvidia basically forcing MS to remove certain DX10 features that sped up hardware but they didn't support so they felt like ALL gamers should suffer as a result of them making a worse card than they should have. Then when games added support for those features using DX10.1 they PAID developers to actually REMOVE the DX10.1 from the game as it made them look bad.
What seems pretty clear is, DX12 is pretty ostensibly based on GCN architecture, AMD supports effectively every useful performance feature fully. Nvidia threw their toys out the pram at basically being asked to support Mantle reincarnate and got a few additional pointless throwaway features that won't get much support or offer much benefit so they could market having more features/DX12 support.
That is how the situation reads to me, and how it appears to be playing out in terms of which DX12 features are being used, offering performance improvement, being adopted by devs and what Nvidia appears to be lacking.
These are all the reasons I can't stand Nvidia, they repeatedly at every stage hold back features and performance at the expense of marketing and appearing to be the best. One company is stiffling new features because they didn't bother to support them and pushing back usage of such features by years. Tessellation, the features that wound up in DX10 instead of DX10.1. Nvidia knew the DX10 spec for ages, failed to achieve it and asked MS to **** everyone. Nvidia screwed their own customers by not supporting performance enhancing features and rather than ride the bad press that would come with they instead get MS to remove those performance enhancing features. They are so utterly anti consumer I can't stand it and I get irked by those who blindly support them while Nvidia go out of their way to screw their own customers, it's madness.
Nvidia time and time again choose to inhibit new features, then they also go the other way, they use over tessellation as a weapon to win benchmarks but provide THEIR OWN USERS with no benefit. Think about it, you literally can't see any IQ difference beyond a certain level of tessellation. ACtively designing the hardware to tessellate more is taking transistors away from other functions that can provide a performance benefit. You're paying for the die space that feature takes up and the design team putting time and money into making that, JUST to win a benchmark which but offers no benefit at all to their own users. How much better would Nvidia cards be if that time and money went into something that increased performance or IQ for their own users. Rather than spending time getting MS to remove useful features from DX10, maybe they could have just supported them. Rather than paying devs to remove DX10.1 from their game, add DX10.1 to their refreshes or the next generation of cards... nope.