no one i know who games on PC has an xBox, they might have a PS3/4 but not an xBox, they simply don't believe in having to pay to play.
Because they are able to show better performance in games, take that away, no one will pay £280 for a CPU when its no better than a £140 CPU.
I have a PC and an XBOX. I find it much harder playing games with my console friends if I don't have a console. There are also some games that are console exclusives. You don't have to have an XBOX Live Gold account to be able to play games, just to play games online. And you do get 24 free games over the course of the year for the ~£30 that costs you. I probably spend more than £30 a year on steam and get less than 24 games for that!
To be honest the thing that stops me gaming on consoles more is partly down to not being able to use keyboard and mouse.
As for the second point, maybe some people use their PC for more than just gaming?
I think outside of gaming there are still scenarios where an Intel CPU is faster than an AMD CPU (I imagine a lot of people now people now have their fingers in their ears going "la la la I can't hear you" at the suggestion that something exists that is better than the AMD version), for me one of these scenarios is video encoding. I've used a FX-8350 for encoding and a 4770K and a 3930K and in my experience the 3930K is the fastest, followed by the 4770K and finally the FX-8350.
If all someone does on their PC is play games then maybe they could just get a console?

I mean it is sort of what they're designed for and they're cheaper (which seemed important with the CPUs earlier). In PC terms the console hardware might not be high end, but they do tend to make better use out of it. Even with Mantle/DX12/OGL I doubt PC will be able to optimise as well as consoles.
Regarding the title (which I know Greg is only quoting) I'm not sure that even if DX12 is successful (by that I mean does what it says and is widely adopted) that it's the end of Mantle. Mantle will still be as good as it was always going to be. The time when Mantle might have been the only API supported was so far down the line that developers wanting to make use of it would have to support it alongside DirectX. Which they would still be free to do with DX12.
While Mantle supports such a limited number of cards and OpenGL is having more stuff thrown at it's codebase in the hope that something good sticks, I think DirectX will be with us for a while to come yet. If we can have a better DirectX then so much the better.
I mean apparently Mantle is what the developers wanted (who'd have thought they wanted an API that supports only a small sub-section of AMD graphics cards) so why would that suddenly change just because DirectX gets some improvements. Also I notice it is said that Mantle IS what developers want, not Mantle WILL BE, when finished, what developers want. This suggests that they want Mantle as it is in it's current state.
For me personally if DirectX 12 is able to come close to Mantle performs then I will be happy. If Mantle is a 45% improvement over DX11 and DX12 is a 35-40% improvement or something like that then I would be happy. The thing with DirectX is that I would in theory be able to switch between AMD and Nvidia cards as I saw fit. Something that is very useful to me as I like to buy something because I want it rather than because it has the logo I support on it.
I think it will be good if Mantle does stick around even if DirectX 12 is the vast improvement we hope it is as it should serve to keep Microsoft honest and hopefully stop the stagnation. We do have to hope that whoever AMD hand Mantle over to manages to keep it competitive and it doesn't go the OpenGL route.
I think the problem Mantle has is that out of principle it might not get adopted by Intel or Nvidia even after it is handed over to this 3rd party. DirectX has, historically, supported Nvidia, Intel and AMD cards. Surely this will factor into developer take up?
I also have to laugh a bit at everyone giving credit to AMD for this, as this whole thing is cyclic isn't it?
Would AMD have come up with Mantle had it not been for DirectX and OpenGL before it? Before that OpenGL and DirectX drive each other, one probably wouldn't have done X if the other hadn't done Y. Maybe we should blame AMD for not coming up with Mantle earlier? If they'd have come up with Mantle back when DirectX and OpenGL started out then maybe we wouldn't be in this situation now?
So maybe we should blame DirectX's stagnation on AMD taking so long?
