Once again things to point out that make it all a bit obvious. Current desktop level gpus don't have any outright dedicated ray tracing hardware. When you add something in hardware rather than doing it on software(via non dedicated shaders) you would get a dramatic increase in performance. Top desktop gpus don't have it because it's still not ready for it, you would have to take away 'normal' gpu features in size to add in ray tracing hardware. So AMD could have come along and changed the architecture for Fiji which may have made ray tracing applications run 100 times faster than they do today, taking up half the core. Meaning it would be 100 times faster in the exact zero ray traced games that are out, and 50% slower in the 100% of all games that are out.
When mobile gaming consists of games that don't tax the existing hardware at all to begin with, you can double the gpu transistor count with a die shrink within the same size and power budget and put half of transistors towards ray tracing without effectively changing any current gaming performance because every single game is lowest common denominator, mostly able to run on phones with 1/4 of the power and without lots of settings.
It's a different market with different targets and different goals.
Then the biggest problem, ray tracing itself... is not that big a deal.
1990-1995 graphics vs ray tracing was night and day, in quality and performance. Today the graphical difference is a tiny fraction of what it once was and the designers make as much difference as the graphical capabilities of the chips.
Nvidia/AMD gpus are able to run games way faster that look significantly better than Fallout 4, yet Fallout 4 runs like crap and looks like ass... designers. They put in just horrible textures, with ray traced lighting, those textures still look like ass. The general design of the world is quite good which saves it, but the lacking graphical quality is there for all to see. They even tacked on an attempt at god rays. Had that been done by ray tracing and had it been done with perfect performance, the game wouldn't look much different as it's the textures that are the problem. Arguable if god rays are 'better' lighting (in regards to Fallout 4 specifically, the implementation is poor) or not, but they don't make the textures look better.
Ray tracing has since the early 90s been held up as a bullet to end all graphical issues in gaming, to bring us real world looking games and is completely unmatched. Despite the fact that absolutely none of that is true, this belief has stuck around. Ray tracing is not the silver bullet to bring gaming to photo realism and the performance still isn't there at all. The PC ecosystem is still no where near mobile and the ability to just switch over half a gpu to ray tracing won't work well. It will happen akin to tessellation, very small unit added on a new process with minimal ray tracing added in a very few games then step by step move towards Nvidia ******* everyone with crappy overdone implementations
