Yes,but the issue is that its quite clear AMD,Nvidia and even Intel are moving through transitory phases,and the whole industry will have to also. The issue is the other companies will have to work through all the issues with multi-chip systems at a later date,and if this is problematic for very experienced companies like AMD,Intel and Nvidia,its not going to be easy for the others too. Fujitsu with its A64FX(and a long history of server CPU designs),had to invest a lot of effort into developing a lowish power I/O fabric,and AMD had to really work on dropping power too.
Apple also is hampered by having to maintain very high margins too,which is why it lost a huge amount of the smartphone marketshare worldwide to Android. Apple cares far more about margins than volume.
Intel has been stung by its failures in the fab arm of the company,but since it can still output enough volume is still doing OK(AMD being limited by volume). However, the US government is not going to want to allow TSMC/Samsung free reign forever(hence the money being funnelled into Intel now),so once their node cadence get closer to the competition they will rebound IMHO,especially if they simply have more experience with heterogenous manufacturing by then.
This video from Ian Cutress(of AT fame) was quite interesting:
https://www.youtube.com/watch?v=oaB1WuFUAtw
It's less about the Intel products themselves,but more the changes behind the scenes. AMD has already identified this years before - its quite possible if AMD went with a large 7NM monolithic design,Zen2 and Zen3 would have been better overall,and possibly lower power(chiplet designs do have power penalties). But that is the thing,it would be less efficient to manufacture. Its like with some of the ARM server CPUs which have been tested recently,they look really solid,but again huge monolithic dies. Eventually costs will be the problem,and yields not the performance.