And if it wasn't for Intel, Dell and other builders might have used AMD chips in a fairer market back then which likely would have led to performance gains sooner..
This. For context because its worth reminding every generation.
In around 1979 Intel created the X86 architecture, a set of instructions that basically made it so that you didn't need a Phd in computer science to use one.
It was a revolutionary breakthrough, no question about that, IBM saw its potential and developed a UI interface operating system for it, the precursor to Windows.
So together Intel and IBM created the Personal Computer (PC) and called it as such, The IBM Personal Computer, every home could have one.
IMB was worried about how much supply Intel could provide, so part of the deal was that Intel would also source manufacturing from who were then the world's second largest semi conductor manufacturer, Advanced Micro Devices, (AMD)
Intel never sent AMD the engineering tapes, so they reverse engineered the CPU and manufactured them anyway, there were allowed to do that, AMD had the X86 licence.
AMD manufactured and sold copies of Intel's X86 CPU's through the 1980's, by the end of that decade AMD were tired of reverse engineering iterations on Intel's designs and started designing their own X86 architecture, through the 1990's AMD's designs surpassed Intel's in performance and efficiency.
AMD was small scale with only about 10% market share, but with the quality of their X86 designs they started to grow, and grow at Intel's expense.
By the late 1990's 32Bit instruction sets had reached their limitations, everyone was trying to develop 64Bit architectures, not easy, its a very complex thing, AMD was late to that party, by the time they started developing their own others had been trying for near a decade, in 2003 AMD cracked it, AMD 64 was born, but the real genius was to design it to plug right in to X86 (X86_64) so all existing software needed was a modification, pretty quickly there were two versions of Windows XP, a 32Bit version that would run on Intel and AMD and a 64Bit version that would run only on AMD Athlon64 CPU's.
This was a potential disaster for Intel, they quickly rushed out their own 64Bit CPU they had been working on for near a decade, Itanium, it was massive, slow, inefficient and broken, it didn't work, it was no where near ready.
By the mid 2000's AMD tipped in to 51% market share to Intel's 49% and growing rapidly, Intel could see the writing on the wall.
Intel licenced AMD64, they had to, no choice.
AMD at this point was up and coming, but unlike Intel they didn't have decades of accumulated wealth, Intel did and they used that wealth to buy AMD out of business. When AMD's orders abruptly stopped they to tried figure out what was going on, they offered Dell 1 million CPU's for free, Dell told them they couldn't accept them, they told them if they did they would actually lose money, AMD launched an investigation and found out that Intel had been paying vendors like Dell, HP ecte.... not to use AMD CPU's, AMD focused on Dell and revealed that Intel had been giving them their CPU's for free and on top of that had been paying them $850 million annually not to use AMD CPU's in any of their products, it was the same with HP and the rest of them on a smaller scale.
AMD took Intel to court and put a stop to it but by this point AMD was flat broke, they used what little cash reserves they had to buy ATI, $5.9 billion and tried to make a success of it in GPU's, starting with the HD 4870, a damned good GPU i might add. Unlike Intel's first GPU it actually worked and worked well.
In 2008 the market crashed, that added to AMD's problems and Bulldozer was a flop which almost finished them, the rest is history.
To those of you who vowed for whatever reason not to have anything AMD, you're running an AMD base architecture, even if it is branded Intel, also the multicore aspect of it, that as it is today in all X86 CPU's was another AMD invention and adopted by Intel.