Your 2nd mistake was getting a 13900K when you really should have got a 13700K or even 13600K. You had no use for so many E-cores on a 13900k and that only results in more heat and power, no wonder you couldn't cool it, plus you also have HT on!
The maximum power usage I see on my overclocked CPU is about 140-150W (the same as my overclocked 9700K). I played Jedi Survivor yesterday and had HWInfo running from just after I started the game and after 3 hours my average CPU power consumption was 98W - not as good as the ~65W of my 7800X3D but nowhere near the power consumption Raptorlake horror stories you often read about.
Your 3rd mistake was thinking that because your system passes X or Y synthetic "stability" program that your system is stable. If you were getting WHEA areas your system simple isn't stable and you have to spend extra time refining it. The 7800X3D is so much easier in this regards as the potential overclocking headroom is so much less as AMD already "do it for you" by maxing out the silicon.
This is a key difference with Intel as historically there can be some decent headroom for overclocking depending on silicon lottery. Even now with Raptorlake I often saw people with 13600K with a default maximum turbo of 5.1Ghz hitting 6Ghz.(Linus for one)
I wanted 8 good cores, so a 13600K was out for me as it only has 6. While 6 is enough for gaming for now, 8 provides more future proof solution.
Yes that is why I am now on 7800X3D.