Thank you for all the replies!
Up until yesterday, I had appearantly been running it on PPT 500, TDC 210, EDC 280, which I now see are motherboard values that correspond to "let er' rip".
I never minded these values, I thought these were safe, and they probably are, but they are a wasteful way of (not even getting the best, mind you) better performance in all-core loads. I don't understand why just enabling PBO (what an end-user would do, just set it to "enabled" in BIOS) results in these wasteful values.
What I did *not* understand is how my CPU got hotter and got better all-core boosts when I lowered EDC. More limited, less power draw, right? But after watching and reading a lot I now know that there's a certain optimum to tuning these parameters, and how they work together. After continuous tweaking on just the PBO2 params I landed on PPT 180, TDC 120 and EDC 165. This results in an all-core load of 4,35 Ghz at 72 degrees in CB R23.
Since this is mostly a gaming system, I decided against dialing these in, and just went back to stock. The gains in-game are non-existent or very minimal, even in games that like a lot of threads (warzone, darktide). Lower thermals and lower power usage is beneficial for my chip and for the environment, at the moment.
I'll keep that PBO2 profile handy through for when I encounter a situation in which I need more oompf, and maybe have a look at Curve Optimizer later.
My stock Cinebench R23 all-core scores are in the low 22's, but my RAM probably isn't the best (3600 Mhz with 16-19-19-19 timings) and I've read up that recent AGESA versions introduced some odd clockspeed bugs.
Does this all sound reasonable?