For anyone interested, I took a snapshot of my 5700X versus my 5700X3D in the games I had installed. This is extremely unscientific and at 3440x1440 resolution @ around ultra (so generally expected to be GPU limited). I don't even have MSI afterburner set to show 1% / 0.1% lows, just averages. Here are the findings anyway:
Cyberpunk in a high population area:
Cyberpunk in built benchmark:
Rise of the Tomb Raider in built benchmark:
Shadow of the Tomb Raider in built benchmark:
The Last Of Us Part 1 - latest save where I was noticing an annoying FPS drop entering an NPC filled area:
Cinebench after 5 mins:
These are some extremely weird results. Largely they point to the 5700X3D not being done justice by the synthetic benchmarks on this instance.
ROTTR was notable in that at the beginning of Geothermal Valley there was no judder / single frame stutter, which has always happened on all the processors previously I've tested that with (FX-8350, Ryzen 2700, and Ryzen 5700X)
Temperature results are also very strange. I set my fan curves to the same for the 5700X and 5700X3D. I must admit, I suspect I've used different thermal paste though (I think I used Artic Silver for the 5700X, but used a Thermal Right paste for the 5700X3D).
The 5700X3D goes thermonuclear in Cinebench (hits 70C where all my fan curves are set to really ramp up) compared with the 5700X, yet it's opposite in games, with the 5700X3D being cooler and consuming less wattage than the 5700X.
Weird findings. Sure I'll get a better feel for the difference as I go.