Core usage without figures doesn't actually show anything though.
If the performance is worse, then more cores aren't better.
Eh?
Unless the software is putting dummy loads on the CPU then of course it shows something.
Metro Last Light performed better on the Intel, even with the crap clock speed. Better than a 4.9ghz AMD. Tomb Raider also performed better, and both games are known to quite like AMD.
Look at the Crysis 2 result. There we see a poorly threaded game threading, well, poorly. The performance was also a bit naff. I deliberately ran a game that did not thread well to show that games that do not thread well, well, do not thread well and that there's nothing going on with the OS spreading the loads around making the results look good.
My issue is that by now Crysis 2 is an awfully old game. Thus, you wouldn't say "Hey, just think I'll moochy on down to OCUK and buy a load of parts to build a rig to play Crysis 2 on !"
When people buy PCs they usually do it to upgrade so that they can play new games, not old ones. And the new ones all like to have cores and thread well.
Which is what I've been saying for the best part of a year, and have now proven it.
Core support has been expected for ages, simply because of the new consoles and how PC games are all sloppy seconds to the console payday. Sheesh, even my dead nan would know that.