You do realise that continuously entering threads about old CPUs just to say they are useless and can't compete is basically threadcrapping, right? We all know that in raw benchmarks Kaby Lake is faster than Nehalem and Sandy Bridge. That isn't the point and you know it, so I wish you'd give it a rest.
The actual interesting points of discussion are mostly around how it's an unusual time because such old systems are still perfectly viable for modern applications so many years later, which has historically not been the case. Yes an i7-7700K @ 5 GHz is going to produce better gaming results than an i5-2500K @ 4.5 GHz in most cases but the point is
how much better and
for what cost? A new i7-7700K + Z270 + DDR4 rig costs rather a lot and at the moment there just aren't that many scenarios where such an upgrade is worth the money, particularly when that money could equally be spent on a new GPU, which'd produce a much greater jump in performance. This will continue to change over time to the point where Sandy Bridge becomes less and less viable but we're 6 years on and not there yet, which is pretty crazy. Of course if you already have a GTX 1080 and are looking for those elusive solid 16.6 ms frame times, then it's probably a good upgrade for you. For many people, it just isn't. Law of diminishing returns and all that. Let's not even start on the fact that more and more games are using more than 4 cores now and yet Intel still don't have a 6-core mainstream chip.
I don't use any Pentium 4 systems (never have actually) but my main PC at work is still a Core 2 Duo. When paired with an SSD it's surprisingly usable. It can't handle much in the way of video playback but there is a GPU for that. Any intensive processing is done on clustered servers and VMs. Wouldn't use it for gaming or anything though.