The need of FPS in 1080p is strong, people like having 400fps in CS:GO on 60hz screen.
I actually remember reading about competitive gamers and many apparently drop settings in the game to make it look utterly crap. Why?? It not only boosts FPS but in a number of games it cuts down on foliage,etc meaning its easy to spot your enemy from distance,etc.
Most gamers from my experience over the last 20 years,are not that competitive - its why consoles which target 30~60FPS still seem acceptable,and also why if you look at Steam that most people are really at GTX1060 level performance and below,game at 1080p and probably have a 4C/4T or 2C/4T CPU. Also the most common "expensive" card is the GTX1070,which is outranked by the GTX960 IIRC.
Its why even Digital Foundry and channels like Hardware Unboxed kind of mock themselves when they always state we are not testing a "normal" situation by running a GTX1080TI at 1080p since we are actively looking for "CPU limitations".
They state it in their reviews FFS!!
I mean don't get me wrong,I would rather have a Core i5 8400 over a Ryzen 5 1600 for the games like FO4 I play since I have modded it to no end and have massive settlements.
However,amongst all of my mates who play the game I am the only one who has taken modding and settlement building and ran with it,ie,I am at the mod limit for example.
All of them do perfectly fine on slower CPUs,but I am aware I am a niche.
OTH,I have a mate who has a Ryzen 5 1600 and a RX570,and for the games he runs like Overwatch he has not complained one bit about performance. Another mate who is going to get a Ryzen 7 1700 but less for gaming but more for some of his worked related stuff.
Its good to have a choice now!!