It doesn't work like that, capping frames does nothing other than, matching your screens refresh rate to stop screen tearing, and reducing heat.
It is literally pointless having your game running at several hundred frames per second, your just creating more heat, using more power and wearing out your hardware for nothing.
There is an argument that running frames over say 60hz (if your monitor is capped at that) can reduce input lag and generally latency, but I mean your talking niche competitive gaming there, and I mean just get a higher refresh monitor.
Where a slower CPU will affect your gaming is with minimum frame rate, not maximum.
Thinking about it, it's actually really hard to explain.......
Ok, let me give you my example on PUBG.
Im running a Nvidia 1070 playing 1440p. I've recently switched from a 10 year old i7-2600k to a new 5800x.
So..... Overall, my frame rate hasn't significantly changed, that is because the graphics settings are the same and same graphics card the same.
However, on my old CPU in game where there was a lot of action, eg lots of players around, my frame rate would drop. Now that wasn't caused by my graphics card, it was my CPU struggling a bit. Now with the new CPU although my overall frame rate isn't that much different, I dont get that drop in frame rate anymore when there is lots going on.
You'll see that affect even more at lower resolutions where the graphics card is doing less work to keep the frame rate.
So in a way, you've almost got to think of the CPU and GPU doing 2 different things in your games.
I'm not sure if that helped or confused you further, it's hard to explain but once you get your head around it you'll understand.