Final8y you are very confused on this matter.
FPS is directly linked to what the monitor can display. It's not a case that you won't notice frames being displayed at over 60fps on a 60hz screen it's that they are not displayed at all. 60hz = the screen is refreshed max 60 times a second, therefore it cannot physically display any more than that.
Having a card internally rendering 80-100-200-400 fps is of little use if the card/monitor does not output this. You can not see it, as the hardware cannot and does not display it. There is no difference to a card rendering 60 or 400 fps on a 60hz screen, if you think you can see a higher fps it's purely a placebo. Just the same as there is no difference to a card rendering 120 or 400 fps on a 120hz screen.
If your card's minimum fps is higher than the max hz of your screen you
cannot see it. It's not displayed in reality and thus is only useful for epeen from numbers in benchmarks.
There is a exception to the above which is when vsync is not enabled. If your card is rendering more fps than your card/display outputs you get tearing. Which doesn't increase the fps, it just makes a mess.
in the example shown above you can see the card has tried to render multiple frames on the same image resulting in a image worse than what you would see if the fps was locked to the hz of the monitor.
To sum up. 60hz = 60fps max 120hz = 120fps max.