Skill is obviously #1 here, a crap player will be just as crap at 60hz as they would at 240hz. The issue I've noticed as a fps-twitch based gamer (quake etc.) is if you have two equally skilled players, one on a 60hz monitor and another on a 120hz monitor and your wizzing around the map rocket jumping and trying to hit a target the other side of the map zipping past a small door the one with the 120hz monitor WILL have the advantage, both in latency and in smoother motion/more frames.
The latency argument has 2 parts really, there's certainly a noticeable difference on input latency between monitors, more than you'd think especially if you have been gaming on PC for a long time and it can mean the difference between winning and losing a game, but not because "OMG the lagz!!111" but because of motion to photon latency which we're very perceptable too, it is possible to adjust to higher latency and still thrash similarly skilled players when we're talking 10ms - 20ms total latency (that's input latency, frametime latency and pixel latency combined), any higher and it will start to affect your skill if you are in the top bracket of players. Having tried 50hz, 60hz, 75hz, 90hz, 100hz, 120hz, 144hz and 240hz the most noticeable both visually and in skill improvement/comfort difference was between 60 and 100, much above 100 started getting exponentially increasing diminishing returns. I'd say 120hz is a nice spot to aim for as its 2x 60hz (so videos/cutscenes or even windowed games that play at either 30fps or 60fps get to display an even number of images per frame and you get an even smooth motion, whereas something like 90 means some frames have to either wait or skip and you get micro-stuttering, again only a minor issue and only on locked fps games or videos/cutscenes)