Higher refreshes are better. Full stop. But at what point is it 'good enough'?
Because CRTs scan the image line-by-line so you have a huge amount of the screen black between frames/fields, we see flicker at 60hz. But still 60Hz was the industry standard. Most people couldn't stand it and moved to 72/72/85 as soon as possible. I used to run my diamondtron at 100Hz.
As soon as LCDs started taking over, nobody complained about flicker any more because the image is simply left in place until it gets replaced (called 'sample and hold'). There is no black image between the frames.
However people suffer from what is known as the 'sample and hold effect'. Basically our brains used to actually prefer motion when seen on old displays like CRTs. Now that there isn't flicker - we see motion blur or smear instead. This is annoying to people in differing amounts (a bit like DLP rainbows is).
This artifact is even more apparent on 24p material such as blu-ray or properly deinterlaced film DVDs.
So - now that the LCD marketplace is maturing, manufacturers in the home entertainment arena are starting to use refresh rates at greater than 50/60Hz and interpolating new frames in-between. See Sony MotionFlow for example and similar techs from pretty much all the major players this year. It seems that 96Hz is being settled on as a minimum (so that's what 24p goes to), and PAL goes to 100hz, NTSC to 120Hz.
(it should be said that not everyone like the effect of the interpolation - especially with movies so some are experimenting with other methods - Sony's 'Dark Frame Insertion' is one).
BUT... most computer displays aren't showing moving images all the time - the vast majority still look at static apps all day long. You would only see the negative effects on moving images (esp 24p) and games (less so because you will likely be running at 60p) The computer market is even more price-sensitive than the home ent. market and nobody is really driving for this to happen.
Someone in a company will suggest different frame rates for their new monitor and then a project manager will ask the techies how much more it will cost to implement (might need a different Genesis scaling chip, a different input board, higher spec panel + plus more dev/testing time). So they come up with a price.
They then go to marketing and say "how many more units will we sell if we have this feature?". Now at the moment, I'm guessing the answer will come back very low indeed, so it doesn't happen - just look at this thread for evidence.
Wheras if ATI/NVidea/IBM/etc had an effective interpolation feature and the points above were more widely known, then maybe they could sell an extra 10-20%. That might start making things happen.
The good thing about a PC is it's one of the few areas where you actually have a variable frame-rate source. Unlike video where we are stuck with 24/50/60 for the foreseeable future.
BTW - there are plenty of 75+Hz capable displays out there, but they are all aimed at the high end CAD, Video market and cost roughly 4 times the price of an equivalent dell.
Anyway - thought it might be useful to some

I'm sure we'll get to a point where 60Hz isn't seen as enough any more.