Is this question for real? You want to know how I know? Well if you can't tell then even at 40yrs worth of viewing experience, your eyes are not ones to judge.
But as they say, you are never too old to learn, so here goes! Since the electrical system is 60hz, (in the US or 50hz UK) TV's are designed to refresh their screens at the same frequency. Developers once use to set that as their goal to match frame rates to the TV refresh. Getting games to run at 60fps to match the TV frequency. The end result is a super fluid and smooth gameplay. Unfortunately too many short cuts are taken, and unless system resources are utilized to their max to be able to handle vsync on, which also has an overhead, then frame rates would fluctuate, so they lock them in at 30fps instead.. Saving them time and money. Which is evident in 80% of titles these days, to get 'em out the door in time. And no-one seems to care, or worst even see the difference. umm a little LIKE YOU !