If I am playing a game at 60FPS on a 144Hz monitor would I still get 144Hz
Think of it as your graphics card outputting 60 fps, but the monitor taking snapshots of this 144 times a second.
Now say in a different game you graphics card was able to output 90fps, then with a 144hz monitor you will still be able to see all of those 90 frames because 144 is higher than 90.
Now imagine the same situation but with a 60hz monitor. Because your refresh rate (60) is now lower than your fps (90), some of your fps will go to waste because your refresh rate can t keep up.
So having a higher refresh rate makes it possible to see more of the fps your card is outputting. Your graphics card can be outputting 200fps, but on a 60hz monitor you'll only be seeing a max of 60 distinct frames. A 144 hz monitor therefore increases that upper limit to 144.
Keeping with this 'snapshot' analogy, you can also appreciate that there will be a syncing problem between your fps and refresh rate. If your refresh rate does not match exactly with your fps, i.e. each new snapshot does not coincide perfecrly with a new frame, then what can happen is that during a particular snapshot, your graphics card will change to a new frame too early and the end result is that you'll see a snapshot made from two frames. This is called screen tearing and his highly distracting.
To combat this you can use vsync, where your graphics card will wait for your screen to refresh (i.e. take a snapshot) before sending a new frame. This can work really well, but as you can imagine it will introduce some lag during the 'waiting' process. On the other hand if your graphics card can't keep up with the monitors refresh rate, then it will just send the same frame as before and keep doing this for each subsequent refresh untill the new frame is ready. This creates a very nasty juddering effect as the fps is momentarily reduced to a factor of your refresh rate. So if your refresh rate is 60, but your graphics card can't keep up and has to send the same frame again, then you'll be effectively seeing 30fps for the brief moment. Similarly if your graphics card sill doesnt have a new frame for the next refresh, then it will have to send the same frame once again, meaning your fps will effectively drop to 20 for that brief moment. This is why you see juddering.
Alternatively you can use something like g-sync (nvidia) or freesync (AMD), which instead of having the graphics card wait for the monitor, makes the monitor wait for the graphics card. In other words, the monitor will only refresh the screen when it detects a new frame from the graphics card. This is a far more elegant solution, as your graphics card will be free to output frames without having to wait for new refresh, meaning you won't experience the lag and juddering associated with vsync. Of course you still have a max refresh rate of 60 or 144 (or whatever your monitor supports), meaning if your graphics card fps breaches that limit your graphics card will then have to either switch to normal vsync or just disable the syncing altogether and introduce screen tearing again. For g-sync and freesync you get choose between these two methods. Also, being able use g-sync or freesync requires a compatible monitor and graphics card.