After a quick scan of the thread I think I understand the VSYNC issue in relation to microstutter:
Say you have a display going at 60hz, your graphics card will want to do 60fps. What if it cant do 60fps? Well, 55fps is no good because it will cause tearing, which is what VSYNC is trying to avoid. So it has to drop to the next framerate that it can handle but still be in synch, which I think is 30fps. So because you drop from 60 -> 30 and back up to 60, you've effectively "microstuttered".
So if you take a game like HL2, it's easier for your card to output 60fps all the time. I dunno what GRID is like, but I'd suspect it's harder to pull off the frames required to stay at 60fps.
Please don't kill me if I'm wrong, but I think thats how it works!
Say you have a display going at 60hz, your graphics card will want to do 60fps. What if it cant do 60fps? Well, 55fps is no good because it will cause tearing, which is what VSYNC is trying to avoid. So it has to drop to the next framerate that it can handle but still be in synch, which I think is 30fps. So because you drop from 60 -> 30 and back up to 60, you've effectively "microstuttered".
So if you take a game like HL2, it's easier for your card to output 60fps all the time. I dunno what GRID is like, but I'd suspect it's harder to pull off the frames required to stay at 60fps.
Please don't kill me if I'm wrong, but I think thats how it works!