I just noticed that if I turn on V-Sync in the options, my update rate on the netgraph display drops to 60 (the refresh rate on my monitor). The same thing happens if I turn it off in-game and force it through my nVidia control panel.
I believe the update rate translates to the number of times per second that my computer, the client, sends an update to the server regarding my position, movements, firing angles, etc etc etc. Does this mean then that if I put v-sync on, it's putting me at a (slight) disadvantage?
Has anyone else noticed this? Why should the frequency at which you are sending data to the game server depend on your frame rate? Surely my mouse movements and keyboard inputs are the actual content of the data which is being sent, so as long as there is sufficient data to send (which there is!) then I should be updating at 100 no?
This obviously applies to a 100 tick rate server.
Any comments? I don't know an awful lot about this, and a friend mentioned it to me. Seems to be a trade off between tearing (with v-sync off) or lower update levels with v-sync turned on.
I believe the update rate translates to the number of times per second that my computer, the client, sends an update to the server regarding my position, movements, firing angles, etc etc etc. Does this mean then that if I put v-sync on, it's putting me at a (slight) disadvantage?
Has anyone else noticed this? Why should the frequency at which you are sending data to the game server depend on your frame rate? Surely my mouse movements and keyboard inputs are the actual content of the data which is being sent, so as long as there is sufficient data to send (which there is!) then I should be updating at 100 no?
This obviously applies to a 100 tick rate server.
Any comments? I don't know an awful lot about this, and a friend mentioned it to me. Seems to be a trade off between tearing (with v-sync off) or lower update levels with v-sync turned on.