On almost every occasion, clients of a server will have different update speeds, which is the rate in which clients send and receive updates from the server. The server cannot not send updates to a client when they are needed, or the client will not know where the other clients are. If the client knew the update rate of the other clients, the client would theoretically not not have to not have the median packets generated by the server to fill in gaps, however, the update rates of clients are always changing, and everything would become unsynced.
Take, as an example, two clients. One client has an update rate of 60Hz (updates per second) and the other one has an update rate of 30Hz. If both clients are standing still, the server will repeatedly send the same coordinates over and over again to the clients at their prefered update rate.
If the client with the higher refresh rate is moving (as long as it is twice the rate,) the server sends every other packet to the client with the lower refresh rate.
If the client with the lower refresh rate decides to move, the server must extrapolate it's position to send to the client with the higher refresh rate. The server; when finding the corordanated of the client with the lower refresh rate, halfway through the refresh rate; knows where the clilent was, and where it was before that. The server uses this data to know where it in theory is, and where it will be if the client with the lower refresh rate lags a bit. The server takes subtracts where it was from where it was before where it was; or where it was before where it was from where it was, whichever is greater, and from that, it finds the client with the lower refresh rate's velocity; within reason. The server then adds this velocity to where it was, and now gets where it is.
Comments
Post a Comment