In several blogs/forums, suggest applying a constant time lag(say 100ms) between the server time and the client time as Humans can adjust to a constant input lag rather than random jitters
For more details refer section 7.2 from this blog. (I am not referring to interpolation and other technics).
But in my case, I am trying to create a p2p game where the position(and other properties) of the sprites are synced between the player but it needs some time to complete the sync. So now I am trying to implement a delay on when the sprites are rendered. The problem is I am trying to using Phaser physics to update the states but want to render from old states.
For example, Let the game physics steps are occurring at a constant delta of 16ms but during render time I want to render the sprite as it was say 160ms ago (constant delay of 160ms or 10 steps).
I am trying to figure out the best way to achieve this.