I need to adapt my code to calculate the difference between the time of the server where the site is hosted and the time of the user's computer to not overload my system.
I am setting up a table where I will show the time of another server and need to update every second.
So I have to get three data:
The third item I have defined in JavaScript as follows:
<script>
var difJogo = (-145);
</script>
It means that the game server is 145 seconds behind the site server (main reference).
To get the time of the site server I use:
<?php
date_default_timezone_set('America/Sao_Paulo');
$cH = date('G');
$cM = date('i');
$cS = date('s');
echo $cH .':'. $cM .':'. $cS;
?>
And to get the time of the user's device I use:
<script>
function hat() {
var sAg = ( Date.now() / 1000 ) % 86400;
}
</script>
What I can not do is put all these functions together to get where I want to go. The time of the site server needs to be the basis of everything because there are people who access the site from different places in the world, so I want to set the time for everyone using the São Paulo time zone.
The final logic is: Horário do servidor - Horário do dispositivo + Diferença do Jogo.
With this, you need a function to display the time of the game server and refresh every 1 second.