Decode JSON return faster

0

The System captures the user's address through the zip code, I'm using the ViaCep service, I do a front-end validation with jquery validate and a backend with PHP, validation with jquery takes the address of the user very fast and shows in a field "disabled", already with PHP takes a while to process and this is leaving the script slow, what is the best way to accelerate the JSON reading of a page with PHP?

ViaCep return example:

{
  "cep": "01001-000",
  "logradouro": "Praça da Sé",
  "complemento": "lado ímpar",
  "bairro": "Sé",
  "localidade": "São Paulo",
  "uf": "SP",
  "unidade": "",
  "ibge": "3550308",
  "gia": "1004"
}

URL: link

In PHP:

$cep = 'xxxx';

$cepUrl = "https://viacep.com.br/ws/{$cep}/json/";

$ch = curl_init();

curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_URL, $cepUrl);
$result = curl_exec($ch);

curl_close($ch);
$result = json_decode($result, 1);
    
asked by anonymous 29.11.2016 / 04:10

1 answer

1

From JavaScript the connection is via client-side, that is, by the browser. A URL, once accessed, has the content cached on the user's computer.

On server-side this cache is not automatically saved when you access a URL. To optimize on the backend, save a result cache.

Note also that there are other factors that influence response time such as the environment network, for example, and several other backend processes.

In general, what can greatly optimize the process is to save a result cache.

Of course, you should first evaluate whether it is feasible to save the cache. The data returned from the URL may change, though that zip code and address do not change as easily, but it is not impossible to happen. The decision is at the discretion of your business model.

No more blah blah blah and let's get down to business.

An optimization, saving results cache with PHP:

$cep = '01001000';

/*
Local onde será salvo o cache.
É recomendado otimizar com uma melhor organização para evitar ter dezenas de milhares de arquivos numa única pasta. Mas esse não é o foco da questão.
*/    
$file = __DIR__.DIRECTORY_SEPARATOR.'cep/'.$cep.'.php';

if (file_exists($file)) {
    /*
    Encontrou um arquivo de cache. O resultado será obtido desse arquivo.
    Esse arquivo já está em formato PHP, portanto, nem precisa converter de json para array do PHP.
    */
    $result = include $file;
} else {

    $cepUrl = 'https://viacep.com.br/ws/'.$cep.'/json/';

    $ch = curl_init();

    curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch, CURLOPT_URL, $cepUrl);
    $result = curl_exec($ch);

    curl_close($ch);
    $result = json_decode($result, true);

    /*
    Aqui salvamos o cache já em formato PHP para otimizar também o processo de json_decode().
    Assim não precisará invocar json_decode() sempre que consultar esses dados do cache.
    */
    $content = '<?php'.PHP_EOL.'return array(';
    foreach ($result as $k => $v) {
        $content .= PHP_EOL.'   \''.$k.'\' => \''.$v.'\',';
    }
    $content .= PHP_EOL.');';
    file_put_contents($file, $content);
    unset($file, $content);
}

print_r($result);

In the first query, when it does not have the cache, the process costs from 1 to 1.8 seconds. When the query comes from the cache the process costs from 0.0001 to 0.00016 milliseconds.

The above example is purely didactic. I suggest you create some more sophisticated control where you can check the date that the cache was saved, for example. If it was saved 6 months ago, for example, then force the script to fetch the data online and refresh the cache. This will ensure greater integrity.

In JavaScript you can force no-cache queries like this:

    $().ready(function() {
    	$("#button").click(function(){
    		$("div").html("");
    		var d = new Date();
    		var n = "?" + d.getTime();
    		//n = "";
    		console.log(n);
    		
    		$.getJSON("https://viacep.com.br/ws/01001000/json/"+n, function(result){
    			$.each(result, function(i, field){
    				$("div").append(field + "<br>");
    			});
    		});
    	});
    });
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script><inputtype="button" id="button" value="request">
    <br>
    <div class="foo"></div>

Even browsing without caching, the browser still continues with a good response speed, with a small delay due to the connection but obviously still much faster than browsing the backend without cache because there are fewer processes.

If you think better, you could even discard the use of PHP CURL and consider leaving that query job to JavaScript. Even if you want to save the data or access it in the backend, you can then send a json to the backend where it would save the results. Of course this should not be open to the public. Implement this logic only in private environments where you can trust the user.

Anyway, it's just a momentary idea. As mentioned above, you can refine and adapt to suit your needs.

    
29.11.2016 / 10:00