One thing is a function to run asynchronously, another thing is to make a call to something external without waiting for a response (or pick up the answer later), I will not go into detail because to be Sincerely, I do not know Guzzle.
PHP only really supports Threads with PThread , yet it is possible to simulate something. I personally do not see a need to run a function asynchronously in a PHP script that processes a web page , maybe I do not fully understand the need for this, I think maybe in a CLI (Command-line interface / command-line interface) using PHP would be more interesting, since it could run multiple tasks and be running endlessly (or until all "events" end).
Calling a PHP script without waiting for it to respond
Speaking independently of need, an example which would be something similar to "asynchronous" (in fact, it is a totally separate process) would call a PHP script through another script in CLI mode, it would look something like:
/**
* Iniciar script em outro processo
*
* @param string $script Define a localização do script
* @param string $php_exe Define a localização do interpretador (opcional)
* @param string $php_ini Define a localização do php.ini (opcional)
* @return void
*/
function processPhpScript($script, $php_exe = 'php', $php_ini = null) {
$script = realpath($script);
$php_ini = $php_ini ? $php_ini : php_ini_loaded_file();
if (stripos(PHP_OS, 'WIN') !== false) {
/* Windows OS */
$exec = 'start /B cmd /S /C ' . escapeshellarg($php_exe . ' -c ' . $php_ini . ' ' . $script) . ' > NUL';
} else {
/* nix OS */
$exec = escapeshellarg($php_exe) . ' -c ' . escapeshellarg($php_ini) . ' ' . escapeshellarg($script . ' >/dev/null 2>&1');
}
$handle = popen($exec, 'r');
if ($handle) {
pclose($handle);
}
}
processPhpScript('pasta/script1.php');
processPhpScript('pasta/script2.php');
If it's in windows, the scripts will run as if they were in the CMD with the command start
so you do not have to wait:
start /B cmd /S /C "c:\php\php.exe -c c:\php\php.ini c:\documents\user\pasta\script1.php" > NUL
start /B cmd /S /C "c:\php\php.exe -c c:\php\php.ini c:\documents\user\pasta\script2.php" > NUL
If in a unix-like environment they will run with to write output to /dev/null
instead of returning to output
php -c /etc/php/php.ini /home/user/pasta/script1.php >/dev/null 2>&1
php -c /etc/php/php.ini /home/user/pasta/script2.php >/dev/null 2>&1
This will cause your script that called the processPhpScript
function to wait for the response.
Curl and fsockopen
I believe that when we speak of asynchronous in Guzzle we are actually talking about external requests from which you do not have to wait for the answer because you do not want it or multiple requests that work concurrently concurrently and are delivered as they end, Curl himself can do something like this:
<?php
// Inicia dois CURLs
$ch1 = curl_init("https://pt.stackoverflow.com/");
$ch2 = curl_init("https://meta.pt.stackoverflow.com/");
//Esta parte é apenas devido ao SSL, é tudo apenas um exemplo
curl_setopt($ch1, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch2, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch1, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch2, CURLOPT_RETURNTRANSFER, 1);
//Inicia o manipulador e adiciona o curls
$mh = curl_multi_init();
curl_multi_add_handle($mh, $ch1);
curl_multi_add_handle($mh, $ch2);
//Executa as requisições simultaneamente
$running = null;
do {
curl_multi_exec($mh, $running);
} while ($running);
//Finaliza o manipulador
curl_multi_remove_handle($mh, $ch1);
curl_multi_remove_handle($mh, $ch2);
curl_multi_close($mh);
//Pega o conteúdo
$response1 = curl_multi_getcontent($ch1);
$response2 = curl_multi_getcontent($ch2);
//Exibe as respostas
echo $response1, PHP_EOL;
echo $response2, PHP_EOL;
Now, if we observe correctly, we request two requests at the same time, but in order to obtain the answer we had to wait for everything. I believe that in Guzzle requestAsync
should work with something like Promisse , that what it is going to get it will send for a response in a callback , I do not know I'm not sure how to do this, but I'm not sure how to do this.
<?php
function createRequest($url, &$failno, &$failstr) {
$parsed = parse_url($url);
$isHttps = $parsed['scheme'] == 'https';
$host = ($isHttps ? 'ssl://' : '') . $parsed['host'];
$port = isset($parsed['port']) ? $parsed['port'] : ($isHttps ? 443 : 80);
$socket = fsockopen($host, $port, $errorno, $errorstr);
echo $host, $port;
if ($socket) {
$out = "GET " . $parsed['path'] . " HTTP/1.1\r\n";
$out .= "Host: " . $parsed['host'] . "\r\n";
$out .= "Connection: close\r\n\r\n";
fwrite($socket, $out);
return $socket;
}
return false;
}
function checkStatus(&$promisses, \Closure &$done) {
if (empty($promisses)) {
return false;
}
$nocomplete = false;
foreach ($promisses as &$promisse) {
if (feof($promisse['socket']) === false) {
$nocomplete = true;
$promisse['response'] .= fgets($promisse['socket'], 1024);
} else if ($promisse['complete'] === false) {
$promisse['complete'] = true;
$done($promisse['url'], $promisse['response']);
}
}
return $nocomplete;
}
function promisseRequests(array $urls, \Closure $done, \Closure $fail)
{
$promisses = array();
foreach ($urls as $url) {
$current = createRequest($url, $errorno, $errorstr);
if ($current) {
$promisses[] = array(
'complete' => false,
'response' => '',
'socket' => $current,
'url' => $url
);
} else {
$fail($url, $errorno, $errorstr);
}
}
$processing = true;
while ($processing) {
$processing = checkStatus($promisses, $done);
}
}
// Inicia dois CURLs
$urls = array(
'http://localhost/',
'http://localhost/inphinit/'
);
promisseRequests($urls, function ($url, $response) {
var_dump('Sucesso:', $url, $response);
}, function ($url, $errorno, $errorstr) {
var_dump('Falhou:', $url, $errorno, $errorstr);
});
In general, what I did was make the requests work at the same time, what I find interesting is that you can get the result that you finish first and manipulate as you wish, but I think it is not always useful. >
Where Thread would be interesting
Both of the examples I mentioned above are not threads , one is a separate process that avoids having to wait (since the call is entirely the same) and the other is multiple HTTP requests and (or next to it) the only place I see that maybe it would be interesting is with a PHP script that keeps running continuously, for example a CLI script as I've already mentioned or a Socket to work with WebSocket (which by the way is also a CLI).
I'm going to use WebSocket as an example, a socket server handles multiple calls and it responds to the websocket only when it wishes, imagining that 5 people connect to the socket through WebSocket making requests would be interesting to move the requests to Threds and deliver them only when it is finished, otherwise it would have to process them as it was requested and then the user who made a simple request ends up having to wait for the users who made more time consuming requests to process, with the Thread this could improve a bit, work the competition (of course if the script is well written).
As soon as possible I'll post an example with WebSocket
Installing PThreads
It can be installed via PECL , using the command:
pecl install pthreads
But if you do not have Pecl you can try and using Windows you can download the binaries here link , but if you use Mac OSX, Linux or do not have the binary for your version of PHP and do not have PECL, then you will have to compile after downloading link , of course the PHP executable has to have been compiled on your machine too and by the same compiler (I will not go into detail as this is not the focus of the question)