Parallel function on website

0

I have a class that generates a file .xls to download. The problem is that it has more than 30,000 entries, so it takes about 10 minutes to generate. And in that time I have to keep the page open loading. I remember sites that I used to convert files in which after finishing the upload I could close the page, the browser and even shut down the computer that when the process was finished I would receive an email with my file attached. > My intention is to do the same with this file. The user clicks to download then is redirected to a page that says he will receive the file in the email and meanwhile he can close the page and do something else.

In my searches I ended up finding pcntl_fork() , which seemed to be perfect, until I found out that it only works if I run .php directly from the terminal. I have not found anything that works on websites.

    
asked by anonymous 14.10.2016 / 20:11

2 answers

0

I solved it very simply. I changed the generate .xlsx button so that instead of generating it, it would call the terminal the code it generates with shell_exec("php nomeDoArquivo.php") , but it still waits for the output of the terminal, so it ended up taking a long time.

To solve this, as the output of the terminal does not interest me¹ I asked to ignore it by modifying the command to:

$your_command = "php nomeDoArquivo.php";
shell_exec( $your_command . "> /dev/null 2>/dev/null &" );

Source: Mat

[1]: Be very careful when ignoring the output, if you have any errors in the process you will never know. As my script already sends an email with attachment I have already set it up for when it fails to send an email to me with the error message.

    
21.10.2016 / 15:44
0

One solution, perhaps not very elegant, would be to put this file in a queue (managed in a database table and / or directory), and the server would have a PHP script in cronjob that would do the job of generating the file and sending to email from time to time.

    
14.10.2016 / 22:03