I would like to know if there is a way to create a queue of processes.
For example, whenever I run the processar.php
file, if it is already running, it will wait to run after the other is finished. Being run only when "free";
I would like to know if there is a way to create a queue of processes.
For example, whenever I run the processar.php
file, if it is already running, it will wait to run after the other is finished. Being run only when "free";
This seems very simple.
According to what you described in the question, I do not think you need threads, queuing, etc. It would be enough to create a flag.
When running processo.php
create a flag that identifies that it is running.
Example:
/*
Coloque no início do arquivo, antes de qualquer execução.
*/
if (!file_exists('processo.txt')) {
// levanta flag
file_put_contents('processo.txt', 1);
} else {
exit; // está em execução
}
/*
aqui faz os paranauê , etc e tal
*/
/*
aqui é o final, quando terminal tudo , remove a flag
*/
unlink('processo.txt');
A simple hint to improve is to compare the date and time the file was created. If the process lasts on average 10 minutes and in a next execution the file of the flag is more than 10 minutes of life, then something wrong or inexperienced occurred. What to do in this case is the choice of each. It can automate with an alert via email, or you can simply ignore and delete the file and generate a new process, generate log, anyway. I can not go into these details because it depends on each one's business model, and trying to explain here would make the answer very long and tiresome, dispelling the focus.
I believe a solution should be simple, efficient, legible, and portable before it's elegant.
The above example shows as simple and portable as possible.
It is portable in the sense that you can use it under any environment (linux, windows, mac) and even on a limited-host hosting server such as shared hosting.
Resource consumption is minimal, much faster than generating a flag in a database.
You can also choose to generate the flag in the environment variables of the Operating System, including Linux systems have features such as posix. However here we are already creating a complication that in most cases is unnecessary.
I think using thread will not work for what you want, because according to my research, thread is a feature that will allow execution in parallel blocks, so I understand you want sequential execution of code blocks. See what this link talks about threads:
What are threads?
Before answering this question, it must be said that there are two thread types: Kernel Level Thread (KLT) and User Level Thread (ULT). In this article, we will focus only on ULT-type threads, which are supported by PHP.
An ULT is a set of instructions that can be executed in parallel with other instructions from the same program. This way, when we have a multi-processor device (computer) or processor with multi-cores, you can put two (or more) threads running in parallel, each in one processor or core. The result is a potential improvement in process performance. That is, instead of executing instruction by instruction sequentially, you can perform some blocks of instructions in parallel and, at certain points in the code, require synchronization to ensure that the "results" or processing performed by the parallel tasks have been completed.
An important feature to note is that threads share the same memory region of the process that started it. Therefore, multiple threads can work on the same memory. Because they are part of a single process, using threads usually perform better than running multiple processes equal in parallel. On the other hand, threads need to have some sort of control over access to the given data, since there may be access competitor.
Use session as the friend suggested can solve, if your concern is the "release" of the file processa.php
per user, since a session is created for each user. That is, you will not be able to block the execution of the php file for a user if another one is running it.
My suggestion is that you try to resolve this by using the database, creating a run queue. You will need a PHP script, executed asynchronously, that will get the next request to be processed from the database and will call processa.php
.
In the database, you can create columns to store the relevant variable information or even the $ _POST you would use.
Some time ago I had to interact php with a pipeline written in bash with many steps and you called many python and R functions sequentially. So I thought of the whole process as sub-processes where process 2 was only started when process 1 was finished, 3 was after 2, and so went to the last process. I think this is what you're looking for correct?
The solution is to overlap this combination:
...
//Processo-1 iniciou a reação... e gerou um arquivo => fileResultProcesso-1.txt
...
$control-2=true;
while($control-2){
if(file_exists(/path/fileResultProcesso-1)){
//Se o arquivo xiste é pq o processo-1 já terminou.
//Entáo inicie o processo-2
//Salve o resultado do processo-2 (ou uma tag qualquer)
//COloque $control-2 = False
//////////////////////// -- Início do processo-3
$control-3=true;
while($control-3){
if(file_exists(/path/fileResultProcesso-2)){
... continue até terminar seus processos ....
}//Fim do if file processo-1
}//Fim do processo-3
}//Fim do if file processo-1
}//Fim do processo-2
Note: In relation to breaking the process in the middle of the way. You have to make a decision. Will you want to restart the whole process or will you continue where you left off?
My algorithm deleted all the files and folders generated at the end of the process. But I could save everyone to restart from where they left off and make a small change in the code: check if the file or its tag exists. If yes go to the next step.