How to kill the process automatically with Popen?

0

I'm running a parallel process in python:

process = subprocess.Popen(['python',path, title, uid])

The program takes + - 1 minute to finish and run normally. The process generates a PID that I can capture: process.pid . In one example it generated PID 29058 . I have another program that will handle these PIDs and check which one ended up through the function:

 def check_pid(pid):                                                             

  try:                                                                        
      os.kill(pid, 0)                                                         
  except OSError:                                                             
      return False                                                            
  else:                                                                       
      return True

That also works normally.

But even though my program I ran on Popen ended, and I'm pretty sure that happens. The process in Ubuntu still continues running the process, without consuming memory and nothing:

My main program which I call the first command above is a Bottle server that will not end unless I want to. When I kill the Bottle server process this PID 29058 process also dies. What I want to know is if there are any parameters that I pass in Popen that causes the process to die automatically when it finishes and not to stay that way?

    
asked by anonymous 06.09.2018 / 19:53

1 answer

2

If you have the object created by calling Popen (in the process variable, in the example you gave), and detecting that the process task has completed, you just call the wait method on that object. / p>

In other words, call process.wait() after the task is finished.

Since you are calling an external process from a Web application, however, there are other tips on the following paths:

You could use ProcessPoolExecutor in module concurrent.futures - this would allow you to have a fixed external process pool - and the concurrent.futures module takes care of reusing processes that have already performed their task and leaving them ready to the next.

The other way is to rethink the architecture, but it's like doing this kind of task in "production" on large systems: it involves using celery . Celery is a framework that connects different processes running through queues, but this is done almost transparently: you do what seems to be a simple function call in Python, celery puts the parameters in a queue outside the process, and a process coordinated by the celery, external, but which can even comprise the same file ".py" of the calling program that executes the call. This external process is a worker - and can be both on the same server as your web app, as on any other machine on the network (thus allowing workload distribution).

    
06.09.2018 / 20:09