Run multiple processes on Linux with Python

2

I get some URLs as a parameter via MQTT, after extracting them I run the command for the FFMPEG to write them to the .system. But this works for just one process, and I need to run N simultaneously.

I come from Java and I can not figure out how to do this in Python ...

import paho.mqtt.client as paho
import json
import os


def on_message(client, userdata, message):
    content = str(message.payload.decode("utf-8"))
    conversor(content)


def on_connect(client, userdata, flags, rc):
    client.subscribe("cameras/gravacao")


def on_disconnect():
    connect_to_mqtt()


def connect_to_mqtt():
    client = paho.Client("id")
    client.username_pw_set("", "")
    client.on_connect = on_connect
    client.on_disconnect = on_disconnect
    client.on_message = on_message
    client.connect("localhost", 1883, 60)
    client.loop_forever()


def conversor(content):
    data = json.loads(content)
    for n in range(data.get("videos")):
        os.system("ffmpeg -i " + data.get("remote_urls")[n]['url'] + "-acodec copy -vcodec copy "
                                                                     "/home/user/Vídeos/output.mp4")


connect_to_mqtt()
    
asked by anonymous 05.12.2018 / 17:58

2 answers

1

You would have to do something like this:

async def command(*args):
    process = await asyncio.create_subprocess_exec(*args, stdout=asyncio.subprocess.PIPE)
    cria_arquivo_com_pid(process.pid)
    stdout, stderr = await process.communicate()
    return stdout.decode() if process.returncode == 0 else stderr.decode()

def gravacao():
    comandos_gravacao = []
    for url in URLS:
        comandos_gravacao.append(command('ffmpeg', '-i', url, '-acodec', 'copy', '-vcodec', 'copy',
                                dirs + 'video.mp4'))

    loop = asyncio.get_event_loop()
    processes = asyncio.gather(*comandos_gravacao)
    loop.run_until_complete(processes)
    loop.close()

gravacao()

This really is something more advanced in Python.

    
08.01.2019 / 12:01
4

os.system breaks a branch in the interactive environment, or in very small scripts, replacing some small shell-script.

To have better control over processes used from a Python program, the features are grouped in the module subprocess .

In case you do not need multiple threads in the Python program and no other complicator - you simply have to be able to trigger a subprocess - it runs in parallel with your program, coordinated by the operating system. Eventually you can check the result of the process to know if it is finished, if there was an error - but depending on the degree of sophistication you want, you do not even need it.

In short, to call an external process and continue executing the Python code, replace your call with os.system with:

import subprocess

...
    proc = subprocess.Popen(["ffmpeg", "-i", data.get("remote_urls")[n]['url'],  "-acodec", "copy", "-vcodec", "copy"])

That is - the biggest difference is that instead of having the command string as it would be typed into the terminal, each parameter must be an element of a list, which is passed to Popen. Return is a Popen type object, which has the documentation at the link above, and can be used to query if the process is still running, and, with other parameters passed to Popen, check the output of the same to stdin, or stderr .

If the list with the options becomes very strange, nothing prevents you from writing it as a string, and use the "split" operator to transform it into a list:

executavel = 'ffmpeg'
url = data.get("remote_urls")[n]['url']
parametros = "-acodec copy -vcodec copy /home/user/Vídeos/output.mp4".split()
proc = subprocess.Popen([executavel, "-i", url] + parametros)
    
05.12.2018 / 18:57