So - this can range from "easy" to "complicated" - especially if you put it into production and you want to have some security from whoever calls it.
I will try to list some options and give the stones path to some of them:
You may not need more than one process:
You have to see how much you want to "complicate" your life - if in a process you get in a loop while
reading values from the serial, and you just want the camera controller code to see those values, nothing prevents everything is in one program, and it's just a matter of refactoring your code.
A good way to do this is to put both your code and values from the serial as well as optionally the camera's into co-routines, using "yield".
So supposing your code is now something like this:
def principal():
# codigo para configurar a serial
while True:
valor = funcao_pra_ler_da_serial()
# quero que o código da camera receba "valor"
#
# Num loop continuo é importante forçar uma pausa
# pra nao usar mais CPU que o necessário:
time.sleep(0.05)
principal()
You can then rewrite this to be something like this:
def le_valores():
# codigo para configurar a serial
while True:
valor = funcao_pra_ler_da_serial()
yield valor
def principal():
gerador_de_valores = le_valores()
while True:
time.sleep(0.05)
valor = next(gerador_de_valores)
if <expressao_usando "valor">:
funcao_que_tira_foto()
time.sleep(0.05)
principal()
Multiprocessing - Unique Code Base
If one program can import the other, or you can write a program that can import the two - you can use Python multiprocessing . You create a function that controls each of the operations you want, calls each one in a different process, and uses a multiprocessing.Queue
to pass data between two parts of your code.
This will be the simplest method.
Celery
celery is the "standard way" to call out-of-process functions in production environments nowadays. Even though, unlike Multiprocessing, you can set up configurations where processes are on different machines, different operating systems, etc.
The down side is that it needs an intermediary system that functions as a broker to pass messages from one process to another. This is a separate installed service, such as redis or rabbitmq. To run on a local machine, this is quite simple, and the default values are enough. For production, it is to dig the documentation and to build your broker the proof of holes, with authentication, and etc ...
He also needs one process to import the code from the other - although he needs this just to "know" the names of the functions, which are called as celery tasks. The first example of the tutorial I think is quite clear.
link
xmlrpc or jsonrpc
This is a form that was once easier to configure. Today, with existing examples it is possible, but it may give some more work - because of some fixes to avoid security breaches that have been introduced over time, and also why the method has not evolved to use Unicode in a transparent way with Python3.
But basically it is the simplest way that does not require that one of the processes can import the code from the other, and still allows each process to be in a different version of Python (Python 2 and Python 3, or you can mix pypy, Jython, FePython, etc ...)
As I mentioned earlier, one of your processes will then have to play the role of "server" - it will expose methods that can be called from processes outside:
link
Final considerations:
Note that none of these methods will let you simply "read a variable" from another process in real time, as you read a variable from a module that is imported - most of them require you to call a function remotely, and then you have the return value of that function. That means you'll probably have to refactor your code anyway. The exception is the multiprocessing method, which can use an object of type Queue
to share the data.