Problem making multiprocessing between files in python

0

I was trying to run an application using the OpenBCI kit, in case, I used their programs in python. My objective is to analyze the data obtained by this program in real time, however, from what I realized, it only works by running at a predetermined time and only then get the data, and the program needs to be rebooted, causing me to lose this data. So I thought of doing a parallel processing, using the "multiprocessing" library but I'm not able to read and read the file simultaneously, causing the program to close. Here is my code:

import time
from open_bci_v3 import OpenBCIBoard
import numpy as np
import seaborn as sns
import matplotlib.pyplot as plt
import threading
from multiprocessing import Process, Queue
eegdata = []
class EEG():
    def __init__(self):
        self.data = []
    def printdata(self):
        print(self.data)
    def include_data(self,data):
        time.sleep(0.5)
        self.data.append(data)
        self.printdata()
def handle_sample(sample):
    eegdata.append(sample.channel_data)


#Establish connection with the board
board = OpenBCIBoard()
board.print_register_settings()
eeg = EEG()
proc = []
time = 5 #seconds
procs = Process(target = eeg.include_data, args = ((eegdata)))
procs.start()
board.start_streaming(handle_sample,time)
procs.join()

In short, I would like to run open_bci_v3, the start_streaming command at the same time that I include the data in the EEG class.

    
asked by anonymous 08.05.2018 / 18:44

0 answers