I'm using multiprocessing to create a sub-process to my Python app. I would like to share data between my parent process and the child process. it's important to mention that I need to share this asynchronously, means that the child process and the parent process will update the data during the code running.
What would be the best way to perform that?
3 Answers 3
This is one simple example from python documentation -
from multiprocessing import Process, Queue
def f(q):
q.put([42, None, 'hello'])
if __name__ == '__main__':
q = Queue()
p = Process(target=f, args=(q,))
p.start()
print q.get() # prints "[42, None, 'hello']"
p.join()
You can use pipe as well, Refer for more details - https://docs.python.org/2/library/multiprocessing.html
4 Comments
Fom 3.8 it is possible to use shared_memory.
This example is taken from the docs:
>>> from multiprocessing import shared_memory
>>> shm_a = shared_memory.SharedMemory(create=True, size=10)
>>> type(shm_a.buf)
<class 'memoryview'>
>>> buffer = shm_a.buf
>>> len(buffer)
10
>>> buffer[:4] = bytearray([22, 33, 44, 55]) # Modify multiple at once
>>> buffer[4] = 100 # Modify single byte at a time
>>> # Attach to an existing shared memory block
>>> shm_b = shared_memory.SharedMemory(shm_a.name)
>>> import array
>>> array.array('b', shm_b.buf[:5]) # Copy the data into a new array.array
array('b', [22, 33, 44, 55, 100])
>>> shm_b.buf[:5] = b'howdy' # Modify via shm_b using bytes
>>> bytes(shm_a.buf[:5]) # Access via shm_a
b'howdy'
>>> shm_b.close() # Close each SharedMemory instance
>>> shm_a.close()
>>> shm_a.unlink() # Call unlink only once to release the shared memory
However you should be carefull with the race conditions, I personally recommend a message system or join as mentioned in other answers. However, if you only have one writting process and the rest are readers, it should be quite safe.
Comments
Here's an example of multiprocess-multithread and sharing a couple variables:
from multiprocessing import Process, Queue, Value, Manager
from ctypes import c_bool
from threading import Thread
ps = []
def yourFunc(pause, budget):
while True:
print(budget.value, pause.value)
##set value
pause.value = True
....
def multiProcess(threads, pause, budget):
for _ in range(threads):
t = Thread(target=yourFunc(), args=(pause, budget,))
t.start()
ts.append(t)
time.sleep(3)
if __name__ == '__main__':
pause = Value(c_bool, False)
budget = Value('i', 5000)
for i in range(2):
p = Process(target=multiProcess, args=(2, pause, budget))
p.start()
ps.append(p)
Comments
Explore related questions
See similar questions with these tags.
p = multiprocessing.Process(target=myProcess, args=()) p.start()