Is there any way to make SharedMemory object created in Python persist between processes?
If the following code is invoked in interactive python session:
>>> from multiprocessing import shared_memory
>>> shm = shared_memory.SharedMemory(name='test_smm', size=1000000, create=True)
it creates a file in /dev/shm/ on a Linux machine.
ls /dev/shm/test_smm
/dev/shm/test_smm
But when the python session ends I get the following:
/usr/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 1 leaked shared_memory objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d
and the test_smm is gone:
ls /dev/shm/test_smm
ls: cannot access '/dev/shm/test_smm': No such file or directory
So is there any way to make the shared memory object created in python persist across process runs?
Running with Python 3.8
1 Answer 1
You can unregister a shared memory object from the resource cleanup process without unlinking it:
$ python3
Python 3.8.6 (default, Sep 25 2020, 09:36:53)
[GCC 10.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from multiprocessing import shared_memory, resource_tracker
>>> shm = shared_memory.SharedMemory(name='test_smm', size=1000000, create=True)
>>> resource_tracker.unregister(shm._name, 'shared_memory')
>>>
$ ls /dev/shm/test_smm
/dev/shm/test_smm
I don't know whether this is portable, and it doesn't look like a supported way of using the multiprocessing module, but it works on Linux at least.
4 Comments
shared_memory module.Explore related questions
See similar questions with these tags.
sysv_ipc.*.npyfile and read it withnumpy.lib.format.open_memmap(), that way you don't take up precious RAM.