I'm using a Multiprocessing.Queue to communicate between my processing processes and my daemon process. The daemon takes output from the queue and writes to file in an infinite loop. The file object is opened in the daemon process printToFile itself.
resultqueue = Queue()
p = Process(target = printToFile , args=(resultqueue))
p.daemon = True
p.start()
for si, ei in ranges:
pr = Process(target = processing , args=(si, ei, resultqueue))
pr.start()
processes.append(pr)
for pr in processes:
pr.join()
My problem is that printToFile doesn't write anything to file, even though it prints to screen, the output it gets from the queue. When I remove the line, setting it to a daemon process, and manually kill the program using Ctrl+C, everything works fine. Can someone please help me understand what is going on. I don't know where to start debugging.
I am not using fileObject.close() as the daemon dies when the program finished execution. But I don't think that is the problem, because the program does write to file when I use Ctrl+C without making the process daemon. Another (maybe unrelated) problem, is that when the file object is not instantiated within printToFile, but is a global, even using Ctrl+C doesn't print the output to file. But I can live with that. But I would still like to understand what's going on.
1 Answer 1
Since my guess was successful in comments I would point out what I said.
open() function has third argument - buffering:
The optional buffering argument specifies the file’s desired buffer size: 0 means unbuffered, 1 means line buffered, any other positive value means use a buffer of (approximately) that size (in bytes). A negative buffering means to use the system default, which is usually line buffered for tty devices and fully buffered for other files. If omitted, the system default is used.
Passing 0 as third argument you open a file in unbuffered mode so the changes appears there immediately.
0as third parameter ofopenfunctionfile.flush()when you need to make the results available?