I have a python program where I continuously read the output of other program launched via subprocess.Popen and connected via subprocess.PIPE
The problem I am facing is that it sometime lost significantly portion of the output from the launched program.
For example, monitor for inotify events via a pipe to inotifywait loses many events.
This is the relevant functions:
process = subprocess.Popen(["inotifywait", "-q", "-r", "-m", "--format", "%e:::::%w%f", srcroot], stdout=subprocess.PIPE, stderr=subprocess.PIPE) polling = select.poll() polling.register(process.stdout) process.stdout.flush() while True: process.stdout.flush() if polling.poll(max_seconds*1000): line = process.stdout.readline() if len(line)> 0: print line[:-1]
Executing the command inotifywait -q -r -m --format %e:::::%w%f /opt/fileserver/ > /tmp/log1 and moving some file around (to generate inotify events) give a>8000 line file. On the other hand, using my ./pscript.py > /tmp/log2 give a file with about 5000 lines.
1 Answer 1
You're ignoring stderr completely in your example. Try to create the process like this:
process = subprocess.Popen(["inotifywait", "-q", "-r", "-m",
"--format", "%e:::::%w%f", srcroot], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
Furthermore, I'd use inotify directly with one of its Python bindings rather than spawning a process with inotifywait.
print process.stderr.read()