I am running a command via Popen and catching the stderr and using that to update a display elsewhere, this is working correctly, updating every time stderr outputs something, but I am also trying to save the output to a log file in realtime. While this does write the output to a file, it doesn't seem to update this file very often. Is there a way I get get the output writing to the file every time there is something to write? Here is the code:
self.process1 = Popen(command, startupinfo=startupinfo, stderr=subprocess.PIPE)
logFile = open(logFilePath, "a")
while True:
line222 = self.process1.stderr.readline().decode('utf-8')
logFile.write(line222)
2 Answers 2
for line in iter(self.process1.stderr.readline,b""):
line222 = line.decode('utf-8')
logFile.write(line222)
logFile.flush()
10 Comments
logFile.write(line.decode('utf-8')) breaks because logFile writes str type, not unicode (an exception on first non-ascii character). If it is Python 3 then iter(self.process1.stderr.readline,""): never stops because '' != b'' there.stderr (only logfile.fileno() is used). Also, OP wants both to capture stderr ("every time stderr outputs something") and to save it to a file.decode('utf-8') I just forgot the b. I am not sure I understand the point about stderr, I don't see that anywhere in the OP's code.for line in self.process1.stderr: instead -- you don't need iter() here as I've mentioned in my answer. To understand the point about stderr, look at the revision of your answer that my comment refers to.universal_newlines is implemented using io.TextIOWrapper that has Python implementation that you could look at. Everything has a downside. Even by fixing a bug; you could break somebody's workflow By default, files use a block-buffering i.e., nothing is written to disk until the buffers overflow. Block size is typically 4K or 8K bytes.
In your case, It should be enough to make the file line-buffered:
#!/usr/bin/env python
from __future__ import print_function
from subprocess import Popen, PIPE
p = Popen(command, startupinfo=startupinfo,
stderr=PIPE, bufsize=1, # `1` means line-buffered (our end)
universal_newlines=True) # convert to text, normalize newlines
with p.stderr, open(log_filename, "a", 1) as log_file: # `1` means line-buffered
for line in iter(p.stderr.readline, ''):
for file in [sys.stderr, log_file]:
print(line, end='', file=file)
p.wait()
If you are on Python 3 then you don't need iter() here (the read-ahead bug is fixed there); you could use just for line in pipe: instead.
I assume that you've used utf-8 as a place-holder for user's locale character encoding.
logFile.flush()immediately afterlogfFile.write()will make the workPopen, which I currently can't remember, that you can use as the condition to thewhileloop and that will give you the output line by line... Need to check in the docs