Python monitoring stderr and stdout of a subprocess

I trying to start a program (HandBreakCLI) as a subprocess or thread from within python 2.7. I have gotten as far as starting it, but I can't figure out how to monitor it's stderr and stdout.

The program outputs it's status (% done) and info about the encode to stderr and stdout, respectively. I'd like to be able to periodically retrieve the % done from the appropriate stream.

I've tried calling subprocess.Popen with stderr and stdout set to PIPE and using the subprocess.communicate, but it sits and waits till the process is killed or complete then retrieves the output then. Doesn't do me much good.

I've got it up and running as a thread, but as far as I can tell I still have to eventually call subprocess.Popen to execute the program and run into the same wall.

Am I going about this the right way? What other options do I have or how to I get this to work as described?


I have accomplished the same with ffmpeg. This is a stripped down version of the relevant portions. bufsize=1 means line buffering and may not be needed.

def Run(command):
    proc = subprocess.Popen(command, bufsize=1,
        stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
        universal_newlines=True)
    return proc

def Trace(proc):
    while proc.poll() is None:
        line = proc.stdout.readline()
        if line:
            # Process output here
            print 'Read line', line

proc = Run([ handbrakePath ] + allOptions)
Trace(proc)

Edit 1: I noticed that the subprocess (handbrake in this case) needs to flush after lines to use this (ffmpeg does).

Edit 2: Some quick tests reveal that bufsize=1 may not be actually needed.

链接地址: http://www.djcxy.com/p/77122.html

上一篇: 什么时候应该在子流程中使用`wait`而不是`communicate`?

下一篇: Python监视子进程的stderr和stdout