Python subprocess hangs as Popen when piping output

I've been through dozens of the "Python subprocess hangs" articles here and think I've addressed all of the issues presented in the various articles in the code below.

My code intermittently hangs at the Popen command. I am running 4 threads using multiprocessing.dummy.apply_async, each of those threads starts a subprocess and then reads the output line by line and prints a modified version of it to stdout.

def my_subproc():
   exec_command = ['stdbuf', '-i0', '-o0', '-e0',
                    sys.executable, '-u',
                    os.path.dirname(os.path.realpath(__file__)) + '/myscript.py']

   proc = subprocess.Popen(exec_command, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, bufsize=1)
   print "DEBUG1", device

   for line in iter(proc.stdout.readline, b''):
    with print_lock:
        for l in textwrap.wrap(line.rstrip(), LINE_WRAP_DEFAULT):

The code above is run from apply_async:

pool = multiprocessing.dummy.Pool(4)
for i in range(0,4):
    pool.apply_async(my_subproc)

Intermittently the subprocess will hang at subprocess.Popen , the statement "DEBUG1" is not printed. Sometimes all threads will work, sometimes as few as 1 of the 4 will work.

I'm not aware that this exhibits any of the known deadlock situations for Popen. Am I wrong?


This appears to be a bad interaction with multiprocessing.dummy. When I use multiprocessing (not the .dummy threading interface) I'm unable to reproduce the error.

链接地址: http://www.djcxy.com/p/77138.html

上一篇: 子过程输出到标准输出和PIPE

下一篇: 管道输出时,Python子流程会作为Popen挂起