Need to avoid subprocess deadlock without communicate

I need a execute a command that produces a lot of output and takes a lot of time to execute (> 30 minutes). I was thinking of using subprocess.Popen to do it. I need to capture the output of the command, so I pass PIPE to stdout and stderr.

A deadlock problem when using Popen.wait() is well documented on a lot of forums, so Popen.communicate() is the proposed way of avoiding the deadlock. The problem with that solution is that communicate() blocks until the command is completed. I need to print everything that arrives on stdout while the command is executed. If there is no output after 20 minutes, the script execution will be killed.

Here are some constraints that I need to respect:

  • My Python version is 2.4.2 and I can't upgrade.
  • If the solution is still to use subprocess, I need to pass subprocess.PIPE to all std handles to avoid this bug: http://bugs.python.org/issue1124861
  • Is there a way to do it?


  • 要在Python2.4上解决python bug#1124861,你可以将stdin附加到NUL设备上
  • import os
    from subprocess import PIPE, STDOUT, Popen
    
    lines = []
    p = Popen(cmd, bufsize=1, stdin=open(os.devnull), stdout=PIPE, stderr=STDOUT)
    for line in iter(p.stdout.readline, ''):
          print line,          # print to stdout immediately
          lines.append(line)   # capture for later
    p.stdout.close()
    p.wait()
    

    你有没有尝试过pexpect?


    It sounds like you need to do a non-blocking read on the filehandles attached to the pipes.

    This question addresses some ways to do that for windows & linux: Non-blocking read on a subprocess.PIPE in python

    链接地址: http://www.djcxy.com/p/13478.html

    上一篇: 什么时候'命令'更喜欢'popen'子流程?

    下一篇: 没有通信需要避免子进程死锁