Popen log management question

Problem: I have a monitor program in Python that uses subprocess' Popen to start new processes. These processes have the potential to run for a very long time (weeks-months). I'm passing a file handle to stdout variable in Popen and I'm worried that this file will get huge easily. Is there a way I can safely move or remove the data in that log file? Important Note: This is on a

Popen日志管理问题

问题: 我有一个使用子进程Popen启动新进程的Python监控程序。 这些过程有可能运行很长时间(数周 - 数月)。 我将一个文件句柄传递给Popen中的stdout变量,我担心这个文件会变得很容易。 有没有办法可以安全地移动或删除该日志文件中的数据? 重要提示:这是在Windows系统上,所以任何解决方案都必须与Windows兼容。 代码片段: 这是我创建过程的方式。 try: logFile = file(self.logFileName, 'w') self.pr

Python subprocess & stdout

I have a simulation program which is piloted though stdin and provides output to stdout Doing a C++/Qt program for running it in a QProcess works well. Doing a Python program for running it under linux works well, using: p = subprocess.Popen(cmd,stdin=subprocess.PIPE,stdout=subprocess.PIPE) And using p.stdin.write , p.stdout.readline , and p.wait However, under windows , the program runs

Python子进程和标准输出

我有一个模拟程序,通过stdin进行引导,并提供输出到标准输出 在QProcess中运行C ++ / Qt程序运行良好。 在Linux下运行Python程序运行良好,使用: p = subprocess.Popen(cmd,stdin=subprocess.PIPE,stdout=subprocess.PIPE) 并使用p.stdin.write , p.stdout.readline和p.wait 然而,在windows下,程序运行并通过stdin获取命令(这已通过调试子进程来验证),但python程序会在任何p.stdout.readline和p.wait死锁。 如

When should I use `wait` instead of `communicate` in subprocess?

In the document of wait (http://docs.python.org/2/library/subprocess.html#subprocess.Popen.wait), it says: Warning This will deadlock when using stdout=PIPE and/or stderr=PIPE and the child process generates enough output to a pipe such that it blocks waiting for the OS pipe buffer to accept more data. Use communicate() to avoid that. From this, I think communicate could replace all usage

什么时候应该在子流程中使用`wait`而不是`communicate`?

在等待文档(http://docs.python.org/2/library/subprocess.html#subprocess.Popen.wait)中,它表示: 警告 当使用stdout = PIPE和/或stderr = PIPE时,这将会发生死锁,并且子进程会向管道生成足够的输出,从而阻止等待OS管道缓冲区接受更多数据。 使用通信()来避免这种情况。 由此,我认为communicate可以取代wait() 所有用法 ,如果不需要retcode话。 即使stdout或stdin不是PIPE,我也可以用communicate()替换wait

Python monitoring stderr and stdout of a subprocess

I trying to start a program (HandBreakCLI) as a subprocess or thread from within python 2.7. I have gotten as far as starting it, but I can't figure out how to monitor it's stderr and stdout. The program outputs it's status (% done) and info about the encode to stderr and stdout, respectively. I'd like to be able to periodically retrieve the % done from the appropriate stream.

Python监视子进程的stderr和stdout

我试图从python 2.7开始一个程序(HandBreakCLI)作为一个子进程或线程。 我已经开始,但我无法弄清楚如何监视它的stderr和stdout。 程序分别输出它的状态(%完成)和关于编码到stderr和stdout的信息。 我希望能够定期检索从适当的流中完成的%。 我已经尝试调用subprocess.Popen,将stderr和stdout设置为PIPE并使用subprocess.communicate,但它会一直等到进程终止或完成,然后检索输出。 对我没有好处。 我已经完成

Python subprocess.Popen pipe custom fd

I currently have this code that pipe stdout or stderr following the context: def execteInstruction(cmd, outputNumber): do_stdout = DEVNULL do_stderr = DEVNULL if outputNumber != 2: do_stdout = subprocess.PIPE else: do_stderr = subprocess.PIPE return subprocess.Popen(cmd, shell=True, stderr=do_stderr, stdout=do_stdout) And then I read the result with communicate() . This works

Python子进程.Popen管道自定义fd

我目前有这样的代码,管道stdout或stderr遵循上下文: def execteInstruction(cmd, outputNumber): do_stdout = DEVNULL do_stderr = DEVNULL if outputNumber != 2: do_stdout = subprocess.PIPE else: do_stderr = subprocess.PIPE return subprocess.Popen(cmd, shell=True, stderr=do_stderr, stdout=do_stdout) 然后我用communicate()读取结果。 这工作得很好,除了我需要读取自定义fd,因为我使用子

python subprocess pipe unbuffered behaviour

I've the below piece of code to read data from a child process as its generated and write to a file. from subprocess import Popen, PIPE proc = Popen('..some_shell_command..', shell=True, stdout=PIPE) fd = open("/tmp/procout", "wb") while True: data = proc.stdout.read(1024) if len(data) == 0: break fd.write(data) fd.close() 'Popen' default bufsize is 0 => unbu

python子进程管道无缓冲行为

我有下面这段代码来读取子进程中的数据,并将其写入文件。 from subprocess import Popen, PIPE proc = Popen('..some_shell_command..', shell=True, stdout=PIPE) fd = open("/tmp/procout", "wb") while True: data = proc.stdout.read(1024) if len(data) == 0: break fd.write(data) fd.close() 'Popen'默认bufsize是0 =>无缓冲。 如果由于某种原因写入文件操作经历了巨大的延迟,会发

Python, subprocess, pipe and select

I have a python program where I continuously read the output of other program launched via subprocess.Popen and connected via subprocess.PIPE The problem I am facing is that it sometime lost significantly portion of the output from the launched program. For example, monitor for inotify events via a pipe to inotifywait loses many events. This is the relevant functions: process = subpr

Python,子进程,管道和选择

我有一个Python程序,我不断读取通过subprocess.Popen启动的其他程序的输出,并通过子进程连接.PIPE 我面临的问题是,它有时会失去已启动程序的大部分输出。 例如,通过管道监视inotify事件以inotifywait失去许多事件。 这是相关的功能: process = subprocess.Popen(["inotifywait", "-q", "-r", "-m", "--format", "%e:::::%w%f", srcroot], stdout=subprocess.PIPE, stderr=subprocess.PIPE) polling

Python windows script subprocess continues to output after script ends

Hi I am writing a python script in windows and using subprocess I have a line like results=subprocess.Popen(['xyz.exe'],stdout=subprocess.PIPE) After the script ends, and I get back to the promp carrot in cmd, I see more output from the script being printed out. I'm seeing stuff like Could Not Find xxx_echo.txt Being printed out repeatedly. How do I properly close the sub

Python脚本子进程在脚本结束后继续输出

您好我正在写一个Python脚本在Windows和使用子进程 我有一条线 结果= subprocess.Popen([ 'xyz.exe'],标准输出= subprocess.PIPE) 脚本结束后,我回到cmd中的promp胡萝卜,我看到脚本的更多输出被打印出来。 我看到类似的东西 找不到xxx_echo.txt 重复打印出来。 我如何正确关闭Windows中的子进程? Could Not Find xxx_echo.txt看起来像一条错误消息,可能会打印在stderr上。 您打给Popen()电话不会

Subprocess error in python

from subprocess import Popen,PIPE from Tkinter import * root=Tk() calc=Frame(root) calc.grid() root.title("Calculator") bt=Button(calc,text="3") bt.grid() process=subprocess.Popen(['python','imacap.py'],stderr=subprocess.STDOUT, stdout=subprocess.PIPE) In given code I created GUI using tkinter in python. While displaying GUI app, I want to run camera capturing app at the same time so after goog

python中的子进程错误

from subprocess import Popen,PIPE from Tkinter import * root=Tk() calc=Frame(root) calc.grid() root.title("Calculator") bt=Button(calc,text="3") bt.grid() process=subprocess.Popen(['python','imacap.py'],stderr=subprocess.STDOUT, stdout=subprocess.PIPE) 在给定的代码中,我使用python中的tkinter创建了GUI。 在显示GUI应用程序时,我想同时运行相机捕捉应用程序,因此在Google搜索后我找到了使用subprocess.

does order matter for p.stdout.read() and p.wait()?

Questions about Python's subprocess.Popen() objects (Please assume a scenario where the number of bytes being generated for stdout/stderr does not fill up the OS pipe buffers and create a deadlock waiting for the OS pipe buffers to accept more data) 1) Does it matter what order p.stdout.read() and p.wait() are in? 2) Does read() on a stdout/stderr subprocess.PIPE block until the process

订单是否对p.stdout.read()和p.wait()有影响?

关于Python的subprocess.Popen()对象的问题 (请假定stdout / stderr生成的字节数不会填满操作系统管道缓冲区并创建死锁,等待OS管道缓冲区接受更多数据) 1)p.stdout.read()和p.wait()的顺序有什么关系? 2)在stdout / stderr subprocess.PIPE块上执行read(),直到进程终止? 3)即使在进程终止后,stdout / stderr子进程.PIPE文件对象和数据是否可用? import subprocess process = subprocess.Popen(args="