python as a "batch" script (i.e. run commands from python)

I'm working in a windows environment (my laptop!) and I need a couple of scripts that run other programs, pretty much like a windows batch file. how can I run a command from python such that the program when run, will replace the script? The program is interactive (for instance, unison) and keeps printing lines and asking for user input all the time. So, just running a program and printi

python作为“批处理”脚本(即从python运行命令)

我在Windows环境下工作(我的笔记本电脑!),我需要一些运行其他程序的脚本,非常像Windows批处理文件。 我怎么能从python运行一个命令,让程序运行时,会替换脚本? 该程序是交互式的(例如,一致),并且始终保持打印线路并询问用户输入。 所以,只运行一个程序并打印输出是不够的。 程序必须接管脚本的输入/输出,相当于从.bat文件运行命令。 我尝试了os.execl,但它一直告诉我“无效参数”,并且它没有找到程序名(不

Read subprocess stdout and stderr concurrently

I'm trying to run a lengthy command within Python that outputs to both stdout and stderr. I'd like to poll the subprocess and write the output to separate files. I tried the following, based on this answer Non-blocking read on a subprocess.PIPE in python import subprocess from Queue import Queue, Empty from threading import Thread def send_cmd(cmd, shell=False): """ Send cmd

同时读取子进程标准输出和标准错误

我试图在Python中运行一个冗长的命令,输出到stdout和stderr。 我想轮询子进程并将输出写入单独的文件。 我尝试了以下,基于这个答案非阻塞读取python中的subprocess.PIPE import subprocess from Queue import Queue, Empty from threading import Thread def send_cmd(cmd, shell=False): """ Send cmd to the shell """ if not isinstance(cmd, list): cmd = shlex.split(cmd) params = {'args'

live output from subprocess command

I'm using a python script as a driver for a hydrodynamics code. When it comes time to run the simulation, I use subprocess.Popen to run the code, collect the output from stdout and stderr into a subprocess.PIPE --- then I can print (and save to a log-file) the output information, and check for any errors. The problem is, I have no idea how the code is progressing. If I run it directly from

来自subprocess命令的实时输出

我使用Python脚本作为流体动力学代码的驱动程序。 当运行模拟时,我使用subprocess.Popen来运行代码,将stdout和stderr的输出收集到subprocess.PIPE ---然后我可以打印(并保存到日志文件)输出信息并检查是否有错误。 问题是,我不知道代码是如何进行的。 如果我直接从命令行运行它,它会给出关于它在什么时候迭代的信息,什么时候,什么是下一个时间步,等等。 有没有一种方法可以存储输出(用于记录和错误检查),还可以

Catching and outputting stderr at the same time with python's subprocess

(Using python 3.2 currently) I need to be able to: Run a command using subprocess Both stdout/stderr of that command need be printed to the terminal in real-time (it doesn't matter if they both come out on stdout or stderr or whatever At the same time, I need a way to know if the command printed anything to stderr (and preferably what it printed). I've played around with subproc

使用python的子进程同时捕获并输出stderr

(目前使用python 3.2) 我需要能够: 使用子进程运行一个命令 该命令的stdout / stderr都需要实时打印到终端(无论它们是在stdout还是stderr上出现都没关系 同时,我需要一种方法来知道命令是否向stderr打印了任何内容(最好是打印的内容)。 我玩过子流程管道以及在bash中做奇怪的管道重定向,以及使用tee ,但是至今还没有发现任何可行的东西。 这是可能的吗? 我的解决方案import subprocess process = subproce

Pipe subprocess standard output to a variable

I want to run a command in pythong , using the subprocess module, and store the output in a variable. However, I do not want the command's output to be printed to the terminal. For this code: def storels(): a = subprocess.Popen("ls",shell=True) storels() I get the directory listing in the terminal, instead of having it stored in a . I've also tried: def storels(): subproc

将子流程标准输出转换为变量

我想使用子pythong模块在pythong运行一个命令,并将输出存储在一个变量中。 但是,我不希望将命令的输出打印到终端。 对于此代码: def storels(): a = subprocess.Popen("ls",shell=True) storels() 我得到终端中的目录列表,而不是将它存储在a 。 我也试过: def storels(): subprocess.Popen("ls > tmp",shell=True) a = open("./tmp") [Rest of Code] storels() 这也将ls的输出打印到我

Constantly print Subprocess output while process is running

To launch programs from my Python-scripts, I'm using the following method: def execute(command): process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) output = process.communicate()[0] exitCode = process.returncode if (exitCode == 0): return output else: raise ProcessException(command, exitCode, output) So when

进程运行时不断打印子进程输出

要从我的Python脚本启动程序,我使用以下方法: def execute(command): process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) output = process.communicate()[0] exitCode = process.returncode if (exitCode == 0): return output else: raise ProcessException(command, exitCode, output) 所以当我启动一个像Process.execute("

Python subprocess.Popen strange behavior

I'm trying to run a process and when it's done, read stderr and stdout. I found only way to do this with Python is creating a subprocess.Popen, with subprocess.PIPE as stderr and stdout args, and then reading Popen.stderr and Popen.stdout file descriptors. In [133]: subprocess.call(['python', '-c', "import sys; sys.exit(1)"]) Out[133]: 1 In [134]: p = subprocess.Popen(['python', '-c',

Python子进程。打开奇怪的行为

我试图运行一个进程,当它完成时,阅读stderr和stdout。 我发现只有这样做的方法是使用Python创建一个subprocess.Popen,并将subprocess.PIPE作为stderr和stdout参数,然后读取Popen.stderr和Popen.stdout文件描述符。 In [133]: subprocess.call(['python', '-c', "import sys; sys.exit(1)"]) Out[133]: 1 In [134]: p = subprocess.Popen(['python', '-c', "import sys; sys.exit(1)"]) In [135]: p.returncode In [136]:

python subprocess Popen environment PATH?

I'm confused about how subprocess searches for the executable when using Popen() . It works if given absolute paths to the child process, but I'm trying to use relative paths. I've found that if I set the environment variable PYTHONPATH then I can get imported modules from that path ok, and PYTHONPATH is there in sys.path , but it doesn't seem to help with the behaviour of subpr

python子进程Popen环境PATH?

我对使用Popen()时subprocess Popen()如何搜索可执行文件感到困惑。 它的作品,如果给予子进程的绝对路径,但我试图使用相对路径。 我发现如果我设置了环境变量PYTHONPATH,那么我可以从该路径导入模块,并且PYTHONPATH在sys.path ,但它似乎没有帮助subprocess.Popen的行为。 我也试着编辑sitecustomize.py文件中添加PYTHONPATH到os.environ ,像这样 # copy PYTHONPATH environment variable into PATH to allow our stuff

capturing stderr of subprocesses with their shell stdout alive

Here are things I'm trying to do: -python process captures stderr of multiple subprocesses to watch the subprocesses -each subprocess runs on the separate window displaying stdout. When I use Popen(command,stderr = fp4tempfile) , (good) the python process can capture stderr of the subprocesses (bad ) the subprocess shells stop displaying stdout. When I use Popen(command) , (good) eac

捕获它们的shell stdout存活的子进程stderr

以下是我想要做的事情: -python process captures stderr of multiple subprocesses to watch the subprocesses -each subprocess runs on the separate window displaying stdout. 当我使用Popen(命令,stderr = fp4tempfile)时 , (good) the python process can capture stderr of the subprocesses (bad ) the subprocess shells stop displaying stdout. 当我使用Popen(命令)时 , (good) each subprocess she

Python subprocess/Popen with a modified environment

I believe that running an external command with a slightly modified environment is a very common case. That's how I tend to do it: import subprocess, os my_env = os.environ my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"] subprocess.Popen(my_command, env=my_env) I've got a gut feeling that there's a better way; does it look alright? 我认为os.environ.copy()更好,如果你不打算修

Python子流程/ Popen与修改后的环境

我相信,在略微修改环境下运行外部命令是非常普遍的情况。 这就是我倾向于这样做的原因: import subprocess, os my_env = os.environ my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"] subprocess.Popen(my_command, env=my_env) 我有一种直觉,认为有更好的方法; 它看起来好吗? 我认为os.environ.copy()更好,如果你不打算修改当前进程的os.environ: import subprocess, os my_env = os.environ.copy() my_env["PA