How can I read the memory of a process in python in linux?
I'm trying to use python and python ptrace to read the memory of an external process. I need to work entirely in python, and I've been trying to read and print out the memory of a process in linux.
So for example I've tried the following code, which keeps giving me IO errors:
proc_mem = open("/proc/%i/mem" % process.pid, "r")
print proc_mem.read()
proc_mem.close()
Mostly I just want to repeatedly dump the memory of a process and look for changes over time. If this is the correct way to do this, then what is my problem? OR is there a more appropriate way to do this?
Call a shell command from python - subprocess module
import subprocess
# ps -ux | grep 1842 (Assuming 1842 is the process id. replace with process id you get)
p1 = subprocess.Popen(["ps", "-ux"], stdout=subprocess.PIPE)
p2 = subprocess.Popen(["grep", "1842"], stdin=p1.stdout, stdout=subprocess.PIPE)
p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits.
output = p2.communicate()[0]
print output
and parse through output to see its memory utilization
Mostly I just want to repeatedly dump the memory of a process and look for changes over time. If this is the correct way to do this, then what is my problem? OR is there a more appropriate way to do this?
You may be interested in gdb's reverse debugging, which records all changes to process memory. Here is the tutorial (google cache).
There is also Robert O'Callahan's Chronicle/Chronomancer work, if you want to play with the raw recording tools.
链接地址: http://www.djcxy.com/p/66958.html上一篇: 从标题自动生成C代码