Python: how to share an object instance across multiple invocations of a script

I'm use a library which provides a python interface to an external program. This allows me to create:

foo = Foo()

The code above starts a new instance of the Foo program that I can control from within python.

I have a python scripts which needs to be invoked multiple times and depending on external parameters, tell a single instance of the external Foo program to do different things. Obvious I can't do

foo = Foo() everytime,

since that creates a new instance of Foo every time my script runs.

What I want to do is to create foo= Foo() once, and have multiple invocations share the same instance. Currently I'm thinkibng of just creating it once, serialize it, and have my scripts deserialize it. Does this approach work? Is there a better alternative?

Thanks!!


This can be done if you follow an approach similar to that given in this answer. Or you can use Pyro, which is compared to multiprocessing in this answer.


You might be able to use pickle. Here's a simple example:

import os, pickle

class Foo(object):
    def __init__(self, bar):
        self.bar = bar

# use previous pickled instance if found
if os.path.exists('foo.pickle'):
    with open('foo.pickle') as f:
        foo = pickle.load(f)
else:
    foo = Foo(42)

# print current foo.bar
print foo.bar

# change foo.bar and pickle
foo.bar = raw_input('new bar: ')
with open('foo.pickle', 'w') as f:
    pickle.dump(foo, f)

You can change the design so that Foo() just connects you to an existing process, and then you create a new function, call it startFoo() that you previously call once (or if Foo() fails). Even better would be to make the program that Foo() connects to a service that you connect to over a socket. You might also want to just switch to the multiprocessing module.

链接地址: http://www.djcxy.com/p/8114.html

上一篇: uploadify:IO错误

下一篇: Python:如何在脚本的多个调用中共享一个对象实例