【发布时间】:2014-01-07 17:03:59
【问题描述】:
我有一个生成守护进程的Python 脚本。在进程内部,我使用multiprocessing.pool 同时运行1 到4 进程。
当我在守护进程之外运行它时,它可以完美运行(即,当我设置 run_from_debugger=True - 请参阅下面的代码),但如果我通过守护进程运行代码(即 run_from_debugger=False),@ 987654327@ 永远不会被执行。
是否可以在daemon 进程中使用multiprocessing.pool?
我使用Python-daemon 1.6 作为我的守护程序包(如果重要的话)。
代码:
def loop_callback(params):
#Spawn the process in the pool
# Because loop_callback is called many times, often faster than async_function executes,
# adding them to a pool allows for parallel execution.
pool.apply_async(async_function, params)
def run_service():
# loop is a method that can/will call loop_callback multiple times, and it will call
# loop_callback faster than the code in asyc_function executes
loop(alignment_watch_folder, sleep_duration)
#Class declaration
app = App()
#Declare a pool of processes
# processes=1 indicates serial execution
pool = Pool(processes=4)
#Either run from a daemon process or not
run_from_debugger = False
#Run the daemon process
if run_from_debugger:
run_service()
else:
daemon_runner = runner.DaemonRunner(app)
daemon_runner.do_action()
任何建议将不胜感激。
【问题讨论】:
标签: python multiprocessing daemon