【发布时间】:2017-03-25 02:27:44
【问题描述】:
我的共享字典对象的条目数不一致。它应该有 500,但大多数测试最终都在 450 和 465 之间。我也尝试过使用 map 和 Process 而不是 apply_async。
map 稍微好一点,因为共享字典有大约 480 个条目,而不是大约 450 个,但它仍然不一致,并且不是预期的全部 500 个。
我也尝试过使用 Process,但这导致我的共享字典中的条目数量最少 - 大约 420 个。
这是使用apply_async的完整代码:
import numpy as np
from PIL import Image
from os import listdir
from multiprocessing import Manager, Pool
def processImage(path, d):
image = np.array(Image.open(source + "/" + path))
# Copy lists from shared dictionary since updates don't work otherwise
w = d["width"]
h = d["height"]
w.append(image.shape[0])
h.append(image.shape[1])
d["width"] = w
d["height"] = h
if __name__ == "__main__":
source = "./sample/images"
p = Pool()
m = Manager()
d = m.dict()
d["width"], d["height"] = [], []
for path in listdir(source):
p.apply_async(processImage, (path, d))
p.close()
p.join()
这是使用map的完整代码:
def processImage(obj):
image = np.array(Image.open(source + "/" + obj[1]))
w = obj[0]["width"]
h = obj[0]["height"]
w.append(image.shape[0])
h.append(image.shape[1])
obj[0]["width"] = w
obj[0]["height"] = h
if __name__ == "__main__":
source = "./sample/images"
p = Pool()
m = Manager()
d = m.dict()
d["width"], d["height"] = [], []
p.map(processImage, zip(itertools.repeat(d), listdir(source)))
这是使用Process的完整代码:
def processImage(path, d):
image = np.array(Image.open(source + "/" + path))
w = d["width"]
h = d["height"]
w.append(image.shape[0])
h.append(image.shape[1])
d["width"] = w
d["height"] = h
if __name__ == "__main__":
source = "./sample/images"
p = Pool()
m = Manager()
d = m.dict()
d["width"], d["height"] = [], []
jobs = []
for img in listdir(source):
p = Process(target=processImage, args=(img, d))
p.start()
jobs.append(p)
for j in jobs:
j.join()
【问题讨论】:
标签: python multiprocessing pool