【问题标题】:Celery - task succedes before it's subtasksCelery - 任务在子任务之前成功
【发布时间】:2017-06-27 13:01:54
【问题描述】:

我无法弄清楚为什么我的任务在其所有子任务完成之前就被认为已完成。

tasks.scan_user.delay(1)

代码:

@task()
def scan_chunk(ids):
    occs = Occurence.objects.filter(product_id__in=ids)
    result = scan_occurences_dummy_pool((x.id,x.url,x.xpath) for x in occs)
    return result

@task()
def scan_user(id): 
    chunks = generate_chunks_from_user(id)
    ch = chain(scan_chunk.si([x.id for x in chunk]) for chunk in chunks)
    return ch()

这是 Celery 输出,如您所见,scan_user 在所有 scan_chunks 完成之前成功,这是一个问题,因为我希望在另一个 chain 中将 scan_user 发送给我们。

[2017-02-09 14:27:03,493: INFO/MainProcess] Received task: engineapp.tasks.scan_user[ed358a98-a685-4002-baac-993fdc7b64cf]
[2017-02-09 14:27:05,721: INFO/MainProcess] Received task: engineapp.tasks.scan_chunk[35b74e01-f9fa-471f-8c20-ecbf99a89201]
[2017-02-09 14:27:06,740: INFO/MainProcess] Task engineapp.tasks.scan_user[ed358a98-a685-4002-baac-993fdc7b64cf] succeeded in 3.24300003052s: <AsyncResult: 442f9373-d983-4696-a42a-ba42a8ce7761>
[2017-02-09 14:27:22,178: INFO/MainProcess] Received task: engineapp.tasks.scan_chunk[36a94ad4-3c9e-4f7d-a040-5c2a617a0d8f]
[2017-02-09 14:27:23,204: INFO/MainProcess] Task engineapp.tasks.scan_chunk[35b74e01-f9fa-471f-8c20-ecbf99a89201] succeeded in 17.4779999256s: [

我想创建另一个task,它将为所有用户按顺序运行scan_user,但我认为这是不可能的,因为它实际上是并行的。

【问题讨论】:

    标签: python python-2.7 celery celery-task


    【解决方案1】:

    ch() 只是运行链而不等待结果。 如果您想等待,请执行以下操作:

    ch = chain(scan_chunk.si([x.id for x in chunk]) for chunk in chunks)()
    return ch.get()
    

    【讨论】: