【发布时间】:2021-05-09 03:23:48
【问题描述】:
在我的 Django 项目中,我在两个不同的应用程序中有两个任务,我想用 Celery 定期运行它们。
工人似乎在收集任务,而节拍似乎在收集时间表。但是,节拍卡在启动(它没有同步时间表)并且从未将任务交付给工作人员。
celery --app=bozonaro worker --loglevel=debug --beat(bozonaro 是我的 django 项目的名称)命令向我提示以下内容:
[2021-02-04 18:23:48,080: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2021-02-04 18:23:48,103: DEBUG/MainProcess] | Worker: Building graph...
[2021-02-04 18:23:48,104: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Pool, Autoscaler, StateDB, Beat, Consumer}
[2021-02-04 18:23:48,257: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2021-02-04 18:23:48,257: DEBUG/MainProcess] | Consumer: Building graph...
[2021-02-04 18:23:48,413: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Heart, Agent, Mingle, Gossip, Tasks, Control, event loop}
-------------- celery@LAPTOP-E5L3SQ6N v5.0.5 (singularity)
--- ***** -----
-- ******* ---- Linux-4.19.128-microsoft-standard-x86_64-with-glibc2.29 2021-02-04 18:23:48
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: bozonaro:0x7fc00b16d4c0
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. actions.tasks.post_action_tweet
. bozonaro.celery.debug_task
. celery.accumulate
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
. quotes.tasks.post_quote_tweet
[2021-02-04 18:23:48,468: DEBUG/MainProcess] | Worker: Starting Hub
[2021-02-04 18:23:48,468: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:48,468: DEBUG/MainProcess] | Worker: Starting Pool
[2021-02-04 18:23:48,538: DEBUG/ForkPoolWorker-2] Using selector: EpollSelector
[2021-02-04 18:23:48,574: DEBUG/ForkPoolWorker-3] Using selector: EpollSelector
[2021-02-04 18:23:48,608: DEBUG/ForkPoolWorker-4] Using selector: EpollSelector
[2021-02-04 18:23:48,641: DEBUG/ForkPoolWorker-5] Using selector: EpollSelector
[2021-02-04 18:23:48,675: DEBUG/ForkPoolWorker-6] Using selector: EpollSelector
[2021-02-04 18:23:48,708: DEBUG/ForkPoolWorker-7] Using selector: EpollSelector
[2021-02-04 18:23:48,741: DEBUG/ForkPoolWorker-8] Using selector: EpollSelector
[2021-02-04 18:23:48,773: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:48,773: DEBUG/MainProcess] | Worker: Starting Beat
[2021-02-04 18:23:48,774: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:48,775: DEBUG/MainProcess] | Worker: Starting Consumer
[2021-02-04 18:23:48,775: DEBUG/MainProcess] | Consumer: Starting Connection
[2021-02-04 18:23:48,776: DEBUG/ForkPoolWorker-9] Using selector: EpollSelector
[2021-02-04 18:23:48,781: INFO/Beat] beat: Starting...
[2021-02-04 18:23:48,784: INFO/MainProcess] Connected to redis://localhost:6379//
[2021-02-04 18:23:48,784: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:48,784: DEBUG/MainProcess] | Consumer: Starting Events
[2021-02-04 18:23:48,790: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:48,790: DEBUG/MainProcess] | Consumer: Starting Heart
[2021-02-04 18:23:48,792: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:48,792: DEBUG/MainProcess] | Consumer: Starting Mingle
[2021-02-04 18:23:48,792: INFO/MainProcess] mingle: searching for neighbors
[2021-02-04 18:23:48,810: DEBUG/Beat] Current schedule:
<ScheduleEntry: post-quote-tweet-every-day-at-18h quotes.tasks.post_quote_tweet() <crontab: 0 18 * * * (m/h/d/dM/MY)>
<ScheduleEntry: post-action-tweet-every-day-at-8h actions.tasks.post_action_tweet() <crontab: 0 8 * * * (m/h/d/dM/MY)>
[2021-02-04 18:23:48,810: DEBUG/Beat] beat: Ticking with max interval->5.00 minutes
[2021-02-04 18:23:48,811: DEBUG/Beat] beat: Waking up in 5.00 minutes.
[2021-02-04 18:23:49,808: INFO/MainProcess] mingle: all alone
[2021-02-04 18:23:49,809: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:49,809: DEBUG/MainProcess] | Consumer: Starting Gossip
[2021-02-04 18:23:49,813: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:49,813: DEBUG/MainProcess] | Consumer: Starting Tasks
[2021-02-04 18:23:49,831: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:49,831: DEBUG/MainProcess] | Consumer: Starting Control
[2021-02-04 18:23:49,833: DEBUG/MainProcess] ^-- substep ok
[2021-02-04 18:23:49,833: DEBUG/MainProcess] | Consumer: Starting event loop
[2021-02-04 18:23:49,833: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2021-02-04 18:23:49,833: INFO/MainProcess] celery@LAPTOP-E5L3SQ6N ready.
[2021-02-04 18:23:49,833: DEBUG/MainProcess] basic.qos: prefetch_count->32
[2021-02-04 18:28:48,850: DEBUG/Beat] beat: Synchronizing schedule...
[2021-02-04 18:28:48,856: DEBUG/Beat] beat: Waking up in 5.00 minutes.
[2021-02-04 18:33:48,909: DEBUG/Beat] beat: Synchronizing schedule...
[2021-02-04 18:33:48,918: DEBUG/Beat] beat: Waking up in 5.00 minutes.
正如您所见,节拍未能同步时间表,这导致无法写入条目等等......
我已经浪费了很多时间来寻找这个解决方案,但是我发现的所有东西都不起作用或者来自旧版本的 Celery。那么,我错过了什么?
我的其余配置:
bozonaro/init.py
from .celery import app as celery_app
__all__ = ("celery_app",)
bozonaro/settings.py
from decouple import config as secret
CELERY_BROKER_URL = os.environ.get("REDIS_URL", secret("REDIS_URL"))
CELERY_RESULT_BACKEND = os.environ.get("REDIS_URL", secret("REDIS_URL"))
CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_RESULT_SERIALIZER = "json"
CELERY_TASK_SERIALIZER = "json"
bozonaro/celery.py
import os
from celery import Celery
from celery.schedules import crontab
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "bozonaro.settings")
app = Celery("bozonaro")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
app.conf.beat_schedule = {
"post-action-tweet-every-day-at-8h": {
"task": "actions.tasks.post_action_tweet",
"schedule": crontab(minute=0, hour=8),
},
"post-quote-tweet-every-day-at-18h": {
"task": "quotes.tasks.post_quote_tweet",
"schedule": crontab(minute=0, hour=18),
},
}
@app.task(bind=True)
def debug_task(self):
print(f"Request: {self.request!r}")
actions/tasks.py
from celery import shared_task
from core.tasks_helper import get_generic_request, post_entity_tweet
from .views import get_random_action
@shared_task
def post_action_tweet() -> None:
action = get_random_action(request=get_generic_request()).data
post_entity_tweet(infos=action, type_="action")
quotes/tasks.py
from celery import shared_task
from core.tasks_helper import get_generic_request, post_entity_tweet
from .views import get_random_quote
@shared_task
def post_quote_tweet() -> None:
quote = get_random_quote(request=get_generic_request()).data
post_entity_tweet(infos=quote, type_="quote")
我正在使用 Python 3.9.1、Ubuntu 20.04(在 WSL 上)、Celery 5.0.5 和 Redis 3.5.3。
非常感谢!
编辑:
-
“是什么不工作?”
- 节拍未将任务发送给工作人员。
-
“当你说 celery-beat 无法同步时间表时,你是什么意思?”
-
这只是日志的解释。正如你在最后几行中看到的那样,节拍将无休止地重复以下模式:
beat: Synchronizing schedule... beat: Waking up in 5.00 minutes. beat: Synchronizing schedule... beat: Waking up in 5.00 minutes.
-
-
"另外,请说明您期望发生的事情与实际发生的事情——或未发生的事情。"
-
我希望节拍在特定时间将任务发送给工作人员。时间表似乎是正确的......
Current schedule: <ScheduleEntry: post-quote-tweet-every-day-at-18h quotes.tasks.post_quote_tweet() <crontab: 0 18 * * * (m/h/d/dM/MY)> <ScheduleEntry: post-action-tweet-every-day-at-8h actions.tasks.post_action_tweet() <crontab: 0 8 * * * (m/h/d/dM/MY)> -
...但是工人没有任何反应。
-
【问题讨论】:
-
你能更详细地解释一下是什么不起作用吗?当您说 celery-beat 无法同步时间表时,您的意思是什么?另外,请解释您期望发生的事情与实际发生的事情 - 或未发生的事情。谢谢。
-
@damon 我只是编辑问题。谢谢!
标签: python django redis celery celerybeat