You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
importosfromceleryimportCelery# set the default Django settings module for the 'celery' program.os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local")
app=Celery(
"app_name",
)
# Using a string here means the worker doesn't have to serialize# the configuration object to child processes.# - namespace='CELERY' means all celery-related configuration keys# should have a `CELERY_` prefix.app.config_from_object("django.conf:settings", namespace="CELERY")
# Load task modules from all registered Django app configs.app.autodiscover_tasks()
[2023-12-24 19:25:47,785: DEBUG/MainProcess] | Worker: Starting Hub
[2023-12-24 19:25:47,785: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:47,785: DEBUG/MainProcess] | Worker: Starting Pool
[2023-12-24 19:25:48,019: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:48,019: DEBUG/MainProcess] | Worker: Starting Consumer
[2023-12-24 19:25:48,019: DEBUG/MainProcess] | Consumer: Starting Connection
[2023-12-24 19:25:48,020: WARNING/MainProcess] /Users/user_name/.pyenv/versions/3.10.11/envs/app_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
warnings.warn(
[2023-12-24 19:25:48,026: INFO/MainProcess] Connected to redis://localhost:6379//
[2023-12-24 19:25:48,026: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:48,026: DEBUG/MainProcess] | Consumer: Starting Events
[2023-12-24 19:25:48,026: WARNING/MainProcess] /Users/user_name/.pyenv/versions/3.10.11/envs/app_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
warnings.warn(
[2023-12-24 19:25:48,028: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:48,028: DEBUG/MainProcess] | Consumer: Starting Mingle
[2023-12-24 19:25:48,028: INFO/MainProcess] mingle: searching for neighbors
[2023-12-24 19:25:49,037: INFO/MainProcess] mingle: all alone
[2023-12-24 19:25:49,038: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,038: DEBUG/MainProcess] | Consumer: Starting Gossip
[2023-12-24 19:25:49,048: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,049: DEBUG/MainProcess] | Consumer: Starting Tasks
[2023-12-24 19:25:49,053: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,053: DEBUG/MainProcess] | Consumer: Starting Control
[2023-12-24 19:25:49,058: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,059: DEBUG/MainProcess] | Consumer: Starting Heart
[2023-12-24 19:25:49,061: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,061: DEBUG/MainProcess] | Consumer: Starting event loop
[2023-12-24 19:25:49,061: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2023-12-24 19:25:49,062: INFO/MainProcess] [email protected] ready.
[2023-12-24 19:25:49,062: DEBUG/MainProcess] basic.qos: prefetch_count->40
[2023-12-24 19:25:53,429: ERROR/MainProcess] Received unregistered task of type 'users.tasks.get_users_count'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
The delivery info for this task is:
{'exchange': '', 'routing_key': 'celery'}
Traceback (most recent call last):
File "/Users/user_name/.pyenv/versions/3.10.11/envs/app_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py", line 658, in on_task_received
strategy = strategies[type_]
KeyError: 'users.tasks.get_users_count'
Tasks.py in the shop app works well, but I don't know why the function in tasks.py created by the cookie cutter doesn't work properly. Of course, it's okay if you don't use it, but I don't understand the operation from a developer's perspective.
The text was updated successfully, but these errors were encountered:
What happened?
unregistered task of type 'users.tasks.get_users_count'
What should've happened instead?
return users count
I tried many thins
1. create new app and wrote tasks.py
I try startapp shop.
tasks.py in shop
# tasks.py
result: Good working
[2023-12-24 19:36:28,767: INFO/MainProcess] Task shop.tasks.add[c6eaabd8-cf80-4d21-9974-000bf7a9fe9d] received
[2023-12-24 19:36:28,767: DEBUG/MainProcess] TaskPool: Apply <function fast_trace_task at 0x106824700> (args:('shop.tasks.add', 'c6eaabd8-cf80-4d21-9974-000bf7a9fe9d', {'lang': 'py', 'task': 'shop.tasks.add', 'id': 'c6eaabd8-cf80-4d21-9974-000bf7a9fe9d', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'c6eaabd8-cf80-4d21-9974-000bf7a9fe9d', 'parent_id': None, 'argsrepr': '(1, 2)', 'kwargsrepr': '{}', 'origin': '[email protected]', 'ignore_result': False, 'replaced_task_nesting': 0, 'stamped_headers': None, 'stamps': {}, 'properties': {'correlation_id': 'c6eaabd8-cf80-4d21-9974-000bf7a9fe9d', 'reply_to': 'bad5509d-676c-3c89-bd15-64929f7e7df0', 'delivery_mode': 2, 'delivery_info': {'exchange': '', 'routing_key': 'celery'}, 'priority': 0, 'body_encoding': 'base64', 'delivery_tag': '4a28df74-63a1-4027-ae9b-a30433cc53ab'}, 'reply_to': 'bad5509d-676c-3c89-bd15-64929f7e7df0', 'correlation_id': 'c6eaabd8-cf80-4d21-9974-000bf7a9fe9d', 'hostname': '[email protected]', 'delivery_info': {'exchange': '',... kwargs:{})
[2023-12-24 19:36:28,772: INFO/ForkPoolWorker-8] Task shop.tasks.add[c6eaabd8-cf80-4d21-9974-000bf7a9fe9d] succeeded in 0.004394083996885456s: 3
2. add function in app_name.app_name.users.tasks
# app_name.config.celery_app.py
# tasks.py
3. stackoverflow answer not working
[Celery Received unregistered task of type (run example)](https://stackoverflow.com/questions/9769496/celery-received-unregistered-task-of-type-run-example?page=1&tab=votes#tab-top)
Additional details
Host system configuration:
Version of cookiecutter CLI (get it with
cookiecutter --version
): 2.5.0OS name and version: On Mac
On MacOs, run
Python version, run
python3 -V
: 3.10.11[2023-12-24 19:25:47,761: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2023-12-24 19:25:47,762: DEBUG/MainProcess] | Worker: Building graph...
[2023-12-24 19:25:47,762: DEBUG/MainProcess] | Worker: New boot order: {StateDB, Timer, Hub, Pool, Autoscaler, Beat, Consumer}
[2023-12-24 19:25:47,764: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2023-12-24 19:25:47,764: DEBUG/MainProcess] | Consumer: Building graph...
[2023-12-24 19:25:47,769: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Gossip, Tasks, Control, Agent, Heart, event loop}
[email protected] v5.3.6 (emerald-rush)
macOS-13.3.1-arm64-arm-64bit 2023-12-24 19:25:47
[config]
.> app: app_name:0x106891ba0
.> transport: redis://localhost:6379//
.> results: redis://localhost:6379/
.> concurrency: 10 (prefork)
.> task events: ON
[queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. celery.accumulate
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
. app_name.users.tasks.add
. app_name.users.tasks.get_users_count
. shop.tasks.add
[2023-12-24 19:25:47,785: DEBUG/MainProcess] | Worker: Starting Hub
[2023-12-24 19:25:47,785: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:47,785: DEBUG/MainProcess] | Worker: Starting Pool
[2023-12-24 19:25:48,019: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:48,019: DEBUG/MainProcess] | Worker: Starting Consumer
[2023-12-24 19:25:48,019: DEBUG/MainProcess] | Consumer: Starting Connection
[2023-12-24 19:25:48,020: WARNING/MainProcess] /Users/user_name/.pyenv/versions/3.10.11/envs/app_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
warnings.warn(
[2023-12-24 19:25:48,026: INFO/MainProcess] Connected to redis://localhost:6379//
[2023-12-24 19:25:48,026: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:48,026: DEBUG/MainProcess] | Consumer: Starting Events
[2023-12-24 19:25:48,026: WARNING/MainProcess] /Users/user_name/.pyenv/versions/3.10.11/envs/app_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
warnings.warn(
[2023-12-24 19:25:48,028: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:48,028: DEBUG/MainProcess] | Consumer: Starting Mingle
[2023-12-24 19:25:48,028: INFO/MainProcess] mingle: searching for neighbors
[2023-12-24 19:25:49,037: INFO/MainProcess] mingle: all alone
[2023-12-24 19:25:49,038: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,038: DEBUG/MainProcess] | Consumer: Starting Gossip
[2023-12-24 19:25:49,048: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,049: DEBUG/MainProcess] | Consumer: Starting Tasks
[2023-12-24 19:25:49,053: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,053: DEBUG/MainProcess] | Consumer: Starting Control
[2023-12-24 19:25:49,058: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,059: DEBUG/MainProcess] | Consumer: Starting Heart
[2023-12-24 19:25:49,061: DEBUG/MainProcess] ^-- substep ok
[2023-12-24 19:25:49,061: DEBUG/MainProcess] | Consumer: Starting event loop
[2023-12-24 19:25:49,061: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2023-12-24 19:25:49,062: INFO/MainProcess] [email protected] ready.
[2023-12-24 19:25:49,062: DEBUG/MainProcess] basic.qos: prefetch_count->40
[2023-12-24 19:25:53,429: ERROR/MainProcess] Received unregistered task of type 'users.tasks.get_users_count'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
Please see
https://docs.celeryq.dev/en/latest/internals/protocol.html
for more information.
The full contents of the message body was:
b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (77b)
The full contents of the message headers:
{'lang': 'py', 'task': 'users.tasks.get_users_count', 'id': '119c005b-f4f2-4823-991b-6645c8a5fa18', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, None], 'root_id': '119c005b-f4f2-4823-991b-6645c8a5fa18', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': '{}', 'origin': '[email protected]', 'ignore_result': False, 'replaced_task_nesting': 0, 'stamped_headers': None, 'stamps': {}}
The delivery info for this task is:
{'exchange': '', 'routing_key': 'celery'}
Traceback (most recent call last):
File "/Users/user_name/.pyenv/versions/3.10.11/envs/app_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py", line 658, in on_task_received
strategy = strategies[type_]
KeyError: 'users.tasks.get_users_count'
celery Have tasks
. massage_queen.users.tasks.add
. massage_queen.users.tasks.get_users_count
. shop.tasks.add
But not working
Result
Tasks.py in the shop app works well, but I don't know why the function in tasks.py created by the cookie cutter doesn't work properly. Of course, it's okay if you don't use it, but I don't understand the operation from a developer's perspective.
The text was updated successfully, but these errors were encountered: