Skip to main content
Stack Overflow
  1. About
  2. For Teams
Filter by
Sorted by
Tagged with
0 votes
0 answers
32 views

I am using Django + Celery with Redis as both the broker and the result backend. Here is my Celery configuration: # Celery settings CELERY_BROKER_URL = 'redis://redis:6379/1' CELERY_RESULT_BACKEND = '...
0 votes
1 answer
166 views

I’m encountering an issue when running Celery with PgBouncer and PostgreSQL after enabling idle connection timeouts. My stack includes: Django (served via Tornado) Celery (workers + beat) ...
1 vote
1 answer
63 views

I am hoping to use Celery to manage the task queue for my application, and am wondering if it is capable of managing tasks that themselves use multiprocessing, called from an external library. For ...
0 votes
0 answers
57 views

I use celery 5.5.0 with RabbitMQ 4.1.1 as a message broker. After a switch from classic queues to quorum queues I got quite a lot of queues called celery_delayed_0-27. In total 28 new queues. As far ...
2 votes
0 answers
117 views

I'm getting a bunch of logs like this: [2025年11月29日 16:13:15,731] def group(self, tasks, result, group_id, partial_args, add_to_parent=0): return 1 [2025年11月29日 16:13:15,732] def xmap(task, it): ...
1 vote
0 answers
58 views

Beat seems to be sending the messages into SQS very slowly, about 100/minute. Every Sunday I have a sendout to about 16k users, and they're all booked for 6.30pm. Beat starts picking it up at the ...
1 vote
1 answer
49 views

I have a Celery Chain chain( first_task.s(), second_task.s(), third_task.s(), ).apply_async() The tasks implement the on_success and on_failure handler functions to send messages to ...
Arttu's user avatar
  • 1,013
0 votes
1 answer
78 views

I am using redis and celery in docker-compose. I noticed that I am loosing celery tasks when I am restarting docker-compose (docker-compose down, and then docker-compose up) How to improve my redis ...
mascai's user avatar
  • 1,770
0 votes
1 answer
53 views

I want to create tests but every time I run a test it triggers celery and celery creates instances into my local db. that means that if I run those tests in the prod or dev servers, then it will ...
4 votes
2 answers
223 views

I’m using Django + Celery for data crawling tasks, but the memory usage of the Celery worker keeps increasing over time and never goes down after each task is completed. I’m using: celery==5.5.3 ...
tanng's user avatar
  • 41
0 votes
2 answers
143 views

I’m running into an issue when using asyncio.run together with SQLAlchemy (async) inside a Celery task. When I call the function the first time, it works fine. On the second call, I get: RuntimeError: ...
2 votes
1 answer
70 views

I want to be sure that a task is killed at a certain time if it is still running. The context is an overloaded worker, where tasks are not picked up straight away. Image this "busy" worker (...
Guillaume's user avatar
  • 3,081
3 votes
1 answer
79 views

I’m running Celery with Django and Celery Beat. Celery Beat triggers an outer task every 30 minutes, and inside that task I enqueue another task per item. Both tasks are decorated to use the same ...
0 votes
1 answer
82 views

I am building a scanning service using FastAPI + Celery + PostgreSQL + SQLAlchemy + Nmap. I have a Celery task that performs WHOIS, DNS lookups, IP lookups, and then port scanning using python-nmap. ...
1 vote
1 answer
120 views

Here’s the pattern I want: Dispatch multiple tasks in parallel. Aggregate all their results into a final result. Remove the intermediate results right after the chord result is ready, without ...

15 30 50 per page
1
2 3 4 5
...
601

AltStyle によって変換されたページ (->オリジナル) /