You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current worker implementation is single threaded and will have limited throughput. Expand the worker to support concurrent processing based on CLI flags. --concurrency=12 should create 12 worker threads/processes allowing a worker to process up to 12 tasks concurrently.
Rough thinking would be to have a single thread responsible for fetching tasks and sending rpc responses back and a pool of worker threads that execute tasks. Processing timeouts could be tricky to enforce though as python threads don't have a way to do that. Perhaps celery worker has some tricks we could replicate.
The text was updated successfully, but these errors were encountered:
The current worker implementation is single threaded and will have limited throughput. Expand the worker to support concurrent processing based on CLI flags.
--concurrency=12
should create 12 worker threads/processes allowing a worker to process up to 12 tasks concurrently.Rough thinking would be to have a single thread responsible for fetching tasks and sending rpc responses back and a pool of worker threads that execute tasks. Processing timeouts could be tricky to enforce though as python threads don't have a way to do that. Perhaps
celery worker
has some tricks we could replicate.The text was updated successfully, but these errors were encountered: