Quantcast
Channel: How to route tasks to queues while using Docker - Stack Overflow
Viewing all articles
Browse latest Browse all 2

How to route tasks to queues while using Docker

$
0
0

I’m setting up multiple workers which execute tasks based on given queues with in a docker environment. I'm unable to see queues that I have declared in settings.py

I've seen that the worker command that is on the top(in container command) is only getting executed and the rest are not being executed.

This is how I'm trying to route celery tasks in settings.py

# Celery settings


CELERY_BROKER_URL = 'redis://:secret@redis:6379/0'
CELERY_RESULT_BACKEND = 'redis://:secret@redis:6379/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
CELERY_TASK_ROUTES = {

    'app1.tasks.*': {'queue': 'emails'},
    'app2.tasks.*': {'queue': 'parser'},
    'app3.tasks.task1': {'queue': 'sms'}
}

This is how Im writing my tasks.py file.

@app.task(bind=True,  queue='parser')
def parser(self, extension,  data, file):
    return True

This is how Im giving commands in my docker-compose.yml file in a container

version: '3.3'
services:
  api:
    restart: always
    build: ./app
    volumes:
      - ./app:/app/
    depends_on:
      - redis
    command:  bash -c 'python3 manage.py makemigrations --noinput  &&
                       python3 manage.py migrate --noinput &&
                       celery -A app worker -l info -Q email-n worker1 &&
                       celery -A app worker -l info -Q parser-n worker2 &&
                       celery -A app worker -l info -Q sms-n worker3 &&
                       celery -A app  worker -l info  &&
                       celery -A app beat -l info  &&
                       python3 manage.py runserver 0.0.0.0:1337'
​
​
    ports:
      - 1337:1337
    expose:
      - 1337
    environment:
      - SECRET_KEY = qbsdk08)5&n*x4xdya7fbm0&lb)x!6!f_#ta(y-)*!w_wibc4c
      - SQL_ENGINE = django.db.backends.postgresql_psycopg2
      - SQL_DATABASE = postgres
      - SQL_USER = admin
      - SQL_PASSWORD = password
      - SQL_HOST = postgres_container
      - SQL_PORT = 5432
      - DATABASE = postgres
      - DJANGO_SETTINGS_MODULE = api.settings
    depends_on:
      - postgres_container

Expected Result: Im expecting list of queues to be displayed in console and run well

Actual Result: It displays only the first task in as given in docker-compose file in my case the below command as per docker-compose file, and the rest wont work.

"celery -A app worker -l info -Q email-n worker1 &&"

Tech: Django v2.2 Celery v4 Redis v5


Viewing all articles
Browse latest Browse all 2

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>