Quantcast
Viewing all articles
Browse latest Browse all 2

Answer by 2ps for How to route tasks to queues while using Docker

The issue is with the startup command. You've essentially started each worker serially, so celery -A app worker -l info -Q parser-n worker2 won't execute until after worker1 exits. The easiest way to fix this is just to have separate docker instances for each worker.

  email-n:
    restart: always
    build: ./app
    volumes:
      - ./app:/app/
    depends_on:
      - redis
    command:  bash -c 'python3 manage.py makemigrations --noinput  &&
                       python3 manage.py migrate --noinput &&
                       celery -A app worker -l info -Q email-n worker1'
...
  parser-n:
    restart: always
    build: ./app
    volumes:
      - ./app:/app/
    depends_on:
      - redis
    command: celery -A app worker -l info -Q parser-n worker2 
  sms-n:
    restart: always
    build: ./app
    volumes:
      - ./app:/app/
    depends_on:
      - redis
    command: celery -A app worker -l info -Q sms-n worker3 
  celery:
    restart: always
    build: ./app
    volumes:
      - ./app:/app/
    depends_on:
      - redis
    command: celery -A app  worker -l info
  beat:
    restart: always
    build: ./app
    volumes:
      - ./app:/app/
    depends_on:
      - redis
    command: celery -A app beat -l info
  api:
    restart: always
    build: ./app
    volumes:
      - ./app:/app/
    depends_on:
      - redis
    command: python3 manage.py runserver 0.0.0.0:1337

Alternatively, you can start one worker with multiple queues, e.g., -Q parser-n,email-n,sms-n. Finally, you can also daemonize docker inside your container but then you have to have a graceful way to stop daemonization when you are ready to stop the container, but that is outside the scope of this question.


Viewing all articles
Browse latest Browse all 2

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>