0

I've already got a rabbitmq container that launches with my django vm. This is the docker-compose.yml file:

version: "3.8"

services:
  
  web:    
    build: ./src
    ports:
      - "${PORT}:${PORT}"
    depends_on:
      - message
    tty: true    
    stdin_open: true
    container_name: ${COMPOSE_PROJECT_NAME}_app
    expose:
      - ${PORT}
    links:
      - message
    env_file:
      - .env
    restart: unless-stopped
    volumes:
      - ./src:/code

  message:
    container_name: ${COMPOSE_PROJECT_NAME}_broker
    image: rabbitmq:3.9.12-management
    ports:
      - "5672:5672"
      - "15672:15672"
    restart: always
    volumes: 
      - ./data:/var/lib/rabbitmq/
    env_file:
      - .env
    environment:
      - RABBITMQ_DEFAULT_USER=${RABBITMQ_DEFAULT_USER}
      - RABBITMQ_DEFAULT_PASS=${RABBITMQ_DEFAULT_PASS}

So after installing celery with pypi and following the First Steps, I've created a file **celery.py ** which are in the same folder that urls.py and settings.py and reading the docs I've used the same rule to _init.py.

celery.py :

from __future__ import absolute_import

import os 
from celery import Celery
from django.conf import settings


# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings.base')

app = Celery('project')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django apps.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print(f'Request: {self.request!r}')

init.py:

from __future__ import absolute_import

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

Inside the settings.py(base.py), the celery configurations:

# Celery Configuration Options

BROKER_URL = 'amqp://guest:guest@book_shelf_rabbitmq:5672/'
CELERY_RESULT_BACKEND = 'rabbitmq'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

The folders structure today is something like this:

├── project
│   ├── asgi.py
│   ├── celery.py
│   ├── __init__.py
│   ├── settings
│   │   └── base.py

But inside djando container, running '' celery worker '' I got this issue:


consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
Trying again in 2.00 seconds... (1/100)

I've tested change the celery settings configurations, the broker and etc. Looks like django doesn't recognize the broker URL. Someone could help me?

Chuck.h5
  • 1
  • 3
  • There was a similar question about this that I answered, maybe check this out? https://stackoverflow.com/a/70633902/17851130 – DrummerMann Jan 08 '22 at 22:43

0 Answers0