Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add initial support for a Celery backend #64

Draft
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

matiasb
Copy link

@matiasb matiasb commented Jun 28, 2024

This is an initial PoC for a Celery backend implementation (still in progress, probably requiring some extra work, besides tests).

How to use it:

  • Using this django_tasks branch in your Django project environment, make sure to also install celery:
    $ pip install celery

  • Update your settings.py to set the Celery backend:

TASKS = {
    "default": {
        "BACKEND": "django_tasks.backends.celery.CeleryBackend"
    }
}
  • You can also set extra celery config in settings.py (otherwise it will just use the default values, which should be ok), by defining a CELERY_* prefixed setting (e.g. to define broker_url, you should add a setting for CELERY_BROKER_URL)

  • If you don't set a broker URL, the expected one would be a local RabbitMQ. You can run it using docker like this:
    $ docker run -d -p 5672:5672 rabbitmq

  • You shouldn't need to change any django_tasks related code in your project.

  • Finally, to run the Celery worker:
    $ DJANGO_SETTINGS_MODULE=<your_project.settings> celery -A django_tasks.backends.celery.app worker -l INFO
    (this uses a simple default Celery app (see app.py below) pulling config from Django settings; it can be customized per project if needed)

Your tasks should now be queued into RabbitMQ and picked/run by the Celery worker.
(FWIW, I have been using this in a simple personal project, things seem to work for me so far).

A few items to discuss:

  • Should it be possible to run the worker via a management command? (and then, setting the env var for settings shouldn't be necessary)
  • Should it be possible to config a subset of the Celery config through django-tasks? (to eventually handle those general enough for all backends in the same way)
  • Right now this tries to keep the simplicity from the current implementation, just wrapping the minimal bits to queue tasks through Celery (and use the Celery worker to handle them on the other side), but eventually it could make sense to get deeper into Celery internals to allow for more flexible and/or complex scenarios (if needed).
  • Any other feedback, suggestions or expectations.

@RealOrangeOne
Copy link
Owner

You are, an absolute legend!

Should it be possible to run the worker via a management command

I think deferring to however Celery runs its tasks even without django-tasks is the way to go. #7 can deal with that in future

Should it be possible to config a subset of the Celery config through django-tasks?

I think it'd be ideal if we can. By the looks of it, you've got a custom celery App with this, which we could point to django-tasks to find its settings?

As an aside, how viable is making this work without needing a custom app? Perhaps by pointing django-tasks to an existing app (optionally)?

just wrapping the minimal bits to queue tasks through Celery

I'd say that's absolutely fine! If it supports all the features django-tasks surfaces, then I think that's absolutely fine! Surfacing extras can come with time. Especially true if said configuration comes from the app.

@matiasb
Copy link
Author

matiasb commented Jul 2, 2024

Should it be possible to run the worker via a management command

I think deferring to however Celery runs its tasks even without django-tasks is the way to go. #7 can deal with that in future

Sounds good 👍

Should it be possible to config a subset of the Celery config through django-tasks?

I think it'd be ideal if we can. By the looks of it, you've got a custom celery App with this, which we could point to django-tasks to find its settings?

You mean defining a way to configure Celery via the TASKS setting and make the Celery app get the config from there?

As an aside, how viable is making this work without needing a custom app? Perhaps by pointing django-tasks to an existing app (optionally)?

You need to define a Celery app, which will keep the tasks' registry and be the base to start the worker. I'm adding a minimal/default app here which sets the default queue name (as defined by django-tasks) and that will try to get CELERY_* config settings from the DJANGO_SETTINGS_MODULE. You can still define your own app (as documented), and start the worker(s) from there instead (that custom app will be preferred when registering tasks too). If no app is given, Celery will setup a 'default' one, but it gets tricky to make the worker find the registered tasks then.

just wrapping the minimal bits to queue tasks through Celery

I'd say that's absolutely fine! If it supports all the features django-tasks surfaces, then I think that's absolutely fine! Surfacing extras can come with time. Especially true if said configuration comes from the app.

Makes sense!

@RealOrangeOne
Copy link
Owner

My main thinking is around people wanting to slowly use django-tasks, without needing to rewrite all their Celery integration. If django-tasks defines the app, people can't use their own. Is there a way to achieve that? Even if the default is internal but we let people specify the module path to their own.

It'd be great to be able to configure Celery directly through django-tasks (the built-in app), but I think we'd need that and custom apps for this to be easy to adopt.

@matiasb
Copy link
Author

matiasb commented Jul 2, 2024

My main thinking is around people wanting to slowly use django-tasks, without needing to rewrite all their Celery integration. If django-tasks defines the app, people can't use their own. Is there a way to achieve that? Even if the default is internal but we let people specify the module path to their own.

People will be able to use their own. If they are already using Celery with Django, they should have one, and they can still use that one (in that case, there will be already a default_app set, and the worker should be run from that app too, which is likely what they were already doing).
In any case, I'm not sure how usual would be migrating from Celery to django-tasks since that could require multiple changes (depending on the usage of Celery: redecorate tasks, the way they are queued, if you have class-based tasks, etc). I think django-tasks is great for getting started with background workers, and making it smooth and easy to switch backends depending on your needs.

It'd be great to be able to configure Celery directly through django-tasks (the built-in app), but I think we'd need that and custom apps for this to be easy to adopt.

Yeah. Just in case, you can use a custom app as things are now following the Celery docs. If you define your app and make sure it gets loaded when your Django project starts, this will be set as the default app (and then you need to start the worker from this app instead, passing the right path to the celery -A <your-app> worker command).

I will iterate a few more times on this in the upcoming days (will fix all lint issues, for example :-) will also check any other possible improvements, and will add some tests). Let me know if you have any other thoughts or feedback!

@auvipy
Copy link

auvipy commented Jul 7, 2024

I would like to follow this development....

@matthewhegarty
Copy link

This looks very interesting. May I ask what the current status is? There's been no activity for a few months, and it looks like it got close. Is it likely to be merged soon?

@oxillix
Copy link

oxillix commented Nov 4, 2024

Interested in this as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants