celery

What is celery

Celery. python job queue Processing framework. Make the worker daemon wait to create a sync processing mechanism, or start the beat daemon to create a scheduled batch process. The documentation is well maintained.

In terms of processing flow, it is close to thinking that a task scheduler is attached to a processing part such as AWS lambda or IBM bluemix OpenWhisk. I am grateful that I can proceed with processing division without worrying about API definition.

worker

If you want to use async job, start the worker daemon.

I start the daemon with celery -A proj worker, but it's hard to understand how to specify proj!

The rough rules are like this.

In other words, to give an example ...

If you look carefully, the appearance of the app will change slightly depending on the pattern.

celery -A somename worker
...
- ** ---------- [config]
- ** ---------- .> app:         __main__:0x106a950
celery -A somename worker
...
- ** ---------- [config]
- ** ---------- .> app:         somename:0x1585410

The confusing cause is the special rules proj.app and proj.celery.

Extremely confusing cases are not explained in the Flask documentation. Flask generally creates a Flask instance ʻapp variable. However, celery also thinks that it must be a Celery instance if it finds the ʻapp variable, so there is a discrepancy.

A workaround would be, for example:

beat

If you want to create a scheduled batch, start a beat daemon.

celery -A somename beat

Start it separately from the worker daemon. This is basically not scaled out, so you have to be careful when using it.

Note that schedule settings cannot be embedded in the program and will be listed in the basic configuration file.

by the way

I don't really understand the relationship between job queues and celery ... Is that the image ...?

Postscript: Now you have the option of kafka + faust.

Recommended Posts

celery
celery / kombu memo
Workflow design for Celery 3.1