Celery. python job queue Processing framework. Make the worker daemon wait to create a sync processing mechanism, or start the beat daemon to create a scheduled batch process. The documentation is well maintained.
In terms of processing flow, it is close to thinking that a task scheduler is attached to a processing part such as AWS lambda or IBM bluemix OpenWhisk. I am grateful that I can proceed with processing division without worrying about API definition.
If you want to use async job, start the worker daemon.
I start the daemon with celery -A proj worker
, but it's hard to understand how to specify proj!
The rough rules are like this.
modname: varname
must be a Celery instance if there is a:
in the proj
stringproj.app
is not a module, it must be a Celery instance (hey)proj.celery
is not a module, it must be a Celery instance (hmm)proj.celery
is a moduleproj.celery.app
is not a module, it must be a Celery instance (hey)proj.celery.celery
is not a module, it must be a Celery instance (hmm)proj.celery
for subclass instances of Celeryproj
for subclass instances of CeleryIn other words, to give an example ...
somename.py
file in the current directory
If the variable x for the Celery instance existscelery -A somename: x worker
. Orcelery -A somename worker
.somename / __ init__.py
file as seen from the current directory
If the variable x of the Celery instance existscelery -A somename worker: x
. Orcelery -A somename worker
.somename / __ init__.py
exists when viewed from the current directory,
If the variable x for Celery instance exists in the somename / celery.py
filecelery -A somename.celery: x worker
. Orcelery -A somename.celery worker
. Orcelery -A somename worker
.from __future__ import absolute_import
(python 2)somename / __ init__.py
exists when viewed from the current directory,
If the Celery instance variable x exists in the somename / subname.py
filecelery -A somename.subname: x worker
. Orcelery -A somename.subname worker
. variable in the
somename` module in the module search path, assume it is a Celery instance.celery -A somename: app worker
.celery -A somename worker
.somename
module in the module search pathcelery -A somename: x worker
.celery -A somename worker
.somename.celery
can be loaded and the Celery instance variable x exists in itcelery -A somename.celery: x worker
celery -A somename.celery worker
.celery -A somename worker
.from __future__ import absolute_import
(python 2)If you look carefully, the appearance of the app will change slightly depending on the pattern.
celery -A somename worker
...
- ** ---------- [config]
- ** ---------- .> app: __main__:0x106a950
celery -A somename worker
...
- ** ---------- [config]
- ** ---------- .> app: somename:0x1585410
The confusing cause is the special rules proj.app
and proj.celery
.
Extremely confusing cases are not explained in the Flask documentation. Flask generally creates a Flask instance ʻapp variable. However, celery also thinks that it must be a Celery instance if it finds the ʻapp
variable, so there is a discrepancy.
A workaround would be, for example:
:
celery -A your_application: celery worker
If you want to create a scheduled batch, start a beat daemon.
celery -A somename beat
Start it separately from the worker daemon. This is basically not scaled out, so you have to be careful when using it.
Note that schedule settings cannot be embedded in the program and will be listed in the basic configuration file.
I don't really understand the relationship between job queues and celery ... Is that the image ...?
Postscript: Now you have the option of kafka + faust.