There are times when you want to store tasks in a queue and handle them as a worker. Computational tasks, mass deployment of something, etc. When creating such a mechanism, celery is convenient for Python.
The person who read this can easily realize the asynchronous processing mechanism in Python and even monitor the processing status. What you can do.
apt-get install redis-server
pip install celery
pip install flower
main.py
import tasks
print('<first task>')
#Start task here(run task)
worker = tasks.run.delay()
#If it doesn't end, wait until it ends.
while not worker.ready():
pass
#Give a return value
print worker.result
print('<second task>')
#Start task here(calc task)
worker = tasks.calc.delay(100, 200)
#If it doesn't end, wait until it ends.
while not worker.ready():
pass
#Give a return value
print worker.result
tasks.py If you put together the tasks you want to process asynchronously into a function and add an @task decorator Ready to hit from celery. The serializer of celery handles the passing of arguments and return values well. Note that instances of your own class cannot be serialized.
import time
from celery.decorators import task
@task
def run():
time.sleep(10)
print('Processing Owata')
return 'I'm done'
@task
def calc(a, b):
return a+b
celeryconfig.py Configuration file for running celery. Basically, data transfer around workers Since I want to do it with json, "json" is specified as the serializer for task / result delivery. The backend (BROKER) is made to work with redis, but RabbitMQ can also be used. (I'll leave it to you) In the example below, the worker loads tasks.py. Functions to be processed asynchronously Let's specify all the scripts to include. If CELERYD_LOG_LEVEL is set to INFO, the standard output of the task will also be logged (celeryd.log). Written in. In production, it may be better to set it to ERROR.
Since CELERYD_CONCURRENCY = 1, we will handle the queues one by one. It is better to adjust here according to the number of CPUs.
BROKER_URL = 'redis://localhost/0'
CELERYD_CONCURRENCY = 1
CELERY_RESULT_BACKEND = 'redis'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_BACKEND = "redis"
CELERYD_LOG_FILE = "./celeryd.log"
CELERYD_LOG_LEVEL = "INFO"
CELERY_IMPORTS = ("tasks", )
First, let's start redis-server. (Required) Skip those who have already started the service.
$ redis-server
The worker is now ready to handle the queue
(env) docker@1824542bb286:~/workspace$ celery worker
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
The 'CELERYD_LOG_LEVEL' setting is scheduled for deprecation in version 2.4 and removal in version v4.0. Use the --loglevel argument instead
alternative='Use the {0.alt} instead'.format(opt))
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
The 'CELERYD_LOG_FILE' setting is scheduled for deprecation in version 2.4 and removal in version v4.0. Use the --logfile argument instead
alternative='Use the {0.alt} instead'.format(opt))
-------------- celery@1824542bb286 v3.1.23 (Cipater)
---- **** -----
--- * *** * -- Linux-3.13.0-24-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: default:0x7f068383f610 (.default.Loader)
- ** ---------- .> transport: redis://localhost:6379/0
- ** ---------- .> results:
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ----
--- ***** ----- [queues]
-------------- .> celery exchange=celery(direct) key=celery
[tasks]
. tasks.run
Asynchronous processing
docker@1824542bb286:~/workspace$ python main.py
<first task>
I'm done
<second task>
300
(env2) docker@1824542bb286:~/workspace$ celery flower
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
The 'CELERYD_LOG_LEVEL' setting is scheduled for deprecation in version 2.4 and removal in version v4.0. Use the --loglevel argument instead
alternative='Use the {0.alt} instead'.format(opt))
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
The 'CELERYD_LOG_FILE' setting is scheduled for deprecation in version 2.4 and removal in version v4.0. Use the --logfile argument instead
alternative='Use the {0.alt} instead'.format(opt))
[I 160617 13:02:20 command:136] Visit me at http://localhost:5555
[I 160617 13:02:20 command:141] Broker: redis://localhost:6379/0
[I 160617 13:02:20 command:144] Registered tasks:
['celery.backend_cleanup',
'celery.chain',
'celery.chord',
'celery.chord_unlock',
'celery.chunks',
'celery.group',
'celery.map',
'celery.starmap',
'tasks.run']
[I 160617 13:02:20 mixins:231] Connected to redis://localhost:6379/0
By default, localhost: 555 is the URL of the flower (monitoring interface). It is convenient because you can adjust the number of workers as well as monitoring.
Recommended Posts