Feel free to build Task Queue with PyQS

When controlling Queue with Python, Celery is a standard, but [PyQS] using Amazon SQS (https://github.com/spulec/PyQS) I found that, so I tried using it for a while. I had a hard time because the behavior is slightly different from README, but I am grateful that it is possible to build a very easy Task Queue. ..

Installation

Installation is easy with a single pip command. pip install pyqs

Environment variable

The following environment variables are required.

--ʻAWS_ACCESS_KEY_ID ʻAWSS_SECRET_ACCESS_KEY ... Must have Read / Write permissions to Amazon SQS --PYTHONPATH ... Make the Python script executed by Queue available.

task

Import the decorator task from pyqs and put it in the function you want to register in the Queue.

qqq/tasks.py


from pyqs import task


@task('queue0')
def another_task(message):
    print "another_task: message={}".format(message)


@task('queue0')
def send_email(subject):
    print "send_email: subject={}".format(subject)

Registration to Queue

add_queue.py


from qqq.tasks import another_task, send_email
from settings import config

for i in range(0, 100):
    send_email.delay(subject='hogehoge')
    another_task.delay(message='hogehogehoge')

Start Worker

run_queue.sh


#! /bin/bash

export PYTHONPATH=`pwd`
export QUEUE='queue0'
pyqs $QUEUE

According to Github's README, the Queue name should be "queue0.tasks.send_email" or "queue.tasks.another_task" However, the actual SQS does not allow a period in the Queue name, so queue0 declared in the @task decorator will be the Queue name. I chased the sauce for about 2 hours before I understood this.

Summary

It is attractive to be able to create a Task Queue immediately if you have access to Amazon SQS without having to set up a message broker on your own. Also, I'm grateful that SQS is almost free.

Github

The source is here

Recommended Posts

Feel free to build Task Queue with PyQS
Feel free to encrypt the disk
Feel free to implement Python's asynchronous http client with Trio + httpx
Feel free to knock 100 data sciences with Google Colab and Azure Notebooks!
I tried to build ML Pipeline with Cloud Composer