There are some parts that are omitted, S3 settings, etc. Here are the steps to implement TweetBot on AWS Lambda.
It is assumed that S3 has JSON data in advance. The JSON data has a file name of yyyymmdd.json The structure is as follows ↓ ↓
First from the development environment. For Mac, you will be working in the terminal.
After creating the working directory (here tentatively lambda_function) Execute the following commands in order.
pip install virtualenv
source /{your path}/lambda_function/bin/activate
cd /{your path}/lambda_function/
pip install python-lambda-local
pip install lambda-uploader
virtualenv This will create a virtual execution environment for python in the specified directory. This allows you to try the module in a limited environment.
python-lambda-local
This is a tool for executing lambda functions in your local environment.
To execute, create a file called event.json and specify it to lambda
Create a tentative input value.
Execute with the following command ↓ ↓
python-lambda-local -f {lambda executable name} {lambda executable} .py ./event.json -t 5
_The last option t specifies the timeout execution seconds of lambda (optional) _
lambda-uploader You can upload the source to the last lambda. To execute, create lambda.json, requirements.txt and fill in the settings. lambda.json
{
"name": "{Lambda function name}",
"description": "{Description}",
"region": "ap-northeast-1",
"handler": "{Executable file name(extension.No py)}.{Execution function name}",
"role": "{Specify arn of role}",
"timeout": 300,
"memory": 128
}
Create a role in advance to specify the authority of the function when executing the Lambda Function.
requests_oauthlib
beautifulsoup4
pytz
Create the following files. canary.py (filename is arbitrary)
from boto3 import Session, resource
from requests_oauthlib import OAuth1Session
from bs4 import BeautifulSoup
import pytz
from pprint import pprint
from datetime import datetime,timedelta
import urllib2
import random
import os.path
import urllib
import json
# Twitter API
CK = '{your twitter CK}'
CS = '{your twitter CS}'
AT = '{your twitter AT}'
AS = '{your twitter AS}'
TMP_DIR = '/tmp'
UPDATE_URL = 'https://api.twitter.com/1.1/statuses/update.json'
UPDATE_MEDIA = 'https://upload.twitter.com/1.1/media/upload.json'
IMAGES_SELECTOR = 'img'
IMAGES_NUM = 4
AWS_S3_BUCKET_NAME = "{* enter your backet name *}"
INTERVAL = 1
def _exists(bucket, key):
return 'Contents' in Session().client('s3').list_objects(Prefix=key, Bucket=bucket)
def _getTweetList(keyName):
if( _exists(AWS_S3_BUCKET_NAME, keyName) == False ):
print("No JSON FILE"); return False
s3 = resource('s3', region_name='ap-northeast-1')
obj = s3.Bucket(AWS_S3_BUCKET_NAME).Object(keyName)
response = obj.get()
body = response['Body'].read()
return body.decode('utf-8')
def _getImages(url):
img_urls = []
html = urllib2.urlopen(url)
soup = BeautifulSoup(html, "html.parser")
for img in soup.select(IMAGES_SELECTOR):
img_urls.append(img['src'])
if len(img_urls) > IMAGES_NUM:
fetch_urls = random.sample(img_urls, IMAGES_NUM)
else:
fetch_urls = img_urls
filenames = []
count = 1
for img_url in fetch_urls:
name, ext = os.path.splitext(img_url)
filename = TMP_DIR+'/'+str(count)+ext
urllib.urlretrieve(img_url, filename)
filenames.append(filename)
count = count+1
return filenames
def _uploadTweetImage( images ):
media_ids = []
tw = OAuth1Session(CK, CS, AT, AS)
for image in images:
files = {"media": open(image, 'rb')}
req_media = tw.post(UPDATE_MEDIA, files = files)
if req_media.status_code == 200:
media_ids.append(json.loads(req_media.text)['media_id'])
else:
media_ids.append(req_media.status_code)
return media_ids
def _tweet(text, media_ids):
params = {"status": text, "media_ids": media_ids}
tw = OAuth1Session(CK, CS, AT, AS)
req = tw.post(UPDATE_URL, params = params)
if req.status_code == 200:
return text
else:
return req.status_code
def _testAllFunction(event, context):
ret = {}
ret['getImages'] = _getImages("http://yahoo.co.jp")
ret['uploadTweetImage'] = _uploadTweetImage([TMP_DIR+'/1.jpg', TMP_DIR+'/2.jpg', TMP_DIR+'/3.jpg', TMP_DIR+'/4.jpg'])
ret['tweet'] = _tweet("Hello", [])
ret['exists'] = _exists(AWS_S3_BUCKET_NAME, '20160209.json')
ret['getTweetList'] = _getTweetList('20160209.json')
return ret
def lambda_handler(event, context):
ret = {}
jst = pytz.timezone('Asia/Tokyo')
jst_now = datetime.now(jst)
today = jst_now.strftime("%Y%m%d")
object_name = today + ".json"
pprint(object_name)
json_data = _getTweetList(object_name)
if ( json_data != False ):
tweets = json.loads(json_data)
td_now = timedelta(hours=jst_now.hour, minutes=jst_now.minute)
ret['main'] = [{'now': str(jst_now.hour)+':'+str(jst_now.minute)}]
targetTweetList = []
for tweet in tweets:
td_tweet = timedelta(hours=tweet["hour"], minutes=tweet["minute"])
if(td_now < td_tweet and (td_tweet - td_now).seconds/60 <= INTERVAL):
pprint(tweet)
targetTweetList.append( { "text" : tweet["text"], "link": tweet["link"] } )
pprint(targetTweetList)
for ttweet in targetTweetList:
images = _getImages(ttweet["link"])
media_ids = _uploadTweetImage(images)
status = _tweet(ttweet["text"], media_ids)
ret['main'].append(status)
else:
ret['main'] = "no data"
return ret
I will explain the flow briefly.
After creating the above file, first install the library to try it locally.
pip install requests_oauthlib
pip install beautifulsoup4
pip install pytz
pip install boto3
Check locally.
python-lambda-local -f lambda_handler canary.py event.json
If there is no error, it is OK.
Fill in the following in lambda.json
{
"name": "Canary",
"description": "sugoroku schedule tweet",
"region": "ap-northeast-1",
"handler": "canary.lambda_handler",
"role": "{Specify arn of role}",
"timeout": 300,
"memory": 128
}
Since requirements.txt has already been created, execute the following ↓
python-uploader
Hopefully you should have added Canary to the Lambda Function on the AWS Management Console.
For the time being, execute the Test button from the management console and check the execution on Lambda. If there is no problem, enter the cron settings.
Add from Evnt sources> Add event source
After that, please actually put a reserved post in the json file and check the operation.
By the way, I didn't touch on this, but as an impression of touching Python
I had a hard time with the above points, but the impression was that it was generally easy to touch.
Tweet with image in Python-Qiita http://qiita.com/yubais/items/864eedc8dccd7adaea5d
Learn with Twitter retrobot build AWS Lambda Python --Qiita http://qiita.com/ketancho/items/6d5137b48d94eced401e
How to easily create a setting tool for service operation using GAS + S3 --Qiita http://qiita.com/hirokidaichi/items/769e330284302a799095
Lambda | Special Category | Developers.IO http://dev.classmethod.jp/referencecat/aws-lambda/
Deploy AWS Lambda Python with lambda-uploader | Developers.IO http://dev.classmethod.jp/cloud/deploy-aws-lambda-python-with-lambda-uploader/
Run AWS Lambda Python locally | Developers.IO http://dev.classmethod.jp/cloud/aws/invoke-aws-lambda-python-locally/
Recommended Posts