The story is related to hololive, but what I'm doing is as follows, so it may be useful (I hope it helps).
・ Create a Slack Bot using AWS Lambda -Acquires the text sent to the personal chat with the bot and operates as specified. (Also reply) ・ Tweet with twitter API -Deploy AWS Lambda with serverless-framework -Python external module is introduced by serverless-python-requirements
I made a Slack Bot to tweet in Chama. Roughly speaking,
The other day, Captain Marin was delivering and talking about the fact that Hachama had proposed something called ** "Play in Chama" **. (See video below) https://www.youtube.com/watch?v=T2yMNE_zb54 https://www.youtube.com/watch?v=IoOeMaCzuZY
Roughly speaking, the Chama word is ** replaced with "Chama" in the sentence **. For example
"Akai Haato"-> "Akai Haato" "Tomato"-> "Chamama Chama" "My Neighbor Totoro"-> "Cha-Chamaro"
And, "Play in Chama language" seems to be a rotation rhythm game with the following rules.
Watching the delivery, I burst into laughter while watching the comment section with the captain who was contaminated with memes, and said, "I also made" Slack Bot to tweet in Chama language "and" How to make Slack Bot using AWS Lambda. 』I will write an article ~ ~".
Please apply for and obtain a Twitter Developer account. (This is the most annoying work this time)
First, create a Lambda called chama-language-tweet-bot using serverless-framework.
(If you don't have serverless-framework, put it in with npm install -g serverless
)
(If you haven't set up an aws account, do it. If you check "How to use aws-cli", it will come out, so set the key.)
$ mkdir chama-language-tweet-bot
$ cd chamago-tweet-bot
$ npm init
$ serverless create --template aws-python3 --name chama-language-tweet-bot
$ ls
handler.py package-lock.json serverless.yml
node_modules package.json
So far, it's the usual flow. (Here, `` `sls deploy``` may be done to check if it is set properly) Oh, it's my hobby, but it's hard to understand with handler.py, so I changed the file name to slackbot.py here. Since we will only create one function this time, it is okay to leave handler.py as it is, but since we usually put multiple functions in one Lambda, handler.py is too difficult to understand. So, I usually make .py for each function, set a function handler for each, and make it a handler.
SlackBot has a test to see if the connection with API such as Lambda is successful during the setting. So, by the time you get to that stage, you need to create a behavior in Lambda that can handle the test.
serverless.yml
service: chama-language-tweet-bot
frameworkVersion: '2'
provider:
name: aws
runtime: python3.8
stage: dev
region: us-east-1
functions:
slackbot:
handler: slackbot.handler
timeout: 200
events:
- http:
path: slackbot
method: post
cors: true
integration: lambda
slackbot.py
# coding: utf-8
import json
import logging
#Log settings
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def handler(event, context):
logging.info(json.dumps(event))
#Slack Event API Authentication
if "challenge" in event["body"]:
return event["body"]["challenge"]
return {
'statusCode': 200,
'body': 'ok'
}
As described above, sls deploy
. Then make a note of the endpoint.
Let's use SlackBot. Go to Slack API and create a new Slack App from "Create New App". Decide the name appropriately, and decide the workspace to install in Development Slack Workspace. ~~ At this time, do not select the company workspace even if you make a mistake ~~
Once you have a Slack App, look for App Credentials and make a note of the Verification Token.
Next, enable the Slack API. This time we'll be using something called Slack Events, so open Event Subscriptions and turn Enable Events on.
After turning on Enable Events, put the Lambda endpoint in the Request URL. Then (if Lambda so far is done properly) it will be Verified. This completes the connection between Slack and Lambda.
Next, set the reaction type of Event API. Select "message.im" from the Add Bot User Event. This is a setting that reacts when a post is made in a personal chat with a bot.
Next, give the bot permission to post messages. Find Scopes in OAuth & Permissions and add "chat: write" from "Add an OAuth Scope" there.
Now that you have set up SlackBot, press "Install to Workspace" in OAuth & Permissions to install SlackBot in your workspace.
Then, make a note of the OAuth Token displayed after that.
Now, let's introduce Bot from the Slack app. Open "Add App" in the Slack app, search for the name of the Slack Bot you created earlier, and install it. Then open the bot's profile and "copy member ID" from "others".
This completes the settings on the Slack side!
Now that we have a Slack Bot, we'll introduce an external module for use with Lambda. The following two Python external modules are used this time. ・ Tweepy ・ Pykakashi This time, I'm going to use the plugin serverless-python-requirements to insert these.
Now, introduce serverless-python-requirements.
$ npm install --save serverless-python-requirements
Then add the following to serverless.yml.
serverless.yml
plugins:
- serverless-python-requirements
custom:
pythonRequirements:
dockerizePip: true
Basically, you only need to write plugins, but this time pykakashi is a non-pure Python module, so you need to write custom. ** (Furthermore, you need to have Docker running when you do `` `sls deploy```.) **
Next, place requirements.txt in the same hierarchy as serverless.yml and write the external library you want to put.
requirements.txt
tweepy
pykakasi
Now, when you `sls deploy`
, the external library will be included with it.
Now, let's set the key here. Add environment to serverless.yml as follows.
serverless.yml
service: haachama-twitter-bot
frameworkVersion: '2'
provider:
name: aws
runtime: python3.8
stage: dev
region: us-east-1
environment:
SLACK_BOT_USER_ACCESS_TOKEN: ''
SLACK_BOT_VERIFY_TOKEN: ''
TWITTER_CONSUMER_KEY: ''
TWITTER_CONSUMER_SECRET: ''
TWITTER_ACCESS_TOKEN: ''
TWITTER_ACCESS_TOKEN_SECRET: ''
BOT_USER_ID: ''
The contents of each
SLACK_BOT_USER_ACCESS_TOKEN OAuth Token displayed when SlackBot is installed in the workspace
SLACK_BOT_VERIFY_TOKEN The first Verification Token that came out when I started making SlackBot.
TWITTER_CONSUMER_KEY TWITTER_CONSUMER_SECRET TWITTER_ACCESS_TOKEN TWITTER_ACCESS_TOKEN_SECRET Twitter Developer account key.
BOT_USER_ID The "member ID" of the bot copied with the Slack app.
Now, let's write the contents of Lambda. First, make sure your bot responds appropriately in a personal chat with your bot.
slackbot.py
# coding: utf-8
import json
import os
import logging
import urllib.request
import tweepy
import pykakasi
#Log settings
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def handler(event, context):
logging.info(json.dumps(event))
#Slack Event API Authentication
if "challenge" in event["body"]:
return event["body"]["challenge"]
#Check token
if not is_verify_token(event):
return {'statusCode': 200, 'body': 'token error'}
#Check resend
if "X-Slack-Retry-Num" in event["headers"]:
return {'statusCode': 200, 'body': 'this request is retry'}
#If not mentioning the bot
if not is_app_message(event):
return {'statusCode': 200, 'body': 'this request is not message'}
#Does not react to myself
if event["body"]["event"]["user"] == os.environ["BOT_USER_ID"]:
return {'statusCode': 200, 'body': 'this request is not sent by user'}
return {
'statusCode': 200,
'body': 'ok'
}
def is_verify_token(event):
token = event["body"]["token"]
if token != os.environ["SLACK_BOT_VERIFY_TOKEN"]:
return False
return True
def is_app_message(event):
return event["body"]["event"]["type"] == "message"
The above code does something like this.
--Token confirmation --Confirmation of resending --The Slack Event API prevents the request from being resent up to 4 times without permission if statusCode: 200 is not returned within 3 seconds. This time Lambda will take less than 3 seconds to process, but I wrote it because it caused a runaway and (I) had run out. By the way, "X-Slack-Retry-Num" is a property that stores the number of retransmissions, and "X-Slack-Retry-Num" does not exist in the first place when it is sent for the first time. This time it was determined. --Determine that the posted message is "posted in a personal chat with a bot" --Prevent runaway --Do not respond if the posted message is from the bot itself. If you don't write this part, once the bot reacts, it will repeatedly respond to your own posts and run away.
*Yes Yes. As an aside, it seems that Slack's system uses the ID and time stamp of the channel on which the post was posted, rather than the unique ID of the message (Twitter assigns an ID to each tweet). .. Therefore, if you want to implement it so that the bot will reply, the ID of the channel to which the post was posted and the time stamp are essential. *
This time, when a post is made, I'd like to reply for confirmation, so I'll write this as well.
slackbot.py
def post_message_to_channel(channel, message):
url = "https://slack.com/api/chat.postMessage"
headers = {
"Content-Type": "application/json; charset=UTF-8",
"Authorization": "Bearer {0}".format(os.environ["SLACK_BOT_USER_ACCESS_TOKEN"])
}
data = {
"token": os.environ["SLACK_BOT_VERIFY_TOKEN"],
"channel": channel,
"text": message,
}
req = urllib.request.Request(url, data=json.dumps(data).encode("utf-8"), method="POST", headers=headers)
urllib.request.urlopen(req)
If you pass the channel and text in the handler to this function as follows, the bot will reply "done!".
slackbot.py
channel_id = event["body"]["event"]["channel"]
post_message_to_channel(channel_id, "done!")
Next, translate it into "Chama language" and write the part to tweet the result.
slackbot.py
def tweet(input_text):
auth = tweepy.OAuthHandler(os.environ["TWITTER_CONSUMER_KEY"], os.environ["TWITTER_CONSUMER_SECRET"])
auth.set_access_token(os.environ["TWITTER_ACCESS_TOKEN"], os.environ["TWITTER_ACCESS_TOKEN_SECRET"])
api = tweepy.API(auth)
kakasi = pykakasi.kakasi()
kakasi.setMode('J', 'H')
conv = kakasi.getConverter()
input_text = conv.do(input_text)
input_text = input_text.replace('When', 'Chama')
input_text = input_text.replace('To', 'Chama')
input_text = input_text.replace('To', 'Chama')
input_text = input_text.replace('Do', 'Jama')
input_text = input_text.replace('Do', 'Jama')
input_text = input_text.replace('Do', 'Jama')
api.update_status(status=input_text)
slackbot.py
#Call in handler
input_text = event["body"]["event"]["text"]
tweet(input_text)
This time, in pykakashi, all the input sentences are made into hiragana, and "to" contained in the sentence is replaced with "chama" and "do" is replaced with "jyama". Then, tweet the replacement result.
Well, that's it. Let's test it by doing `` `sls deploy```.
Yes, I did.
Kanji conversion doesn't seem to be a problem either.
serverless.yml
service: haachama-twitter-bot
frameworkVersion: '2'
provider:
name: aws
runtime: python3.8
stage: dev
region: us-east-1
environment:
SLACK_BOT_USER_ACCESS_TOKEN: ''
SLACK_BOT_VERIFY_TOKEN: ''
TWITTER_CONSUMER_KEY: ''
TWITTER_CONSUMER_SECRET: ''
TWITTER_ACCESS_TOKEN: ''
TWITTER_ACCESS_TOKEN_SECRET: ''
BOT_USER_ID: ''
plugins:
- serverless-python-requirements
custom:
pythonRequirements:
dockerizePip: true
functions:
slackbot:
handler: slackbot.handler
timeout: 200
events:
- http:
path: slackbot
method: post
cors: true
integration: lambda
slackbot.py
# coding: utf-8
import json
import os
import logging
import urllib.request
import tweepy
import pykakasi
#Log settings
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def handler(event, context):
logging.info(json.dumps(event))
#Slack Event API Authentication
if "challenge" in event["body"]:
return event["body"]["challenge"]
#Check token
if not is_verify_token(event):
return {'statusCode': 200, 'body': 'token error'}
#Check resend
if "X-Slack-Retry-Num" in event["headers"]:
return {'statusCode': 200, 'body': 'this request is retry'}
#If not mentioning the bot
if not is_app_message(event):
return {'statusCode': 200, 'body': 'this request is not message'}
#Does not react to myself
if event["body"]["event"]["user"] == os.environ["BOT_USER_ID"]:
return {'statusCode': 200, 'body': 'this request is not sent by user'}
input_text = event["body"]["event"]["text"]
channel_id = event["body"]["event"]["channel"]
tweet(input_text)
post_message_to_channel(channel_id, "done!")
return {
'statusCode': 200,
'body': 'ok'
}
def post_message_to_channel(channel, message):
url = "https://slack.com/api/chat.postMessage"
headers = {
"Content-Type": "application/json; charset=UTF-8",
"Authorization": "Bearer {0}".format(os.environ["SLACK_BOT_USER_ACCESS_TOKEN"])
}
data = {
"token": os.environ["SLACK_BOT_VERIFY_TOKEN"],
"channel": channel,
"text": message,
}
req = urllib.request.Request(url, data=json.dumps(data).encode("utf-8"), method="POST", headers=headers)
urllib.request.urlopen(req)
def is_verify_token(event):
token = event["body"]["token"]
if token != os.environ["SLACK_BOT_VERIFY_TOKEN"]:
return False
return True
def is_app_message(event):
return event["body"]["event"]["type"] == "message"
def tweet(input_text):
auth = tweepy.OAuthHandler(os.environ["TWITTER_CONSUMER_KEY"], os.environ["TWITTER_CONSUMER_SECRET"])
auth.set_access_token(os.environ["TWITTER_ACCESS_TOKEN"], os.environ["TWITTER_ACCESS_TOKEN_SECRET"])
api = tweepy.API(auth)
kakasi = pykakasi.kakasi()
kakasi.setMode('J', 'H')
conv = kakasi.getConverter()
input_text = conv.do(input_text)
input_text = input_text.replace('When', 'Chama')
input_text = input_text.replace('To', 'Chama')
input_text = input_text.replace('To', 'Chama')
input_text = input_text.replace('Do', 'Jama')
input_text = input_text.replace('Do', 'Jama')
input_text = input_text.replace('Do', 'Jama')
api.update_status(status=input_text)
requirements.txt
tweepy
pykakasi
I thought about testing it, but after all, if the bot icon is the same as the initial one, it will not be a surprise. Let's draw it because it's a big deal. ... yes, I drew it. ~~ It may take the longest time in the process to draw the icon at the end. ~~ The icon setting of Slack Bot can be done from Display Information of Basic Information of Slack App, and it is done like this.
Don't forget to press "Save Changes" after making changes.
Yes, it feels good. You did it. This one has a "feeling of being made properly".
By the way, when converting input sentences to hiragana, MeCab and an appropriate dictionary (UniDic, neologd, etc.) will be able to output more accurately than pykakashi. However, pykakashi is overwhelmingly easier to incorporate into Lambda ~~, and this time it's a story, so accuracy is not required so much ~~, so I ended up with pykakashi. If you want to embed MeCab and dictionaries in Lambda, you need to build MeCab and dictionaries under Amazon Linux environment, so build with EC2 instance (more than t2.medium. Memory was insufficient for t2.micro). Must be mounted on EFS. Is there an easy way to use MeCab (& UniDic) with Lambda ...
Recommended Posts