Cisco Spark has a bot feature. Instead of mentioning and talking to a person, when you mention and talk to a bot, the bot understands it as a command, processes it, and returns the resulting message (actually, the bot just relays and the actual processing is done. Another program does).
It depends on the idea what kind of processing is done and what kind of result is returned. The simplest process here is to create a bot that responds pong to the message ping. AWS Lambda + Amazon API gateway will be used for the actual processing part.
Note that when the bot HTTP POSTs to the Target URL in 2., the payload ** does not contain the message itself, only the message ID **. To get the message itself, you need to do an HTTP GET from Lambda to Cisco Spark to get the message corresponding to the ID. Prepare this program yourself (6 in the figure). You also need to have a program (HTTP POST) that displays the results in Cisco Spark (7 in the figure).
You need to log in as a real user to create a bot. Because Bot is registered in association with the actual user.
Go to the following site and click "Create a Bot": https://developer.ciscospark.com/add-app.html
Fill in all the following three items on the following screen:
If you enter it correctly, the Access Token for the bot will be displayed, so make a note of it. It will not expire. Handle with care.
Open Cisco Spark, go to any room, and add the bot to the room in the same way you would add a person to a room (search by bot name, then add).
When you post a message to Cisco Spark, the bot relays it and passes it to another program via HTTP POST. There are various ways to process the program, but usually the result is returned to the bot (so-called callback) and displayed on Cisco Spark. The mechanism of callback by HTTP is called Webhook.
The Target URL is especially important for setting up webhooks. The bot is the destination to throw an HTTP POST at the specified URL that exists on the Internet.
The Target URL and program must be prepared by the creator of the bot. For that purpose, for example, you can use the conventional method such as preparing a server, installing the OS, running the Web server, assigning an IP address, and then assigning a URL. Since it takes a lot of time and effort, here we prepare the Target URL using API Gateway and run the actual program on Lambda.
To determine the Target URL, you need to configure Lambda + API Gateway first. Set up the Cisco Spark webhook in 4-1.1.
This section describes [Using AWS Lambda with Amazon API Gateway (via on-demand HTTPS)](http://docs.aws.amazon.com/ja_jp/lambda/latest/dg/with-on-demand-https- Based on the tutorial in example.html), we have customized the steps as needed to support Cisco Spark Bot.
It is assumed that the AWS Management Console and AWS CLI settings have been completed and can be accessed. If you haven't completed it, go to "Step 1" in this article. Please complete the setting by referring to ": Preparation".
A deployment package is a compressed version of the following two files:
Follow the steps below to create:
$ vi LF4CiscoSpark.py
---
from __future__ import print_function
import boto3
import json
import requests
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
access_code = '' #Enter the bot's Access Code
botDisplayName = '' #Enter the bot name
def sendSparkGET(event):
url = 'https://api.ciscospark.com/v1/messages/{0}'.format(event.get('data')['id'])
headers = {
'Authorization' : 'Bearer ' + access_code,
'Content-Type' : 'application/json'
}
r1 = requests.get(url, headers = headers)
return json.loads(r1.text)
def sendSparkPOST(event,message_detail):
url = 'https://api.ciscospark.com/v1/messages/'
headers = {
'Authorization' : 'Bearer ' + access_code,
'Content-Type' : 'application/json'
}
payload = {
"roomId" : event.get('data')['roomId'],
"text" : 'pong'
}
r1 = requests.post(url, headers = headers, data = json.dumps(payload))
return True
def handler(event, context):
print("Received event: " + json.dumps(event, indent=2))
message_detail = sendSparkGET(event)
bot_command = message_detail['text']
bot_commands = {
botDisplayName + 'ping' : lambda x, y : sendSparkPOST(x, y)
}
if bot_command in bot_commands:
return bot_commands[bot_command](event,message_detail)
else:
raise ValueError('Unrecognized operation')
---
$ zip LF4CiscoSpark.zip -r /(path-to-site-packages)/site-packages/requests
$ zip -g LF4CiscoSpark.zip LF4CiscoSpark.py
Mention the bot and send a "ping" message, and the bot will reply "pong". If you want to use a command other than "ping", you can describe the process in bot_commands.
IAM is a mechanism that controls which AWS resources users can access and how they can be used. As a general setting procedure, 3 steps of 1) creating a role, 2) creating a policy, and 3) attaching a policy to a role are performed. Since the policy uses the standard policy this time, it is not necessary to perform 2). When the procedure is complete, you will see a string called Role ARN. This is needed to create a Lambda function in the next step.
$ aws lambda create-function --region us-west-2 \
--function-name LF4CiscoSpark \
--zip-file fileb://LF4CiscoSpark.zip \
--role arn:xxxxxxxxxxxxxxxxxx \
--handler LF4CiscoSpark.handler \
--runtime python3.6 \
--profile adminuser
$ aws lambda update-function-code \
--function-name LF4CiscoSpark \
--zip-file fileb://LF4CiscoSpark.zip
This time, we will use the AWS CLI to create the API Gateway. There are a few steps, but it is easier to understand if you work while checking the work results in the CLI one by one on the AWS console.
$ aws apigateway create-rest-api \
--name API4CiscoSpark \
--region us-west-2 \
--profile adminuser
(Execution result below)
{
"name": "API4CiscoSpark",
"createdDate": 1501839827,
"id": "" # API ID
}
AWS console display result (API Gateway):
$ aws apigateway get-resources \
--rest-api-id (API ID)
(Execution result below)
{
"items": [
{
"path": "/",
"id": "" # ROOT RESOURCE ID
}
]
}
$ aws apigateway create-resource \
--rest-api-id (API ID) \
--parent-id (ROOT RESOURCE ID) \
--path-part Resource4CiscoSpark
(Execution result below)
{
"pathPart": "Resource4CiscoSpark",
"parentId": "",
"path": "/Resource4CiscoSpark",
"id": "" # RESOURCE ID
}
AWS console display result (API Gateway):
$ aws apigateway put-method \
--rest-api-id (API ID) \
--resource-id (RESOURCE ID) \
--http-method POST \
--authorization-type NONE
AWS console display result:
$ aws apigateway put-integration \
--rest-api-id (API ID) \
--resource-id (RESOURCE ID) \
--http-method POST \
--type AWS \
--integration-http-method POST \
--uri arn:aws:apigateway:(region):lambda:path/2015-03-31/functions/arn:aws:lambda:(region):(Account number):function:LF4CiscoSpark/invocations
AWS console display result (API Gateway):
$ aws apigateway put-method-response \
--rest-api-id (API ID) \
--resource-id (RESOURCE ID) \
--http-method POST \
--status-code 200 \
--response-models "{\"application/json\": \"Empty\"}"
$ aws apigateway put-integration-response \
--rest-api-id (API ID) \
--resource-id (RESOURCE ID) \
--http-method POST \
--status-code 200 \
--response-templates "{\"application/json\": \"\"}"
AWS console display result (API Gateway):
$ aws apigateway create-deployment \
--rest-api-id (API ID) \
--stage-name prod
AWS console display result (API Gateway):
The URL on the right side of Calling URL is the Target URL for Cisco Spark webhooks.
$ aws lambda add-permission \
--function-name LF4CiscoSpark \
--statement-id apigateway \
--action lambda:InvokeFunction \
--principal apigateway.amazonaws.com \
--source-arn "arn:aws:execute-api:us-west-2:(Account number):(API ID)/prod/POST/Resource4CiscoSpark"
AWS console display result (* AWS Lambda *):
To set up a webhook, run HTTP PUT under the following conditions:
You can find roomId from here.
import requests
access_code = '' #Enter the Access Code here
url = 'https://api.ciscospark.com/v1/webhooks'
headers = {
'Authorization' : 'Bearer ' + access_code,
'Content-Type' : 'application/json'
}
r = requests.put(url, headers = headers)
{
'name' : 'My Awesome Webhook',
'targetUrl' : 'https://example.com/mywebhook',
'resource' : 'messages',
'event' : 'created',
'filter' : 'roomId=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx',
}
Mention the bot in Cisco Spark and type "ping" and the bot will return "pong".
Recommended Posts