It is a plan to create a Lambda function ** that updates the Lambda function group and challenge continuous Lambda update.
Code running on AWS Lambda can be distributed in three ways.
Actually, when I first saw the options, I thought that the S3 download format would refer to the S3 file every time I started Lambda. With this, "If you distribute it to S3, you can deploy the code at that point, isn't it?"
As far as I can see the behavior, it seems that it is not so, so I made a mechanism to deploy locally while being a little disappointed, but based on the idea that "AWS SDK will also touch Lambda itself?"
I wondered if I could make a mechanism like this, so I made a little code and tried it.
In addition, it is written in Python for personal convenience.
Define a Lambda function with inline code upsert_lambda.py in this Gist.
I don't think the memory itself requires that much. However, depending on the contents, it will make a good API call, so it is better to keep the timeout seconds longer. (I set it to 30 seconds for the time being)
(Roughly explain in the comments while squeezing the contents)
upsert_lambda.py
# -*- coding:utf8 -*-
import boto3
from botocore.exceptions import ClientError
import zipfile
import json
class S3Object(object):
#Class for simplification of delivery
# <Abbreviated for each init>
def lambda_handler(event, context):
#Function called by AWS Lambda
records = event.get('Records', [])
for record in records:
s3_object = S3Object(record['awsRegion'], record['s3']['bucket']['name'], record['s3']['object']['key'])
_update_functions('sharequiz', 'test', s3_object)
def _update_functions(project, env, s3_object):
#Get the function list json from the uploaded zip
with zipfile.ZipFile(project_zip_path) as zfp:
handlers_json = zfp.read('functions.json')
functions = json.loads(handlers_json)
_lambda = boto3.client('lambda')
for function in functions:
try:
#Try to create a Lambda function for the time being
#Get at least at the time of writing this code_I remember that function was also an exception
_lambda.create_function(
# <Argument omitted>
)
except ClientError as err:
#Overwrite settings and code if it already exists
_lambda.update_function_configuration(**function)
_lambda.update_function_code(
# <Argument omitted>
)
In S3, you can notify Lambda of events such as object creation and call specific Lambda functions. [^ 1] By using this mechanism, you can ** call the above Lambda function when the file is uploaded to the specified bucket folder **.
Basically, there is no problem if you make this area according to the Lambda document.
However, please include json to make lambda.create_function
and lambda.update_function_code
in boto3. (Named in functions.json)
Like ↓
functions.json
[{
"Handler": "path/to/module.some_function",
"Role": "arn:aws:iam::your-role",
"FunctionName": "some_funtion",
"Timeout": 3,
"MemorySize": 128,
"Description": "Easy processing"
}, {
"Handler": "path/to/module.other_function",
"Role": "arn:aws:iam::your-role",
"FunctionName": "else_funtion",
"Timeout": 30,
"MemorySize": 1280,
"Description": "Heavy processing"
}]
Please upload the created zip to the place where the notification goes to Lambda side.
(If you have time after posting, paste the video or GIF)
This is the introduction of the code I made before I knew the existence of lambda-uploader.