I had to write (update) multiple records to DynamoDB, so I posted an article because I wanted to write down the results of implementation and cautions. I will also post an implementation in both Python and Node.js
The table to be written this time is assumed to be as follows Table name: Users The columns and types (the schema itself has no type, but it is decided as the type of data to be registered for convenience) are as follows.
friend assumes a list of Maps with id, name, address keys This time, even if it is a complicated type (type like friend of User table), it is one of the things I want to show that it can be written, so I will ignore whether the table design is the best in the first place.
This time I will post the code that runs on Lambda, so I assume that Lambda is ready for itself
Whole code example → Light commentary I will write in the procedure of, and at the end I will post common supplementary items
import boto3
from boto3.dynamodb.conditions import Key
def update_users(table, users_friends):
with table.batch_writer() as batch:
for n in range(3):
batch.put_item(
Item={
"id": n + 1,
"name": "user" + str(n + 1),
"address": "address" + str(n + 1),
"friends": users_friends[n]
}
)
def lambda_handler(event, context):
try:
dynamoDB = boto3.resource("dynamodb")
table = dynamoDB.Table("Users")
user1_friends = [
{ "id": 2, "name": "user2", "address": "address2" },
{ "id": 3, "name": "user3", "address": "address3" }
]
user2_friends = [
{ "id": 1, "name": "user1", "address": "address1" },
{ "id": 3, "name": "user3", "address": "address3" }
]
user3_friends = [
{ "id": 1, "name": "user1", "address": "address1" },
{ "id": 2, "name": "user2", "address": "address2" }
]
users_friends = [user1_friends, user2_friends, user3_friends]
update_users(table, users_friends)
return event["message"] #What to return is appropriate
except Exception as e:
print(e)
The way to write the code and the data to be written are not essential, so it's okay to have them through.
The important thing is that the contents of the ʻupdate_users function use the
batch_writemethod that grows in the DynamoDB model instance. That is, when you want to write multiple times, do
PUT` in the following block.
with table.batch_write() as batch:
# ...Abbreviation
Also, when PUT
is done, for example, batch.put_item
is called.
Describe the contents of writing in the argument named ʻItem of this guy And one more thing to note is that even a type called List [Map] can be written without problems. Basically any type can be written to the DB properly as the value of the specified key, so in short, there is no problem if you pass the dictionary you want to update to ʻItem
as it is
Finally, as a little supplement, I am turning the for
statement inside the batch_write
block, but of course you can write the batch.put_item
multiple times in solid, so if you can not write it with the for
statement Sometimes you just have to do it solid
Also, it is said that the maximum number of items that can be written at one time is 25, but it seems that if there are 25 or more items, they will resend or do whatever they want, so it seems that you can write code without worrying about the number of items.
const AWS = require("aws-sdk");
const dynamoDB = new AWS.DynamoDB.DocumentClient({
region: "ap-northeast-1"
});
const tableName = "Users";
exports.handler = (event, context) => {
const user1Friends = [
{ id: 2, name: "user2", address: "address2" },
{ id: 3, name: "user3", address: "address3" }
];
const user2Friends = [
{ id: 1, name: "user1", address: "address1" },
{ id: 3, name: "user3", address: "address3" }
];
const user3Friends = [
{ id: 1, name: "user1", address: "address1" },
{ id: 2, name: "user2", address: "address2" }
];
const usersFriends = [user1Friends, user2Friends, user3Friends]
const params = {
RequestItems: {
[tableName]: usersFriends.map((e, i) => ({
PutRequest: {
Item: {
id: i + 1,
name: `user${i + 1}`,
address: `address${i + 1}`,
friends: e
}
}
}))
}
};
//callback is appropriate
dynamoDB.batchWrite(params, (e, data) => {
if (e) {
context.fail(e);
} else {
context.succeed(data);
}
});
};
As mentioned in the explanation in Python, the code and data values themselves are through.
The important thing here is the property (function) called batchWrite
in ʻAWS.DynamoDB.DocumentClient. This argument is quite quirky, so [Class: AWS.DynamoDB.DocumentClient — AWS SDK for JavaScript](https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html#batchWrite- property) When I looked into multiple writes in Node.js, I often saw
batchWriteItem, but the latest Node.js execution environment currently available in Lambda (as of September 2020). It seems that I will not do it, so it does not move and it is unexpectedly difficult The big difference from
batchWriteItem is that the object of ʻItem
does not have to be a DynamoDB-specific typed object (likename: {"S": "user"}
).
In short, if you make JSON as it is and pass it as the value of ʻItem` without worrying about the value you want to write, it will work without problems
Therefore, of course, even a complicated type called Map [List] can be written by passing it as it is.
If you want to update in batch processing, you have to use PUT
The reason is simple and there is no ʻUPDATE, so it can't be helped, so it is necessary to steadily describe it with
PUT. Also, I didn't focus this time because I can do
DELETE` in batch processing, but I'll just say that there is ...
What did you think For multiple operations, I often see articles with code and one or two lines of explanation, but I don't see many explanations specifically for writing (even if there are complicated types, it is unclear whether it can be done). , In the first place, it will be a run-time error), I did a little trouble, so I wrote it down as a reminder It may be a slightly niche field, but I hope you find it useful.
Amazon DynamoDB — Boto3 Docs 1.14.56 documentation Class: AWS.DynamoDB.DocumentClient — AWS SDK for JavaScript
Recommended Posts