When I wrote Lambda in Node.js, I was struck by asynchronous processing, so I decided to use the recently released Lambda / Python and tried it.
http://boto3.readthedocs.org/en/latest/guide/dynamodb.html
import boto3
import json
from boto3.dynamodb.conditions import Key, Attr
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('MY_TABLE_NAME')
query
res = table.query(
IndexName='MY_INDEX_NAME',
KeyConditionExpression=Key('MY_INDEX_NAME').eq(MY_INDEX_VALUE)
)
for row in res['Items']:
print(row)
delete_item
table.delete_item(Key={'key1': key1, 'key2': key2})
put_item
table.put_item(
Item={
"key1": value1,
"key2": value2
}
)
get_item
items = table.get_item(
Key={
"key1": key1
}
)
batch_write
with table.batch_writer() as batch:
for i in range(50):
batch.put_item(
Item={
'account_type': 'anonymous',
'username': 'user' + str(i),
'first_name': 'unknown',
'last_name': 'unknown'
}
)
http://boto3.readthedocs.org/en/latest/guide/dynamodb.html#batch-writing
The limit of DynamoDB is up to 25, but it seems that sending 25 each is troublesome.
list_tables
dynamodb = boto3.resource('dynamodb')
table_list = dynamodb.tables.all()
for table in table_list:
print(table.table_name)
Recommended Posts