Run and see AWS Kinesis Firehose

Introduction

Kinesis Firehose was announced at Amazon re: Invent 2015 on October 2015.

I was surprised to see this. Even if you don't have a Server to collect Logs (with Client app or IOT device), you can store Logs in S3 or Redshift with Server Maintenance 0 if you just want to collect them! As you can see in the Google Analytics library, if you call the Beacon API, Logs will be accumulated in Redshift without thinking about anything, and Data analysis is super easy! With this, Jean is even closer to Amazon's 2 tier Architecture!

That's why I will back the leaflet with the results of my research, such as Firehose settings and Python script. Please note that this article is the result of the survey as of November 2015.

Conclusion

To conclude first, Kinesis Firehose has the following uses at the moment.

By the way, I tried to incorporate the Java version of the AWS Firehose SDK into the Android App, but I couldn't incorporate it because my Android fighting power was too low or the conflict between the AWS Java SDK and the Android Java Library could not be resolved.

So, let's run Firehose in Python below.

Set up Kinesis Firehose

First, set the Firehose side on the assumption that the received Log will be saved in S3. From Kinesis in the AWS Console, select Go to Kinesis Firehose and then Create Delivery Stream. (Kinesis Firehose currently supports only some Regions such as US Oregon, Tokyo does not work)

Step1 Create Delivery Stream First, set where to pass the received Log.

Step1.png

Step2 Configuration Set the timing to write the received Log to S3, the presence or absence of compression, etc. You can leave it as Default, but I changed the Buffer interval to 60 (minimum 60 seconds) to set it to write to` S3 every 60 seconds. Now, if Log is coming, Log will be written to S3 as a file every 60 seconds.

Step2.png

Step3 Review This is the final confirmation. If there is no problem, the setting is completed with Create Delivery Stream. Step3.png

Send Log to Kinesis Firehose on Python

Next, send the log to the Delivery Streams called Kinesis-S3-Test created from Python Script.

IAM User settings

Create an IAM User to access Kinesis from a Python script. ʻAllow Permission for AmazonKinesisFirehoseFullAccess` and issue API / Secret Key.

Python Script settings

Create the following Python code.

What I'm doing is sending the current time in the format 2015-11-13 13:34:21 to Kinesis Firehose every second with client.put_record.


#!/usr/bin/env python
# -*- coding: utf-8 -*-

import boto3, datetime, time

def main():
  accesskey = 'AKIAABCDEFGHIJK2345PQ'
  secretkey = 'CUQVma+ilWkC7FOU8isueWKWUGk7GB'
  region    = 'us-west-2'
  client = boto3.client('firehose', aws_access_key_id=accesskey, aws_secret_access_key=secretkey, region_name=region)

  count = 0
  while count < 100 :
    response = client.put_record(
        DeliveryStreamName='Kinesis-S3-Test',
        Record={
            'Data': datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S')
        }
    )
    print(response)
    count += 1
    time.sleep(1)

if __name__ == '__main__':
  main()


Check S3 bucket

Go to S3 in the AWS Console and open a bucket called kinesis-firehose-test1. I think that a Log file is created under the folder kinesis-firehose-test1 / YYYY / mm / DD / hh. Please check if the log sent by Python script is saved.

Step4.png

Summary

For the time being, I sent Log from Python to Kinesis Firehose. I think that it is good for replacing fluentd in combination with kinesis agent. If you put Log in Bucket of S3 and register it in Amazon Elasticsearch with Lambda function in Trigger, Elasticsearch / Kibana can also be managed by Amazon.

However, if ʻAndroid and javascript` are supported in the future, it should be usable like Beacon, so I will support it.

Recommended Posts

Run and see AWS Kinesis Firehose
Kinesis Firehose Note
Develop, run, and deploy AWS Lambda remotely using lambda-uploader
[blackbird-kinesis-stream] Monitoring AWS Kinesis Stream
Run YOLO v3 on AWS v2
Run YOLO v3 on AWS