Build Elastic Stack with Docker and analyze IIS logs

Introduction

This is the article on the 23rd day of Elasticsearch Advent Calendar (2020). Paste the IIS log into Excel to create a pivot table and make it a pivot graph ~~ Primitive ~~ I don't want to do any more work, so I built Elastic Stack and it became easy to analyze, so I wrote it in the article I did.

table of contents

-Overview of each software -Processing flow -Building Elastic Stack with Docker Compose -Configuration

Overview of each software

elkb.png

--Elastic Stack is a general term for a group of products consisting of Elasticsearch, Kibana, Beats, and Logstash. --Beats are called data shippers and are used as a data transfer tool. --Automatically detects file updates and transfers differences. --This time, we will use Filebeats. --Logstash is called a data processing pipeline and can take data, convert it, and store it in Elasticsearch. --Elasticsearch is a well-known full-text search engine. By creating an inverted index internally when data is input, a large number of documents can be searched at high speed. --Kibana is used as a tool to visualize Elasticsearch data.

Processing flow

--The processing flow is [Filebeat-> Logstash-> Elasticsearch-> Kibana] --Filebeat monitors the IIS log, and if it detects an update, it forwards it to Logstash. --Convert to json with Logstash and submit it to Elasticsearch. --Visualize Elasticsearch data in Kibana.

Build an Elastic Stack with Docker Compose

Build with Docker Compose. Start the container with docker-compose up -d in the directory where docker-compose.yml is located.

> docker-compose up -d

Constitution

--Store IIS logs in ./filebeat/log. --If the container is running, it will be automatically submitted to Elasticsearch.

.
├─docker-compose.yml
├─.env
├─elasticsearch
│  └─data
├─filebeat
│  ├─conf
│  │  └─filebeat.yml
│  └─log
│      └─u_exyyyymmdd.log
└─logstash
    └─pipeline
        └─logstash.conf

docker-compose.yml

--Build Elasticsearch, Kibana, Logstash, Filebeat. --Elasticsearch is built with a single node. --Mount the volume locally to hold Elasticsearch data. --Graphs and dashboards created in Kibana are also stored here. --Logstash reads the local config file. --Filebeat reads the local config file. --Mount the volume so that Filebeat can see the local logs. --It seems that Filebeat refers to the Docker socket, so mount it.

version: "3"

services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.2.0
    environment:
      - discovery.type=single-node
      - cluster.name=docker-cluster
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms4096m -Xmx4096m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
    ports:
      - 9200:9200
    volumes:
      - ./elasticsearch/data:/usr/share/elasticsearch/data
  kibana:
    image: docker.elastic.co/kibana/kibana:7.2.0
    ports:
      - 5601:5601
  logstash:
    image: docker.elastic.co/logstash/logstash:7.2.0
    ports:
      - 5044:5044
    environment:
      - "LS_JAVA_OPTS=-Xms4096m -Xmx4096m"
    volumes:
      - ./logstash/pipeline:/usr/share/logstash/pipeline
  filebeat:
    image: docker.elastic.co/beats/filebeat:7.2.0
    volumes:
      - ./filebeat/conf/filebeat.yml:/usr/share/filebeat/filebeat.yml
      - ./filebeat/log:/usr/share/filebeat/log
      - /var/run/docker.sock:/var/run/docker.sock
    user: root

.env

Allows Docker for Windows to mount /var/run/docker.sock.

COMPOSE_CONVERT_WINDOWS_PATHS=1

logstash.conf

--Set input to accept transfers from Filebeat. --Process IIS logs. --Set output so that it can be input to Elasticsearch.

input {
# input from Filebeat
  beats {
    port => 5044
  }
}

filter {
  dissect {
  # log format is TSV
    mapping => {
      "message" => "%{ts} %{+ts} %{s-ip} %{cs-method} %{cs-uri-stem} %{cs-uri-query} %{s-port} %{cs-username} %{c-ip} %{cs(User-Agent)} %{cs(Referer)} %{sc-status} %{sc-substatus} %{sc-win32-status} %{time-taken}"
    }
  }
  date {
    match => ["ts", "YYYY-MM-dd HH:mm:ss"]
    timezone => "UTC"
  }
  ruby {
    code => "event.set('[@metadata][local_time]',event.get('[@timestamp]').time.localtime.strftime('%Y-%m-%d'))"
  }
  mutate {
    convert => { 
      "sc-bytes" => "integer"
      "cs-bytes" => "integer"
      "time-taken" => "integer"
    }
    remove_field => "message"
  }
}

output {
  elasticsearch { 
    hosts    => [ 'elasticsearch' ]
    index => "iislog-%{[@metadata][local_time]}" 
  }
}

filebeat.yml

--Set input to reference/usr/share/filebeat/log. --Actually, ./filebeat/log is mounted in/usr/share/filebeat/log, so if you store the IIS log in ./filebeat/log, Filebeat will automatically refer to it. --Set output to forward to Logstash.

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /usr/share/filebeat/log/*.log
  exclude_lines: ['^#','HealthChecker']
  
output.logstash:
  hosts: ["logstash:5044"]

Put the IIS log file

Place the IIS log file in ./filebeat/log and Filebeat will detect it and send it to Logstash. The transmitted data is processed by Logstash and submitted to Elasticsearch.

Visualize with Kibana

Elasticsearch Index Check

Go to http: // localhost: 5601.

00.png

Click the gear icon, then click Elasticsearch/Index Management.

01.png

Make sure that the IIS log is indexed.

Create Kibana Index Pattern

Click Kibana/Index Patterns, then click Create Index pattern.

04.png

Enter the Index pattern and click Next step.

02.png

Select @timestamp for the Time Filter field name and click Create index pattern.

03.png

Select the Index pattern created here and create a graph.

04.png

Graph creation

05.png

Specify the Index pattern you created earlier.

06.png

Narrow down the display period on the upper right.

07.png

Specify the X axis. Set Aggregation to Date Histogram, Field to @timestamp, Minimum interval to Minute, and click ▷.

08.png

The graph now shows the number of requests per minute. To display the number of requests per minute for each function, click Add filter and specify Field (the item you want to filter), Operator (operator), and Value (value).

_09.png

_10.png

Dashboard creation

You can arrange the created graphs on the dashboard.

_kibana_dashboard.png

Recommended Posts

Build Elastic Stack with Docker and analyze IIS logs
Analyze and visualize csv logs with Excel Elastic Stack (docker-compose) --Set up with docker-compose
Analyzing and visualizing csv logs with Excel Elastic Stack (docker-compose) --What is Elastic Stack?
Analyze and visualize csv logs with Excel Elastic Stack (docker-compose) --Receive input from multiple beats with Pipeline-to-Pipeline of Logstash
Build WordPress environment with Docker (Local) and AWS (Production)
Build docker environment with WSL
Build DynamoDB local with Docker
Build Docker Image lightweight and fast with CodeBuild with Santa Banner
Build a Node-RED environment with Docker to move and understand
Elasticsearch> Build with docker, get Twitter information and visualize with Kibana
Build Metabase with Docker on Lightsail and make it https with nginx
Build Couchbase local environment with Docker
Build a Node.js environment with Docker
Analyzing and visualizing csv logs with Excel Elastic Stack (docker-compose) --Two ways to deal with Logstash OutOfMemoryError
Build PlantUML environment with VSCode + Docker
Build environment with vue.js + rails + docker
Hello World with Docker and C
Build Rails environment with Docker Compose
Run logstash with Docker and try uploading data to Elastic Cloud
Create a flyway jar with maven and docker build (migrate) with docker-maven-plugin
Microservices With Docker and Cloud Performance
Build docker + laravel environment with laradock
Build WebRTC Janus with Docker container
Spring Boot gradle build with Docker
Make a daily build of the TOPPERS kernel with Gitlab and Docker
[Copy and paste] Build a Laravel development environment with Docker Compose Part 2
[Copy and paste] Build a Laravel development environment with Docker Compose Participation
Building Rails 6 and PostgreSQL environment with Docker
Build a PureScript development environment with Docker
Build and manage RStudio environment with Docker-compose
Build DNS server with CentOS8 and bind
Communicate Gitlab and Gitlab Runner launched with Docker
Build a Wordpress development environment with Docker
[Docker] Build Jupyter Lab execution environment with Docker
Build an environment with Docker on AWS
Build TensorFlow operation check environment with Docker
How to build Rails 6 environment with Docker
Analyzing and visualizing csv logs with Excel Elastic Stack (docker-compose) --How to deal with data duplication errors in Elasticsearch
Build Apache and Tomcat environment with Docker. By the way, Maven & Java cooperation
Experience .NET 5 with Docker and Visual Studio Code
Build a Laravel / Docker environment with VSCode devcontainer
Build a WordPress development environment quickly with Docker
Build and test Java + Gradle applications with Wercker
Create jupyter notebook with Docker and run ruby
How to build API with GraphQL and Rails
Prepare a scraping environment with Docker and Java
Build mecab (NEologd dictionary) environment with Docker (ubuntu)
[Rails] How to build an environment with Docker
[First team development ②] Build an environment with Docker
Until you build a Nuxt.js development environment with Docker and touch it with VS Code
[Rails] [Docker] Copy and paste is OK! How to build a Rails development environment with Docker
[Rails AWS Docker] Build an existing Ruby on Rails + MySQL application with Docker and deploy it on AWS (5)
[Rails AWS Docker] Build an existing Ruby on Rails + MySQL application with Docker and deploy it on AWS (6)
[Rails AWS Docker] Build an existing Ruby on Rails + MySQL application with Docker and deploy it on AWS (3)
[Rails AWS Docker] Build an existing Ruby on Rails + MySQL application with Docker and deploy it on AWS (2)
[Rails AWS Docker] Build an existing Ruby on Rails + MySQL application with Docker and deploy it on AWS (1)
How to quit Docker for Mac and build a Docker development environment with Ubuntu + Vagrant
[Rails AWS Docker] Build an existing Ruby on Rails + MySQL application with Docker and deploy it on AWS (4)
Analyze and visualize csv logs with Excel Elastic Stack (docker-compose) --Parse "year / month / day, hour: minute: second" in multiline with grok filter and treat it as Japan time
Analyze and visualize csv logs with Excel Elastic Stack (docker-compose)-(1st line: date, 2nd and subsequent lines: csv data) date is added to each line after the 2nd line as a timestamp field.