Build ELK stack on Mac OS X and output JAVA log to Elasticsearch (log4j2)

Hi everyone, long time no see.

ELK stack construction

Preparation

Homebrew and JAVA JDK are required to install ELK. If you do not have it installed, please put it in first. Homebrew Under my working path

$ mkdir homebrew && curl -L https://github.com/Homebrew/brew/tarball/master | tar xz --strip 1 -C homebrew

Java SDK Please refer to Oracle's ** Java SE Downloads **. I chose JDK 8 just in case.

ELK

What is ELK

ELK stack is a general term for Elasticsearch's three products: Elasticsearch (analysis) + Logstash (collection) + Kibana (visualization). (When it actually moves, it moves according to the LEK flow) Elasticsearch Elasticsearch is an open source full-text search engine developed by Elastic. You can quickly extract documents containing the desired word from a large number of documents.

You can't use SQL statements because Elasticsearch is not a relational database. Operate using the RESTful interface instead.

This time, Elasticsearch 6.2.4 will be used.

Installation

$ brew install elasticsearch

Set up the host

First, you will find ʻelasticsearch.yml`.

$ brew info elasticssearch

I think that such information will come out.

...(Blah Blah Blah)...
Data:    /usr/local/var/lib/elasticsearch/elasticsearch_JP23417/
Logs:    /usr/local/var/log/elasticsearch/elasticsearch_JP23417.log
Plugins: /usr/local/var/elasticsearch/plugins/
Config:  /usr/local/etc/elasticsearch/
...(Blah Blah Blah)...

ʻElasticsearch.yml is under the path in Config. Open it and change network.host` to your actual IP address. This time I will build it in the local environment, so I uncommented it and set it like this.

network.host: localhost

Operation check

Start Elasticsearch with brew services elasticsearch start. Elasticsearch runs on port 9200 by default, so if you go to http: // localhost: 9200, you'll see this information.

{
  "name" : "ry1EXQC",
  "cluster_name" : "elasticsearch",
  "cluster_uuid" : "ZMYVLV-eR-G5hBrQ6QmSGA",
  "version" : {
    "number" : "6.2.4",
    "build_hash" : "ccec39f",
    "build_date" : "2018-04-12T20:37:28.497551Z",
    "build_snapshot" : false,
    "lucene_version" : "7.2.1",
    "minimum_wire_compatibility_version" : "5.6.0",
    "minimum_index_compatibility_version" : "5.0.0"
  },
  "tagline" : "You Know, for Search"
}

Logstash Logstash is an open source server-side data processing pipeline. It captures data from a huge number of sources at the same time, transforms it, and sends it to your favorite hangar (stash). The recommended stash is, of course, Elasticsearch.

This time, Logstash 6.2.4 will be used.

Install and launch

$ brew install logstash
$ brew services start logstash

Logstash actually works in conjunction with the config. You need to place a different config for each scene. You should be able to run Kibana with this for the time being, so I will explain in detail later.

Kibana Kibana is an easy visualization tool for Elasticsearch data.

This time I will use Kibana 6.2.4.

Installation

$ brew install kibana

Set port and Elasticsearch

First, you'll find kibana.yml.

$ brew info kibana

I think that such information will come out.

...(Blah Blah Blah)...
Config: /usr/local/etc/kibana/
...(Blah Blah Blah)...

kibana.yml is under the path in Config. Open it and change server.port and ʻelasticsearch.url` to the ones you actually use. This time, I set it like this with the default settings.

...(Blah Blah Blah)...
server.port: 5601 

...(Blah Blah Blah)...

elasticsearch.url: "http://localhost:9200”
...(Blah Blah Blah)...

Verification

If you can do this, you should be able to see it from http: // localhost: 5601 / status. A screen like this will appear. スクリーンショット 2018-09-12 16.56.32.png

Log output by JAVA (log4j2)

This time, we will create a tool to send logs to Elasticsearch by TCP.

Create Maven project

I think that some IDE will automatically generate a Maven project, but please use it. There is no problem if it is generated in this way. For your reference. スクリーンショット 2018-09-12 17.07.29.png The package name this time is log4j2.local (actually anything is fine).

Maven library placement

In other words, it's the contents of pom.xml. Add the following to the dependencies tag.

pom.xml


<dependencies>
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-api</artifactId>
        <version>2.8.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-core</artifactId>
        <version>2.8.2</version>
    </dependency>
</dependencies>

Placement of Log4j2

In other words, it is the contents of log4j2.xml. Log4j2 decides where to send the log accordingly. This time, of course, it will be sent to Elasticsearch via Logstash, but considering the ease of debugging, I made it output to Console for the time being.

log4j2.xml


<configuration status="OFF">
    <appenders>
        <Socket name="Logstash" host="localhost" port="9601" protocol="TCP">
            <PatternLayout pattern="%d|%t|%p|%c|%L|%m%n" />
        </Socket>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="%d|%t|%p|%c|%L|%m%n" />
        </Console>
    </appenders>
    <loggers>
        <root level="all">
            <AppenderRef ref="Logstash"/>
            <appender-ref ref="Console"/>
        </root>
    </loggers>
</configuration>

If you want to personalize, please refer to ** This page **.

Main class

This is the behavior to output the log.

Main.java


package log4j2.local;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.ThreadContext;


public class Main {
    private static Logger logger = LogManager.getLogger("test");

    public static void main(String[] args) {
        logger.error("Error message");
    }
}

When you do this, you'll see a log like this in the Console. (Of course, it is not saved in Elasticsearch because it is not connected to Logstash yet.)

2018-09-12 12:55:13,962|main|ERROR|test|13|Error message

Place Logstash

Logstash config

You can use any path you like and create logstash-tcp.conf. (Since it is tcp communication, I just gave it this name.) By default, Logstash can use all ports 9600-9700, so this time we will use port 9601. (Be sure to use the same port ** as the <Socket> tag of log4j2.xml.)

logstash-tcp.conf


input {
    tcp {
        host => "localhost"
        port => "9601"
        mode => "server"
        type => "eslocallogger"
    }
}

filter {
    mutate{
        split => ["message","|"]
        add_field =>   {
            "Datetime" => "%{[message][0]}"
        }
        add_field =>   {
            "Classname" => "%{[message][1]}"
        }
        add_field =>   {
            "Level" => "%{[message][2]}"
        }
        add_field =>   {
            "Logger" => "%{[message][3]}"
        }
        add_field =>   {
            "Message" => "%{[message][5]}"
        }
    }
}
output {
    stdout {
        codec => rubydebug
    }
    elasticsearch {
        hosts => "localhost:9200"
    }
}

I used Filter to separate the Schemas here, but the default columns are automatically parsed even if I don't use them. (Here, the error messages that come into Logstash are separated by |.)

Connect Logstash to config and restart

$ brew services stop logstash
$ logstash -f ./logstash-tcp.conf --experimental-java-execution

With the --experimental-java-execution flag, it will run on the JAVA engine. For more information, please see ** Meet the New Logstash Java Execution Engine **. Certainly, that will greatly increase the throughput.

Let's run

Console

2018-09-12 18:05:39,737|main|ERROR|test|13|Error message

Logstash Console

[2018-09-12T18:05:40,464][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5caa3237 sleep>"}
{
         "Level" => "ERROR",
    "@timestamp" => 2018-09-12T09:05:39.847Z,
        "Logger" => "test",
          "port" => 60614,
       "message" => [
        [0] "2018-09-12 18:05:39,737",
        [1] "main",
        [2] "ERROR",
        [3] "test",
        [4] "13",
        [5] "Error message"
    ],
          "type" => "eslocallogger",
       "Message" => "Error message",
      "Datetime" => "2018-09-12 18:05:39,737",
          "host" => "localhost",
      "@version" => "1",
     "Classname" => "main"
}

Kibana If you go to Kibana's ** Management ** screen and reload, you should see a screen like this. スクリーンショット 2018-09-12 18.08.52.png Go to Next step, set filter to @timestamp and click Create index pattern to start index visualization. スクリーンショット 2018-09-12 18.11.16.png When you enter the Discover screen, you can see the data on the time axis. スクリーンショット 2018-09-12 18.11.25.png For details, please refer to ** How to use Kibana **.

Reference source

Installing the ELK Stack on Mac OS X ElasticSearch + Logstash + Kibana + log4j2 Official 6.1.1 Edition Anguist Arrangement Use like LogStash Filter Elastic Stack

Recommended Posts

Build ELK stack on Mac OS X and output JAVA log to Elasticsearch (log4j2)
Log output to file in Java
Enable log output to both file and console using log4j in Eclipse.
How to check Java installed on Mac
[Java] How to output and write files!
How to switch Java versions on Mac
Build a Java development environment on Mac
What to do if you have installed Java for OS X on macOS
Try to build Java8 environment on Amazon Linux2
Install java and android-sdk on Mac using homebrew
[Java] How to get and output standard input
Build Java environment and output hello world [Beginner]
Build Java x Spring x VSCode x Gradle on Docker (1)
Build Java development environment with VS Code on Mac
Install java and maven using brew on new mac
Gzip-compress byte array in Java and output to file
I want to simplify the log output on Android