Thanks! An engineer in charge of the product inspection process in the production engineering department. It is a continuation of Analyzing and visualizing csv logs with Excel Elastic Stack (docker-compose) --What is Elastic Stack.
This article is intended for those who are new to Elastic Stack and who are thinking about trying it out.
Date,2020/10/30,12:20:50
I have put a set of configuration files in GitLab, so please refer to it. Click here for repository-> elastic-stack
First of all, the grok-patterns provided as standard are like this. If you use DATE, the date will not be in order. Also, TIMESTAMP_ISO8601 is separated by hyphens. The description of custom patterns on the official website is here.
# Months: January, Feb, 3, 03, 12, December
MONTHNUM (?:0?[1-9]|1[0-2])
MONTHDAY (?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])
# Years?
YEAR (?>\d\d){1,2}
# datestamp is YYYY/MM/DD-HH:MM:SS.UUUU (or something like it)
DATE_US %{MONTHNUM}[/-]%{MONTHDAY}[/-]%{YEAR}
DATE_EU %{MONTHDAY}[./-]%{MONTHNUM}[./-]%{YEAR}
TIMESTAMP_ISO8601 %{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?
DATE %{DATE_US}|%{DATE_EU}
Prepare DATE_JP arranged in order of date, and combine TIMESTAMP_JP with standard TIME separated by commas.
logstash/extra_patterns/date_jp
DATE_JP %{YEAR}[/-]%{MONTHNUM}[/-]%{MONTHDAY}
TIMESTAMP_JP %{DATE_JP}[,]%{TIME}
Assign the created date_jp file to / opt / logstash / extra_patterns
.
docker-compose.yml
logstash01:
build: ./logstash
container_name: logstash01
links:
- es01:elasticsearch
volumes:
- ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml
- ./logstash/config/jvm.options:/usr/share/logstash/config/jvm.options
- ./logstash/config/log4j2.properties:/usr/share/logstash/config/log4j2.properties
- ./logstash/config/pipelines.yml:/usr/share/logstash/config/pipelines.yml
- ./logstash/pipeline/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
- ./logstash/extra_patterns/date_jp:/opt/logstash/extra_patterns
networks:
- esnet
Set extra_patterns in patterns_dir. Create a read_timestamp field using the custom pattern TIMESTAMP_JP.
logstash.conf
filter {
grok {
patterns_dir => ["/opt/logstash/extra_patterns"]
match => { "message" => "%{TIMESTAMP_JP:read_timestamp}" }
}
}
Since logstash defaults to a string type, read_timestamp is also a string type. Since it is not recognized as a time stamp, it is converted to date type using the date filter. Set the timezone of the date filter to Asia / Tokyo and assign the target to @timestamp.
logstash.conf
filter {
grok {
patterns_dir => ["/opt/logstash/extra_patterns"]
match => { "message" => "%{TIMESTAMP_JP:read_timestamp}" }
}
date {
match => ["read_timestamp", "yyyy/MM/dd,HH:mm:ss"]
timezone => "Asia/Tokyo"
target => "@timestamp"
}
}
Described how to parse dates and set timezones using custom patterns. In the future, I would like to introduce how to handle csv files.
Recommended Posts