I'm running various servers, but I want to plot which country they are accessing from! (With Elasticsearch + Kibana + Python3) But I don't use Logstash (I can't use it for some reason) ... (normally not)
ubuntu 18.04 Elasticsearch 7.4 Kibana 7.4
sudo pip3 install python-geohash
sudo pip3 install geoip2
sudo pip3 install elasticsearch
sudo pip3 install pytz
Download "GeoLite 2 City (MAXMIND DB version)" from the MAXMIND page MAXMIND
Log file example
DATE,IP,SRC_PORT,DST_PORT,SIZE,DATA
2019/11/05 19:00:00,1.2.3.4,44455,80,180,474554202F20485454502F312E31 ・ ・ ・
2019/11/05 19:00:00,2.3.4.5,44456,80,180,474554202F20485454502F312E31 ・ ・ ・
・ ・ ・
I want to plot this IP on Kibana as location information ...
Geohash must specify the type as "" type ":" geo_point "" for it to be recognized as location information on Kibana. Go to Kibana and run the following query on the DevTool screen.
PUT geoip_map
{
"mappings": {
"properties": {
"add_time": {
"type": "date",
"format": "yyyy/MM/dd HH:mm:ss||yyyy/MM/dd||epoch_millis"
},
"data": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"geohash": {
"type": "geo_point"
},
"ip": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"latitude": {
"type": "float"
},
"longitude": {
"type": "float"
},
"src_port": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"dst_port": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"size": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
The container is now complete.
import pytz
import geoip2.database
import geohash
from datetime import *
from elasticsearch import Elasticsearch
ELASTIC_HOST = "127.0.0.1"
elasticobj = Elasticsearch(ELASTIC_HOST + ":9200")
with open("logfile.txt","r") as f:
logdata = f.read()
loglst = logdata.replace('\r','').split('\n')
reader = geoip2.database.Reader('GeoLite2-City.mmdb')
for line in loglst:
cols = line.split(',')
#Set information such as latitude and longitude using geoip2
resp = reader.city(cols[1])
#Calculate geohash Accuracy is by default
str_geohash = geohash.encode(resp.location.latitude,resp.location.longitude)
#You don't need to register with Elasticsearch, but this code also saves raw data for latitude and longitude.
#Actually change these values according to the actual log data
elasticobj.index(index="geoip_map",doc_type="_doc",body={"geohash":str_geohash,"latitude":resp.location.latitude,"longitude":resp.location.longitude,"add_time":cols [0].replace("-"," "),"ip":cols [1],"src_port":cols [2],"dst_port":cols [3],"size":cols [4],"data":cols [5]})
The procedure to make geoip_map visible in Kibana is easy ...
Write "geoip *" etc. in the place where Management ⇒ index pattern etc. is written and click the Create button
I think that the date and time information field is automatically detected, so select add_time and click the Create button.
Now you can see the data on the Discover screen.
Click Maps on the next screen (you can do the same with RegionMap and CoodinateMap)
Press Add layer and select the item that already says to use Index on Kibana, Where it says Select Index Pattern, select the pattern created by "geoip *" etc. (If you need to specify a location field, select the geohash field.)
You can plot like this.
For the time being, it's completed! It seems that you can use a cooler map, so let's try it later.
Recommended Posts