Let's guess the development status of the city from the satellite image.

Overview

So far, we have acquired observation images from ** Copernicus hub **, which is a hub of Sentinel satellite images, and have obtained trends in the parking situation of Tokyo Disneyland parking lots and the number of vessels in Tokyo Bay.

I checked the usage status of the parking lot from satellite images. I tried to find the trend of the number of ships in Tokyo Bay from satellite images.

This time, as in the previous time, we will obtain satellite images from ** Google Earth Engine ** and estimate the development status of the suburbs of Tokyo over the past five years from satellite images.

Please refer to the above site for how to use Google Earth Engine (hereinafter, GEE). ** If you have a Google account, you can use it with Google colaboratory **, so you don't have to build a difficult environment, so you can experience it relatively easily **.  Screenshot from 2020-09-22 10-44-47.png This is a composite image of Sentinel-1's ** observation image on September 12, 2020 ** and ** observation image on September 21, 2015 ** about 5 years ago. The past image is red and the recent image is blue. From this, some relatively large blue parts can be seen in Odaiba and Tokyo Disneyland. It is estimated that these places have been ** developed in the last 5 years **.

I will introduce the creation method including the explanation of this composite image. The final output creates a kmz file for viewing the above satellite imagery on Google Earth. This file is located at here, so if you want to try it on your Google Earth, please download it. Also, the code introduced in the article is placed on Github, so if you want to try it yourself, please use this. ** Google colaboratory is used **, so you can use it regardless of PC as long as you have a network environment **.

1.First of all

This time, we will use the image taken by the ** SAR satellite ** called the radio wave sensor from the satellite images. The captured image of the SAR satellite is ** difficult to structure **, but it can be qualitatively evaluated whether there are many structures, and it is stable because it is not affected by clouds. can do. However, there is a difference in resolution even with the same SAR image, and I would like to use a high-resolution SAR image if possible. I will use it. The density of structures by SAR images can be estimated to some extent from the images. I don't have a drawing heart, so I will refer to related articles.

Screenshot from 2020-08-16 10-34-57.png Kihon of satellite data-Understanding, type, frequency, resolution, application examples ~

SAR satellites irradiate the earth with radio waves and image the intensity of the reflected signal. At this time, ** radio waves are radiated from an oblique direction with respect to the ground surface **, so if the ground surface is flat as shown in the figure, many things are in the incident direction of the radio waves, that is, the satellite that receives the radiated radio waves. It reflects in the opposite direction. Therefore, ** If the target ground surface is flat, the received signal is weak and the image will be dark. The surface of water such as lakes and the sea is extremely flat, so it looks darker **. On the other hand, when there is a structure such as a building, a building, or a car, the object is uneven, so more radio waves are reflected in the same direction as the incident direction, so the image looks bright. For example, the observation image of the radio satellite of Tokyo Disneyland is as follows.

Screenshot from 2020-08-16 10-41-01.png

You can see the facilities at Disneyland well. Also, as you probably know if you went to Disneyland, you can see that there is a parking lot around it, which is wide and flat, so it looks dark. Looking at the Disneyland homepage, it is planned to develop a large-scale expansion project for Tokyo DisneySea in the place that was a parking lot in 2015. , It turned out that the attraction is being built here at this time. In addition, Odaiba and the Gulf area are being developed in line with events such as the Tokyo Olympics, and you may be able to see the pattern from satellite images. The artificial satellites used this time are Sentinel-1 and 2 developed and operated by the European Space Agency. For details of each, please refer to the following articles.

How to get the latest satellite imagery for free.

The observed image of the analysis evaluation is Sentinel-1, which is a SAR satellite. The Sentinel-1 is composed of two satellites, ** one has 12 days of return days ** (meaning that it passes through the same area under the same conditions every 12 days), but it has two satellites. Therefore, we observe many areas ** once every 6 days **. This ** regular observation ** is a major feature of the Sentinel satellite and an advantage on the user side. Screenshot from 2020-08-16 10-50-29.png [Sentinel-1 Observation Scenario] (https://sentinel.esa.int/web/sentinel/missions/sentinel-1/observation-scenario)

In addition, the observation time of artificial satellites is almost fixed, ** for optical satellites, around 10:30 am, and for SAR satellites, around 6:00 am and 6:00 pm **. Can you see people from artificial satellites? ~ By satellite, ground resolution / local time summary ~ Screenshot from 2020-09-22 11-03-12.png

For Sentinel-1, which was introduced earlier, the observation time is around 6 am for satellite A and around 6 pm for satellite B. Since each satellite has a different irradiation direction of radio waves, it is not appropriate to compare using both satellite images because the difference can be captured. Therefore, this time, we will use only the image of satellite B to acquire the past image to be compared with the recent satellite image and create a composite image of it. The purpose of creating a composite image is to make the changes visually easy to understand. By making the recent image blue and the past image red, the part that appears blue indicates that the signal has become stronger, and the part that appears red indicates that the signal has become weaker. Since the strength of the signal depends on the presence or absence of buildings and the density, ** the blue part indicates that development has progressed **, and ** the red part indicates that the structure has disappeared ** (flattened). It is. Next, we will introduce the acquisition of satellite images from GEE and its evaluation.

2. Acquisition and evaluation of satellite images

2.1 Environment preparation (construction).

I have introduced it so far, but since some of you are new to this article, I will duplicate it. If you know, please skip to 2.4. Please refer to the following article for details on how to acquire satellite images with GEE. Artificial satellite image analysis by Google Earth Engine and Google Colab-Satellite image analysis starting for free (Introduction)- ** Easy to use for free if you have a Google account **. It has become an amazing world. The acquired data such as satellite images are saved in Google Drive. Since the code and data are deleted every time you use Google Colaboratory, it is convenient to have the data in your own Google Drive. However, since the capacity of Google dirve for free use is 20GB, it will run out as soon as a large satellite image is downloaded, so please be careful to delete it as appropriate. Now, let me introduce the code.

import ee
import numpy as np
import matplotlib.pyplot as plt

ee.Authenticate()
ee.Initialize()

First, execute this to authenticate the GEE connection. When you execute it, the link will be returned. Click on it to perform the authentication procedure, copy the access code and enter it.

Next, authenticate the connection to Google Drive. Again, the flow is the same as GEE certification.

from google.colab import drive
drive.mount('/content/drive')

Next, we will perform work such as viewing the acquired satellite image and installing the modules necessary for digitizing and analyzing it.

#Package installation&import
!pip install rasterio
import numpy as np
import matplotlib.pyplot as plt
import rasterio

import json
import os
import glob

import time
from datetime import datetime
from dateutil.parser import parse

Frequently used modules are already installed in Google Colaboratory, so no additional work is required, but this time we will use Geotiff, which is an image with map information added, so it is necessary for image processing ** Rasterio Install **.

Next, install a module called ** folium ** to check the set target area on the map.

!pip install folium

import folium

Now that the environment is ready, take a satellite image from GEE.

2.2 Setting the area of interest (target area)

In order to obtain satellite images from GEE, you need to enter ** latitude / longitude information ** of the target area you are interested in. I somehow remember the latitude and longitude of Japan from the geography class at school and the globe, but what is the latitude and longitude of my home? Even if it is said, I think that you can finally find out by searching and searching. (Of course, I don't remember either.) Therefore, I made the following so that the latitude and longitude of the area of interest can be easily investigated and obtained.

#Acquisition of polygon information of the area of interest.
from IPython.display import HTML
HTML(r'<iframe width="1000" height="580" src="https://gispolygon.herokuapp.com/" frameborder="0"></iframe>')

When you execute this, the following screen will be displayed.

Screenshot from 2020-08-16 11-17-18.png

After expanding the area of interest, select a square polygon from the icon on the left to display the polygon of the area of interest. Then click ** Show Features ** to see the polygon geographic information in the right window. Then click ** Copy ** at the bottom to copy this geographic information. For example, in the case of the parking lot of Tokyo Disneyland, which is the target of this time, it will be as follows. Screenshot from 2020-09-22 11-07-07.png

Then paste the copied map information below and enter it.

A = {"type":"FeatureCollection","features":[{"properties":{"note":"","distance":"56641.31 m","drawtype":"rectangle","area":"38085.35 ha"},"type":"Feature","geometry":{"type":"Polygon","coordinates":[[[139.7144973278046,35.51642373317183],[139.7144973278046,35.6731886200871],[139.95559573173526,35.6731886200871],[139.95559573173526,35.51642373317183],[139.7144973278046,35.51642373317183]]]}}]}

This geographic information is processed into the input format for GEE and for display in Folium.

#Set any file name to be used in the future. For example, the name of the area.
object_name = 'TokyoBay_RGB'
with open(str(object_name) +'_2.geojson', 'w') as f:
    json.dump(A, f)

json_file = open(str(object_name) +'_2.geojson')
json_object = json.load(json_file)

#Extract only the latitude / longitude information of the area of interest from json.
AREA = json_object["features"][0]["geometry"]['coordinates'][0]

area = pd.DataFrame(AREA, columns=['longtitude', 'latitude'])

area_d =[[area['longtitude'].min(), area['latitude'].max()],
 [area['longtitude'].max(), area['latitude'].max()],
 [area['longtitude'].max(), area['latitude'].min()],
 [area['longtitude'].min(), area['latitude'].min()],
 [area['longtitude'].min(), area['latitude'].max()]]

AREA = area_d

Now, let's check the set area of interest.

m = folium.Map([(AREA[0][1]+AREA[len(AREA)-2][1])/2,(AREA[0][0]+AREA[len(AREA)-3][0])/2], zoom_start=11)

folium.GeoJson(str(object_name) +'_2.geojson').add_to(m)
m

output Screenshot from 2020-09-22 11-05-54.png

2.3 Acquisition of satellite images from GEE

A lot of satellite images and a lot of information already analyzed are set in GEE. For details, please refer to Data Catalog. Sentinel-1 and 2 are as follows.

Sentinel-1 SAR GRD: C-band Synthetic Aperture Radar Ground Range Detected, log scaling Sentinel-2 MSI: MultiSpectral Instrument, Level-1C

From this page, ** Sentinel-1 observation images can be used from October 3, 2014 **, and ** Sentinel-2 data can be used from June 23, 2015 **. As for the observation image of Sentinel-2, the image of Level 2A of atmospheric correction with the atmospheric haze removed is also prepared, but only the image after March 28, 2017 when the analysis became standard is targeted. It becomes. Therefore, if you want to use an older observation image, please use this image.

Now, get the images of Sentinel-1 and 2 from GEE and save them in Google Colaboratory.

First, prepare the format of the geographic information to be set in GEE.

region=ee.Geometry.Rectangle(area['longtitude'].min(),area['latitude'].min(), area['longtitude'].max(), area['latitude'].max())

Next, set the parameters of the information to be acquired. This time, the period of the acquired image and the save destination of the acquired image are specified. This time, in order to acquire satellite images of different days, we will first acquire the latest satellite images. Here, it is September 20, 2020.

#Specify the due date
import datetime

date = datetime.date(2020, 9, 20)

dt_now = datetime.date.today()
dt = dt_now - date

if dt.days < 6:
  dt_nd = datetime.timedelta(days=12)
  dt_pd = datetime.timedelta(days=0)
else:
  dt_nd = datetime.timedelta(days=6)
  dt_pd = datetime.timedelta(days=6)

#Observation period
from_date= date - dt_nd
to_date= date + dt_pd

from_date = from_date.strftime('%Y-%m-%d')
to_date = to_date.strftime('%Y-%m-%d')


#Folder name to save
dir_name_s1 = 'GEE_Sentinel1_' + object_name
dir_name_s2 = 'GEE_Sentinel2_' + object_name

Here, since the observation frequency of Sentinel-1 is once every 12 days, if the set date is close to the time of analysis, the past image will be acquired, and if it is a certain previous image, the image for 6 days before and after that will be acquired. did.

Now, let's set the image conditions for Sentinel-1 and 2. The image of Sentinel-2 is not used this time, but it is acquired for reference.

def cloudMasking(image):
    qa = image.select('QA60')
    cloudBitMask = 1 << 10  
    cirrusBitMask = 1 << 11
    mask = qa.bitwiseAnd(cloudBitMask).eq(0).And(qa.bitwiseAnd(cirrusBitMask).eq(0))
    return image.updateMask(mask).divide(10000)

def ImageExport(image,description,folder,region,scale):
    task = ee.batch.Export.image.toDrive(image=image,description=description,folder=folder,region=region,scale=scale)
    task.start()

#In order to compare the images observed by the same satellite, only Ascending or Decsending is used.
Sentinel1 = ee.ImageCollection('COPERNICUS/S1_GRD').filterBounds(region).filterDate(parse(from_date),parse(to_date)).filter(ee.Filter.eq('orbitProperties_pass', 'ASCENDING')).select(['VV'])
Sentinel2 = ee.ImageCollection('COPERNICUS/S2').filterBounds(region).filterDate(parse(from_date),parse(to_date)).filterMetadata('CLOUDY_PIXEL_PERCENTAGE','less_than', 10).map(cloudMasking).select(['B4','B3','B2'])

imageList_s1 = Sentinel1.toList(300)
imageList_s2 = Sentinel2.toList(300) 

Here, since only the ** B satellite observation image ** is used as the observation image of Sentinel-1, 'ASCENDING' is selected in'orbitProperties_pass'. When set to "Descending", the observation image will be at 6:00 am.

Now, under the above satellite image acquisition conditions, the images of the Sentinel-1 and Sentinel-2 areas of interest are acquired.

for i in range(imageList_s1.size().getInfo()):
    image = ee.Image(imageList_s1.get(i))
    ImageExport(image.reproject(crs='EPSG:4326',scale=10),image.get('system:index').getInfo(),dir_name_s1,region['coordinates'][0],10)
for i in range(imageList_s2.size().getInfo()):
    image = ee.Image(imageList_s2.get(i))
    ImageExport(image.reproject(crs='EPSG:4326',scale=10),image.get('system:index').getInfo(),dir_name_s2,region['coordinates'][0],10)

2.4 Satellite image display and smoothing process by filter

Now, let's display and confirm the satellite image acquired in 2.3. Satellite images are saved in the directory (folder) set in my drive of Google drive. Call it and display it. First, from the observation image of Sentinel-2.

#Visualization in chronological order
s2_path = '/content/drive/My Drive/' + dir_name_s2 + '/'
files =os.listdir(s2_path)
files.sort()

plt.figure(figsize=(25, 25))
j=0

v = len(files)//5 +1 
for i in range(len(files)):
  #Acquire and visualize images one scene at a time
  with rasterio.open(s2_path + files[i]) as src:
      arr = src.read()
  j+=1#Shift and place the plot position of the image
  plt.subplot(v,5,j)
  arrayImg = np.asarray(arr).transpose(1,2,0).astype(np.float32)*2 #Brightness is doubled and brightness is corrected.
  plt.imshow(arrayImg)
  plt.title(files[i][0:8])#Get the date from the file name
  #plt.tight_layout()

Screenshot from 2020-09-22 11-13-15.png

** The resolution of the observed image of Sentinel-2 is 10m **, but if it is a large structure, you may know whether it was constructed or not. Since the observation images of optical satellites are familiar to us on Google Maps, we can easily find them because the purpose (location) is specified when using this service. However, this can be noticed if there is information that ** the structure would have been built there **, but find out if the structure is built somewhere in this range. , Will be difficult ** because there is no ** index. Therefore, this time, the observation image of Sentinel-2 is not used for evaluation, but by showing the change by comparing the observation image of Sentinel-1, ** it is possible to easily find the difference **. Now, let's display the observed image of Sentinel-1.

#Visualization in chronological order
s1_path = '/content/drive/My Drive/' + dir_name_s1 + '/'
files =os.listdir(s1_path)
files.sort()

plt.figure(figsize=(20, 40))
j=0


v = len(files)//5 +1 
for i in range(len(files)):
  #Acquire and visualize images one scene at a time
  with rasterio.open(s1_path + files[i]) as src:
      arr = src.read()
  #print(arr[0].shape)
  j+=1#Shift and place the plot position of the image
  plt.subplot(v,5,j)
  plt.imshow(arr[0], cmap='gray')
  plt.title(files[i][33:41])#Get the date from the file name
  date0 = files[i][33:41]
  plt.tight_layout()

Screenshot from 2020-09-22 11-39-26.png From this, two observation images with the same observation date have been acquired. It is thought that the reason why the two observation images are separated in the same area is that the standard size of the satellite image of Sentinel-1 is 100km ☓ 100km, and the area of interest this time crossed it. Therefore, the satellite image divided into two is first combined into one.

#Data reading
n = 0

with rasterio.open(s1_path + files[n]) as src:
    arr = src.read()

print(files[n][33:41])
#Visualization
plt.imshow(arr[0], cmap='gray')

type(arr[0])
image1 =np.zeros(arr[0].shape)

image_1 = np.nan_to_num(arr[0])
#Data reading
n = 1

with rasterio.open(s1_path + files[n]) as src:
    arr = src.read()

print(files[n][33:41])
#Visualization
plt.imshow(arr[0], cmap='gray')

type(arr[0])
image2 =np.zeros(arr[0].shape)

image_2 = np.nan_to_num(arr[0])

Here, the area (white part) that has not been acquired in the area of interest is ** (Nan) ** with no data, so this is replaced with ** 0 **. Next, combine these two images.

image_BG = image_1 + image_2
plt.imshow(image_BG, cmap='gray')

Screenshot from 2020-09-22 11-43-36.png

Now you have prepared the satellite image of the SAR satellite on September 12, 2020. In the image of the SAR satellite, the noise of many bright spots called ** speckle noise ** is generated by the reflection from the structure on the ground. This is caused by interference during reflection because the radio waves used by SAR satellites are waves. ** Smoothing processing ** is required because this noise may affect the subsequent processing. This time, we used ** Lee filter **, which is a general SAR image smoothing process. The prepared Lee filter function is as follows.

from scipy.ndimage.filters import uniform_filter
from scipy.ndimage.measurements import variance

def lee_filter(img, size):
    img_mean = uniform_filter(img, (size, size))
    img_sqr_mean = uniform_filter(img**2, (size, size))
    img_variance = img_sqr_mean - img_mean**2

    overall_variance = variance(img)

    img_weights = img_variance / (img_variance + overall_variance)
    img_output = img_mean + img_weights * (img - img_mean)
    return img_output

Now, let's perform smoothing processing. Here, smoothing processing within 3 pixels was performed.

image_BG_lee = lee_filter(image_BG, 3)

plt.imshow(image_BG_lee,cmap='gray')

Screenshot from 2020-09-22 11-48-40.png

You can't see the effect of the filter with the size of this satellite image. Please check the difference by downloading the created image. Next, get the past satellite images to be compared. Here, the image was taken 5 years ago.

#Specify the observation date of the base image
date = datetime.date(2015, 9, 20)
print(date)
#Observation period

from_date= date - dt_nd
to_date= date + dt_pd

from_date = from_date.strftime('%Y-%m-%d')
to_date = to_date.strftime('%Y-%m-%d')


#Folder name to save
dir_name_s1 = 'GEE_Sentinel1_R_' + object_name
dir_name_s2 = 'GEE_Sentinel2_R_' + object_name
for i in range(imageList_s1.size().getInfo()):
    image = ee.Image(imageList_s1.get(i))
    ImageExport(image.reproject(crs='EPSG:4326',scale=10),image.get('system:index').getInfo(),dir_name_s1,region['coordinates'][0],10)

Now, let's display the acquired image.

#Visualization in chronological order
s1_path = '/content/drive/My Drive/' + dir_name_s1 + '/'
files =os.listdir(s1_path)
files.sort()

plt.figure(figsize=(20, 40))
j=0


v = len(files)//5 +1 
for i in range(len(files)):
  #Acquire and visualize images one scene at a time
  with rasterio.open(s1_path + files[i]) as src:
      arr = src.read()
  print(arr[0].shape)
  #print(image)
  j+=1#Shift the plot position of the image
  plt.subplot(v,5,j)
  #image =  arr[0].fillna(0)
  plt.imshow(arr[0], cmap='gray')
  plt.title(files[i][33:41])#Get the date from the file name
  date1 = files[i][33:41]
  plt.tight_layout()

Screenshot from 2020-09-22 11-51-34.png

This time, the image different from the previous one is not divided, and it is one image including the entire area of interest. This image is also smoothed with a ** filter ** as before.

image_R_lee = lee_filter(image_BG, 3)
plt.imshow(image_R_lee,cmap='gray')

Next, create a ** RGB image ** that combines the latest satellite image and the past satellite image. This time, ** past images are red ** and recent images are blue and green **.

2.5 Create RGB composite image of recent and past satellite images

Now, create a ** RGB composite image ** from each satellite image.

#RGB composition
RGB = np.dstack((image_R, np.dstack((image_BG_lee, image_BG_lee))))

print(RGB.shape)
plt.figure(figsize=(15, 20))
plt.imshow(RGB)
plt.show()

Screenshot from 2020-09-22 12-00-05.png

From this image, you can see the blue spots in Odaiba and Maihama. ** The red and blue dots in Tokyo Bay indicate the ship **, which shows the ship on the shooting date. There are other places that appear in red, and it seems that the buildings disappeared or gathered together to become one big thing due to the development. It can be said that the red color indicates development in that sense as well.

Then, save the image acquired here in jpeg with the observation date and credits.

import cv2
from PIL import Image, ImageDraw, ImageFont

img_bgr = np.dstack((image_R, np.dstack((image_BG_lee, image_BG_lee))))

new_arr = ((img_bgr - img_bgr.min()) * (1/(img_bgr.max() - img_bgr.min()) * 255)).astype('uint8')


im_rgb = cv2.cvtColor(new_arr, cv2.COLOR_BGR2RGB)
cv2.imwrite(str(object_name) +'.jpg', im_rgb )

img = Image.open(str(object_name) +'.jpg')

plt.figure(figsize=(15, 20))

plt.imshow(img)
plt.show()

Screenshot from 2020-09-22 13-03-09.png As it is, the image is white as a whole, and it is difficult to see the changing blue and red. Now, do the following to change the contrast. This is your favorite.

import PIL.Image
from PIL import ImageEnhance

IMAGE_PATH = str(object_name) +'.jpg'
CONTRAST = 2.0

img = PIL.Image.open(IMAGE_PATH)

#Change the contrast
contrast_converter = ImageEnhance.Contrast(img)
contrast_img = contrast_converter.enhance(CONTRAST)

#Save image
contrast_img.save(str(object_name) +'.jpg')

plt.figure(figsize=(15, 20))

plt.imshow(contrast_img)
plt.show()

Screenshot from 2020-09-22 13-04-41.png The change between blue and red is easier to see than before. Next, the shooting date and credit of the satellite image are described in this image. The fonts used here are those that are open to the public for free on the net. This is also your favorite.

#Font file download and settings
!wget https://osdn.net/dl/mplus-fonts/mplus-TESTFLIGHT-063a.tar.xz

!xz -dc mplus-TESTFLIGHT-*.tar.xz | tar xf -

fontfile = "./mplus-TESTFLIGHT-063a/mplus-1c-bold.ttf"

Now, write the characters on the image.

img = Image.open(str(object_name) +'.jpg')

img = img.convert('RGB')


x = int(img.size[0]/1.21) #Setting the date description position
y = int(img.size[1]/20) #Setting the date description position
fs = int(img.size[0]/70) #Date font size setting

obj_draw = ImageDraw.Draw(img)
obj_font = ImageFont.truetype(fontfile, fs)
obj_draw.text((x, y), 'R: '+str(date1), fill=(255, 255, 255), font=obj_font)
obj_draw.text((x, y+1.1*fs), 'G: '+str(date0), fill=(255, 255, 255), font=obj_font)
obj_draw.text((x, y+2.2*fs), 'B: '+str(date0), fill=(255, 255, 255), font=obj_font)
obj_draw.text((img.size[0]/2.0, img.size[1]-y*0.1 - img.size[1]/30 ), 'Contains modified Copernicus Sentinel data (2020)', fill=(255, 255, 255), font=obj_font)

img = img.resize((int(img.size[0] / 2) , int(img.size[1] / 2)))

#img = img.convert('L')


img.save(str(object_name) +'.jpg')

plt.figure(figsize=(15, 20))

plt.imshow(img)
plt.show()

Screenshot from 2020-09-22 13-07-23.png Now you have the necessary information in the image. Change the above code to adjust the character position, size, color, etc. to your liking. Next, create a ** kmz file ** so that the images acquired here can be viewed on a local PC with a ** GIS application ** (for example, ** Google Earth **).

2.6 Create a kmz file for viewing with GIS software

For how to create a kmz file, please refer to the following article introduced in detail earlier.

Data analysis on the free environment platform "Tellus" for artificial satellite images. -Google Earth-

I think there are several ways to create a kmz file, but this time I will use ** simplekml ** provided by python. Install and use the module

!pip install simplekml

import simplekml
import zipfile

Next, read the latitude / longitude information of the image.

area = pd.DataFrame(AREA,
                  columns=['longtitude', 'latitude'])

north = area["latitude"].max()
south = area["latitude"].min()
east = area["longtitude"].max()
west = area["longtitude"].min()

Next, create a kml file (latitude / longitude information) of the above area of interest.

#Output geographic information in kml.
kml = simplekml.Kml()
ground = kml.newgroundoverlay(name=str(object_name))
ground.icon.href = str(object_name) +'.jpg'
ground.latlonbox.north = north
ground.latlonbox.south = south
ground.latlonbox.east = east
ground.latlonbox.west = west
ground.latlonbox.rotation = 0
kml.save(str(object_name)+".kml")

Then, define the link between the created ** kml file ** and the image to be viewed, and create a ** kmz file **.

#Outputs kmz that combines geographic information and images.
with zipfile.ZipFile(str(object_name)+'.kmz', "w", zipfile.ZIP_DEFLATED) as zf:
    zf.write(str(object_name)+".kml", arcname=str(object_name)+".kml")
    zf.write(str(object_name) +'.jpg', arcname=str(object_name) +'.jpg')

The kmz file is now created. You can check the kmz file created by Google Colaboratory. If you select the right side of this file with the mouse, ** download ** will be displayed, so download the file.

Screenshot from 2020-09-22 13-17-24.png

If ** Google Earth ** is installed on your device, double-click the downloaded ** kmz file ** to automatically start Google Earth and paste the created image on the map.

Screenshot from 2020-09-22 13-19-43.png Check the blue or red location in the image, and deselect the kmz file to see where the location is. ** Is the blue part of Odaiba the place to put the container? It turned out to be **. Other than that, the parking lot at Tokyo Disneyland is partly blue **, which is presumed to be the planned construction site for DisneySea's ** new attraction **. (Aerial photographs of Google Earth have a time slider, so please check the shooting date and past ones of the relevant part.)

3. Finally

We introduced how to use the Google Earth Engine provided by Google to acquire satellite images of the desired day and see how the city is changing and where it is changing significantly. Here, we evaluated mainly Tokyo Bay, and found that development is actively progressing in ** Odaiba and Tokyo Disneyland **. If you look at another place in the same way, you can extract the developing city. In addition, since the time change of the city situation can be seen, damage caused by disaster factors other than development can be identified by the same method.

Guidebook for Utilization of Artificial Satellites in the Event of a Disaster Flood Damage Version / Satellite Basics

Personally, I think the best way to use satellite imagery is to extract such changes. Satellite images are surface information, and it is possible to obtain long-term differences between past and present images and trends between them (for example, the number of parking lots introduced in the previous article). There are various purposes, but the method is ** change extraction **. It is fun not only to browse satellite images, but also to compare the past and present to guess what is happening. I hope this will be an opportunity for more people to be interested in satellite imagery. If you have any comments or questions, please feel free to comment.

Reference article

I checked the usage status of the parking lot from satellite images. I tried to find the trend of the number of ships in Tokyo Bay from satellite images. How to get the latest satellite imagery for free.

Kihon of satellite data-Understanding, type, frequency, resolution, application examples ~ Can you see people from artificial satellites? ~ By satellite, ground resolution / local time summary ~

Sentinel-1 SAR GRD: C-band Synthetic Aperture Radar Ground Range Detected, log scaling Sentinel-2 MSI: MultiSpectral Instrument, Level-1C

Guidebook for Utilization of Artificial Satellites in the Event of a Disaster Flood Damage Version / Satellite Basics

Recommended Posts

Let's guess the development status of the city from the satellite image.
Let's cut the face from the image
I checked the usage status of the parking lot from satellite images.
Let's search from the procession
Remove the frame from the image
How to get the pixel value of the point from the satellite image by specifying the latitude and longitude
[Python] Try to graph from the image of Ring Fit [OCR]
List of disaster dispatches from the Sapporo City Fire Department [Python]
Existence from the viewpoint of Python
Visualize the response status of the census 2020
Let's decide the winner of bingo
Let's talk about the tone curve of image processing ~ LUT is amazing ~
Aggregate by prefecture / city / ward / town / village from the PDF of the weekly Bellmark reception status of the Bellmark Education Grant Foundation
Learning notes from the beginning of Python 1
Omit BOM from the beginning of the string
Let's claim the possibility of pyenv-virtualenv in 2021
Let's summarize the construction of NFS server
Let's investigate the mechanism of Kaiji's cee-loline
Learning notes from the beginning of Python 2
Analyze the usage status of the contact confirmation application (COCOA) posted in "Image" / Tesseract