This article was posted as Day 21 of Cisco Advent Calendar 2019 between Cisco Systems GK. What is the connection between Cisco and satellite data in the first place? I think there is a question, so I would like to introduce it including that.
--Understanding what satellite data is --Understanding what the satellite data platform Tellus is --Understanding how to use the Tellus development environment --Satellite data can be analyzed using the Tellus development environment --Understanding what NDVI and NDWI are
--Display high-definition satellite images by selecting the date and position Tue, 31 Jul 2018 12:42:49 GMT, latitude 35.675 --35.605 east longitude 139.745 --139.835
--Visualize vegetation status (green area) from satellite images
--Quantify vegetation status (NDVI)
Tellus is Japan's first open and free satellite data platform aimed at creating a new business marketplace using government satellite data. It multiplies multiple data and provides all functions to promote new business creation. https://www.tellusxdp.com/ja/about
Until now, when using satellite data, it was necessary to handle image data, which costs hundreds of thousands of yen per image, in a special data format. There are various issues in terms of price, storage, and computing resources, and despite the abundance of satellite data, utilization has been very limited. In order to solve these problems, the Ministry of Economy, Trade and Industry launched a three-year project up to 2020, "Open & Free Government Satellite Data and Data Utilization Promotion Project", which will be the suspension of its existence. It will be Tellus.
Regarding the relationship between Cisco and the space business, Cisco frequently introduced IP routers into outer space in 2003 (experimental satellites) and 2009 (commercial satellites). There are four types of satellites: meteorological satellites, observation satellites, positioning satellites, and communication satellites. Of these, Cisco has worked on communications satellites in the past. https://www.cisco.com/c/ja_jp/about/newsroom/archives/us/2010/prod-072010b.html
At Tellus, various businesses, research institutes, and organizations are participating in various fields as the xData Alliance, and Cisco is also participating from February 21, 2019. This time, we are aiming to create a new business using observation satellites in the above types. https://www.xdataalliance.com
Recently, a service design that provides a new urban experience called Tokyo Moonshot Challenge We also hold an idea creation / realization contest, so if you are interested, please join us.
Now let's actually use Terraus to utilize satellite data. The procedure for using it is as follows.
Access Tellus, enter the required information, and register as a user. There may be a few places to enter, but don't be discouraged.
After completing user registration, let's actually use Terrasus. There are two types of usage environments in Terrass. First, log in to Terrass OS from the button at the top right of the Terrass web page or below.
APIs that can be used from Python and R are available. Not only API but also environment such as Computing, Storage, GPU is provided. The development environment can be accessed from the Terrass OS. You can access the development environment by clicking the button below on the Tellus OS.
An application is required to access the development environment.
Go to My Page
Select and apply for the required development environment in the development environment application on My Page. ――It may take some time because applications are currently concentrated.
When the development environment becomes available, register API TOKEN. You can register from My Page. Give the Key any name you like.
Log in to Tellus OS and enter the development environment.
After logging in, go under the Work directory. Let's create a new Python3 Notebook.
I won't explain the Jupyter Notebook here, but it's a familiar environment for those who have used it. The modules used this time are as follows. First, import these.
name | Overview |
---|---|
Numpy | A library for numerical calculation in Python |
Scikit-image | Python image analysis library |
Requests | Apache2 Licensed-based HTTP communication library for Python |
Io | I/Python standard module for O |
PIL | Python image processing library |
Matplotlib | Python graph drawing library |
math | Python computational library |
#Import required modules and libraries
import numpy as np
from skimage import io, color, img_as_ubyte, filters, transform
import requests
from io import BytesIO
from PIL import Image
import matplotlib.pyplot as plt
%matplotlib inline
import math
Here, we would like to introduce the satellites used this time.
Satellite name | ASNARO-1 |
---|---|
Data provider | PASCO CORPORATION |
Installation period | 2019-2019 |
Ground surface resolution | 0.5m(Pankuro)、2m(Multi)、0.5m(Pan sharpen) |
Usage example | Ship detection in waters using high-resolution optical satellite data |
wavelength | Pankuro (black and white image):450-860 nm Multi (color image): Band1:400-450 nm (Ocean Blue) Band2:450-520 nm (Blue) Band3:520-600 nm (Green) Band4:630-690 nm (Red) Band5:705-745 nm (Red Edge) Band6:760-860 nm (NIR) |
Specify the access token.
#API access token API_Store in TOKEN
API_TOKEN = 'YOUR TOKEN HERE'
This time, I would like to acquire an image of the area around Tokyo Bay, so enter the latitude and longitude as follows.
#Minimum latitude
min_lat = 35.675
#Minimum longitude
min_lon = 139.745
#Maximum latitude
max_lat = 35.605
#Maximum longitude
max_lon = 139.835
Specify the URL, header, and parameters for acquiring satellite data using the API.
#Specify the server URL, header, and parameters of the satellite image to be acquired
# ASNARO-URL of 1
urls_ASNARO1 = 'https://gisapi.tellusxdp.com/api/v1/asnaro1/scene'
#header
headers = {'Authorization': 'Bearer ' + API_TOKEN}
#Specify parameters (maximum and minimum values of latitude and longitude)
parameter = { 'min_lat': min_lat, 'min_lon': min_lon, 'max_lat': max_lat, 'max_lon': max_lon}
You can find the server URL and parameters in the Terraus API Reference. API reference: https://www.tellusxdp.com/ja/dev/api
There are two main types of satellite data APIs.
First, let's check what kind of scene ASNARO-1 has.
#Search the scene that observed the area specified in the requests module and get the list in JSON format
r_ASNARO1 = requests.get(urls_ASNARO1, params=parameter, headers=headers).json()
#Let's display the contents
r_ASNARO1[0]
{'acquisitionDate': 'Wed, 31 Oct 2018 12:31:16 GMT',
'clat': 35.6602740014593,
'clon': 139.701407725281,
'cloudCover': 20.0,
'entityId': '20190115064426419_AS1',
'max_lat': 35.7191765745438,
'max_lon': 139.786374532312,
'min_lat': 35.6013160074949,
'min_lon': 139.61632039109,
'path': 0,
'productId': '20190115064426419_AS1_D01_L1B',
'row': 0,
'thumbs_url': 'https://tile.tellusxdp.com/thums/ASNARO-1/20190115064426419_AS1.PNG'}
Let's check how many shooting data there are in this latitude and longitude range
len(r_ASNARO1)
28
I found that there are 28 sheets. Let's display the contents.
# ASNARO-Show 1 search results
print("ASNARO-List of 1[Shooting date and time, entityId, productId, cloudCover]")
for res in r_ASNARO1:
print(", ".join([res['acquisitionDate'],
str(res['entityId']), str(res['productId']), str(res['cloudCover'])]))
ASNARO-List of 1[Shooting date and time, entityId, productId, cloudCover]
Wed, 31 Oct 2018 12:31:16 GMT, 20190115064426419_AS1, 20190115064426419_AS1_D01_L1B, 20.0
Sat, 20 Oct 2018 12:52:04 GMT, 20181224064142828_AS1, 20181224064142828_AS1_D01_L1B, 19.0
Tue, 31 Jul 2018 12:42:49 GMT, 20181224064003777_AS1, 20181224064003777_AS1_D01_L1B, 0.0
Sat, 30 Jun 2018 12:17:46 GMT, 20181224062516884_AS1, 20181224062516884_AS1_D01_L1B, 13.0
Fri, 08 Jun 2018 12:25:21 GMT, 20190115062640786_AS1, 20190115062640786_AS1_D01_L1B, 20.0
Sat, 02 Jun 2018 12:36:46 GMT, 20181224062257563_AS1, 20181224062257563_AS1_D01_L1B, 18.0
Sat, 21 Apr 2018 12:38:02 GMT, 20181225033937173_AS1, 20181225033937173_AS1_D01_L1B, 0.0
Sun, 25 Mar 2018 12:27:00 GMT, 20181224062059947_AS1, 20181224062059947_AS1_D01_L1B, 0.0
Wed, 03 Jan 2018 15:30:59 GMT, 20181224061906786_AS1, 20181224061906786_AS1_D01_L1B, 0.0
Tue, 02 Jan 2018 02:16:36 GMT, 20181224061858253_AS1, 20181224061858253_AS1_D01_L1B, 0.0
Fri, 29 Dec 2017 15:24:58 GMT, 20181224061846634_AS1, 20181224061846634_AS1_D01_L1B, 1.0
Sun, 05 Nov 2017 13:55:39 GMT, 20181224061816204_AS1, 20181224061816204_AS1_D01_L1B, 0.0
Sun, 09 Jul 2017 15:40:45 GMT, 20181224061651885_AS1, 20181224061651885_AS1_D01_L1B, 0.0
Mon, 22 May 2017 14:06:36 GMT, 20181224061523296_AS1, 20181224061523296_AS1_D01_L1B, 1.0
Sat, 06 May 2017 15:29:27 GMT, 20181224061506752_AS1, 20181224061506752_AS1_D01_L1B, 8.0
Tue, 04 Apr 2017 15:27:01 GMT, 20181224061413592_AS1, 20181224061413592_AS1_D01_L1B, 1.0
Fri, 24 Mar 2017 13:57:55 GMT, 20181224061303679_AS1, 20181224061303679_AS1_D01_L1B, 0.0
Sat, 18 Mar 2017 15:39:14 GMT, 20181224061254113_AS1, 20181224061254113_AS1_D01_L1B, 2.0
Mon, 24 Oct 2016 14:10:19 GMT, 20181224060959386_AS1, 20181224060959386_AS1_D01_L1B, 0.0
Fri, 17 Jun 2016 15:39:32 GMT, 20181224045149110_AS1, 20181224045149110_AS1_D01_L1B, 29.0
Thu, 05 May 2016 14:08:24 GMT, 20181223152759390_AS1, 20181223152759390_AS1_D01_L1B, 0.0
Fri, 18 Mar 2016 15:31:58 GMT, 20181223152702426_AS1, 20181223152702426_AS1_D01_L1B, 0.0
Mon, 30 Nov 2015 14:31:57 GMT, 20190117124820618_AS1, 20190117124820618_AS1_D01_L1B, 33.0
Tue, 03 Nov 2015 14:32:11 GMT, 20181225123811442_AS1, 20181225123811442_AS1_D01_L1B, 0.0
Sun, 25 Oct 2015 03:35:55 GMT, 20181223152305607_AS1, 20181223152305607_AS1_D01_L1B, 0.0
Thu, 06 Aug 2015 03:12:35 GMT, 20181223060118912_AS1, 20181223060118912_AS1_D01_L1B, 13.0
Wed, 05 Aug 2015 14:11:45 GMT, 20181223060109423_AS1, 20181223060109423_AS1_D01_L1B, 0.0
Sun, 28 Jun 2015 12:41:27 GMT, 20181225122707116_AS1, 20181225122707116_AS1_D01_L1B, 10.0
Here, "cloud cover" is called the cloud cover rate, and it is a numerical value of how many clouds are included in the image. This time, it is not a synthetic aperture radar that is not affected by clouds called SAR, but a normal optical satellite image, so we will select the latest data below that has no cloud coverage.
Tue, 31 Jul 2018 12:42:49 GMT, 20181224064003777_AS1, 20181224064003777_AS1_D01_L1B, 0.0
Next, let's get the image data for each scene using the second API. ASNARO-1 uses productId to get the image. Let's also store the entityId here.
#Select the scene you want to get
#Selected ASNARO-1 Stores productId and entityId of satellite image
ASNARO1_entityId = "20181224064003777_AS1"
ASNARO1_productId = "20181224064003777_AS1_D01_L1B"
Here, I will explain the image tiles and the magnification. It is expressed as (z = Zoom Level, x = tile_x, y = tile_y), but in Terras it is possible to get 256 x 256 tile images for each Zoom Level between Z = 0-18 with the API. .. This uses the Web Mercator projection, which is a cutout of a world map projected by the Mercator projection in the range of 180 ° east to 180 ° west and 85.0511 ° north to 85.0511 ° south.
By the way, it is possible to find the range you want to display this time with tiles from the Terrass OS like this, but in this case, select "Data selection" → "Tile image" as shown below.
However, if it is troublesome to do this every time, I will consider how to implement a method to make latitude and longitude into tile coordinates in Python. The detailed algorithm will be long, so I will refrain from doing so, but the following code can be used for conversion.
#Tile coordinates (x) including specified latitude and longitude, y,z) is calculated.
#A function that calculates the corresponding tile coordinates with latitude / longitude and zoom level z as inputs
#You can find the zoom level z you can specify in the API Reference on the Tellus Developers page.
# https://www.tellusxdp.com/ja/dev/api
def LatLon2Tile(lat,lon, z):
L = 85.05112878
tileX = int( (2**(z+7)) * ((lon/180) + 1) )
tileY = int( (2**(z+7)/math.pi) * (-1 * np.arctanh(math.sin(lat * math.pi/180)) + np.arctanh(math.sin(L * math.pi/180))) )
return int(tileX/256), int(tileY/256)
#Center coordinates of the designated area this time(longitude latitude)
center_lat = (min_lat + max_lat) / 2
center_lon = (min_lon + max_lon) / 2
#This time, let the zoom level z be 12.
z = 12
#Get tile coordinates
x, y = LatLon2Tile(center_lat,center_lon, z)
Let's actually acquire a satellite image using this function. This time, we will acquire images for each of the RGB and NIR (near infrared) bands.
# ASNARO-Get 1 satellite image
#Specify the server URL
url_ASNARO1 = 'https://gisapi.tellusxdp.com/blend/asnaro1'
#Determine band assignment
query_ASNARO1 = "r=4&g=3&b=2"
res_ASNARO1 = requests.get("{}/{}/{}/{}/{}.png?{}".format(
url_ASNARO1,
ASNARO1_productId,
z, x, y,
query_ASNARO1),
headers={"Authorization": "Bearer " + API_TOKEN})
RGB_ASNARO1 = np.asarray(Image.open((BytesIO(res_ASNARO1.content))))
#Change data type to integer
RGB_ASNARO1 = RGB_ASNARO1.astype(np.uint8)
Import as R = Band4, G = Band3, B = Band2 by band assignment. The acquired image is captured in 4 dimensions called RGBA (A = Alpha Chennel) as a numpy array.
# 7. ASNARO-Get 1 single band image(Red, blue, green, near infrared)
#Convert to a single-band two-dimensional array of red, blue, and green
R_ASNARO1 = RGB_ASNARO1[:,:,0]
G_ASNARO1 = RGB_ASNARO1[:,:,1]
B_ASNARO1 = RGB_ASNARO1[:,:,2]
# ASNARO-Change the band assignment of 1 (red: band 4, green: band 4, blue: band 4)
query_ASNARO1 = "r=6&g=6&b=6"
# ASNARO-1 Get satellite image
res_ASNARO1 = requests.get("{}/{}/{}/{}/{}.png?{}".format(
url_ASNARO1,
ASNARO1_productId,
z, x, y,
query_ASNARO1),
headers=headers)
#NumPy two-dimensional array of near-infrared single-band images
NIR_ASNARO1 = np.asarray(Image.open((BytesIO(res_ASNARO1.content))))
NIR_ASNARO1 = NIR_ASNARO1[:,:,0].astype(np.uint8)
Next, capture near infrared NIR = Band6.
The captured image is displayed in true color composition, which is a composition method close to the human eye. The following is an image of true color composition.
# 9.True color composition
#R: Red band, G: Green band, B: Blue band
# ASNARO-1
true_ASNARO1 = np.dstack((R_ASNARO1, G_ASNARO1, B_ASNARO1))
# ASNARO-1 Display of satellite image
plt.figure(figsize=(20,20)) #Specify display size in inches 20 x 20 inches
plt.imshow(true_ASNARO1)
plt.show()
Since it is acquired at 256 x 256 pixels, it will be considerably blurred if it is stretched to this extent, but it seems that some information such as the inflow status of sediments from the Hamarikyu Gardens and rivers can be acquired.
So far, I think I've written a rather troublesome Code, but I think it's possible to use it more conveniently from GCP on Google Earth without doing this. I think some people think that. I did. It's really what you're aiming for, so it's perfectly fine if you can meet your goals with Google Earth and other data sources.
However, satellite data has a number of features that other data do not have. Although it is an independent investigation, I have summarized the features of each below.
Satellite | Google Earth | Air drone | aircraft | Tellus | |
---|---|---|---|---|---|
price | ✕✕✕ | ◎ | △ | ✕ | ◎ |
previous data | ◎ | △ | ✕ | ◯ | ◎[^1] |
frequency | ◎[^2] | ✕ | △ | ✕ | ◎ |
resolution | ◎ | ◎ | ◎ | ◎ | ◎ |
Visible wavelength sensor (passive) | ◎ | ◎ | ◎ | ◎ | ◎ |
Other sensors (passive) | ◎ | ✕ | ✕ | ✕ | ◎ |
SAR (radar) | ◎ | ✕ | ✕ | ✕ | ◎ |
API cooperation | ✕ | ◎ | ✕ | ✕ | ◎ |
Non-satellite data | ✕ | ✕ | ✕ | ✕ | ◎ |
As mentioned above, the advantages of satellite data are the abundance of past data, sensors of various wavelengths, and active sensing by a synthetic aperture radar called SAR. By using these, it becomes possible to obtain information that could not be obtained with visible light.
In addition, Terras also provides data obtained from sources other than satellites, such as temperature and artificial flow data, which is also advantageous.
So what kind of information could not be obtained with visible light? https://sorabatake.jp/5192/
Since observation satellites are equipped with optical sensors of various wavelengths, the wavelengths that can be acquired differ for each satellite. As mentioned above, the wavelengths of electromagnetic waves absorbed and reflected by the object are different, so various information can be obtained by combining these. Typical combinations are known as follows.
https://sorabatake.jp/5192/
Among them, NDVI, which is an index of how much plants are overgrown, is a commonly used index. There is an index called NDWI, which is an indicator of how much water is present. This time, the pixel of each image is indicated by uint8 $ (2 ^ 8 = 256) $, so it is indicated by a numerical value from 0 to 255. NDVI and NDWI will be shown normalized between (-1 → 1).
NDVI = \frac{R - NIR}{R + NIR}
NDWI = \frac{R - SWIR}{R + SWIR}
Next, a synthesis method called natural color synthesis is performed. This is a color composition method in which R = NIR (near infrared) is used instead of R = R. You can display the plants in green. The specific image is as follows.
# 10.Natural color composition
#R: Red band, G: Near infrared band, B: Green band
# ASNARO-1
natural_ASNARO1 = np.dstack((R_ASNARO1, NIR_ASNARO1, G_ASNARO1))
# ASNARO-1 Display of satellite image
plt.figure(figsize=(20,20)) #Specify display size in inches 20 x 20 inches
plt.imshow(natural_ASNARO1)
plt.show()
The vegetation part is visualized in green, and the vegetation area can be displayed more clearly than you can see with true color composition.
Next, let's visualize the NDVI described above. Put R and NIR in the formula.
# 1. NDVI
#Change data type from uint8 to float32
NIR_ASNARO1 = NIR_ASNARO1.astype(np.float32)
R_ASNARO1 = R_ASNARO1.astype(np.float32)
#Calculate NDVI
NDVI_ASNARO1 = (NIR_ASNARO1 - R_ASNARO1) / (NIR_ASNARO1 + R_ASNARO1)
plt.figure(figsize=(20,20)) #Specify display size in inches 6 inches x 6 inches
plt.imshow(NDVI_ASNARO1, cmap='Reds')
plt.colorbar()
plt.show()
I was able to calculate the NDVI and quantify the vegetation index. Unfortunately, ASNARO-1 does not have a SWIR (Short Wavelength Infrared) sensor, so it is not possible to calculate NDWI. However, even in such a case, it is possible to calculate the NDWI by using the data of Landsat 8 equipped with SWIR. However, in the case of Landsat 8, only the magnification of Z = 12 is supported, so the roughness of this image is the limit. In this way, Terraus can switch and use data from various satellites depending on the purpose.
I think the resolution is lower than the image shown at the beginning of the article, but it does not mean that the web browser has failed to load. Each zoom level captures an image with a resolution of 256x256, so if you want to capture a higher definition image, for example, you need to increase the zoom level. The zoom level varies from satellite to satellite, but even within the same satellite, it varies from sensor to sensor. In ASNARO-1, the maximum zoom level is 18 for an optical sensor that combines RGB, but for an optical sensor for each band like this time, the maximum zoom level is up to 16. Spec details for each satellite are described in detail on the following pages, so please refer to them as necessary. https://www.tellusxdp.com/ja/dev/data
Now, when I try to get a high resolution image (Z = 16), the range is narrowed because it is 256x256 pixels. However, if you try to expand the range (Z = 12), the resolution will decrease because of 256x256 pixels. In such a case, it is possible to acquire images with a resolution of Z = 16 level within the range of Z = 12 by concatenating and displaying images with Z = 16.
def get_combined_image(get_image_func, z, topleft_x, topleft_y, size_x=1, size_y=1, option={}, params={}):
"""
Get a combined tile image
Parameters
----------
get_image_func : function
Tile image acquisition method
The argument is z, x, y, option, params
z : int
Tile zoom factor
topleft_x : int
X coordinate of the upper left tile
topleft_y : int
Y coordinate of the upper left tile
size_x : int
Number of tiles that can be connected in the longitude direction
size_y : int
The number of tiles that can be connected in the latitude direction
option : dict
API path options(z,x,y excluded)
params : dict
Query parameters
Returns
-------
combined_image: ndarray
Combined tile image
"""
rows = []
blank = np.zeros((256, 256, 4), dtype=np.uint8)
for y in range(size_y):
row = []
for x in range(size_x):
try:
img = get_image_func(z, topleft_x + x, topleft_y + y, option)
except Exception as e:
img = blank
row.append(img)
rows.append(np.hstack(row))
return np.vstack(rows)
Here, we will introduce an algorithm that first creates an empty numpy matrix and then inserts images one by one. As Z increases by 1, the image will be quadrupled and quadrupled, so if you change from Z = 12 to Z = 16, $ 4 ^ {4} = 256 $ tiles will be lined up. This is illustrated below.
The code above is the actual implementation of this.
def get_asnaro1_image(z, x, y, option={}, params={}, query=''):
"""
ASNARO-Get 1 tile image
Parameters
----------
z : int
Tile zoom factor
x : int
X coordinate of tile
y : int
Y coordinate of tile
option : dict
API path options(productId)
params : dict
Query parameters
Returns
-------
img: ndarray
Tile image
"""
url = 'https://gisapi.tellusxdp.com/blend/asnaro1/{}/{}/{}/{}.png?{}'.format(option['productId'], z, x, y, query)
headers = {
'Authorization': 'Bearer' + API_TOKEN
}
r = requests.get(url, headers=headers, params=params)
if not r.status_code == requests.codes.ok:
r.raise_for_status()
return io.imread(BytesIO(r.content))
This time, the algorithm to acquire the 256x256 image that is the original image to be stitched is written as above. In the following, we will acquire images as R = Band4, G = Band3, B = Band2 for true color composition.
# ASNARO-Get 1 satellite image
def rgb_asnaro1(z, x, y, option):
#Specify the server URL
url_ASNARO1 = 'https://gisapi.tellusxdp.com/blend/asnaro1'
#Determine band assignment
query_ASNARO1 = "r=4&g=3&b=2"
res_ASNARO1 = requests.get("{}/{}/{}/{}/{}.png?{}".format(
url_ASNARO1,
option['productId'],
z, x, y,
query_ASNARO1),
headers={"Authorization": "Bearer " + API_TOKEN})
RGB_ASNARO1 = np.asarray(Image.open((BytesIO(res_ASNARO1.content))))
#Change data type to integer
#RGB_ASNARO1 = RGB_ASNARO1.astype(np.uint8)
return RGB_ASNARO1.astype(np.uint8)
In the following, we will acquire images as R = Band6, G = Band6, B = Band6 for natural color composition.
# ASNARO-Get 1 satellite image
def nir_asnaro1(z, x, y, option):
#Specify the server URL
url_ASNARO1 = 'https://gisapi.tellusxdp.com/blend/asnaro1'
#Determine band assignment
query_ASNARO1 = "r=6&g=6&b=6"
res_ASNARO1 = requests.get("{}/{}/{}/{}/{}.png?{}".format(
url_ASNARO1,
option['productId'],
z, x, y,
query_ASNARO1),
headers={"Authorization": "Bearer " + API_TOKEN})
RGB_ASNARO1 = np.asarray(Image.open((BytesIO(res_ASNARO1.content))))
#Change data type to integer
#RGB_ASNARO1 = RGB_ASNARO1.astype(np.uint8)
return RGB_ASNARO1.astype(np.uint8)
Below, the coordinates of Top Left X and Top Left Y are calculated from the center coordinates. +1 is the correction made because the coordinates will return too much if 1/2 is subtracted from the center coordinates.
z = 16
size_x = 16
size_y = 16
x, y = LatLon2Tile(center_lat,center_lon, z)
topleft_x = int(x - size_x / 2 + 1)
topleft_y = int(y - size_y / 2 + 1)
option = {
'productId': ASNARO1_productId
}
query_ASNARO1 = "r=4&g=3&b=2"
Enter the above parameters for the function that actually stitches the true color composite images.
combined = get_combined_image(rgb_asnaro1, z, topleft_x, topleft_y, size_x, size_y, option, params_ASNARO1)
We will also add the above parameters to the function that actually connects the natural color composite images.
combined_nir = get_combined_image(nir_asnaro1, z, topleft_x, topleft_y, size_x, size_y, option, query_ASNARO1)
Images are stored in an array for each band.
# ASNARO-Get 1 single band image(Red, blue, green, near infrared)
#Convert to a single-band two-dimensional array of red, blue, and green
COMR_ASNARO1 = combined[:,:,0].astype(np.uint8)
COMG_ASNARO1 = combined[:,:,1].astype(np.uint8)
COMB_ASNARO1 = combined[:,:,2].astype(np.uint8)
#NumPy two-dimensional array of near-infrared single-band images
COMNIR_ASNARO1 = combined_nir[:,:,0].astype(np.uint8)
First of all, true color composition.
#True color composition
#R: Red band, G: Green band, B: Blue band
# ASNARO-1
true_COMASNARO1 = np.dstack((COMR_ASNARO1, COMG_ASNARO1, COMB_ASNARO1))
# ASNARO-1 Display of satellite image
plt.figure(figsize=(20,20)) #Specify display size in inches 20 x 20 inches
plt.imshow(true_COMASNARO1)
plt.show()
You got a pretty high-definition image. In terms of pixels, it is an image of 256 x 16 = 4096 pixels.
Next is natural color composition.
#Natural color composition
#R: Red band, G: Near infrared band, B: Green band
# ASNARO-1
natural_COMASNARO1 = np.dstack((COMR_ASNARO1, COMNIR_ASNARO1, COMG_ASNARO1))
# ASNARO-1 Display of satellite image
plt.figure(figsize=(20,20)) #Specify display size in inches 20 x 20 inches
plt.imshow(natural_COMASNARO1)
plt.show()
I was able to get a fairly high-definition image here.
# NDVI
#Change data type from uint8 to float32
COMNIR_ASNARO1 = COMNIR_ASNARO1.astype(np.float32)
COMR_ASNARO1 = COMR_ASNARO1.astype(np.float32)
#Calculate NDVI
COMNDVI_ASNARO1 = (COMNIR_ASNARO1 - COMR_ASNARO1) / (COMNIR_ASNARO1 + COMR_ASNARO1)
plt.figure(figsize=(20,20)) #Specify display size in inches 6 inches x 6 inches
plt.imshow(COMNDVI_ASNARO1, cmap='Reds')
plt.colorbar()
plt.show()
A fairly high-definition image is also produced for NVDI.
This article mainly describes how to use satellite data, but when it comes to images, you don't want to try image recognition. I won't mention it in this article, but of course image recognition is also possible on Terrass. The results of detecting a moored vessel using a machine learning model [^ 3] are as follows.
We have extracted the ones with a certainty of 0.8 or higher, but we were able to find 326 anchored vessels in total. We have also detected ships moored in the river towards the land. Since the vessels that are sailing are excluded from the detection target, it can be seen that the vessels that are waving are not detected.
Reference -Sorahata —— News about how to use Terrass and the space business
The opinions expressed on this site and the corresponding comments are the personal opinions of the contributor and not the opinions of Cisco. The content of this site is provided for informational purposes only and is not intended to be endorsed or expressed by Cisco or any other party. By posting on this website, you are solely responsible for the content of all information uploaded by posting, linking or otherwise, and disclaiming Cisco from any liability regarding the use of this website. I agree.
[^ 1]: Based on future onboard satellite data [^ 2]: There are frequent methods such as satellite constellation
Recommended Posts