It's been almost a year since I started using fitbit, but I didn't touch much around the API because I was only developing clocks and apps, and because of my own research, this time I tried to acquire activity data in real time. It was.
It was okay to write to the database, but I chose to write to the sheet quickly because my knowledge is still at the baby level. (I want to study soon)
The reason why (?) Is added after the real time will be explained later. ..
Roughly divide and proceed with the following procedure.
I will proceed immediately!
This is another power application from the beginning, but referring to the article here,
Please get and set. By the way, my application registration settings are as follows.
item | concents inputted |
---|---|
Application Name | HeartRate Watcher |
Description | watch my heart rate |
Application Website | http://localhost/ |
Organization | personal |
Organization Website | http://localhost/ |
OAuth 2.0 Application Type | Personal |
Callback URL | http://127.0.0.1:8080/ |
Default Access Type | Read-Only |
get_hr.py
import fitbit
from ast import literal_eval
CLIENT_ID = "XXXXXX"
CLIENT_SECRET = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
TOKEN_FILE = "token.txt" #In the same directory.Make txt
tokens = open(TOKEN_FILE).read()
token_dict = literal_eval(tokens)
access_token = token_dict['access_token']
refresh_token = token_dict['refresh_token']
def updateToken(token):
f = open(TOKEN_FILE, 'w')
f.write(str(token))
f.close()
return
authed_client = fitbit.Fitbit(CLIENT_ID, CLIENT_SECRET, access_token=ACCESS_TOKEN, refresh_token=REFRESH_TOKEN, refresh_cb=updateToken)
Once the authorization of ʻauthed_client` is complete, you can retrieve the data as you like.
This is the end of STEP1.
Since Python writes to the spreadsheet, we also register the API here.
STEP2 also has a very very polite explanation at here, so set it while reading.
get_hr.py
import gspread
import json
from oauth2client.service_account import ServiceAccountCredentials
scope = ['https://spreadsheets.google.com/feeds','https://www.googleapis.com/auth/drive']
credentials = ServiceAccountCredentials.from_json_keyfile_name('************.json', scope)
gc = gspread.authorize(credentials)
SPREADSHEET_KEY = '******************'
worksheet = gc.open_by_key(SPREADSHEET_KEY).sheet1
Once you have registered the above information, you can read and write as much as you like!
I would like to say, but since there is a ** request limit of 100 per 100 seconds **, it is unexpectedly caught. Since it is considered as one request for each cell value acquisition or update, try to avoid reusable places and extra writing as much as possible. (I don't know if I'm doing well.)
This is the end of STEP2.
After registering various information, it is finally time to get the data.
data_sec = authed_client.intraday_time_series('activities/heart', 'YYYY-MM-DD', detail_level='1sec')
heart_sec = data_sec["activities-heart-intraday"]["dataset"]
Set the date of the data you want to acquire with YYYY-MM-DD
and the data interval with detail_level
.
detail_level
can be selected from [1sec, 1min, 15min].
heart_sec contains the heart rate in the form {time: mm: hh: ss, value: **}
.
You can leave it as it is, but I wanted to add date information, so I modified it a little as follows before outputting it.
def get_heartrate_data(date):
data_sec = authed_client.intraday_time_series('activities/heart', date, detail_level='1sec')
heart_sec = data_sec["activities-heart-intraday"]["dataset"]
for data in heart_sec:
datetime = date + " " + data['time']
data['datetime'] = datetime
del data['time']
return heart_sec
After this, I will throw it to the sheet, so leave it in json format.
This is the end of STEP3.
There are also people who have carefully summarized the acquisition method, so if you want to try other than heart rate, please refer to here. ..
The last step is to write to the sheet.
Make a list of the data you want to input and throw it one by one. Append () will write it to the bottom of the latest cell, so I'll leave it to you.
headers = ['value', 'datetime'] # keys
def set_data_to_sheet(datas):
for data in datas:
column = []
for header in headers:
column.append(data[header])
worksheet.append_row(column)
After completing the above preparations, create a job () function for periodic execution.
def get_latest_data(data):
list_num = len(data)
new_list = []
for i in range(list_num-30, list_num):
new_list.append(data[i])
return new_list
def job():
now = datetime.datetime.now()
date = now.strftime('%Y-%m-%d')
data = get_heartrate_data(date)
latest_data = get_latest_data(data)
set_data_to_sheet(latest_data)
Just get the latest 30 data with the date setting, get_latest_data
and write it to the sheet.
After that, in order to realize ** real-time ** acquisition, it is periodically executed by cron etc. once every 30 seconds. Since the limit of fitbit API is 150 access / 1H, it is set with a little margin.
** Real-time heart rate can be obtained !! **
It didn't go well and the result was as follows. .. (Excerpt from 1 copy)
It was expected to be updated once every 30 seconds, but there is a delay of about 15 minutes in updating the latest information. What is relevant here is the specification regarding data upload. You can't connect to the cloud directly from fitbit, so you'll need to use your smartphone to communicate. However, it seems that fitbit and the smartphone are only in sync at intervals of about 15 minutes, so I think this problem is occurring. (Synchronization is always on, but it's a balance with various batteries)
In this way, real-time acquisition at 30-second intervals using the API seems to be difficult so far, so I changed it as follows.
If you organize the situation
--Update to the latest information about once every 15 minutes
--When you look at heart_sec
, the heart rate is recorded about once every 5 seconds (it should be set every 1 sec, but it can't be helped frequently)
-That is, record about 12 times a minute
Therefore, by the time the latest information is uploaded, 12 x 15 = about 180 records will be accumulated.
Then write 180 data by running cron every 15 minutes!
I thought, but here I get stuck in the ** 100 request limit per 100 seconds **.
I tried to write 180 data at once, but I ran into a limit. (Of course) So, I managed to get past the limit with the power technique of sleeping for 1 second each time I wrote. (It takes a lot of time) (Throwing to the database is overwhelmingly smarter)
After all, I couldn't get and write data immediately, so "Once every 15 minutes, 180 unuploaded data is written to the sheet in 180 seconds." It became a very clunky system.
However, since the data is recorded on the sheet without permission, it seems that it can be used for some kind of analysis.
As long as you write on the sheet, you can visualize it immediately. I mean, you can create a graph more easily by converting it to a DataFrame without writing it, so I will omit the details here.
As an example, the following is a plot of 24 hours of heart rate per day. You can see that the heart rate is low from midnight to early morning when sleep is deep, and gradually rises after waking up.
If you manage it properly in the database, you will be able to create cool and dynamic graphs, so this is a future issue.
We have completed a series of steps such as setting various APIs, acquiring fitbit information, writing to a sheet, and visualizing.
It wasn't exactly real-time, but I would like to create a smart visualization system and interesting apps that can tolerate even 15-minute intervals.
In addition to heart rate, you can record calories, steps, sleep, etc., so whether you are a fitbit user or not, please give it a try!
Recommended Posts