I automatically collected my Qiita post data with Raspberry Pi and notified it regularly with Slack

Introduction

I've posted some articles recently, but I wanted to see data such as the total number of views so far, so I created an environment with Raspberry Pi. This time, the goal is to acquire data regularly with Raspberry Pi, create a graph once a day, and automatically send it to Slack.

environment

The environment of Mac that mainly develops is as follows.

$sw_vers
ProductName:	Mac OS X
ProductVersion:	10.13.6
BuildVersion:	17G9016

The Raspberry Pi used is Raspberry Pi 3 Model B. The OS is as follows.

$ cat /etc/os-release 
PRETTY_NAME="Raspbian GNU/Linux 10 (buster)"
NAME="Raspbian GNU/Linux"
VERSION_ID="10"
VERSION="10 (buster)"
VERSION_CODENAME=buster
ID=raspbian
ID_LIKE=debian
HOME_URL="http://www.raspbian.org/"
SUPPORT_URL="http://www.raspbian.org/RaspbianForums"
BUG_REPORT_URL="http://www.raspbian.org/RaspbianBugs"

Build

I haven't used Raspberry Pi for a while, but when I tried to use it again, I couldn't log in, and when I tried to change the password, the OS stopped booting, so I started by reinstalling the OS. After that, install the necessary libraries and tools, and build MySQL (MariaDB) that stores Qiita data. Finally, we will create a process that periodically sends to Slack.

Raspberry Pi OS reinstallation

Click here to reinstall the OS You can now boot the OS by executing it as described in [^ 1]. The procedure is exactly as described, so I will omit it.

There was a problem that `ʻapt update`` could not be executed with an error. After trial and error, as described here [^ 2],

sudo dhclient eth0

This solved the problem. After that, execute the following command.

sudo apt-get update
sudo apt-get upgrade

DB setup

Next, build a database that stores data such as the number of views of Qiita. This time we will use MySQL. But when I tried to install MySQL

$ sudo apt-get install mysql-server
Loading the package list...Done
Creating a dependency tree
Reading status information...Done
Package mysql-The server is not available, but is referenced by another package.
This is a missing package, obsolete, or another source
It means that it is only available from.
But the following packages replace:
  mariadb-server-10.0

E:package'mysql-server'Has no installation candidates

Error is displayed and the installation could not be performed. However, in the error message, it is displayed that MariaDB will be replaced, so when I tried it, I was able to install it with the following command.

sudo apt-get install mariadb-client mariadb-server

This time I'm thinking of putting Japanese in the data, so I'll set the language. With reference to this [^ 3], I modified /etc/mysql/conf.d/mysql.cnf as follows.

/etc/mysql/conf.d/mysql.cnf


[mysql]
default-character-set=utf8mb4

Restart MySQL and configure the user and database settings.

$ sudo service mysql restart
$ sudo mysql -u root -p
Enter password: 
Welcome to the MariaDB monitor.  Commands end with ; or \g.
Your MariaDB connection id is 60
Server version: 10.3.17-MariaDB-0+deb10u1 Raspbian 10

Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

MariaDB [(none)]> CREATE DATABASE myqiita;
MariaDB [(none)]> CREATE USER 'pi'@'localhost' IDENTIFIED BY '<password>';
MariaDB [(none)]> GRANT ALL PRIVILEGES ON myqiita.* to 'pi'@'localhost';
MariaDB [(none)]> FLUSH PRIVILEGES;
MariaDB [(none)]> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| myqiita            |
+--------------------+
2 rows in set (0.002 sec)

MariaDB [(none)]> quit
Bye

You are now ready to log in as user pi and store your data in the database myqiita.

Data acquisition script creation

Write the code related to the connection to the database you set earlier.

myconnection.py


import MySQLdb
import pandas.io.sql as psql

class myDB():

    def __init__(self):
        self.connection = MySQLdb.connect(
            host='localhost',
            user='pi',
            passwd='<password>',
            charset='utf8',
            db='myqiita')
        self.cursor = self.connection.cursor()

    def exe(self, sql):
        self.cursor.execute(sql)
        return self.cursor.fetchall()

    def exedf(self, sql):
        return psql.read_sql(sql, self.connection)

    def commit(self):
        self.connection.commit()

    def close(self):
        self.connection.close()

The part described as <password> is the password when the user pi is created in MySQL.

Run the following program to create a table mytable to store the data. In addition, there is a textual part about the type specification of each column.

create.py


import MySQLdb
from myconnection import myDB

if __name__ == '__main__':
    
    mydb = myDB()

    sql = """
    create table mytable( 
    id int(11) AUTO_INCREMENT NOT NULL,
    create_datetime datetime not null default current_timestamp,
    tot_num_articles int(11) not null,
    tot_num_views int(11) not null,
    tot_num_likes int(11) not null,
    max_num_views int(11) not null,
    max_num_views_title varchar(100) not null,
    max_num_likes int(11) not null,
    max_num_likes_title varchar(100) not null,
    PRIMARY KEY (id))
    """
    print( sql )
    mydb.exe( sql )

Next, create the code to get the data from Qiita and store it in the database.

insert.py


import requests
import json
import datetime
import MySQLdb
from myconnection import myDB

def collection():

    dt_now = datetime.datetime.now()

    url = 'https://qiita.com/api/v2/authenticated_user/items'
    headers = {"content-type": "application/json",
               "Authorization": <Access token>}
    
    res = requests.get(url, headers=headers)
    article_list = res.json()

    tot_num_articles = len(article_list)
    tot_num_views = 0
    tot_num_likes = 0
    max_num_views = 0
    max_num_likes = 0
    max_num_views_title = ''
    max_num_likes_title = ''

    
    for item in article_list:
        item_id = item['id']
        title = item['title']
        likes_count = item['likes_count']
    
        url = 'https://qiita.com/api/v2/items/' + item_id
        res = requests.get(url, headers=headers)
        json = res.json()
        page_views_count = json['page_views_count']
    
        tot_num_views += page_views_count
        tot_num_likes += likes_count

        if max_num_views < page_views_count:
            max_num_views = page_views_count
            max_num_views_title = title

        if max_num_likes < likes_count:
            max_num_likes = likes_count
            max_num_likes_title = title

    data={'dt':dt_now, 
            'tot_num_articles':tot_num_articles, 
            'tot_num_views':tot_num_views, 
            'tot_num_likes':tot_num_likes, 
            'max_num_views':max_num_views, 
            'max_num_views_title':max_num_views_title, 
            'max_num_likes':max_num_likes, 
            'max_num_likes_title':max_num_likes_title}
    return data
    

if __name__ == '__main__':
    
    mydb = myDB()

    data = collection()

    sql = """
    insert into mytable 
    (tot_num_articles,tot_num_views,tot_num_likes,max_num_views,max_num_views_title,max_num_likes,max_num_likes_title)
    values({},{},{},{},"{}",{},"{}")
    """.format(data['tot_num_articles'],
            data['tot_num_views'],
            data['tot_num_likes'], 
            data['max_num_views'], 
            data['max_num_views_title'], 
            data['max_num_likes'], 
            data['max_num_likes_title'])
    print(sql)
    mydb.exe(sql)
    mydb.commit()
    mydb.close()

I'm getting data by hitting Qiita's API in collection (). For information on how to obtain an access token and data, refer to here [^ 4].

In the main function, insert it into the data table mytable obtained by collection () and commit.

Periodic execution with cron

Register the insert.py created earlier in cron for regular execution.

sudo vim /etc/cron.d/cron_test 

By creating a file called cron_test and writing the following, insert.py will be executed at 00:00 every hour.

00 *	* * *	pi	/usr/bin/python3 /home/pi/work/insert.py   

Restart cron.

sudo service cron restart

With the above, we have reached the point where we can acquire data on a regular basis.

Subscription to Slack

Finally, set it to graph the data once a day and send it to Slack. The program is as follows.

slack_report.py


from slacker import Slacker
import monitor

def main():
    monitor.report()
    
    slack = Slacker("<token>")
    slack.files.upload(file_='out_monitor.png', channels='myqiita')

if __name__ == "__main__":
    main()

The first is monitor.report (), but in another program called monitor.py, we get the data of the database, create a graph, and save it as out_monitor.png. I'm sending this PNG file to my channel in Slack. For the transmission part, I referred to here [^ 5].

The program contents of monitor.py are as follows.

monitor.py


import MySQLdb
from myconnection import myDB
import pandas as pd
import matplotlib.pyplot as plt

def report():
    mydb = myDB()

    sql = """
    select * from mytable
    """
    df = mydb.exedf( sql )

    # Preprocess
    df['create_datetime'] = pd.to_datetime(df['create_datetime'])
    df = df.set_index('create_datetime')

    # Plot
    fig = plt.figure()
    ax1 = fig.add_subplot(311)
    df['tot_num_articles'].plot(ax=ax1)
    ax1.set_xticklabels('')
    ax1.set_xlabel('')
    ax1.set_ylabel('tot_num_articles')
    ax1.grid()
    ax2 = fig.add_subplot(312)
    df['tot_num_views'].plot(ax=ax2)
    ax2.set_xticklabels('')
    ax2.set_xlabel('')
    ax2.set_ylabel('tot_num_views')
    ax2.grid()
    ax3 = fig.add_subplot(313)
    df['tot_num_likes'].plot(ax=ax3)
    ax3.set_ylabel('tot_num_likes')
    ax3.grid()
    fig.savefig('out_monitor.png')

if __name__ == '__main__':
    report()

All the data stored in the table myqiita is acquired, made into a DataFrame of pandas, and then graphed. Please refer to here [^ 6] for conversion to DataFrame. The items in the graph are the number of your articles, the total number of views, and the total number of likes.

Register the program in cron so that slack_report.py is executed regularly. The contents of cron_test are all displayed below.

/etc/cron.d/cron_test


# /etc/crontab: system-wide crontab
# Unlike any other crontab you don't have to run the `crontab'
# command to install the new version when you edit this file
# and files in /etc/cron.d. These files also have username fields,
# that none of the other crontabs do.

SHELL=/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin

# Example of job definition:
# .---------------- minute (0 - 59)
# |  .------------- hour (0 - 23)
# |  |  .---------- day of month (1 - 31)
# |  |  |  .------- month (1 - 12) OR jan,feb,mar,apr ...
# |  |  |  |  .---- day of week (0 - 6) (Sunday=0 or 7) OR sun,mon,tue,wed,thu,fri,sat
# |  |  |  |  |
# *  *  *  *  * user-name command to be executed
17 *	* * *	root    cd / && run-parts --report /etc/cron.hourly
25 6	* * *	root	test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily )
47 6	* * 7	root	test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.weekly )
52 6	1 * *	root	test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.monthly )
00 *	* * *	pi	/usr/bin/python3 /home/pi/work/insert.py   
10 8	* * *	pi	/usr/bin/python3 /home/pi/work/slack_report.py 

Finally, restart cron.

sudo service cron restart

result

Here is a screenshot sent to Slack on my smartphone.

7F3D2229-9000-4AEC-A4E7-AF27B797E7D5.png

Graphs are now sent every morning at 8:10. Alright.

at the end

I was able to send Qiita browsing data to Slack on a regular basis with Raspberry Pi. It's interesting because when you can see the data, you will notice that the amount of increase in the number of views differs depending on the time of day. The number of views and likes is still small, but ...

Recommended Posts

I automatically collected my Qiita post data with Raspberry Pi and notified it regularly with Slack
[For beginners] I made a motion sensor with Raspberry Pi and notified LINE!
I tried connecting Raspberry Pi and conect + with Web API
Measure and compare temperature with Raspberry Pi and automatically generate graph
I made a surveillance camera with my first Raspberry PI.
Home hack to buy automatically when it gets cheaper with Amazon Dash Button and Raspberry Pi
Make a thermometer with Raspberry Pi and make it viewable with a browser Part 4
Read the data of the NFC reader connected to Raspberry Pi 3 with Python and send it to openFrameworks with OSC
Get GrovePi + sensor value with Raspberry Pi and store it in kintone
Pet monitoring with Rekognition and Raspberry pi
I installed OpenCV-Python on my Raspberry Pi
I made a system with Raspberry Pi that regularly measures the discomfort index of the room and sends a LINE notification if it is a dangerous value
Make a thermometer with Raspberry Pi and make it visible on the browser Part 3
I tried to create a button for Slack with Raspberry Pi + Tact Switch
[Python] I introduced Word2Vec and played with it.
Generate and post dummy image data with Django
MQTT RC car with Arduino and Raspberry Pi
CSV output of pulse data with Raspberry Pi (CSV output)
Get temperature and humidity with DHT11 and Raspberry Pi
I made a Docker Image that reads RSS and automatically tweets regularly and released it.
It's getting cold, so I tried to make it possible to turn on / off the AC heater automatically with Raspberry Pi!