I want to use PyTorch to generate something like the lyrics of Japari Park

Introduction

The other day, the development of the deep learning library "Chainer" provided by Preferred Networks (PFN) has been completed. In my laboratory, we are divided into Tensorflow group and Chainer group, and we were competing for mounts with each other, but the Chainer group was eliminated at the same time as the development was completed (it seems to be sharp).

I'm very sorry that I used Chainer habitually, and at the same time, I am full of gratitude for providing an easy-to-use framework (believers).

Well, that PFN company seems to move to the development of PyTorch. Moreover, I hear that PyTorch itself has many aspects similar to Chainer's description. Also, in my research, I mainly used CNN and GAN, but I didn't touch on RNN. .. .. ..

** "This is a good opportunity to learn how to use PyTroch and how RNNs work ???" **

So, I borrowed the book "Deep Learning from Zero ②" from the laboratory synchronization ~~ robbing ~~ and borrowed the contents of RNN. I read it as it was and implemented it with PyTorch. This article is a memorandum.

Well, to be honest, ** "I don't understand RNN" ** ** "I don't know how to use PyTorch" ** ** "Professor from wise people" ** I made an article because it feels like.

The source code is posted on Github. ~~ By the way, I have never seen "Kemono Friends". ~~ I watched only one episode.

Content of this article

--Learning the lyrics of Japari Park using PyTorch's LSTM --Automatically generate Japari Park-like lyrics --Implement Truncated BPTT in PyTorch

What is RNN or LSTM?

RNN is an abbreviation for Recurrent Neural Network, which is a neural network mainly for time series data. Unlike normal neural network layers, RNN layers have a loop path that takes their output as input. This has the feature that hidden information for each time series can be retained. LSTM is a derivative of RNN, which adds the concept of gate to the loop structure.

For more detailed explanation, see here or here than I explain. It is much easier to understand.

Creating a dataset

It is said that the data list written out so as not to cover the word information in the text data is called "corpus". It seems that IDs are assigned to the words in this corpus to make it easier to handle as an ID list. However, this time it was troublesome to separate the words (use ~~ MeCab ~~), so I assigned an ID to each character that appears in the lyrics, and added'^' to indicate the beginning and end of the lyrics. I added'_'.

Below is the ID list and the corresponding words.

{'^': 0, 'W': 1, 'e': 2, 'l': 3, 'c': 4, 'o': 5, 'm': 6, ' ': 7, 't': 8, 'Yo': 9, 'U': 10, 'This': 11, 'So': 12, 'The': 13, 'Turbocharger': 14, 'Pacific League': 15, 'Li': 16, '-': 17, 'Ku': 18, '!': 19, 'now': 20, 'Day': 21, 'Also': 22, 'Do': 23, 'Tsu': 24, 'Ta': 25, 'Down': 26, 'Ba': 27, 'Big': 28, 'Noisy': 29, 'Gi': 30, 'But': 31, 'Mm': 32, 'High': 33, 'Et al.': 34, 'Or': 35, 'To': 36, 'Lol': 37, 'I': 38, 'e': 39, 'If': 40, 'Fu': 41, 'Re': 42, 'Zu': 43, 'Bustle': 44, '嘩': 45, 'Shi': 46, 'hand': 47, 'Su': 48, 'Tsu': 49, 'Chi': 50, 'Ya': 51, 'Me': 52, 'Naka': 53, 'Good': 54, 'Ke': 55, 'of': 56, 'Is': 57, 'Stay': 58, 'Nana': 59, 'Book': 60, 'This': 61, 'Love': 62, 'Ah': 63, 'Ru': 64, 'Ho': 65, 'You': 66, 'hand': 67, 'To': 68, 'Tsu': 69, 'so': 70, 'Opening': 71, 'Rugged': 72, '(': 73, 'Wow': 74, '・': 75, 'Tsu': 76, 'Su': 77, ')': 78, 'Appearance': 79, 'Ta': 80, 'Ten': 81, 'Man': 82, 'color': 83, 'Is': 84, 'Fascination': 85, 'Re': 86, 'Go': 87, 'evening': 88, 'Living': 89, 'Sky': 90, 'finger': 91, 'When': 92, 'Heavy': 93, 'Ne': 94, 'Ji': 95, 'Ma': 96, 'Knowledge': 97, 'Ri': 98, 'Shake': 99, 'Mukai': 100, '\u3000': 101, 'To': 102, 'La': 103, 'Bu': 104, 'Le': 105, 'Hmm': 106, 'Pu': 107, 'Eye': 108, 'You see': 109, 'Mi': 110, 'Self': 111, 'Yu': 112, 'Raw': 113, 'Ki': 114, 'Decoration': 115, 'Ku': 116, 'length': 117, 'husband': 118, 'Do': 119, 'I'm sorry': 120, '♡': 121, 'N': 122, 'i': 123, 'y': 124, 'u': 125, 'Ro': 126, 'Yu': 127, 'face': 128, 'Wait': 129, 'Open': 130, 'door': 131, 'Ge': 132, 'dream': 133, 'Pa': 134, 'word': 135, 'Continued': 136, 'Gu': 137, 'I': 138, 'D': 139, 'I': 140, 'Service': 141, 'Ki': 142, 'Stubborn': 143, 'Zhang': 144, 'A': 145, 'Sa': 146, 'O': 147, 'Se': 148, 'line': 149, 'Wow': 150, 'Up': 151, 'one': 152, 'Suspended': 153, 'life': 154, 'B': 155, 'Ze': 156, 'Eh': 157, '、': 158, 'Turn': 159, '?': 160, 'O': 161, 'h': 162, 'east': 163, 'What': 164, 'Barking': 165, 'West': 166, 'world': 167, 'Kingdom': 168, 'During ~': 169, 'sound': 170, 'A': 171, 'Me': 172, 'De': 173, 'I': 174, ',': 175, 'Collection': 176, 'friend': 177, 'Tatsu': 178, 'Elementary': 179, 'enemy': 180, 'Journey': 181, 'Standing': 182, '_': 183}

Somehow the double-byte space is registered as'\ u3000', but I will ignore it for the time being.

Network configuration

This is the source code of the network class implemented by PyTorch.

import torth.nn as nn
class LSTM_net(nn.Module):
    def __init__(self, corpus_max):
        super(LSTM_net, self).__init__()
        self.embed = nn.Embedding(corpus_max, int(corpus_max/3))
        self.lstm = nn.LSTM(input_size=int(corpus_max/3), hidden_size=int(corpus_max/6), batch_first=True)
        self.out = nn.Linear(int(corpus_max/6), corpus_max)
        self.hidden_cell = None

    def init_hidden_cell(self):
        self.hidden_cell = None

    def repackage_hidden(self, hidden_cell):
        self.hidden_cell = (hidden_cell[0].detach(), hidden_cell[1].detach())

    def forward(self, x, t_bptt=False):
        h = self.embed(x)
        all_time_hidden, hidden_cell = self.lstm(h, self.hidden_cell)
        if t_bptt:
            self.repackage_hidden(hidden_cell)
        out = self.out(all_time_hidden[:, -1, :])
       
        return out

If you look at it like this, the writing style is like Chainer. A little.

Embedding layer

If my understanding is correct, the Embedding layer seems to be an operation to extract the weight parameter line corresponding to the ID. Therefore, it is not necessary to convert the ID to one-hot expression and perform weight multiplication when inputting to the fully connected layer, so the amount of calculation is greatly reduced. .. ..

The Embedding layer in PyTorch is

import torth.nn as nn
nn.Embedding(Total number of IDs, number of dimensions of weight when taken out)

It can be implemented with. (maybe)

About PyTorch's LSTM

According to here, in PyTorch's LSTM, the input tensor is three-dimensional. It seems to be (length of input time series, batch size, vector size). Here, by setting batch_first = True when creating an instance It becomes (batch size, length of time series to be input, vector size). And as the return value, all the output values obtained from the length of the input time series and the tuple of the output value related to the last time are returned.

I can't say it well in words, but it looks like the figure below.

foward_rnn.png

Therefore, ʻall_time_hidden [:, -1,:]andhidden_cell` in the program are virtually the same.

Truncated BPTT In the error back propagation of RNN, back propagation like a normal neural network is performed by following the calculation graph when it is expanded in the horizontal direction on the time axis.

This is called Back Propagation Through Time (BPTT). (Figure below) rnn.png

But there is a problem here. That is, the layer gets deeper in proportion to the length of the time series. In general, in deep learning, the deeper the layer, the more the gradient disappears and the gradient explodes, and the computational resources also increase.

Therefore, when dealing with long time-series data, truncate the backpropagation connection to an appropriate length (truncate). The idea is Truncated BPTT. (Figure below) truncate.png One thing to note here is that the forward propagation connection is preserved. That is, when the connection is broken between time t and t + 1, the output obtained at time t during forward propagation is output. Must be retained as the input value at time t + 1.

The source code corresponds to the following parts.

    def repackage_hidden(self, hidden_cell):
        self.hidden_cell = (hidden_cell[0].detach(), hidden_cell[1].detach())

The obtained output value is recreated as a new tensor, and the calculation graph is temporarily cut when this function is called. .detatch () corresponds to the role of referring only to the value like .data in Chainer (probably).

Also, since LSTM has two outputs, hidden and cell, it is created with tuples with each as an element.

Learning

For the time being, I will post the source code part that will be the learning part.

class Util():
    def make_batch(self, corpus, seq_len, batchsize=5):
        train_data = []
        label_data = []
        for i in range(batchsize):
            start = random.randint(0, len(corpus)-seq_len-1)
            train_batch = corpus[start:start+seq_len]
            label_batch = corpus[start+1:start+seq_len+1]

            train_data.append(train_batch)
            label_data.append(label_batch)

        train_data = np.array(train_data)
        label_data = np.array(label_data)
        return train_data, label_data

class Loss_function(nn.Module):
    def __init__(self):
        super(Loss_function, self).__init__()
        self.softmax_cross_entropy = nn.CrossEntropyLoss()
        self.mean_squared_error = nn.MSELoss()
        self.softmax = nn.Softmax()

def main():
    model = LSTM_net(corpus_max=corpus.max()+1)
    opt = optim.Adam(model.parameters())
    loss_function = Loss_function()
    util = Util()
    
    epoch = 0
    max_seq_len = 16 #Specify the length of characters to be cut by backpropagation
    batch_size = 32

    while True:
        seq_len = 256 #Length to cut by learning
        train_data, label_data = util.make_batch(corpus, seq_len, batch_size)
        train_data = torch.tensor(train_data, dtype=torch.int64)
        label_data = torch.tensor(label_data, dtype=torch.int64)
        
        loss_epoch = 0    
        count = 0
        for t in range(0, seq_len, max_seq_len):
            train_seq_batch = train_data[:, t:t+max_seq_len]
            label_seq_batch = label_data[:, t:t+max_seq_len]
            out = model(x=train_seq_batch, t_bptt=True)
            loss = loss_function.softmax_cross_entropy(out, label_seq_batch[:, -1])
            opt.zero_grad()
            loss.backward()
            opt.step()    
            loss_epoch += loss.data.numpy()
            count += 1
        loss_epoch /= count
        epoch += 1
        sys.stdout.write( "\r|| epoch : " + str(epoch) + " || loss : " + str(loss_epoch) + " ||")
        
        model.init_hidden_cell()

policy

I would like to learn to predict the next character from the input of a character string of arbitrary length. This time, I will decide an appropriate position from the lyrics data, extract 256 characters from it, and do a Truncated BPTT with a length of 16.

Forward propagation is preserved until model.init_hidden_cell () is called in the source code. Backpropagation is performed sequentially within a block of length 16.

The way to take Loss is that the output for the last input is the next character (one-hot representation of the ID in). The softmax cross entropy is taken so that it becomes.

By the way, it doesn't make sense for Loss_function to be a class (apology).

result

For the time being, the source code for generating lyrics looks like this.

            index = random.randint(0, corpus.max()-1)
            gen_sentence = [index]
            print(convert.id2char[index], end="")
            for c in range(700):
                now_input = np.array(gen_sentence)
                now_input = torch.tensor(now_input[np.newaxis], dtype=torch.int64)
                out = F.softmax(model(now_input, t_bptt=True), dim=1).data.numpy()[0]
                next_index = int(np.random.choice(len(out), size=1, p=out)[0]) #Random sampling
                #next_index = int(out.argmax()) #Sampling maximum probability
                print(convert.id2char[next_index], end="")
                if((c+2)%100==0):
                    print("")
                gen_sentence = [next_index]

First of all, give the trained model () the ID of an appropriate character list. Input the output that predicted the next character from that one character, and predict the next character. .. .. The lyrics are generated by repeating the above 700 times. At this time, the output of model () can be expressed as the probability for each ID, so the ID with the highest probability may be used as the next character, but I thought it would be more fun to have randomness, so from the probability distribution The character ID is determined in the form of sampling.

Loss graph

loss.png Let's go down as much as we feel. This is an unpleasant feeling.

Generated lyrics

Below are the generated lyrics

Yuu te Nishi Yu? I'm going to brilliant h! Come on, take care of me, live e Iko o Kuji echo ct finger e om Kimi Daito book is a fake papad noisy top One of the fingers is t or this is not a continuation o There is no wa hangs over ten tedenrima ♡ Nikala It's a big picture Petitze I'm a kid y High Ash Royaeru Sora Fute Husband e Ya Para h is there! Ritsu ♡ Tomo Kimi lol Ippsou Kisoi Sora Hama-ya l Wait! C La e Kuji Igle Day! E Imoku Ima e Yoiko Rakeya Kumo Repa today lc Lava is empty Tar length (Ru line Chikaoi) Isu Innocent Suguba Het ri t i standing u tsu ba koru puyatsu h ii utta supa te line w o c me no la de c soka hero hand yes et la sa o to banra line or day Talking friend I (Kekun blasphemy eo day 1 c tekoyaen west ipu so te te te te te te te te te te te te te te te te te te te te te te te te te te te Tatsukamu Koro o Pa h Yuken La La La La, Papu h no Pusa! Ta t Higashi Chii u Soshi blame good do wait W stubborn m see love Supaman heavy or decorate I) I'm alive to good I m yohasu! E Makakari) ・ Wow, I'm going to do it, and my fingers are in my hands. Ba-kun is the one I am! Chika (Teda no Miru Ba o Chi W decoration t Hibiki to ni e Good luck student and others! Laigu! It's ten years old, but it's a lara, and it's a line, it's an eye, it's a face N, it's a day, it's a hell, it's barking, I'm laughing, a) I'm sorry, I'm sorry, I'm sorry, I'm sorry, I'm sorry, I'm sorry, I'm sorry, I'm sorry, I'm sorry To-kun Ro-ta-ra-kun to Guive (Nen! Yeah, even with a face, it's a laro

Eh, you're also an enemy, and you're fascinated by the door (gate). You can also decorate it and laugh. Yes, it's okay to laugh (Idozo ♡) Nice to Welcome to Japari Park! Because it's 10 people and 10 colors, if you gently put your fingers on the dusk sky that is fascinated by each other, nice to meet you, I want to know more! Gao! La la la la la la la la Oh, Welcomet yo Welcome to Japari Park! Hey, everyone is free to come to life. You're also decorated. It's a mess. It's a mess. It's a mess. It's a big deal. Everyone lives freely. Monishi I'm a serval of Dobarcat! I'll do my best! I'll leave it to Arai-san! Everyone will go. I'll do my best if I don't aim at the top. Go to rock. Hey, it's already steep (Nice to meet you when you gather, I want to know more about you! Gao! Continue here here welcome to to yo Japari Park! Thank you from today, always be kind to me Smile The open door (gate) that was waiting for you If you talk a lot of dreams, it will continue forever Great Journey Komukai! Haji self-finger Soja barking Wooney Here is Japari Park I'm a serval of Sabal Cat When is it! Lalala Gather and Lupar! From today, I'm looking forward to seeing you, but always a gentle smile The open door that was waiting for you (If you talk a lot about Gate Yumeko) Do here e So de big adventure (I'm Chinrama-kun, I'm serval, Arai, who's dusk `

Wow! Mitoro Sightseeing Rock, eh, is it real? Oh east howling melcome to the Japari Park! Lalalala Color lc Weelcome to Ohe Japari Park I'm a serval cat serval! -! It's good for turning around-I'll do my best-I'll leave it to Arai-san! Everyone will go I'll do my best if I don't aim higher I'm going to rock, is it the real thing? Oh Higashi Howl to the west Howl to the west Resonate all over the world Safari Melody Welcome to Welcome to Japari Park! Welcome to Welcome to Japari Park! Welcome to Welcome to Japari Park! Nice to meet you from today Always a gentle smile The open door (gate) that was always waiting for you Here and there, Soko Niso Japari Park! Today too, the turmoil of Dottanbattan was piled up, and the gate of your soul was opened.) La Howling to the west The world echoes the world Safari Melody Welcome to Welcome to Japari Park! Nice to meet you, always a gentle smile The open door (gate) that was waiting for you If you talk a lot of dreams, it will continue forever Great Journey This is Japari Park I'm a serval of a serval cat! I'll do my best, I'll leave it to Arai-san! Everyone will go, I'll do my best if I don't aim higher, I'll go to rock, is it real? Oh, bark to the east, bark to the west. Resonate all over the world Safari Melody Welcome to Welcome to Japari Park!

If you turn around, you'll have troubles here and there. It's a mess. It's a mess. It's a big deal. Everyone seems to be living freely. You don't have to decorate it. Nice to meet. you Jyapari Park! Nice to meet you from today. Always a gentle smile. The open door (gate) that was waiting for you. If you talk a lot of dreams, it will continue forever. Great Journey This is Japari Park I'm a serval of serval cat. I'll do my best, I'll leave it to Arai-san! Everyone will go, I'll do my best if I don't aim higher, I'll go to rock, eh, howl to the east, to the west Howl, echo all over the world Safari Melody Welcome to Welcome to Japari Park! Even today, Dottanbattan's fuss is ten people and ten colors, so if you gently put your fingers on the fascinating dusk sky, nice to meet you, I want to know more! If you turn around, you'll have troubles here and there. It's a mess, it's a mess, it's a mess, it's a mess, it's a big deal, everyone lives freely. You don't have to decorate it either. Nice to meet you Japari Park! Nice to meet you from today. Always a gentle smile The open door (gate) that was waiting for you If you talk a lot of dreams, it will continue forever Great Journey This is Japari Park I'm a serval of a serval cat! I'll do my best! I'll leave it to Arai-san! Everyone will go. I'll do my best if I don't aim higher.

Wow, overfitting, fun (true face)

Finally

It was my first time to post an article this time, so there were many parts that were poor or difficult to understand. However, I don't have the motivation to rewrite it, so post it as it is. I'm going to be messed up by volunteers.

We look forward to your advice and suggestions for any mistakes.

kibounoasa

bonus

This is the result of automatically generating the lyrics of various songs in the data set. What kind of song is in it?

I want to know more about you when I gently put my fingers on the dusk sky, which is fascinated by each other! It seems that everyone is living freely. You don't have to decorate it either. Nice to meet you Japari Park! Lalalala Oh, Welcome to the Japari Park! Lalalala Lalala Lala Welcome to Japari Park! It seems that everyone is living freely. You don't have to decorate it. Nice to meet you Japari Park! Nice to meet you from today. Always a gentle smile. Gate: If you talk a lot about your dreams, it will continue forever Great Journey Oh Howl to the east Howl to the west Resonate all over the world Safari Melody-Welcome to Welcome to Japari Park! Welcome to Welcome to Japari Park! Welcome to Welcome to Japari Park! Welcome to Welcome to Japari Park! Welcome to Welcome to Japari Park! Welcome to Welcome to Japari Park! Today's Dottan, Fuss! Gao! Gorogoro Dane Waku Waku Uzuru Slapstick Noronoro Ah? Urouro Ateku Teku Kosokoso Roughness Sunsun Oh? Peropero Oops Mogumogu Kirakira Guine Pika Pika Picone Mofumofu Nadade Niko! Fluffy, fluffy, fluffy, you call your name, fluffy, fluffy, you're laughing, God who makes you smile, thank you, because I'm happy to meet you even with a mischief of fate, that's no good. You see, my heart is evolving. At this same moment, if you have the feeling that you are sharing, Yamato Nadeshiko! For short? Dust and Nadoko Yamato! If you look up at you, it's too bright and you'll be dazzling. I'm thinking of you. Super-Pa-Pa-Don, O-Tan-Don, Do-Oh? Don't Don't Do ?! Dopapa Donnu! Don't Don Don't Do? Don't Don't Do ?! Do Pad Don Do Don Shakin Den Tan s Gorogoro Tan Do Jan! Aha, one light (Yeah) shines, chooses a word, and even when it's hard to spin or when I'm lonely, the story that continues because I'm connected by the bond of the stars Even if the hearts of people who are constantly changing are complicated and mysterious, are you really saying that? Times, times, times, times, times, times, times, times, times, times, times, times, times, times, times, times, times, times Times! Times to sprinkle petals brilliantly Times Times Times Times Times Times Times Times Times The day before yesterday, yesterday, today, tomorrow, the day after tomorrow Tonight is Yuzuki Hana Hoi! Iyohhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh! If you chase after Chiaki, the day before yesterday is always changing. Even if the hearts of people are complicated and mysterious, do you really say such a thing? The time to scatter the petals brilliantly the time to disperse the petals brilliantly the time to disperse the petals brilliantly the time to disperse the petals brilliantly The dance that the day before yesterday, yesterday, today, tomorrow, the day after tomorrow, and the day after tomorrow, with the hair disturbed. What day is it tonight? Wednesday, Thursday, Friday, Saturday, Sunday, Monday, and Tuesday? Ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha! I'll do it. If I put it into words more, I thought I should erase the words if it's a relationship that disappears. I was afraid, but that? Somehow it's hard like a stone from the tip of the sword. If you have the will, Yamato Nadeshiko? No, I feel like dying without ""! Fluffy fluffy You call your name Fluffy fluffy You're laughing Just because you're laughing God who makes you smile Thank you Koisuru Kisetsuha Yokubari fluffy fluffy You're happy to meet you even with a mischief of fate Just call the name, it floats in the air, fluffy, fluffy, you're laughing, God who makes you smile, thank you. That's right. Don't Don't Do ?! Doha

Recommended Posts

I want to use PyTorch to generate something like the lyrics of Japari Park
I want to use only the normalization process of SudachiPy
I want to express my feelings with the lyrics of Mr. Children
I want to use the Qore SDK to predict the success of NBA players
I want to customize the appearance of zabbix
I want to use the activation function Mish
I tried to vectorize the lyrics of Hinatazaka46!
I want to use Python in the environment of pyenv + pipenv on Windows 10
I want to grep the execution result of strace
I want to fully understand the basics of Bokeh
I want to use the R dataset in python
I want to increase the security of ssh connections
I tried to make something like a chatbot with the Seq2Seq model of TensorFlow
I want to use Linux commands at the command prompt! Use Linux commands at the command prompt instead of Git Bash
I want to use the latest gcc without sudo privileges! !!
I want to do something like sort uniq in Python
I want to get the operation information of yahoo route
[Python] I want to use the -h option with argparse
I want to judge the authenticity of the elements of numpy array
I want to know the features of Python and pip
[Introduction to Pytorch] I want to generate sentences in news articles
Keras I want to get the output of any layer !!
I want to know the legend of the IT technology world
I want to get the name of the function / method being executed
I want to use shortcut translation like DeepL app on Linux
I want to read the html version of "OpenCV-Python Tutorials" OpenCV 3.1 version
I want to output the beginning of the next month with Python
Comparison of GCP computing services [I want to use it serverless]
I want to use both key and value of Python iterator
I want to check the position of my face with OpenCV!
I want to know the population of each country in the world.
[Note] I want to completely preprocess the data of the Titanic issue-Age version-
I don't want to admit it ... The dynamical representation of Neural Networks
[C language] I want to generate random numbers in the specified range
I want to use jar from python
I want to use Linux on mac
I want to revive the legendary Nintendo combination by making full use of AI and HR Tech!
(Python Selenium) I want to check the settings of the download destination of WebDriver
I want to batch convert the result of "string" .split () in Python
I want to explain the abstract class (ABCmeta) of Python in detail.
I want to sort a list in the order of other lists
I want to use the Django Debug Toolbar in my Ajax application
I want to use complicated four arithmetic operations in the IF statement of the Django template! → Use a custom template
I want to use IPython Qt Console
I want to display the progress bar
I want to analyze the emotions of people who want to meet and tremble
Basics of PyTorch (1) -How to use Tensor-
I want to leave an arbitrary command in the command history of Shell
I want to stop the automatic deletion of the tmp area with RHEL7
For the time being using FastAPI, I want to display how to use API like that on swagger
Python: I want to measure the processing time of a function neatly
I want to handle the rhyme part2
I want to handle the rhyme part5
I want to handle the rhyme part4
I want to get the path of the directory where the running file is stored.
I want to visualize the transfer status of the 2020 J League, what should I do?
I want to set a life cycle in the task definition of ECS
I want to add silence to the beginning of a wav file for 1 second
I want to see a list of WebDAV files in the Requests module
I want to crop the image along the contour instead of the rectangle [python OpenCV]
I want to store the result of% time, %% time, etc. in an object (variable)