Try implementing XOR with PyTorch

Introduction

If you want to do what you want with Keras, you end up using tensorflow, so PyTorch is better, right? So I immediately implemented XOR.

environment

Source

import torch
import torch.nn as nn
import torch.optim as optim


class Net(nn.Module):

    def __init__(self):
        super(Net, self).__init__()
        self.fc1 = torch.nn.Linear(2, 8)
        self.fc2 = torch.nn.Linear(8, 8)
        self.fc3 = torch.nn.Linear(8, 1)
        self.sigmoid = nn.Sigmoid()

    def forward(self, x):
        x = torch.nn.functional.relu(self.fc1(x))
        x = torch.nn.functional.relu(self.fc2(x))
        x = self.fc3(x)
        x = self.sigmoid(x)
        return x


def main():

    import numpy as np
    x = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
    y = np.array([[0], [1], [1], [0]])

    num_epochs = 10000

    # convert numpy array to tensor
    x_tensor = torch.from_numpy(x).float()
    y_tensor = torch.from_numpy(y).float()

    # crate instance
    net = Net()

    # set training mode
    net.train()

    # set training parameters
    optimizer = torch.optim.SGD(net.parameters(), lr=0.01)
    criterion = torch.nn.MSELoss()

    # start to train
    epoch_loss = []
    for epoch in range(num_epochs):
        print(epoch)
        # forward
        outputs = net(x_tensor)

        # calculate loss
        loss = criterion(outputs, y_tensor)

        # update weights
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

        # save loss of this epoch
        epoch_loss.append(loss.data.numpy().tolist())

    print(net(torch.from_numpy(np.array([[0, 0]])).float()))
    print(net(torch.from_numpy(np.array([[1, 0]])).float()))
    print(net(torch.from_numpy(np.array([[0, 1]])).float()))
    print(net(torch.from_numpy(np.array([[1, 1]])).float()))

if __name__ == "__main__":
    main()

result

tensor([[0.0511]], grad_fn=<SigmoidBackward>)
tensor([[0.9363]], grad_fn=<SigmoidBackward>)
tensor([[0.9498]], grad_fn=<SigmoidBackward>)
tensor([[0.0666]], grad_fn=<SigmoidBackward>)

Oh, it feels good.

Impressions

It's still just a touch, but compared to Keras and Tensorflow, it doesn't feel like a black box and it feels like it can be used seamlessly from Python. For example, even if you put a print statement in the model, it will be output as it is. Visualization during execution seems to be very easy to do.

Recommended Posts

Try implementing XOR with PyTorch
Try implementing XOR with Keras Functional API
Try implementing RBM with chainer.
Try an autoencoder with Pytorch
Try implementing perfume with Go
I tried implementing DeepPose with PyTorch
Play with PyTorch
I tried implementing DeepPose with PyTorch PartⅡ
Cross-validation with PyTorch
Beginning with PyTorch
Try scraping with Python.
Use RTX 3090 with PyTorch
Try Semantic Segmentation (Pytorch)
Try SNN with BindsNET
Install torch-scatter with PyTorch 1.7
Try regression with TensorFlow
Try implementing associative memory with Hopfield network in Python
Try implementing MetaTrader's LWMA with scipy's FIR filter function
Try to factorial with recursion
Try implementing gRPC structured logs easily and simply with grpc_zap
Try function optimization with Optuna
Try deep learning with TensorFlow
Try using PythonTex with Texpad.
Try edge detection with OpenCV
Try Google Mock with C
Try using matplotlib with PyCharm
Try GUI programming with Hy
Try Python output with Haxe 3.2
Try matrix operation with NumPy
Try running CNN with ChainerRL
Try various things with PhantomJS
Try Deep Learning with FPGA
Implement PyTorch + GPU with Docker
Prediction of Nikkei 225 with Pytorch 2
Machine learning Minesweeper with PyTorch
AWS Lambda with PyTorch [Lambda import]
Try running Python with Try Jupyter
Try implementing Yubaba in Python 3
Prediction of Nikkei 225 with Pytorch
Try Selenium Grid with Docker
Perform Stratified Split with PyTorch
Try face recognition with Python
Try OpenCV with Google Colaboratory
I made Word2Vec with Pytorch
Try machine learning with Kaggle
Try TensorFlow MNIST with RNN
Try building JupyterHub with Docker
Try using folium with anaconda
Try to implement linear regression using Pytorch with Google Colaboratory
Implementing logistic regression with NumPy
Avoid implementing useless functions with inheritance
Try Deep Learning with FPGA-Select Cucumbers
Try scraping with Python + Beautiful Soup
Reinforcement learning 13 Try Mountain_car with ChainerRL.
Try implementing Yubaba in Go language
[PyTorch Tutorial ⑤] Learning PyTorch with Examples (Part 2)
Learn with PyTorch Graph Convolutional Networks
Try to operate Facebook with Python
Try implementing extension method in python
Try singular value decomposition with Python
Try face recognition with Generated Photos