Collection and automation of erotic images using deep learning

Introduction

Erotic images are the best.

Anyone can do naughty freely as long as there is an erotic image. You can be sexually excited by yourself, even if you don't have her. You can feel certain satisfaction and immerse yourself in happiness. Any taste is at your disposal.

Therefore, for us human beings, collecting erotic images is not an exaggeration to call it a species habit, just like dung beetles rolling feces.

However, we are the longest creatures of primates. It shouldn't be the same as the dung beetle, which has been rolling dung in the same way for tens of thousands of years. Mankind must collect erotic images more efficiently and enthusiastically.

However, even so, collecting erotic images is very difficult. It is necessary to visit various sites, carefully examine them, and then collect and structure the fitting items according to a certain scheme. Many people will need different ones depending on the day.

By the way, recently, deep learning has become popular.

Both cats and scoops are deep learning.

When you hit a difficult project, what can you do with deep learning? The number of consultations like this has increased. If you hit a project with low order accuracy, can you not attract the attention of customers by deep learning for the time being? Isn't it popular? The number of sales people who pretend to be like that has increased.

This is the same even if I meet with engineers outside the company, so it's easy to win with deep learning, right? There are more opportunities to talk about things like, "I think that can be solved by deep learning." Everyone and he are all about deep learning.

So, if it's so popular, you have to take on the challenge.

I tried to collect erotic images by deep learning.

Implementation

I used chainer for implementation.

It is a safe and secure domestic product.

The model looks like this:

class NIN(chainer.Chain):

  def __init__(self, output_dim, gpu=-1):
    w = math.sqrt(2)
    super(NIN, self).__init__(
      mlpconv1=L.MLPConvolution2D(3, (16, 16, 16), 11, stride=4, wscale=w),
      mlpconv2=L.MLPConvolution2D(16, (32, 32, 32), 5, pad=2, wscale=w),
      mlpconv3=L.MLPConvolution2D(32, (64, 64, 64), 3, pad=1, wscale=w),
      mlpconv4=L.MLPConvolution2D(64, (32, 32, output_dim), 3, pad=1, wscale=w),
    )
    self.output_dim = output_dim
    self.train = True
    self.gpu = gpu

  def __call__(self, x, t):
    h = F.max_pooling_2d(F.relu(self.mlpconv1(x)), 3, stride=2)
    h = F.max_pooling_2d(F.relu(self.mlpconv2(h)), 3, stride=2)
    h = F.max_pooling_2d(F.relu(self.mlpconv3(h)), 3, stride=2)
    h = self.mlpconv4(F.dropout(h, train=self.train))
    h = F.average_pooling_2d(h, 10)

    h = F.reshape(h, (x.data.shape[0], self.output_dim))

    loss = F.softmax_cross_entropy(h, t)
    print(loss.data)
    return loss

As you can see, I just played with the NIN sample that came with chainer. Moreover, the reason I played with it was that the memory of the VPS for development was small, so I could not train a more complicated model. Even with this model, it takes about 2GB when learning.

For the data for learning, I used the illustrations published on the Internet. Out of a total of 2000 images, 1000 images that the son reacts to are rare images, 1000 images that are unexpected are common images, and the classification is between the two types.

Verification

The following three images were used to check the results.

91vw8GTN37L.jpg

無題.png

aaa.jpg

The first one is the cover of his own book, "Tanaka-Age Equal, a Wizard Who Has No History". The second one is my wife who requested a business trip from 3D Custom Girl. And the third piece is one when I enjoyed molesting play with her on the train.

We scored these three images using the above model.

As a result, the following scores were obtained.

1st

{
  "result": [
    [
      0.9290218353271484, 
      "1,rare"
    ], 
    [
      0.07097823172807693, 
      "0,common"
    ]
  ]
}

Second piece

{
  "result": [
    [
      0.6085503101348877, 
      "0,common"
    ], 
    [
      0.3914496898651123, 
      "1,rare"
    ]
  ]
}

Third piece

{
  "result": [
    [
      0.5935600399971008, 
      "1,rare"
    ], 
    [
      0.40644001960754395, 
      "0,common"
    ]
  ]
}

Similar to the imagenet sample that comes with chainer, the higher the value for the corresponding label, the closer it is to that. In other words, the higher the value of rare, the more happy the son will be.

At this point, I felt some response.

Therefore, in order to collect erotic images more realistically, we will make the access to the above model API, read the image acquired from real-time crawling of twitter, store the result in the database, and rank the web application. created.

The configuration is as follows.

version: '2'

volumes:
  db-data:
    driver: local
  object-data:
    driver: local

services:
  db:
    container_name: db
    image: mysql
    volumes:
      - db-data:/var/lib/mysql
      # - ./dockerfiles/staging/mysql:/docker-entrypoint-initdb.d
    environment:
      MYSQL_ROOT_PASSWORD: $APP_DATABASE_PASSWORD
      MYSQL_DATABASE: app_staging
      MYSQL_USER: xxxx
      MYSQL_PASSWORD: $APP_DATABASE_PASSWORD
    expose:
      - "3306"
    restart: always

  app:
    build:
      context: ../../
      dockerfile: dockerfiles/staging/app/Dockerfile
    environment:
      RAILS_LOG_TO_STDOUT: 'true'
      RAILS_SERVE_STATIC_FILES: 'true'
      RAILS_ENV: 'staging'
      DISABLE_DATABASE_ENVIRONMENT_CHECK: $DISABLE_DATABASE_ENVIRONMENT_CHECK
      APP_SECRET_KEY_BASE: $APP_SECRET_KEY_BASE
      APP_DATABASE_USERNAME: xxxx
      APP_DATABASE_PASSWORD: $APP_DATABASE_PASSWORD
      APP_DATABASE_HOST: db
      TW_CONSUMER_KEY: $TW_CONSUMER_KEY
      TW_CONSUMER_SECRET: $TW_CONSUMER_SECRET
      TW_ACCESS_TOKEN: $TW_ACCESS_TOKEN
      TW_ACCESS_TOKEN_SECRET: $TW_ACCESS_TOKEN_SECRET

  minio:
    build:
      context: ../../
      dockerfile: dockerfiles/staging/minio/Dockerfile
    volumes:
      - object-data:/var/lib/minio
    environment:
      MINIO_ACCESS_KEY: 'xxx'
      MINIO_SECRET_KEY: 'xxx'
    ports:
      - '0.0.0.0:9000:9000'
    command: [server, /export]

  calc:
    container_name: calc
    build:
      context: ../../
      dockerfile: dockerfiles/staging/calc/Dockerfile
    command: python web.py
    expose:
      - "5000"

  web:
    container_name: web
    extends:
      service: app
    command: bin/wait_for_it db:3306 -- pumactl -F config/puma.rb start
    expose:
      - "3000"
    depends_on:
      - db

  crawler:
    container_name: crawler
    extends:
      service: app
    command: bin/wait_for_it db:3306 -- rails runner bin/crawler.rb
    depends_on:
      - calc
      - db

  nginx:
    container_name: nginx
    build:
      context: ../../
      dockerfile:  dockerfiles/staging/nginx/Dockerfile
    ports:
      - '80:80'
      - '443:443'
    depends_on:
      - web

I tried to collect side dishes on twitter for a while using this.

The results are as follows.

ranking.png

Some of them are extracted from the ranking when about 5,60 sheets are collected.

The value shown on the right side of the image is the value of rare.

I'm surprised that the results so far have been achieved just by giving the model the illustrations divided into A and B without doing anything. I strongly felt that it would be possible to collect images that my son liked by labeling them a little more strictly. Also, if you use multiple models in combination, the accuracy is likely to improve further.

Conclusion

Perhaps the day when each person has an erotic image sommelier optimized for himself is just around the corner.

Recommended Posts

Collection and automation of erotic images using deep learning
Examination of Forecasting Method Using Deep Learning and Wavelet Transform-Part 2-
Meaning of deep learning models and parameters
Examination of exchange rate forecasting method using deep learning and wavelet transform
A memorandum of studying and implementing deep learning
Parallel learning of deep learning by Keras and Kubernetes
Deep learning 1 Practice of deep learning
DNN (Deep Learning) Library: Comparison of chainer and TensorFlow (1)
Deep running 2 Tuning of deep learning
Deep reinforcement learning 2 Implementation of reinforcement learning
[Deep Learning from scratch] Implementation of Momentum method and AdaGrad method
Classify CIFAR-10 image datasets using various models of deep learning
Benefits and examples of using RabbitMq
Significance of machine learning and mini-batch learning
Send messages and images using LineNotify
I tried deep learning using Theano
[Anomaly detection] Try using the latest method of deep distance learning
Graph of the history of the number of layers of deep learning and the change in accuracy
Sentiment analysis of corporate word-of-mouth data of career change meetings using deep learning
A collection of tips for speeding up learning and reasoning with PyTorch
I tried using the trained model VGG16 of the deep learning library Keras
I tried the common story of using Deep Learning to predict the Nikkei 225
Deep Learning beginners tried weather forecasting from meteorological satellite images using Keras
I tried the common story of predicting the Nikkei 225 using deep learning (backtest)
Automatic collection of stock prices using python
Semi-supervised label learning using DBN and Label Spreading
Example of using class variables and class methods
Othello-From the tic-tac-toe of "Implementation Deep Learning" (3)
Introduction to Deep Learning ~ Convolution and Pooling ~
Image recognition model using deep learning in 2016
Try deep learning of genomics with Kipoi
Visualize the effects of deep learning / regularization
Automatically generate images of koalas and bears
Learning record of reading "Deep Learning from scratch"
Othello-From the tic-tac-toe of "Implementation Deep Learning" (2)
[For beginners of artificial intelligence] Machine learning / Deep Learning Programming Learning path and reference books
Machine Learning: Image Recognition of MNIST by using PCA and Gaussian Native Bayes
Deep Learning from scratch The theory and implementation of deep learning learned with Python Chapter 3
Publishing and using a program that automatically collects facial images of specified people
Build a python environment to learn the theory and implementation of deep learning
[Deep Learning from scratch] Initial value of neural network weight using sigmoid function
Deep Learning
Reconstruction of moving images by Autoencoder using 3D-CNN
Classification of guitar images by machine learning Part 1
Chainer and deep learning learned by function approximation
An amateur tried Deep Learning using Caffe (Introduction)
An amateur tried Deep Learning using Caffe (Practice)
A story that I wanted to realize the identification of parking lot fullness information using images obtained with a Web camera and Raspberry Pi and deep learning.
Deep learning / error back propagation of sigmoid function
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
Recipe collection comparing versions 1 and 2 of TensorFlow (Part 1)
Low-rank approximation of images by HOSVD and HOOI
Basic understanding of stereo depth estimation (Deep Learning)
Extend and inflate your own Deep Learning dataset
Stock Price Forecast Using Deep Learning (TensorFlow) -Part 2-
Noise removal and background transparency of binary images
Introduction to Deep Learning ~ Localization and Loss Function ~
[Causal search / causal reasoning] Execute causal search (SAM) using deep learning
An amateur tried Deep Learning using Caffe (Overview)
Anonymous upload of images using Imgur API (using Python)
Classification of guitar images by machine learning Part 2