The source of PaintsChainer is published at http://qiita.com/taizan/items/7119e16064cc11500f32, but there are many articles and blogs that I tried to move PaintsChainer using a trained model, but I learned by myself. I couldn't find anyone who tried to do it, so I tried it.
Addendum 20170404: The calculation of electricity bill was overcalculated by one digit. .. It wasn't that expensive. ..
First of all, you have to prepare the learning data. It is difficult to prepare a large number of images, and it can not be helped to do it with the same Pixiv as the original article, so here, [certain animation](https://ja.wikipedia.org/wiki/%E6%B6% BC% E5% AE% AE% E3% 83% 8F% E3% 83% AB% E3% 83% 92% E3% 82% B7% E3% 83% AA% E3% 83% BC% E3% 82% BA) I tried using the video of. The MP4 video I had is converted into a full frame image. For the time being, I made images of episodes 1 to 6 of the initial work. All images are made regardless of the opening, ending, and scenes. There are 172,729 images in total.
Use ffmpeg for imaging. I placed the generated images in cgi-bin / paint_x2_unet / images / original in the directory where I checked out PaintChainer.
$ cd cgi-bin/paint_x2_unet
$ mkdir -p images/original
$ ffmpeg -i movies/movie_01.mp4 -f image2 images/original/movie_01_%d.jpg
I will do this for 6 episodes. Please put the whole story in images / original / so that the file name is not covered.
PaintChainer requires 128x128 images and 512x512 images. I wrote a script to resize properly.
128x128 https://github.com/ikeyasu/PaintsChainer/blob/ikeyasu_mod/cgi-bin/paint_x2_unet/tools/resize.py
512x512 https://github.com/ikeyasu/PaintsChainer/blob/ikeyasu_mod/cgi-bin/paint_x2_unet/tools/resizex2.py
Also, for line art, I referred to k3nt0's blog.
https://github.com/ikeyasu/PaintsChainer/blob/ikeyasu_mod/cgi-bin/paint_x2_unet/tools/image2line.py
I run the above on the image I just extracted, but it takes a lot of time. Therefore, parallelize using Gnu parallel.
I wrote the following script.
cgi-bin/paint_x2_unet/run.sh:
ls -v1 ../images/original/ | parallel -j 8 'echo {}; python resize.py -i {} -o ../images/color/{}'
ls -v1 ../images/original/ | parallel -j 8 'echo {}; python image2line.py -i {} -o ../images/line/{}'
ls -v1 ../images/original/ | parallel -j 8 'echo {}; python resizex2.py -i {} -o ../images/colorx2/{}'
ls -v1 ../images/original/ | parallel -j 8 'echo {}; python image2line.py -i {} -o ../images/linex2/{}'
$ cd cgi-bin/paint_x2_unet
$ cd tools
$ ./run.sh
It also stores a list of datasets in dat / images_color_train.dat.
$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet/tools
$ cd ../images/original
$ ls -v1 > ../../dat/images_color_train.dat
All you have to do is learn. I've tinkered with the original code a bit. (It's a bit old because it's based on the Paint Chainer code when you start learning)
https://github.com/ikeyasu/PaintsChainer/commit/8e30ee6933c747580efe25c9c4d5165f55823966
$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet/images/original
$ cd ../../
$ python train_128.py -g 0 --dataset images/ -e 20 -o result1
$ cp result1/model_final models/model_cnn_128
$ python train_x2.py -g 0 -o result2/ --dataset images/ --snapshot_interval 5000 -e 20
The model to load is written in cgi-bin / paint_x2_unet / cgi_exe.py.
serializers.load_npz(
"./cgi-bin/paint_x2_unet/models/unet_128_standard", self.cnn_128)
When
serializers.load_npz(
"./cgi-bin/paint_x2_unet/models/unet_512_standard", self.cnn)
It is the part of. Copy the model accordingly
$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet
$ cp result1/model_final models/unet_128_standard
$ cp result2/model_final models/unet_512_standard
Then run server.py
$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet
$ cd ../../
$ python server.py
You can see PaintChainer by opening http: // localhost: 8000 in your browser. If you want to see it from another PC, specify the IP of the running host, such as python server.py --host 192.168.1.3
.
There is no copyright in the creation of artificial intelligence However, since this is colored, the line art part has copyright, so here is the result You can't put it on the barn. .. I will only quote a part of the screen.
The color of the hair and eyes is beautifully painted.
涼宮ハルヒの憂鬱 I 第5話より引用
Uniform 涼宮ハルヒの憂鬱 I 第5話より引用
Men too 涼宮ハルヒの憂鬱 I 第5話より引用
I posted the results that worked, but even if I put the handwritten line art of the fan illustration, it doesn't work at all. This may be because the line art method was not good. After all, artificial intelligence has to give good teaching materials. ..
http://d.hatena.ne.jp/zuruo/20080528/1212143328 (頭部のリボン部分)
Also, even if you give a hint of color, which is a feature of PaintChainer, it does not paint very well. What is this? After all, is the learning data not good?
I wrote it as if I could try it quickly, but it took 294 hours (12 days and 6 hours!) To execute so far. Original article says that it is only the first stage. ..
Assuming that the electricity bill of the PC is 200W, the electricity bill is
200 * (51 + 243) * 0.026 = 1,528.8 yen
Here, 0.026 can be written because TEPCO's electricity bill is because 1kw / h is 26 yen.
Also, the PC is a self-made PC with GTX1080, and it costs about 170,000 yen.
Reference: Assemble a cube-type PC with GTX 1080
Recommended Posts