Operating environment
GeForce GTX 1070 (8GB)
ASRock Z170M Pro4S [Intel Z170chipset]
Ubuntu 14.04 LTS desktop amd64
TensorFlow v0.11
cuDNN v5.1 for Linux
CUDA v8.0
Python 2.7.6
IPython 5.1.0 -- An enhanced Interactive Python.
gcc (Ubuntu 4.8.4-2ubuntu1~14.04.3) 4.8.4
GNU bash, version 4.3.8(1)-release (x86_64-pc-linux-gnu)
I'm trying to save the values of input batch, output batch, prediction after learning TensorFlow to a file and use it from another Python script.
It seems that it can be done by exporting binary with numpy.
Related [numpy> File read / write> np.save () / np.load () / np.savetxt () / np.loadtxt ()> Binary read / write / csv read / write](http://qiita.com/7of9/items/ c730990479687ec2e959)
I implemented the following code.
test_python_170324a.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import numpy as np
# on Python 2.7.6
# 1. save
inpbt1 = np.array([[3, 1], [4, 1], [5, 9]], dtype='f')
inpbt2 = np.array([[2, 7], [1, 8], [2, 7]], dtype='f')
inpbt3 = np.array([[6, 0], [2, 2], [10, 23]], dtype='f')
alist = list([inpbt1, inpbt2, inpbt3])
print(alist)
FILENMAE = 'test_res_170328a.npy'
np.save(FILENMAE, alist)
# 2. load
rddat = np.load(FILENMAE)
for idx, elem in enumerate(rddat):
print('%d:%s' % (idx, elem))
result
$ python test_python_170324a.py
[array([[ 3., 1.],
[ 4., 1.],
[ 5., 9.]], dtype=float32), array([[ 2., 7.],
[ 1., 8.],
[ 2., 7.]], dtype=float32), array([[ 6., 0.],
[ 2., 2.],
[ 10., 23.]], dtype=float32)]
0:[[ 3. 1.]
[ 4. 1.]
[ 5. 9.]]
1:[[ 2. 7.]
[ 1. 8.]
[ 2. 7.]]
2:[[ 6. 0.]
[ 2. 2.]
[ 10. 23.]]
The above result is the same data format as when print () was done after learning with TensorFlow. batch_size = 3, 3 times iteration example. The result below is read from the saved file and output.
https://hydrocul.github.io/wiki/numpy/ndarray-io.html
np.save and np.load can output ndarray to a file or input from a file. The file format is binary, and .npy is often used as the file name extension. You can save one ndarray in one file.
I am considering saving input batch, output batch, and prediction in separate files.
However, as you get predictions, learning is likely to progress during that time, so it may be better to save weight and bias and reconfigure the network. http://qiita.com/7of9/items/ce58e66b040a0795b2ae
Recommended Posts