Random number seed fixed in TensorFlow and Numpy A little different

A very irrelevant note. TensorFlow's tf.set_random_seed () seems to behave slightly differently from Numpy'snp.random.seed (). It seems that Numpy resets the seed every time np.random.seed () and generates the same pseudo-random sequence again, whereas TensorFlow does not.

I wrote a sentence in which Japanese is not good, but when I execute the following code, it probably comes through.

import numpy as np
import tensorflow as tf

print '===== Numpy ====='
np.random.seed(0)
print np.random.uniform(size=5)
print np.random.uniform(size=5)

print 'Numpy can reset seed'
np.random.seed(0)
print np.random.uniform(size=5)
print np.random.uniform(size=5)

print '===== TensorFlow ====='
tf.set_random_seed(0)
with tf.Session() as sess:
    print sess.run(tf.random_uniform([5]))
    print sess.run(tf.random_uniform([5]))

print 'TensorFlow does not reset seed'
tf.set_random_seed(0)
with tf.Session() as sess:
    print sess.run(tf.random_uniform([5]))
    print sess.run(tf.random_uniform([5]))
===== Numpy =====
[ 0.5488135   0.71518937  0.60276338  0.54488318  0.4236548 ]
[ 0.64589411  0.43758721  0.891773    0.96366276  0.38344152]
Numpy can reset seed
[ 0.5488135   0.71518937  0.60276338  0.54488318  0.4236548 ]
[ 0.64589411  0.43758721  0.891773    0.96366276  0.38344152]

===== TensorFlow =====
I tensorflow/core/common_runtime/local_device.cc:25] Local device intra op parallelism threads: 4
I tensorflow/core/common_runtime/local_session.cc:45] Local session inter op parallelism threads: 4
[ 0.32064009  0.69209957  0.7421422   0.86931682  0.95991254]
[ 0.70880806  0.3939954   0.67383504  0.34078181  0.98877013]
TensorFlow does not reset seed
[ 0.30350876  0.06209636  0.98059976  0.51447523  0.15706789]
[ 0.48785222  0.40416086  0.97456396  0.57969069  0.09107506]

If you re-execute the code completely, the same pseudo-random number sequence will be generated again, so there is no problem in terms of experiment reproducibility, but for example, if you continue to do % run hoge.py on IPython, it will be seeded. I was careful because it wasn't reset, so I noticed.

By the way, R is the same denomination as Numpy.

> set.seed(0)
> runif(5)
[1] 0.8966972 0.2655087 0.3721239 0.5728534 0.9082078
> set.seed(0)
> runif(5)
[1] 0.8966972 0.2655087 0.3721239 0.5728534 0.9082078

Recommended Posts

Random number seed fixed in TensorFlow and Numpy A little different
Random seeds fixed in TensorFlow
[TensorFlow 2.x (tf.keras)] Fixed random number seed to improve reproducibility
I made a random number graph with Numpy
High-dimensional random number vector generation ~ Latin Hypercube Sampling / Latin hypercube sampling ~
[python] Random number generation memorandum
Non-overlapping integer random number generation (0-N-1)
Random number generation summary by Numpy
#Random string generation
Derivation of multivariate t distribution and implementation of random number generation by python
[Note] Random number creation?
Random string generation (Python)
Numpy random module random number generator
Decision tree and random forest
A1 notation and 26-ary number
Random number seed fixed in TensorFlow and Numpy A little different
Clipping and normalization in TensorFlow
Numpy random module random number generator
A1 notation and 26-ary number
Python a + = b and a = a + b are different
Difference between Numpy randint and Random randint
Create a 1MByte random number file
Random number generation summary by Numpy
Random seed research in machine learning
Create a random string in Python
[Python] Precautions when finding the maximum and minimum values in a numpy array with a small number of elements