[API] TensorBoard: Usage
5 steps of using TensorBoard
1. From TF graph, decide which tensors you want to log
with tf.variable_scope('layer1') as scope:
tf.summary.image('input', x_image, 3)
tf.summary.histogram("layer", L1)
tf.summary.scalar("loss", cost)
2. Merge all summaries
summary = tf.summary.merge_all()
3. Create writer and add graph
# Create summary writer
writer = tf.summary.FileWriter(TB_SUMMARY_DIR)
writer.add_graph(sess.graph)
4. Run summary merge and add_summary
s, _ = sess.run([summary, optimizer], feed_dict=feed_dict)
writer.add_summary(s, global_step=global_step)
5. Launch TensorBoard
tensorboard --logdir=/tmp/mnist_logs
The Visualizations
1. From TF graph, decide which tensors you want to log
Image Input
# Image input
x_image = tf.reshape(X, [-1, 28, 28, 1])
tf.summary.image('input', x_image, 3)
Histogram (multi-dimensional tensors)
with tf.variable_scope('layer1') as scope:
W1 = tf.get_variable("W", shape=[784, 512])
b1 = tf.Variable(tf.random_normal([512]))
L1 = tf.nn.relu(tf.matmul(X, W1) + b1)
L1 = tf.nn.dropout(L1, keep_prob=keep_prob)
tf.summary.histogram("X", X)
tf.summary.histogram("weights", W1)
tf.summary.histogram("bias", b1)
tf.summary.histogram("layer", L1)
Scalar tensors
tf.summary.scalar("loss", cost)
Add scope for better hierarchy
with tf.variable_scope('layer1') as scope:
W1 = tf.get_variable("W", shape=[784, 512],...
b1 = tf.Variable(tf.random_normal([512]))
L1 = tf.nn.relu(tf.matmul(X, W1) + b1)
L1 = tf.nn.dropout(L1, keep_prob=keep_prob)
tf.summary.histogram("X", X)
tf.summary.histogram("weights", W1)
tf.summary.histogram("bias", b1)
tf.summary.histogram("layer", L1)
with tf.variable_scope('layer2') as scope:
...
with tf.variable_scope('layer3') as scope:
...
with tf.variable_scope('layer4') as scope:
...
with tf.variable_scope('layer5') as scope:
2. Merge summaries and
3. Create writer after creating session
# Summary
summary = tf.summary.merge_all()
# initialize
sess = tf.Session()
sess.run(tf.global_variables_initializer())
# Create summary writer
writer = tf.summary.FileWriter(TB_SUMMARY_DIR)
writer.add_graph(sess.graph)
4. Run merged summary and write (add summary)
s, _ = sess.run([summary, optimizer], feed_dict=feed_dict)
writer.add_summary(s, global_step=global_step)
global_step += 1
5. Launch tensorboard (local)
$ tensorboard —logdir=/tmp/mnist_logs
Starting TensorBoard b'41' on port 6006
(You can navigate to http://127.0.0.1:6006)
6. Multiple runs
tensorboard —logdir=/tmp/mnist_logs/run1
writer = tf.summary.FileWriter(“/tmp/mnist_logs/run1”)
tensorboard —logdir=/tmp/mnist_logs/run2
writer = tf.summary.FileWriter(“/tmp/mnist_logs/run1”)
tensorboard —logdir=/tmp/mnist_logs
이 포스팅은 머신러닝/딥러닝 오픈소스 Tensorflow 개발을 위한 선행학습으로 "모두를 위한 머신러닝/딥러닝 강의"의 TensorBoard 대한 학습노트입니다.
'머신러닝&딥러닝 개발 > Tensorflow API' 카테고리의 다른 글
[API] Tensorboard: Visualizaing Learning (텐서보드: 학습 시각화) (0) | 2018.01.24 |
---|---|
[API] Tensorflow Control Flow (텐서플로우 제어 연산자) (0) | 2018.01.17 |
[API] Tensorflow Math (텐서플로우 수학 함수) (0) | 2018.01.17 |
[API] Tensorflow Convert to Tensor - slicing and joining(텐서플로우 텐서 변환 - 자르고 붙이기) (0) | 2018.01.13 |
[API] Tensorflow Convert to Tensor - Shape and Shaping(텐서플로우 텐서 변환 - 구조 및 구조 변형) (0) | 2018.01.13 |