[API] TensorBoard: Usage


5 steps of using TensorBoard

1.   From TF graph, decide which tensors you want to log

with tf.variable_scope('layer1') as scope:

tf.summary.image('input', x_image, 3)

tf.summary.histogram("layer", L1)

tf.summary.scalar("loss", cost)

2.     Merge all summaries

summary = tf.summary.merge_all()

3.     Create writer and add graph

# Create summary writer

writer = tf.summary.FileWriter(TB_SUMMARY_DIR)

writer.add_graph(sess.graph)

4.     Run summary merge and add_summary

s, _ = sess.run([summary, optimizer], feed_dict=feed_dict)

writer.add_summary(s, global_step=global_step)

5.     Launch TensorBoard

tensorboard --logdir=/tmp/mnist_logs

 

The Visualizations

1.     From TF graph, decide which tensors you want to log

Image Input

# Image input

x_image = tf.reshape(X, [-1, 28, 28, 1])

tf.summary.image('input', x_image, 3)

 

W209 一 OSual


Histogram (multi-dimensional tensors) 

with tf.variable_scope('layer1') as scope:

   W1 = tf.get_variable("W", shape=[784, 512])

   b1 = tf.Variable(tf.random_normal([512]))

   L1 = tf.nn.relu(tf.matmul(X, W1) + b1)

   L1 = tf.nn.dropout(L1, keep_prob=keep_prob)

   tf.summary.histogram("X", X)

   tf.summary.histogram("weights", W1)

   tf.summary.histogram("bias", b1)

   tf.summary.histogram("layer", L1)

 

layer2/layer 
layer2/weights

Scalar tensors

tf.summary.scalar("loss", cost)


 

Add scope for better hierarchy

with tf.variable_scope('layer1') as scope:

  W1 = tf.get_variable("W", shape=[784, 512],...

  b1 = tf.Variable(tf.random_normal([512]))

  L1 = tf.nn.relu(tf.matmul(X, W1) + b1)

  L1 = tf.nn.dropout(L1, keep_prob=keep_prob)

  tf.summary.histogram("X", X)

  tf.summary.histogram("weights", W1)

  tf.summary.histogram("bias", b1)

  tf.summary.histogram("layer", L1)

with tf.variable_scope('layer2') as scope:

   ...

with tf.variable_scope('layer3') as scope:

   ...

with tf.variable_scope('layer4') as scope:

   ...

with tf.variable_scope('layer5') as scope:


layer2 
layerl

 

2.     Merge summaries and

3.     Create writer after creating session

# Summary

summary = tf.summary.merge_all()

# initialize

sess = tf.Session()

sess.run(tf.global_variables_initializer())

# Create summary writer

writer = tf.summary.FileWriter(TB_SUMMARY_DIR)

writer.add_graph(sess.graph)

 

4.     Run merged summary and write (add summary)

s, _ = sess.run([summary, optimizer], feed_dict=feed_dict)

writer.add_summary(s, global_step=global_step)

global_step += 1

 

5.     Launch tensorboard (local)

$ tensorboard logdir=/tmp/mnist_logs

Starting TensorBoard b'41' on port 6006

(You can navigate to http://127.0.0.1:6006)

 

6.     Multiple runs

tensorboard logdir=/tmp/mnist_logs/run1

writer = tf.summary.FileWriter(/tmp/mnist_logs/run1)

tensorboard logdir=/tmp/mnist_logs/run2

writer = tf.summary.FileWriter(/tmp/mnist_logs/run1)

tensorboard logdir=/tmp/mnist_logs

 

 

이 포스팅은 머신러닝/딥러닝 오픈소스 Tensorflow 개발을 위한 선행학습으로 "모두를 위한 머신러닝/딥러닝 강의" TensorBoard 대한 학습노트입니다.

Posted by 이성윤