Stop going crazy over plots and charts, TensorBoard is here to help!

Imagine yourself struggling with your python code in order to figure out whether what you wrote is really doing what you wanted it to do. In the past, you should have used some external libraries as matplotlib. Now you can visualize every variable and operation you’ve written with TensorBoard, just by tweaking your software a bit. As you can imagine, this can help you find bottlenecks, errors or notice high bias/variance related problems. All you have to do is to write some of the stats of your software (i.e. the cost function) to a log directory and then run TensorBoard. Notice that you’ll need to create a different directory each time you run your software, otherwise everything will be merged and messed up. As said before, you’ll need to create a directory for each execution, so you could add a timestamp in the directory’s name.

Notice that at the moment we’ve created nothing more than a string. Now it’s time to take advantage of ‘summaries’. A summary is a special binary log file compatible with TensorFlow. For the sake of simplicity, let’s suppose you want to store info about the cost function. All you have to do is to create a summary of that variable (in this case a Scalar) like :

The first argument is a string representing the name you’ll see in TensorBoard, and the second one is obviously the variable used to compute the loss. There’s one more thing you need in order to visualize something: a FileWriter object This software component will let you write variables into the log file, and creates it if it doesn’t exist. Create an instance of a file_writer (NB: one for the training process, one for validation and so on) and write into the log file by using the eval method. It’s a good practice not to write into the log file at each iteration, since it will slow down the program.

Now that everything’s ready, it’s time to launch TensorBoard! Open a terminal and type:

Once you launched TensorBoard, open up a browser like Google Chrome, navigate to http://localhost:6006 (6006 stands for ‘goog’ written upside down) and enjoy a view like the one in the image below.

 

Just to make sure that every concept is clear, here’s a simple code that contains the training of a linear model with the integration of TensorBoard.

Now, if you launch TensorBoard on the right directory and navigate to localhost:6006, you will be able to see accuracy’s and cross entropy’s trends, together with the graph you’ve built.

If you design complex models, things may become really hard to understand in TensorBoard. You can group some nodes with name scopes. For example, let’s suppose you want to define a name scope for the computation of the loss. All you have to do is:

In the graph tab in TensorBoard, you’ll notice a “plus” symbol inside the “cross_entropy” rectangle. If you click on it, you’ll see all the operations that you’re actually hiding.

 

Test what you’ve learned with our software on GitHub.

Star Follow @vincenzosantopietro Watch