5.0 / 5
We will cover the following tasks in 53 minutes:
Normally, when you use TensorFlow to create and train machine learning models, you need to build computational graphs first required for your model training first and then run those graphs later to actually perform the computations. However, this approach is not very easy or intuitive to use. In production setting, this may not be a big problem but if you’re just researching and experimenting with your potential models, then this traditional approach can slow things down. This is where eager execution comes in.
TensorFlow’s eager execution facilitates an imperative programming environment that allows the programmer to evaluate operations immediately, instead of first creating computational graphs to run later. Let’s see how to enable this mode and also check if we are working with eager execution.
Tensors are multi-dimensional arrays that hold information on operations and data types that those operations are performed on. A tensor object has a data type and a shape of data that it expects. TensorFlow offers a comprehensive library of tensor operations like addition, matrix multiplication and so on. These operations automatically convert native Python types into tensors. Let’s look at a few examples in this chapter.
Because both the data types and operations are inherent properties of tensors, it is not possible to mutate the tensors. While NumPy arrays are mutable, they can be easily converted to tensors and vice versa. Of course, when we convert a tensor to a NumPy array, we are only converting the resulting value of the tensor.
By default, NumPy arrays are automatically converted to tensors and tensors are automatically converted to NumPy arrays. Tensors can also be explicitly converted to NumPy arrays by calling the
numpy() method. Let’s look at some of these examples in this chapter.
TensorFlow operations can be backed by either GPU or CPU. TensorFlow can automatically decide if GPU or CPU is used for the operations. Tensors produced by these operations are typically backed by the memory of the device the operation was executed on.
Dataset from Tensors
We can use TensorFlow’s Dataset API to build pipelines to feed data to our models. We can use this to create data pipelines to feed our model’s training and evaluation loops. In eager execution mode, we don’t need to construct a TensorFlow iterator, but instead we can simply use Python’s iterator over the Dataset objects. There are also a bunch of methods to create dataset objects. You can create the dataset objects from tensors or by reading text or CSV files.
Automatic differentiation is a way to programmatically compute derivative of a function. This process involves applying chain rule repeatedly to simpler operations. TensorFlow records all operations executed inside a Gradient Tape. In order to compute gradients associated with each recorded operation, gradients of all operations leading up to required values are computed using automatic differentiation.
Python’s control flow is handled naturally by the gradient tapes. Let’s take a look at it in this chapter.
Higher Order Gradients
With Gradient Tapes, higher order gradients are actually quite easy to compute. If gradients are computed within a gradient tape context, those computations can be stored in a gradient tape as well. This way, higher order gradients can be computed. Let’s take a look!
About the Host (Amit)
I have been writing code since 1993, when I was 11, and my first passion project started with a database management software that I wrote for a local hospital. More recently, I wrote an award winning education Chatbot for a multi-billion-revenue company. I solved a recurrent problem for my client where they wanted to make basic cyber safety and privacy education accessible for their users. This bot enabled my client to reach out to their customers with personalised and real-time education. In the last one year, I’ve continued my interest in this field by constantly learning and growing in Machine Learning, NLP and Deep Learning. I'm very excited to share my variety of experience and learnings with you with the help of Rhyme.com.