
上QQ阅读APP看书,第一时间看更新
Summary
In this chapter, we saw a quick overview of PyTorch functionality and features. We talked about basic fundamental pieces such as tensor and gradients, saw how an NN can be made from the basic building blocks, and learned how to implement those blocks ourselves. We discussed loss functions and optimizers, as well as the monitoring of training dynamics. The goal of the chapter was to give a very quick introduction to PyTorch, which will be used later in the book.
For the next chapter, we're ready to start dealing with the main subject of this book: RL methods.