Notes on the Deep Learning book from Ian Goodfellow, Yoshua Bengio and Aaron Courville (2016)

Overview

Cover of the deep learning book by Goodfellow, Bengio and Courville

The Deep Learning Book - Goodfellow, I., Bengio, Y., and Courville, A. (2016)

This content is part of a series following the chapter 2 on linear algebra from the Deep Learning Book by Goodfellow, I., Bengio, Y., and Courville, A. (2016). It aims to provide intuitions/drawings/python code on mathematical theories and is constructed as my understanding of these concepts.

Boost your data science skills. Learn linear algebra.

I'd like to introduce a series of blog posts and their corresponding Python Notebooks gathering notes on the Deep Learning Book from Ian Goodfellow, Yoshua Bengio, and Aaron Courville (2016). The aim of these notebooks is to help beginners/advanced beginners to grasp linear algebra concepts underlying deep learning and machine learning. Acquiring these skills can boost your ability to understand and apply various data science algorithms. In my opinion, it is one of the bedrock of machine learning, deep learning and data science.

These notes cover the chapter 2 on Linear Algebra. I liked this chapter because it gives a sense of what is most used in the domain of machine learning and deep learning. It is thus a great syllabus for anyone who wants to dive in deep learning and acquire the concepts of linear algebra useful to better understand deep learning algorithms.

You can find all the articles here.

Getting started with linear algebra

The goal of this series is to provide content for beginners who want to understand enough linear algebra to be confortable with machine learning and deep learning. However, I think that the chapter on linear algebra from the Deep Learning book is a bit tough for beginners. So I decided to produce code, examples and drawings on each part of this chapter in order to add steps that may not be obvious for beginners. I also think that you can convey as much information and knowledge through examples as through general definitions. The illustrations are a way to see the big picture of an idea. Finally, I think that coding is a great tool to experiment with these abstract mathematical notions. Along with pen and paper, it adds a layer of what you can try to push your understanding through new horizons.

Graphical representation is also very helpful to understand linear algebra. I tried to bind the concepts with plots (and code to produce it). The type of representation I liked most by doing this series is the fact that you can see any matrix as linear transformation of the space. In several chapters we will extend this idea and see how it can be useful to understand eigendecomposition, Singular Value Decomposition (SVD) or the Principal Components Analysis (PCA).

The use of Python/Numpy

In addition, I noticed that creating and reading examples is really helpful to understand the theory. It is why I built Python notebooks. The goal is two folds:

  1. To provide a starting point to use Python/Numpy to apply linear algebra concepts. And since the final goal is to use linear algebra concepts for data science, it seems natural to continuously go between theory and code. All you will need is a working Python installation with major mathematical librairies like Numpy/Scipy/Matplotlib.

  2. Give a more concrete vision of the underlying concepts. I found hugely useful to play and experiment with these notebooks in order to build my understanding of somewhat complicated theoretical concepts or notations. I hope that reading them will be as useful.

Syllabus

The syllabus follows exactly the Deep Learning Book so you can find more details if you can't understand one specific point while you are reading it. Here is a short description of the content:

  1. Scalars, Vectors, Matrices and Tensors

    An example of a scalar, a vector, a matrix and a tensor

    Difference between a scalar, a vector, a matrix and a tensor

    Light introduction to vectors, matrices, transpose and basic operations (addition of vectors of matrices). Introduces also Numpy functions and finally a word on broadcasting.

  2. Multiplying Matrices and Vectors

    An example of how to calculate the dot product

    The dot product explained

    This chapter is mainly on the dot product (vector and/or matrix multiplication). We will also see some of its properties. Then, we will see how to synthesize a system of linear equations using matrix notation. This is a major process for the following chapters.

  3. Identity and Inverse Matrices

    Example of an identity matrix

    An identity matrix

    We will see two important matrices: the identity matrix and the inverse matrix. We will see why they are important in linear algebra and how to use them with Numpy. Finally, we will see an example on how to solve a system of linear equations with the inverse matrix.

  4. Linear Dependence and Span

    Examples of systems of equations with 0, 1 and an infinite number of solutions

    A system of equations has no solution, 1 solution or an infinite number of solutions

    In this chapter we will continue to study systems of linear equations. We will see that such systems can't have more than one solution and less than an infinite number of solutions. We will see the intuition, the graphical representation and the proof behind this statement. Then we will go back to the matrix form of the system and consider what Gilbert Strang calls the row figure (we are looking at the rows, that is to say multiple equations) and the column figure (looking at the columns, that is to say the linear combination of the coefficients). We will also see what is linear combination. Finally, we will see examples of overdetermined and underdetermined systems of equations.

  5. Norms

    Representation of the squared L2 norm in 3 dimensions

    Shape of a squared L2 norm in 3 dimensions

    The norm of a vector is a function that takes a vector in input and outputs a positive value. It can be thought of as the length of the vector. It is for example used to evaluate the distance between the prediction of a model and the actual value. We will see different kinds of norms ($L^0$, $L^1$, $L^2$...) with examples.

  6. Special Kinds of Matrices and Vectors

    Example of a diagonal matrix and of a symmetric matrix

    A diagonal (left) and a symmetric matrix (right)

    We have seen in 2.3 some special matrices that are very interesting. We will see other types of vectors and matrices in this chapter. It is not a big chapter but it is important to understand the next ones.

  7. Eigendecomposition

    output_59_0

    We will see some major concepts of linear algebra in this chapter. We will start by getting some ideas on eigenvectors and eigenvalues. We will see that a matrix can be seen as a linear transformation and that applying a matrix on its eigenvectors gives new vectors with same direction. Then we will see how to express quadratic equations in a matrix form. We will see that the eigendecomposition of the matrix corresponding to the quadratic equation can be used to find its minimum and maximum. As a bonus, we will also see how to visualize linear transformation in Python!

  8. Singular Value Decomposition

    output_35_7

    We will see another way to decompose matrices: the Singular Value Decomposition or SVD. Since the beginning of this series I emphasized the fact that you can see matrices as linear transformation in space. With the SVD, you decompose a matrix in three other matrices. We will see that we look at these new matrices as sub-transformation of the space. Instead of doing the transformation in one movement, we decompose it in three movements. As a bonus, we will apply the SVD to image processing. We will see the effect of SVD on an example image of Lucy the goose. So keep on reading!

  9. The Moore-Penrose Pseudoinverse

    output_44_0

    We saw that not all matrices have an inverse. It is unfortunate because the inverse is used to solve system of equations. In some cases, a system of equations has no solution, and thus the inverse doesn’t exist. However it can be useful to find a value that is almost a solution (in terms of minimizing the error). This can be done with the pseudoinverse! We will see for instance how we can find the best-fit line of a set of data points with the pseudoinverse.

  10. The Trace Operator

    Calculating the trace of a matrix

    The trace of matrix

    We will see what is the Trace of a matrix. It will be needed for the last chapter on the Principal Component Analysis (PCA).

  11. The Determinant

    Comparison of positive and negative determinant

    Link between the determinant of a matrix and the transformation associated with it

    This chapter is about the determinant of a matrix. This special number can tell us a lot of things about our matrix!

  12. Example: Principal Components Analysis

    Mechanism of the gradient descent algorithm **Gradient descent**

    This is the last chapter of this series on linear algebra! It is about Principal Components Analysis (PCA). We will use some knowledge that we acquired along the preceding chapters to understand this important data analysis tool!

Requirements

This content is aimed at beginners but it would be nice to have at least some experience with mathematics.

Enjoy

I hope that you will find something interesting in this series. I tried to be as accurate as I could. If you find errors/misunderstandings/typos… Please report it! You can send me emails or open issues and pull request in the notebooks Github.

References

Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT press.

Owner
hadrienj
Data and Machine Learning - Freelance. Previously Machine Learning Scientist at Ava. Previously PhD Student at Ecole Normal Supérieure.
hadrienj
My solutions for Advent of Code 2021 🌟🎄

🌟 Advent of Code 2021 🎄 My solutions for Advent of Code 2021. About · What is Advent of Code? · Contents · Usage · Table of puzzles (TODO: add final

Amanda P. Pinha 2 Dec 05, 2022
1000+ ready code templates to kickstart your next AI experiment

AI Seed Projects Start with ready code for your next AI experiment. Choose from 1000+ code templates, across a wide variety of use cases. All examples

BlobCity, Inc 98 Jan 03, 2023
Generates Windows 95 and 95 OEM keys using the modulus 7 check algorithm

w95keygen-python windowskeygen.py - Generates Windows 95 and 95 OEM keys using the modulus 7 check algorithm Just download and drop in the directory y

Joshua Alto 1 Dec 06, 2021
Pomodoro timer by the Algodrip team!

PomoDrip 🍅 Pomodoro timer by the Algo Drip team! To-do: Create the script for the pomodoro timer Design the front-end of the program (Flask or Javasc

Algodrip 3 Sep 12, 2021
Clackety Keyboards Powered by Python

KMK: Clackety Keyboards Powered by Python KMK is a feature-rich and beginner-friendly firmware for computer keyboards written and configured in Circui

KMK Firmware 780 Jan 03, 2023
Easy Alias's for bash

easy-alias Easy Alias's for bash Setup Your system needs to have 'echo' which every 21st century computer has You dont need any python requirments but

Hashm 2 Jan 18, 2022
Mdisk - 🚧 On Construction 🚧

Mdisk Install For Package pip install mdisk pip install git+https://github.com/HeimanPictures/Mdisk.git Usage You can use this as python module or via

AkKiL 6 Aug 08, 2022
An interactive course to git

OperatorEquals' Sandbox Git Course! Preface This Git course is an ongoing project containing use cases that I've met (and still meet) while working in

John Torakis 62 Sep 19, 2022
🏃 Python3 Solutions of All Problems in GKS 2022 (In Progress)

GoogleKickStart 2022 Python3 solutions of Google Kick Start 2022. Solution begins with * means it will get TLE in the largest data set. Total computat

kamyu 38 Dec 29, 2022
Быстрый локальный старт

Быстрый локальный старт

Anton Ogorodnikov 1 Sep 28, 2021
Schemdule is a tiny tool using script as schema to schedule one day and remind you to do something during a day.

Schemdule is a tiny tool using script as schema to schedule one day and remind you to do something during a day. Platform Python Install Use pip: pip

StardustDL 4 Sep 13, 2021
This repo contains scripts that add functionality to xbar.

xbar-custom-plugins This repo contains scripts that add functionality to xbar. Usage You have to add scripts to xbar plugin folder. If you don't find

osman uygar 1 Jan 10, 2022
A Way to Use Python, Easier.

PyTools A Way to Use Python, Easier. How to Install Just copy this code, then make a new file in your project directory called PyTools.py, then paste

Kamran 2 Aug 15, 2022
In this project, we are going to display the battery notification and the time left for the battery to drain out using the battery capacity value.

In this project, we are going to display the battery notification and the time left for the battery to drain out using the battery capacity value.

Ritoban Biswas 1 Dec 20, 2021
Python script to autodetect a base set of swiftlint rules.

swiftlint-autodetect Python script to autodetect a base set of swiftlint rules. Installation brew install pipx

Jonathan Wight 24 Sep 20, 2022
High-level bindings to the Valhalla framework.

Valhalla for Python This spin-off project simply offers improved Python bindings to the fantastic Valhalla project. Installation pip install valhalla

GIS • OPS 20 Dec 13, 2022
Exploring basic lambda calculus in Python

Lambda Exploring basic lambda calculus in Python. In this repo I have used the lambda function built into python to get a more intiutive feel of lambd

Bhardwaj Bhaskar 2 Nov 12, 2021
A Regex based linter tool that works for any language and works exclusively with custom linting rules.

renag Documentation Available Here Short for Regex (re) Nag (like "one who complains"). Now also PEGs (Parsing Expression Grammars) compatible with py

Ryan Peach 12 Oct 20, 2022
1. 네이버 카페 댓글을 빨리 다는 기능

naver_autoprogram 기능 설명 네이버 카페 댓글을 빨리 다는 기능 네이버 카페 자동 출석 체크 기능 동작 방식 카페 댓글 기능 기본 동작은 주기적인 스케쥴 동작으로 해당 카페 ID 와 특정 API 주소로 대상이 새글을 작성했는지 체크. 해당 대상이 새글 등

1 Dec 22, 2021
Basic code and description for GoBigger challenge 2021.

GoBigger Challenge 2021 en / 中文 Challenge Description 2021.11.13 We are holding a competition —— Go-Bigger: Multi-Agent Decision Intelligence Challeng

OpenDILab 183 Dec 29, 2022