# Mathematics of Machine Learning early access

### Machine learning theory made easy.

So, you want to master machine learning. Even though you have experience in the field, sometimes you still feel that something is missing. A look behind the curtain.

Have you ever felt the learning curve to be so sharp that it was too difficult even to start? The theory was so dry and seemingly irrelevant that you were unable to go beyond the basics?

If so, I am building something for you. I am working to create the best resource to study the mathematics of machine learning out there.

Join the early access and be a part of the journey!

#### Math explained, as simple as possible.

Every concept is explained step by step, from elementary to advanced. No fancy tricks and mathematical magic. Intuition and motivation first, technical explanations second.

#### Open up the black boxes.

Machine learning is full of mysterious black boxes. Looking inside them allows you to be a master of your field and never be in the dark when things go wrong.

#### Be a part of the process.

This book is being written in public. With early access, you’ll get each chapter as I finish, with a personal hotline to me. Is something not appropriately explained? Is a concept not motivated with applications? Let me know, and I’ll get right on it!

### Preview chapter!

If you would like to see a sample, I got you! Check out the first two chapters, about vectors in theory and practice!

https://tivadardanka.com/mathematics-of-machine-learning-preview/

### What you'll get?

• The latest version of the book, in an interactive Jupyter Book format + pdf.

• Exclusive access to a new sub-chapter each week as I finish them. (See my planned roadmap below.)

• A personal hotline to me where you can share your feedback with me to build the best learning resource for you.

### What I'll get?

Writing a book is a long and challenging project. I want to do this the right way, so I decided to dedicate 100% of my time and energy to this. **However, I can't do this without your support.** I created the Early Access Program for those wishing to join me in this journey. With you signing up for the Early Access Program, I'll get

• your financial support so I can work on this project full time,

• and your continual feedback, which is essential for me to write the best book on the subject for you.

### Refund policy

If you find that the Early Access Program is not for you, no worries! Let me know within 30 days of your purchase, and I'll refund you immediately - no questions asked.

### Preliminary table of contents

**Part 1.** Linear algebra

• Vectors in theory (✓ published!)

• Vectors in practice (✓ published!)

• The geometric structure of vector spaces: measuring distances (✓ published!)

• Inner products, angles, and lots of reasons to care about them (✓ published!)

• The first steps in computational linear algebra (✓ published!)

• Linear transformations (✓ published!)

• Matrices, the workhorses of machine learning (✓ published!)

• Determinants, or how linear transformations affect volume and magnitude (✓ published!)

• Linear equations (✓ published!)

• The LU decomposition (✓ published!)

• Determinants in practice (✓ published!)

• Eigenvalues and eigenvectors (✓ published!)

• Special transformations and matrix decompositions (✓ published!)

• Computing eigenvalues (✓ published!)

**Part 2.** Functions

• Functions in theory (✓ published!)

• Functions in practice (✓ published!)

• Numbers (✓ published!)

• Sequences (✓ published!)

• The topology of real numbers (✓ published!)

• Limits and continuity (✓ published!)

• Differentiation in theory (✓ published!)

• Differentiation in practice (✓ published!)

• Minima and maxima (✓ published!)

• The basics of gradient descent (✓ published!)

• Integration in theory (✓ published!)

• Integration in practice (✓ published!)

• Why does gradient descent work (✓ published!)

**Part 3.** Multivariable calculus

• Multivariable functions (✓ published!)

• Partial derivatives and gradients

• Minima and maxima in multiple dimensions

• Gradient descent in its full form

• Constrained optimization

• Integration in multiple dimensions

**Part 4.** Probability theory

• The mathematical concept of probability

• Distributions and densities

• Random variables

• Conditional probability

• Expected value

• Information theory and entropy

• Multidimensional distributions

**Part 5.** Statistics

• Fundamentals of parameter estimation

• Maximum likelihood estimation

• The Bayesian viewpoint of statistics

• Bias and variance

• Measuring predictive performance of statistical models

• Multivariate methods

**Part 6.** Machine learning

• The taxonomy of machine learning tasks

• Linear and logistic regression

• Fundamentals of clustering

• Principal Component Analysis

• Most common loss functions and what’s behind them

• Regularization of machine learning models

• t-distributed stochastic neighbor embedding

**Part 7.** Neural networks

• Logistic regression revisited

• Activation functions

• Computational graphs

• Backpropagation

• Loss functions, from a neural network perspective

• Weight initialization

**Part 8.** Advanced optimization

• Stochastic gradient descent

• Adaptive methods

• Accelerated schemes

• The Lookahead optimizer

• Ranger

**Part 9.** Convolutional networks

• The convolutional layer, in-depth

• Dropout and BatchNorm

• Fundamental tasks of computer vision

• AlexNet and ResNet

• Autoencoders

• Generative Adversarial Networks

### Planned roadmap

**2021 Q3 - 2022 Q2:** Core math chapters

• Calculus and multivariate calculus

• Linear algebra

• Probability theory

• Mathematical statistics

**2022 Q3 - 2022 Q4:** Machine learning chapters

• Classical machine learning

• Neural networks and convolutional networks

• Advanced topics

**2022 Q4:** Editing and finishing touches

• Finalizing and prettifying figures

• Editing the text and improving the style

Early Access eBook

##### Format

Digital (pdf + Interactive book in html)

- Early Access eBook
- Format
**Digital (pdf + Interactive book in html)**