Mathematical Engineering
of Deep Learning

A book by Benoit Liquet, Sarat Moka and Yoni Nazarathy.
Appears in the Chapman & Hall/CRC Data Science Series.

Synopsis: This book provides a complete and concise overview of the mathematical engineering of deep learning. In addition to overviewing deep learning foundations, the treatment includes convolutional neural networks, recurrent neural networks, transformers, generative adversarial networks, diffusion models, reinforcement learning, graphical neural networks, and multiple tricks of the trade. The focus is on the basic mathematical description of deep learning models, algorithms and methods. The presentation is mostly agnostic to computer code, neuroscientific relationships, historical perspectives, and theoretical research. The benefit of such an approach is that a mathematically equipped reader can quickly grasp the essence of modern deep learning algorithms, models, and techniques without having to look at computer code, neuroscience, or the historical progression.

Deep learning is easily described through the language of mathematics at a level accessible to many professionals. Readers from the fields of engineering, signal processing, statistics, physics, pure mathematics, econometrics, operations research, quantitative management, applied machine learning, or applied deep learning will quickly gain insights into the key mathematical engineering components of the field.


Get your copy on Amazon. A free online HTML version will appear soon.


Draft book (Feb 28, 2024):

If you spot a typo, please let us know using this simple form.

Mathematical background:

Source code:

Cite:

Related courses and workshops: