Machine Learning, Dynamical Systems, and Control
Steven L. Brunton, J. Nathan Kutz

#Data-Driven
#Machine_Learning
#Python
#MATLAB
#PCA
#DFT
#FFT
#RPCA
#SVD
Data-driven discovery is revolutionizing how we model, predict, and control complex systems. Now with Python and MATLAB®, this textbook trains mathematical scientists and engineers for the next generation of scientific discovery by offering a broad overview of the growing intersection of data-driven methods, machine learning, applied optimization, and classical fields of engineering mathematics and mathematical physics. With a focus on integrating dynamical systems modeling and control with modern methods in applied machine learning, this text includes methods that were chosen for their relevance, simplicity, and generality. Topics range from introductory to research-level material, making it accessible to advanced undergraduate and beginning graduate students from the engineering and physical sciences. The second edition features new chapters on reinforcement learning and physics-informed machine learning, significant new sections throughout, and chapter exercises. Online supplementary material – including lecture videos per section, homeworks, data, and code in MATLAB®, Python, Julia, and R – available on databookuw.com.
Part I: Dimensionality Reduction and Transforms
1) Singular Value Decomposition (SVD)
1.1 Overview
1.2 Matrix Approximation
1.3 Mathematical Properties and Manipulations
1.4 Pseudo-Inverse, Least-Squares, and Regression
1.5 Principal Component Analysis (PCA)
1.6 Eigenfaces Example
1.7 Truncation and Alignment
1.8 Randomized Singular Value Decomposition
1.9 Tensor Decompositions and N-Way Data Arrays
2) Fourier and Wavelet Transforms
2.1 Fourier Series and Fourier Transforms
2.2 Discrete Fourier Transform (DFT) and Fast Fourier Transform (FFT)
2.3 Transforming Partial Differential Equations
2.4 Gabor Transform and the Spectrogram
2.5 Laplace Transform
2.6 Wavelets and Multi-Resolution Analysis
2.7 Two-Dimensional Transforms and Image Processing
3) Sparsity and Compressed Sensing
3.1 Sparsity and Compression
3.2 Compressed Sensing
3.3 Compressed Sensing Examples
3.4 The Geometry of Compression
3.5 Sparse Regression
3.6 Sparse Representation
3.7 Robust Principal Component Analysis (RPCA)
3.8 Sparse Sensor Placement
Part II: Machine Learning and Data Analysis
4) Regression and Model Selection
4.1 Classic Curve Fitting
4.2 Nonlinear Regression and Gradient Descent
4.3 Regression and Ax = b: Over- and Under-Determined Systems
4.4 Optimization as the Cornerstone of Regression
4.5 The Pareto Front and Lex Parsimoniae
4.6 Model Selection: Cross-Validation
4.7 Model Selection: Information Criteria
5) Clustering and Classification
5.1 Feature Selection and Data Mining
5.2 Supervised versus Unsupervised Learning
5.3 Unsupervised Learning: k-Means Clustering
5.4 Unsupervised Hierarchical Clustering: Dendrogram
5.5 Mixture Models and the Expectation-Maximization Algorithm
5.6 Supervised Learning and Linear Discriminants
5.7 Support Vector Machines (SVM)
5.8 Classification Trees and Random Forest
5.9 Top 10 Algorithms of Data Mining circa 2008 (Before the Deep Learning Revolution)
6) Neural Networks and Deep Learning
6.1 Neural Networks: Single-Layer Networks
6.2 Multi-Layer Networks and Activation Functions
6.3 The Backpropagation Algorithm
6.4 The Stochastic Gradient Descent Algorithm
6.5 Deep Convolutional Neural Networks
6.6 Neural Networks for Dynamical Systems
6.7 Recurrent Neural Networks
6.8 Autoencoders
6.9 Generative Adversarial Networks (GANs)
6.10 The Diversity of Neural Networks
Part III: Dynamics and Control
7) Data-Driven Dynamical Systems
7.1 Overview, Motivations, and Challenges
7.2 Dynamic Mode Decomposition (DMD)
7.3 Sparse Identification of Nonlinear Dynamics (SINDy)
7.4 Koopman Operator Theory 286
7.5 Data-Driven Koopman Analysis
8) Linear Control Theory
8.1 Closed-Loop Feedback Control
8.2 Linear Time-Invariant Systems
8.3 Controllability and Observability
8.4 Optimal Full-State Control: Linear–Quadratic Regulator (LQR)
8.5 Optimal Full-State Estimation: the Kalman Filter
8.6 Optimal Sensor-Based Control: Linear–Quadratic Gaussian (LQG)
8.7 Case Study: Inverted Pendulum on a Cart
8.8 Robust Control and Frequency-Domain Techniques
9 Balanced Models for Control
9.1 Model Reduction and System Identification
9.2 Balanced Model Reduction
9.3 System Identification
Part IV: Advanced Data-Driven Modeling and Control
10) Data-Driven Control
10.1 Model Predictive Control (MPC)
10.2 Nonlinear System Identification for Control
10.3 Machine Learning Control
10.4 Adaptive Extremum-Seeking Control
11 Reinforcement Learning
11.1 Overview and Mathematical Formulation
11.2 Model-Based Optimization and Control
11.3 Model-Free Reinforcement Learning and Q-Learning
11.4 Deep Reinforcement Learning
11.5 Applications and Environments
11.6 Optimal Nonlinear Control
12 Reduced-Order Models (ROMs)
12.1 Proper Orthogonal Decomposition (POD) for Partial Differential Equations
12.2 Optimal Basis Elements: the POD Expansion
12.3 POD and Soliton Dynamics
12.4 Continuous Formulation of POD
12.5 POD with Symmetries: Rotations and Translations
12.6 Neural Networks for Time-Stepping with POD
12.7 Leveraging DMD and SINDy for Galerkin–POD
13 Interpolation for Parametric Reduced-Order Models
13.1 Gappy POD
13.2 Error and Convergence of Gappy POD
13.3 Gappy Measurements: Minimize Condition Number
13.4 Gappy Measurements: Maximal Variance
13.5 POD and the Discrete Empirical Interpolation Method (DEIM)
13.6 DEIM Algorithm Implementation
13.7 Decoder Networks for Interpolation
13.8 Randomization and Compression for ROMs
13.9 Machine Learning ROMs
14) Physics-Informed Machine Learning
14.1 Mathematical Foundations
14.2 SINDy Autoencoder: Coordinates and Dynamics
14.3 Koopman Forecasting
14.4 Learning Nonlinear Operators
14.5 Physics-Informed Neural Networks (PINNs)
14.6 Learning Coarse-Graining for PDEs
14.7 Deep Learning and Boundary Value Problems
'Finally, a book that introduces data science in a context that will make any mechanical engineer feel comfortable. Data science is the new calculus, and no engineer should graduate without a thorough understanding of the topic.' Hod Lipson, Columbia University
'This book is a must-have for anyone interested in data-driven modeling and simulations. The readers as diverse as undergraduate STEM students and seasoned researchers would find it useful as a guide to this rapidly evolving field. Topics covered by the monograph include dimension reduction, machine learning, and robust control of dynamical systems with uncertain/random inputs. Every chapter contains codes and homework problems, which make this treaties ideal for the classroom setting. The book is supplemented with online lectures, which are not only educational but also entertaining to watch.' Daniel M. Tartakovsky, Stanford University
'Engineering principles will always be based on physics, and the models that underpin engineering will be derived from these physical laws. But in the future models based on relationships in large datasets will be as important and, when used alongside physics-based models, will lead to new insights and designs. Brunton and Kutz will equip students and practitioners with the tools they will need for this exciting future.' Greg Hyslop, Boeing
'Brunton and Kutz's book is fast becoming an indispensable resource for machine learning and data-driven learning in science and engineering. The second edition adds several timely topics in this lively field, including reinforcement learning and physics-informed machine learning. The text balances theoretical foundations and concrete examples with code, making it accessible and practical for students and practitioners alike.' Tim Colonius, California Institute of Technology
'This is a must read for those who are interested in understanding what machine learning can do for dynamical systems! Steve and Nathan have done an excellent job in bringing everyone up to speed to the modern application of machine learning on these complex dynamical systems.' Shirley Ho, Flatiron Institute/New York University
About the Author
Steven L. Brunton is the James B. Morrison Professor of Mechanical Engineering at the University of Washington. He is also Adjunct Professor of Applied Mathematics and Computer science, and a Data Science Fellow at the eScience Institute. Steve received the B.S. in mathematics from Caltech in 2006 and the Ph.D. in mechanical and aerospace engineering from Princeton in 2012. His research combines machine learning with dynamical systems to model and control systems in fluid dynamics, biolocomotion, optics, energy systems, and manufacturing. He received the Army and Air Force Young Investigator Program (YIP) awards and the Presidential Early Career Award for Scientists and Engineers (PECASE). Steve is also passionate about teaching math to engineers as co-author of three textbooks and through his popular YouTube channel, under the moniker “eigensteve” (youtube.com/c/eigensteve).
J. Nathan Kutz is the Robert Bolles and Yasuko Endo Professor of Applied Mathematics and Electrical and Computer Engineering at the University of Washington and Director of the NSF AI Institute in Dynamic Systems. He is also Adjunct Professor of Mechanical Engineering and Senior Data-Science Fellow at the eScience Institute. His research interests lie at the intersection of dynamical systems and machine learning.