From Coding to Learning
Yury Polyanskiy, Yihong Wu

#Coding
#computer_science
#statistical_learning
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study.
Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory.
This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov's metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation.
Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science.
An enthusiastic introduction to the fundamentals of information theory, from classical Shannon theory to modern statistical learning.
Part I Information measures
1-Entropy
2-Divergence
3-Mutual information
4-Variational characterizations and continuity of information measures
5-Extremization of mutual information: capacity saddle point
6-Tensorization and information rates
7-f-divergences
8-Entropy method in combinatorics and geometry
9-Random number generators
Exercises for Part I
Part II Lossless data compression
10-Variable-length compression
11-Fixed-length compression and Slepian-Wolf theorem
12-Entropy of ergodic processes
13-Universal compression
Exercises for Part II
Part III Hypothesis testing and large deviations
14-Neyman-Pearson lemma
15-Information projection and large deviations
16-Hypothesis testing: error exponents
Exercises for Part III
Part V Rate-distortion theory and metric entropy
17-Rate-distortion theory
18-Rate distortion: achievability bounds
19-Evaluating rate-distortion function. Lossy Source-Channel separation.
20-Metric entropy
Exercises for Part V
Part VI Statistical applications
21-Basics of statistical decision theory
22-Classical large-sample asymptotics
23-Mutual information method
24-Lower bounds via reduction to hypothesis testing
25-Entropic bounds for statistical estimation
26-Strong data processing inequality
Exercises for Part VI
About the Author
Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.
Yihong Wu is a Professor of Statistics and Data Science at Yale University, focusing on the theoretical and algorithmic aspects of high-dimensional statistics, information theory, and optimization. He is the recipient of the 2018 Sloan Research Fellowship in Mathematics.









