Information-theoretic Lower Bounds in Data Science


Stanford School of Engineering

Stanford campus image


Ideas and techniques for information-theoretic lower bounds, with examples in machine learning, statistics, information theory, theoretical computer science, optimization, online learning and bandits, operations research, and more. Deficiency and Le Cam¿s distance; classical asymptotics; information measures and joint range; Le Cam, Assouad, and Fano; Ingster-Suslina method; method of moments; strong converses; constrained risk inequality; compression arguments; privacy-constrained estimation; sequential experimental design; statistical/computational tradeoff.


Prerequisites: EE278, CS229T, STATS300A, or equivalent, or instructor's permission.

Thank you for your interest. No sections are available. Please click the button below to receive an email when the course becomes available again.

Notify Me