Information Theory (Now EE276)

EE376A

Stanford School of Engineering


Thumbnail

Description

Due to the rapid growth of information and data in the last decade, a new mindset has emerged – one in which we see information as more than just an idea, but as a precise mathematical unit. This new mindset has led to the development of Information Theory, a set of ideas that studies the quantification, storage and communication of information - a theory from which the internet was developed.  

This course will cover the basic concepts of information theory, before going deeper into areas like entropy, data compression, mutual information, capacity and applications to statistics and machine learning.

NOTE: This course is now: EE276 (as of Winter quarter 2019-20)

Prerequisites

EE178 or STATS116 or equivalent.

Topics include

  • Entropy and mutual information
  • Source coding theorem and Huffman code
  • Universal compression and distribution estimation
  • Channel capacity and noisy channel coding theorem
  • Polar codes, Gaussian channels and continuous random variables
  • Maximum entropy principle and applications to hypothesis testing

Note on Course Availability

This course is typically offered Winter quarter.

The course schedule is displayed for planning purposes – courses can be modified, changed, or cancelled. Course availability will be considered finalized on the first day of open enrollment. For quarterly enrollment dates, please refer to our graduate certificate homepage.

Thank you for your interest. The course you have selected is not open for enrollment. Please click the button below to receive an email when the course becomes available again.

Request Information