Synopsis for B11a: Communication Theory


Number of lectures: 16 MT
Lecturer(s): Dr Stirzaker

Course Description

Level: H-level Method of Assessment: Written examination.
Weight: Half-unit (OSS paper code 2650).

Recommended Prerequisites:

Part A Probability would be helpful, but not essential.

Overview

The aim of the course is to investigate methods for the communication of information from a sender, along a channel of some kind, to a receiver. If errors are not a concern we are interested in codes that yield fast communication; if the channel is noisy we are interested in achieving both speed and reliability. A key concept is that of information as reduction in uncertainty. The highlight of the course is Shannon's Noisy Coding Theorem.

Learning Outcomes

  1. Know what the various forms of entropy are, and be able to manipulate them.
  2. Know what data compression and source coding are, and be able to do it.
  3. Know what channel coding and channel capacity are, and be able to use that.

Synopsis

Uncertainty (entropy); conditional uncertainty; information. Chain rules; relative entropy; Gibbs' inequality; asymptotic equipartition and typical sequences. Instantaneous and uniquely decipherable codes; the noiseless coding theorem for discrete memoryless sources; constructing compact codes.

The discrete memoryless channel; decoding rules; the capacity of a channel. The noisy coding theorem for discrete memoryless sources and binary symmetric channels.

Extensions to more general sources and channels.

Reading List

  1. D. J. A. Welsh, Codes and Cryptography (Oxford University Press, 1988), Chapters 1–3, 5.
  2. T. Cover and J. Thomas, Elements of Information Theory (Wiley, 1991), Chapters 1–5, 8.

Further Reading

  1. R. B. Ash, Information Theory (Dover, 1990).
  2. D. MacKay, Information Theory, Inference, and Learning Algorithms (Cambridge, 2003). [Can be seen at: \href{http://www.inference.phy.cam.ac.uk/mackay/itila}{http://www.inference.phy.cam.ac.uk/mackay/itila}. Do not infringe the copyright!]
  3. G. Jones and J. M. Jones, Information and Coding Theory (Springer, 2000), Chapters 1–5.