Sunday, 24 09 2017

Information Theory


Information Theory
Lesson Code:  22Α702
Level:  -
Semester:  7ο



A class that studies the theoretical limits of compression and transmission, and practical source/channel encoding schemes that attempt to approach the theoretical limits. Course description: Elements of Probability Theory and principles of Combinatorics. Introduction to Information Theory. Entropy. Mutual Information. Relative Entropy. Properties. Discrete Information Sources with Memory. Entropy Rate. Data Compression. Fixed-length coding. Source coding theorem. Variable-length coding. Classes of codes. Kraft inequality. Shannon and Fano codes. Optimal codes. Huffman coding. Adaptive Huffman codes. Arithmetic coding. Discrete channels. Capacity. Channel coding theorem for Discrete Memoryless Channels. Source-channel separation theorem. Information measures for continuous random variables. Differential entropy. Discrete-time continuous-alphabet channels. The capacity of the Gaussian channel. Continuous-time channels. The capacity of the bandlimited Gaussian channel. Parallel Gaussian channels and waterfilling. Error detection and correction. Introduction to channel codes. Error detection. Error correction. Linear codes: Generator matrix and parity matrix. Coset decoding. Syndrome decoding. Hamming codes. Dual codes. Perfect codes. Cyclic codes: Generator polynomial. Generator matrix and parity matrix. Systematic encoding. Syndrome. Decoding of cyclic codes. Brief overview of convolutional codes, trellis codes, turbo codes and LDPC.




Denazis Spyros

Birbas Michael