Course Overview 1- Introduction: . what is information theory . What is the purpose of information theory . Why we need to study information theory . Communication Model 2- Sets, Operations on sets 3- Trial, Probability space, Events, Random variables, Probability distribution, Expected values, 4- Conditional probability, Bayes theorem, Independence of random variables 5- Memoryless channels, Stationary, Law of large numbers 6- Stochastic process, Markov process, Ergodic process 7- Information source 8- Source coding and decoding: Instantaneous code, Kraft's Inequality, Uniquely decodable codes, McMillan Inequality 9- Coding Strategies: Huffman code, Run-length encoding, Fan code. Shannon code, Two-pass Huffman code, Ziv-Lempel coding 10- Information and Entropy: Logarithms, Kolmogorov complexity, Entropy and expected value. Equiprobable distribution, Properties of entropy, Four-state random variables 11- Conditional entropy, Joint distribution, Joint entropy, Chain rule 12- Mutual Information, Relationship between entropy and mutual information 13- Communication Channel, Channel Capacity, Binary Symmetric Channel, Noisy Typewriter, Symmetric Channel 14- Error Detection and Correction: Detection, Hamming Method, Hamming Error-correction