Entropy and Information Theory
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New...
Main Author: | |
---|---|
Corporate Author: | |
Format: | Electronic eBook |
Language: | English |
Published: |
Boston, MA :
Springer US,
2011.
|
Subjects: | |
Online Access: | Full Text via HEAL-Link |
Table of Contents:
- Preface
- Introduction
- Information Sources
- Pair Processes: Channels, Codes, and Couplings
- Entropy
- The Entropy Ergodic Theorem
- Distortion and Approximation
- Distortion and Entropy
- Relative Entropy
- Information Rates
- Distortion vs. Rate
- Relative Entropy Rates
- Ergodic Theorems for Densities
- Source Coding Theorems
- Coding for Noisy Channels
- Bibliography
- References
- Index.