The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
Bad news: you don't have much of a choice when it comes to authoritative textbooks on information theory. Good news: Cover is not a bad book to be stuck with.
Every chapter, theorem, idea, and problem is well-motivated. Landau-lovers of terse textbooks might bawk at the equation-less pages preceding some results, but I greatly appreciated the intuitive prefaces to mathematical formalism.
This book also has one of the finest problem sets I've seen. Each problem is designed to convey a new idea or point out a subtlety of one mentioned in the text. Striking just the right balance of guidedness with openness, doing Cover problems feels more being in a sandbox than a forced labor camp.
I laughed, I cried, I learned.
Two criticisms though. First, Cover has a tendency to mention vocabulary and use notation that he has not yet introduced and this can make for some frustrating battles of interpretation. Second, some of the problems require a good amount of knowledge not found in this book. I took a course based on this book with zero background in electrical engineering (I'm get my kicks from physics and maths) and at times, I felt like I was trying to crack the Rosetta Stone.
It's a classical textbook in information theory. It does not require much mathematical preliminaries. It's thick and worth rereading. Its only disadvantage is that if you want to learn some deeper mathematical formalisms regarding information theory (such as ergodic theory or algorithmic randomness) then you need to seek out for other books. But it's quite a rare need and most students will be satisfied with this book.
I've only read a handful of chapters and done only a few dozen exercises. It's difficult, and it is a pity that solutions are not provided (though they can be found online!); but it is suitable for self-study and extremely rich in insights.
This didn't have much of the life and joy I associate with information theory (the life found in MacKay's Youtube lectures). It does the job of introducing Info Theory, can't say it makes it thrilling.
First two chapters are quite good but the notation takes a nose dive afterwards. Everywhere is stuff like randomly summing over i's and j's instead of proper matrix/vector notation, terrible choice for variable labels like mu when it's really a probability, using i for everything in time series processes with nested lists of variables (instead of, e.g., i and t).
Definitely recommend the first two chapters if you want to find out what entropy means, haven't seen anything better in years of hearing and reading about entropy from physics classes, CS classes, online articles, etc.
I feel like the second sentence from the preface encapsulates the authors efforts to unfold and break down the concepts: “Everything should be made as simple as possible, but no simpler.� The descriptive force lingers below the entropy, might be a good book if you already have an intuition for information theory.
When I was in graduate school, we were required to take a course in Information Theory, with Cover and Thomas' book as the required text. At the time, the course was so challenging for me that I came away with a sour view of information theory altogether. However, a few years ago I picked up the 2nd edition of the text and began my own study of information theory, using only the text itself. I greatly appreciated their development of asymptotic typicality and rate distortion. What I found (once I was able to focus purely on the author's development of the theorems, their examples and problems) was an extremely readable book, with very good (not perfect, but good) explanations of the theorems and concepts within information theory.
Cover and Thomas is THE classic information theory textbook. Here, the authors took on the ambitious task of making a comprehensive survey of (the still evolving) information theory. Admittedly, I got lost in the proofs about halfway through the text. However, the textual descriptions between proofs were extremely helpful for my overall understanding. The text is unapproachable for those without a solid math or computer science background, but probably one of the best textbooks I've ever worked through on my own.
This book is essentially the bible of information theory. It is a must read for anybody interested in learning more about this constantly evolving discipline. Information theory provides nice tools such as mutual information which is often a better metric than correlation due to its additivity, a good paper on the problems with correlation can be found here:
The text is dense and the proofs can take a while to fully grasp but was more than worth the read.
I really like this book, so lucid and starts from the basics. But when it comes to the MAC and BC, I prefer Network Information Theory by Abbas El Gamal.