

Elements of Information Theory 2nd Edition (Wiley Series in Telecommunications and Signal Processing)
A**R
An Excellent Introduction to Information Theory
I am writing this review in response to some confusion and unfairness I see in other reviews. Cover and Thomas have written a unique and ambitious introduction to a fascinating and complex subject; their book must be judged fairly and not compared to other books that have entirely different goals.Claude Shannon provided a working definition of "information" in his seminal 1948 paper, A Mathematical Theory of Communication. Shannon's interest in that and subsequent papers was the attainment of reliable communication in noisy channels. The definition of information that Shannon gave was perfectly fitted to this task; indeed, it is easily shown that in the context studied by Shannon, the only meaningful measure of information content that will apply to random variables with known distribution must be (up to a multiplicative constant) of the now-familiar form h(p) = log(1/p).However, Shannon freely admitted that his definition of information was limited in scope and was never envisioned as being universal. Shannon deliberately avoided the "murkier" aspects of human communication in framing his definitions; problematic themes such as knowledge, semantics, motivations and intentions of the sender and/or receiver, etc., were avoided altogether.For several decades, Information Theory continued to exist as a subset of the theory of reliable communication. Some classical and highly regarded texts on the subject are Gallager, Ash, Viterbi and Omura, and McEliece. For those whose interest in Information Theory is motivated largely by questions from the field of digital communications, these texts remain unrivalled standards; Gallager, in particular, is so highly regarded by those who learned from it that it is still described as superior to many of its more recent, up-to-date successors.In recent decades, Information Theory has been applied to problems from across a wide array of academic disciplines. Physicists have been forced to clarify the extent to which information is conserved in order to completely understand black hole dynamics; biologists have found extensive use of Information Theoretic concepts in understanding the human genome; computer scientists have applied Information Theory to complex issues in computational vs. descriptive complexity (the Mandelbrot set, which has been called the most complex set in all of mathematics, is actually extremely simple from the point of view of Kolmogorov complexity); and John von Neumann's brilliant creation, game theory, which has been called "a universal language for the unification of the behavioral sciences," is intimately coupled to Information Theory, perhaps in ways that have not yet been fully appreciated or explored.Cover and Thomas' book "Elements of Information Theory" is written for the reader who is interested in these eclectic and exciting applications of Information Theory. This book does NOT treat Information Theory as a subset of reliable communication theory; therefore, the book is NOT written as a competitor for Gallager's classic text. Critics who askfor a more thorough treatment of rate distortion theory or convolutional codes are criticizing the authors for failing to include topics that are not even central to their goals for the text!A very selective list of some of the more interesting topics that Cover and Thomas study includes: (1) the Asymptotic Equipartition Property and its consequences for data compression; (2) Information Theory and gambling; (3) Kolmogorov complexity and Chaitin's Omega; (4) Information Theory and statistics; and (5) Information Theory and the stock market. Item (4) on this list is only briefly introduced in Cover and Thomas's book, and appropriately so; however, readers who wish to pursue the fascinating subject of Fischer Information further should consider B. Roy Frieden's book Physics from Fisher Information: A Unification. Frieden identifies a principle of "extreme physical information" as a unifying theme across all of physics, deriving such classic equations as the Klein-Gordon equation, Maxwell's equations, and Einstein's field equations for general relativity from this information-theoretic principle.This last point is quite typical of Cover and Thomas's book. I participated in a faculty seminar on Information Thoery at my university a few years ago, in which we studied Cover and Thomas as our primary source. We were a diverse group, drawn from five different academic disciplines, and we all found that Cover and Thomas repeatedly introduced us to exciting and unexpected applications of Information Theory, always sending us to the journals for further, more in-depth study.Cover and Thomas' book has become an established favorite in university courses on information theory. In truth, the book has few competitors. Interested readers looking for additional references might also consider David MacKay's book Information Theory, Inference, and Learning Algorithms, which has as a primary goal the use of information theory in the study of Neural Networks and learning algorithms. George Klir's book Uncertainty and Information considers many alternative measures of information/uncertainty, moving far beyond the classical log(1/p) measure of Shannon and the context in which it arose. Jan Kahre's iconoclastic book The Mathematical Theory of Information is an intriguing alternative in which the so-called Law of Diminishing Information is elevated to primary axiomatic status in deriving measures of information content. I alluded to some of the "murkier" issues of human communication earlier; readers who wish to study some of those issues will find Yehoshua Bar-Hillel's book Language and Information a useful source.In conclusion, I highly recommend Cover and Thomas' book on Information Theory. It is currently unrivalled as a rigorous introduction to applications of Information Theory across the curriculum. As a person who used to work in the general area of signals analysis, I resist all comparisons of Cover and Thomas' book with the classic text of Gallager; the books have vastly different goals and very little overlap.
A**H
Very solid introductory book on information theory
I give this book five stars for its outstanding clarity, thoroughness, and choice of topics. The writing is excellent, and most topics are easy to understand, although I have a few isolated quibbles about how certain topics are presented.I feel like the chapters on continuous channels are much tougher to understand and less intuitive than the chapters on discrete channels.The exercises are very useful, but in my opinion, a bit too easy. There are lots of exercises at the end of each chapter, but there are very few that require deep thinking or creative insight. Most of the exercises are fairly routine. I think a few more involved ones would be welcome.The one thing that is most lacking in this book are examples. The bulk of the text is made up in exposition of new ideas and proofs of theorems. While the exercises give lots of examples, I still feel that something is missing--especially in the chapters on continuous channels.As a supplement, I would recommend "Information Theory, Inference, and Learning Algorithms" by MacKay. The two books are very different from each other and have less overlap than one might expect; I think everyone would do well to study both books. That book is much more suitable for self-study, has more concrete examples, and is in my opinion more fun and interesting (which says a lot, because this book is itself quite fun and interesting). It also has some more involved exercises. Also, it covers coding theory in more depth than this book (something that one might not realize from its name), and it integrates a Bayesian perspective into things more deeply.
J**X
I'm glad to have bought it and read it thoroughly
This book has given me the groundwork for my physics PhD. It is a fertile ground for new ideas! I'm glad to have bought it and read it thoroughly.It's also a very well written edition, and useful in many different fields.
A**D
An Academic Reference Book Deserves Better
I recently purchased the hardcopy version of Elements of Information Theory, 2nd edition, from Amazon for over $90. While I have no complaints about the content of the book—it has an excellent reputation—I am thoroughly disappointed with its physical quality.The book arrived with the front cover torn at the top of the spine, which makes me question why Amazon would ship a book in such a condition. It is regrettable that Amazon would allow this to happen, as it raises concerns about their quality control.Even if the cover had been intact, the poor quality of the binding is shameful. For a book priced at over $90 and intended to serve as an academic reference, the glued spine—badly done at that—is a major letdown. For a publisher like Wiley, this level of production quality is simply shameful. Many readers, myself included, still prefer physical books, especially for serious academic works like this. A book of this caliber deserves a binding that reflects its value and purpose, ensuring it withstands years of use.
V**D
Exceeds Expectations
I recommend this product to anyone studying Information Theory. It is very clear and uses modern nomenclature. It also has many exercises to test your understanding of the material.
M**T
Very good book but...
Very good book with some minor issues. The authors do a great job of making most of the material accessible to a person with an understanding of basic probability. In my humble opinion, the chapters on Gaussian Channel (Ch9) and Network Information Theory (Ch15) need more exposition. Other chapters are very well explained. Occasionally deep statements are made without much explanation and amplification. It is upto the reader to figure out explanations for these statements. Some of the problems are repeated. Most of the problems are easy and as another reviewer pointed out, the book might benefit with the addition of some more thought provoking problems. However a great book for learning information theory.
J**E
Great Book
One of the most readable info theory books out there.
Trustpilot
1 month ago
4 days ago