Author: Nathaniel F. G. Martin
Publisher: Cambridge University Press
ISBN: 9780521177382
Category : Computers
Languages : en
Pages : 292
Book Description
This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.
Mathematical Theory of Entropy
Author: Nathaniel F. G. Martin
Publisher: Cambridge University Press
ISBN: 9780521177382
Category : Computers
Languages : en
Pages : 292
Book Description
This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.
Publisher: Cambridge University Press
ISBN: 9780521177382
Category : Computers
Languages : en
Pages : 292
Book Description
This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.
The Mathematical Theory of Communication
Author: Claude E Shannon
Publisher: University of Illinois Press
ISBN: 025209803X
Category : Language Arts & Disciplines
Languages : en
Pages : 141
Book Description
Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.
Publisher: University of Illinois Press
ISBN: 025209803X
Category : Language Arts & Disciplines
Languages : en
Pages : 141
Book Description
Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.
Entropy and Diversity
Author: Tom Leinster
Publisher: Cambridge University Press
ISBN: 1108832709
Category : Language Arts & Disciplines
Languages : en
Pages : 457
Book Description
Discover the mathematical riches of 'what is diversity?' in a book that adds mathematical rigour to a vital ecological debate.
Publisher: Cambridge University Press
ISBN: 1108832709
Category : Language Arts & Disciplines
Languages : en
Pages : 457
Book Description
Discover the mathematical riches of 'what is diversity?' in a book that adds mathematical rigour to a vital ecological debate.
Mathematical Theory of Nonequilibrium Steady States
Author: Da-Quan Jiang
Publisher: Springer Science & Business Media
ISBN: 9783540206118
Category : Markov processes
Languages : en
Pages : 296
Book Description
Publisher: Springer Science & Business Media
ISBN: 9783540206118
Category : Markov processes
Languages : en
Pages : 296
Book Description
Entropy and Information Theory
Author: Robert M. Gray
Publisher: Springer Science & Business Media
ISBN: 1475739826
Category : Computers
Languages : en
Pages : 346
Book Description
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Publisher: Springer Science & Business Media
ISBN: 1475739826
Category : Computers
Languages : en
Pages : 346
Book Description
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
The Mathematical Theory of Information
Author: Jan Kåhre
Publisher: Springer Science & Business Media
ISBN: 9781402070648
Category : Technology & Engineering
Languages : en
Pages : 528
Book Description
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
Publisher: Springer Science & Business Media
ISBN: 9781402070648
Category : Technology & Engineering
Languages : en
Pages : 528
Book Description
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
Entropy in Dynamical Systems
Author: Tomasz Downarowicz
Publisher: Cambridge University Press
ISBN: 1139500872
Category : Mathematics
Languages : en
Pages : 405
Book Description
This comprehensive text on entropy covers three major types of dynamics: measure preserving transformations; continuous maps on compact spaces; and operators on function spaces. Part I contains proofs of the Shannon–McMillan–Breiman Theorem, the Ornstein–Weiss Return Time Theorem, the Krieger Generator Theorem and, among the newest developments, the ergodic law of series. In Part II, after an expanded exposition of classical topological entropy, the book addresses symbolic extension entropy. It offers deep insight into the theory of entropy structure and explains the role of zero-dimensional dynamics as a bridge between measurable and topological dynamics. Part III explains how both measure-theoretic and topological entropy can be extended to operators on relevant function spaces. Intuitive explanations, examples, exercises and open problems make this an ideal text for a graduate course on entropy theory. More experienced researchers can also find inspiration for further research.
Publisher: Cambridge University Press
ISBN: 1139500872
Category : Mathematics
Languages : en
Pages : 405
Book Description
This comprehensive text on entropy covers three major types of dynamics: measure preserving transformations; continuous maps on compact spaces; and operators on function spaces. Part I contains proofs of the Shannon–McMillan–Breiman Theorem, the Ornstein–Weiss Return Time Theorem, the Krieger Generator Theorem and, among the newest developments, the ergodic law of series. In Part II, after an expanded exposition of classical topological entropy, the book addresses symbolic extension entropy. It offers deep insight into the theory of entropy structure and explains the role of zero-dimensional dynamics as a bridge between measurable and topological dynamics. Part III explains how both measure-theoretic and topological entropy can be extended to operators on relevant function spaces. Intuitive explanations, examples, exercises and open problems make this an ideal text for a graduate course on entropy theory. More experienced researchers can also find inspiration for further research.
Mathematical Foundations of Information Theory
Author: Aleksandr I?Akovlevich Khinchin
Publisher: Courier Corporation
ISBN: 0486604349
Category : Mathematics
Languages : en
Pages : 130
Book Description
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
Publisher: Courier Corporation
ISBN: 0486604349
Category : Mathematics
Languages : en
Pages : 130
Book Description
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
Entropy Optimization and Mathematical Programming
Author: Shu-Cherng Fang
Publisher: Springer Science & Business Media
ISBN: 1461561310
Category : Business & Economics
Languages : en
Pages : 350
Book Description
Entropy optimization is a useful combination of classical engineering theory (entropy) with mathematical optimization. The resulting entropy optimization models have proved their usefulness with successful applications in areas such as image reconstruction, pattern recognition, statistical inference, queuing theory, spectral analysis, statistical mechanics, transportation planning, urban and regional planning, input-output analysis, portfolio investment, information analysis, and linear and nonlinear programming. While entropy optimization has been used in different fields, a good number of applicable solution methods have been loosely constructed without sufficient mathematical treatment. A systematic presentation with proper mathematical treatment of this material is needed by practitioners and researchers alike in all application areas. The purpose of this book is to meet this need. Entropy Optimization and Mathematical Programming offers perspectives that meet the needs of diverse user communities so that the users can apply entropy optimization techniques with complete comfort and ease. With this consideration, the authors focus on the entropy optimization problems in finite dimensional Euclidean space such that only some basic familiarity with optimization is required of the reader.
Publisher: Springer Science & Business Media
ISBN: 1461561310
Category : Business & Economics
Languages : en
Pages : 350
Book Description
Entropy optimization is a useful combination of classical engineering theory (entropy) with mathematical optimization. The resulting entropy optimization models have proved their usefulness with successful applications in areas such as image reconstruction, pattern recognition, statistical inference, queuing theory, spectral analysis, statistical mechanics, transportation planning, urban and regional planning, input-output analysis, portfolio investment, information analysis, and linear and nonlinear programming. While entropy optimization has been used in different fields, a good number of applicable solution methods have been loosely constructed without sufficient mathematical treatment. A systematic presentation with proper mathematical treatment of this material is needed by practitioners and researchers alike in all application areas. The purpose of this book is to meet this need. Entropy Optimization and Mathematical Programming offers perspectives that meet the needs of diverse user communities so that the users can apply entropy optimization techniques with complete comfort and ease. With this consideration, the authors focus on the entropy optimization problems in finite dimensional Euclidean space such that only some basic familiarity with optimization is required of the reader.
New Foundations for Information Theory
Author: David Ellerman
Publisher: Springer Nature
ISBN: 3030865525
Category : Philosophy
Languages : en
Pages : 121
Book Description
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.
Publisher: Springer Nature
ISBN: 3030865525
Category : Philosophy
Languages : en
Pages : 121
Book Description
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.