Logical Information Theory: New Foundations for Information Theory

There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory.

Information as distinctions

This paper is sub-titled “New Foundations for Information Theory” since it is based on the logical notion of entropy from the logic of partitions. The basic logical idea is that of “distinctions.” Logical entropy is normalized counting measure of the set of distinctions of a partition, and Shannon entropy is the number of binary partitions needed, on average, to make the same distinctions of the partition.

From Partition Logic to Information Theory

A new logic of partitions has been developed that is dual to ordinary logic when the latter is interpreted as the logic of subsets rather than the logic of propositions. For a finite universe, the logic of subsets gave rise to finite probability theory by assigning to each subset its relative cardinality as a Laplacian probability. The analogous development for the dual logic of partitions gives rise to a notion of logical entropy that is related in a precise manner to Claude Shannon’s entropy.