There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory.

## The Existence-Information Duality

The development of the logic of partitions (dual to the usual Boolean logic of subsets) and logical information theory bring out a fundamental duality between existence (e.g., elements of a subset) and information (e.g., distinctions of a partition). This leads in a more meta-physical vein to two different conceptions of reality, one of which provides the realistic interpretation of quantum mechanics.

## Introduction to Partition Logic

## Introduction to Logical Entropy

This paper, a reprint from the International Journal of Semantic Computing, introduces the logical notion of entropy based on the newly developed logic of partitions that is mathematically dual to the usual Boolean logic of subsets (aka “propositional logic”), and compares it to the usual Shannon entropy.

## Information as distinctions

This paper is sub-titled “New Foundations for Information Theory” since it is based on the logical notion of entropy from the logic of partitions. The basic logical idea is that of “distinctions.” Logical entropy is normalized counting measure of the set of distinctions of a partition, and Shannon entropy is the number of binary partitions needed, on average, to make the same distinctions of the partition.

## Counting Distinctions

## Seminar in Quantum Information Theory II

## Seminar in Quantum Information Theory I

## The Objective Indefiniteness Interpretation of Quantum Mechanics

## History of the Logical Entropy Formula

The logical entropy formula Given a partition on a finite universe set U, the set of distinctions or dits is the set of ordered pairs of elements in distinct blocks of the partition. The logical entropy of the partition is the normalized cardinality of the dit set: . The logical entropy can be interpreted probabilistically […]