This is a set of slides from a talk on introducing the Hamming distance into classical logical information theory and then developing the quantum logical notion of Hamming distance–which turns out to equal a standard notion of distance in quantum information theory, the Hilbert-Schmidt distance.

## Talk: New Foundations for Quantum Information Theory

## Talk: New Foundations for Information Theory

## Logical Information Theory: New Foundations for Information Theory

## New Logical Foundations for Quantum Information Theory

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized as the distinctions of a partition (a pair of points distinguished by the partition). This paper is an introduction to the quantum version of logical information theory.

## The Existence-Information Duality

The development of the logic of partitions (dual to the usual Boolean logic of subsets) and logical information theory bring out a fundamental duality between existence (e.g., elements of a subset) and information (e.g., distinctions of a partition). This leads in a more meta-physical vein to two different conceptions of reality, one of which provides the realistic interpretation of quantum mechanics.

## Introduction to Partition Logic

## Introduction to Logical Entropy

This paper, a reprint from the International Journal of Semantic Computing, introduces the logical notion of entropy based on the newly developed logic of partitions that is mathematically dual to the usual Boolean logic of subsets (aka “propositional logic”), and compares it to the usual Shannon entropy.

## Information as distinctions

This paper is sub-titled “New Foundations for Information Theory” since it is based on the logical notion of entropy from the logic of partitions. The basic logical idea is that of “distinctions.” Logical entropy is normalized counting measure of the set of distinctions of a partition, and Shannon entropy is the number of binary partitions needed, on average, to make the same distinctions of the partition.