These are the slides for a number of talks on logical information theory as providing new foundations for information theory.

## Logical Information Theory: New Foundations for Information Theory

There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory.

## Logical Entropy: Introduction to Classical and Quantum Logical Information Theory

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized as the distinctions of a partition (a pair of points distinguished by the partition). This paper is an introduction to the quantum version of logical information theory.

## The Existence-Information Duality

The development of the logic of partitions (dual to the usual Boolean logic of subsets) and logical information theory bring out a fundamental duality between existence (e.g., elements of a subset) and information (e.g., distinctions of a partition). This leads in a more meta-physical vein to two different conceptions of reality, one of which provides the realistic interpretation of quantum mechanics.

## Introduction to Partition Logic

This is an introductory treatment of partition logic which also shows the extension to logical information theory and the possible killer application to quantum mechanics.

## Introduction to Logical Entropy

This paper, a reprint from the International Journal of Semantic Computing, introduces the logical notion of entropy based on the newly developed logic of partitions that is mathematically dual to the usual Boolean logic of subsets (aka “propositional logic”), and compares it to the usual Shannon entropy.

## Information as distinctions

This paper is sub-titled “New Foundations for Information Theory” since it is based on the logical notion of entropy from the logic of partitions. The basic logical idea is that of “distinctions.” Logical entropy is normalized counting measure of the set of distinctions of a partition, and Shannon entropy is the number of binary partitions needed, on average, to make the same distinctions of the partition.

## Counting Distinctions

This paper gives the logical theory of information that is developed out of partition logic in exactly the same way that Boole developed logical probability theory out of his subset logic.

## Seminar in Quantum Information Theory II

These are the slides from a seminar in quantum information theory and related topics in the Computer Science Department of UC/Riverside during the Spring quarter 2012.

## Seminar in Quantum Information Theory I

These are the slides from a seminar in quantum information theory taught in the Computer Science Department of UC/Riverside in the Winter quarter 2012.