Abstract

Can we decompose the information of a composite system into terms arising from its parts and their interactions?
For a bipartite system (X,Y), the joint entropy can be written as an algebraic sum of three terms: the entropy of X alone, the entropy of Y alone, and the mutual information of X and Y, which comes with an opposite sign. This suggests a set-theoretical analogy: mutual information is a sort of "intersection", and joint entropy is a sort of "union".
The same picture cannot be generalized to three or more parts in a straightforward way, and the problem is still considered open. Is there a deep reason for why the set-theoretical analogy fails?
Category theory can give an alternative, conceptual point of view on the problem. As Shannon already noted, information appears to be related to symmetry. This suggests a natural lattice structure for information, which is compatible with a set-theoretical picture only for bipartite systems.
The categorical approach favors objects with a structure in place of just numbers to describe information quantities. We hope that this can clarify the mathematical structure underlying information theory, and leave it open to wider generalizations.

Details

Talk Number PIRSA:16110030
Speaker Profile Paolo Perrone
Collection Quantum Foundations