Max Planck Institute for Mathematics in the Sciences
Talks by Paolo Perrone
Can we decompose the information of a composite system into terms arising from its parts and their interactions?
For a bipartite system (X,Y), the joint entropy can be written as an algebraic sum of three terms: the entropy of X alone, the entropy of Y alone, and the mutual information of X and Y, which comes with an opposite sign. This suggests a set-theoretical analogy: mutual information is a sort of "intersection", and joint entropy is a sort of "union".