SIT =Shannon Information Theory
Let denote the classical entropy of a probability distribution of the random variables .
Consider a Hilbert space . For any density matrix that acts on , let be its von Neumann entropy.
Suppose is a ket (pure quantum state) with density matrix given by . Then it is well known (a consequence of the Schmidt decomposition or of the Araki-Lieb Inequality) that for this pure state
where and are disjoint sets whose union equals , and where and . For example, if , then we have
Pure quantum states thus have a very high degree of symmetry. So much so that we can make a model of their entropy as follows. I’ll call it the RUM (Roots of Unity Model) of pure states.
(note that the entropy contributions are summed coherently rather than incoherently. The latter is common for macroscopic classical systems)
where we now interpret as the th roots of unity. In other words, we are now setting
The model works because
Henceforth we will abbreviate
I bet a lot of people have discovered RUM before, but I myself discovered it on my own yesterday. I find RUM very helpful because (besides being good for numbing pain and inducing forgetfulness🙂 ) it helps me visualize better the entropy of a pure quantum state. For example, it “explains” to me why quantum conditional entropies can be negative. Indeed, suppose and are two disjoint subsets of . Then
Also, the Araki-Lieb and sub additivity inequalities
become the well known triangle inequalities