fidelity and relative entropy (KL divergence) inequalities for post measurement distributions
#question #math #physics
fidelity relative-entropy quantum information
- Based on the observations:
- from 8.372 - Quantum Information 3 - homework 2, we saw that fidelity is lower bounded by square root probability of measurement, where equality is attained when $M$ is a projection matrix.
- KL divergence of bayesian posterior: when you take the KL divergence of a post observation $B$ (posterior) distribution again the prior, if the observation is a deterministic function of $X$, then the KL divergence is exactly log probability of the outcome of $B$ (i.e. the surprise) and the expected KL divergence is exactly the entropy of your observation $B$
- The latter fact can also be derived via mutual information, since $I(X,B)=H(B) - H(B|X)=H(B)$ and also $I(X,B)=D(p(x,b)|p(x)p(b))=\E_B\lsrs{D(p(x|b)|p(x)}$= expected KL divergence
- Is there some deeper connection here? Are both of these sets of observations specific cases of some more general statement about entropies and measurements?
- Do the KL divergence things apply in the quantum setting as well?
- Is this "fidelity mutual information" studied/used anywhere?
One can also define a "mutual information like" quantity: $F(p(x,b),p(x)p(b))$. Similarly to in the KL divergence setting, we can show this can also be written as: $$F(p(x,b),p(x)p(b))=\E_B\lsrs{F(p(x|b),p(x))}= \E_X\lsrs{F(p(b|x),p(b))}$$
In the case where $B$ is a deterministic function of $X$, you get that $$F(p(x,b),p(x)p(b)) = \E_B\lsrs{F(p(x|b),p(x))}=\sum_bp(b)\sqrt{p(b)}=\sum_bp(b)^{\frac{3}{2}}$$
The analogous computation in the KL divergence setting returns the entropy of $B$. And here, this does seem to give an "entropy like quantity" in that it captures how well spread your distribution over $B$ is.
Q2: Is this an interesting quantity? Is it used anywhere?
(see piazza post for answer potentially)