fidelity and relative entropy (KL divergence) inequalities for post measurement distributions

#question #math #physics
fidelity relative-entropy quantum information

One can also define a "mutual information like" quantity: $F(p(x,b),p(x)p(b))$. Similarly to in the KL divergence setting, we can show this can also be written as: $$F(p(x,b),p(x)p(b))=\E_B\lsrs{F(p(x|b),p(x))}= \E_X\lsrs{F(p(b|x),p(b))}$$
In the case where $B$ is a deterministic function of $X$, you get that $$F(p(x,b),p(x)p(b)) = \E_B\lsrs{F(p(x|b),p(x))}=\sum_bp(b)\sqrt{p(b)}=\sum_bp(b)^{\frac{3}{2}}$$
The analogous computation in the KL divergence setting returns the entropy of $B$. And here, this does seem to give an "entropy like quantity" in that it captures how well spread your distribution over $B$ is.

Q2: Is this an interesting quantity? Is it used anywhere?

(see piazza post for answer potentially)