No data available.
Please log in to see this content.
You have no subscription access to this content.
No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
1. F. Bavaud, “Information theory, relative entropy and statistics,” in Formal Theories of Information, Lecture Notes in Computer Science Vol. 5363, edited by G. Sommaruga (Springer, Berlin, 2009), pp. 54–78.
2. R. Bhatia, Matrix Analysis (Springer, 1997).
3. E. Lieb, “Convex trace functions and the Wigner-Yanase-Dyson conjecture,” Adv. Math. 11, 267–288 (1973).http://dx.doi.org/10.1016/0001-8708(73)90011-X
4. F. Nielsen, “A family of statistical symmetric divergences based on Jensen's inequality,” preprint arXiv:1009.4004v2 (2010).
5. M. Ohya and D. Petz, Quantum Entropy and its Use (Springer, 1993).
6.Named after Fiedler's technique used to prove a well-known result in matrix analysis, see, e.g., Th. VI.7.1 in Ref. 2.
Article metrics loading...
Full text loading...