1303.4467 (Alexey E. Rastegin)
Alexey E. Rastegin
We formulate novel uncertainty relations for mutually unbiased bases and symmetric informationally complete measurements in terms of the R\'{e}nyi and Tsallis entropies. For arbitrary number of mutually unbiased bases in a finite-dimensional Hilbert space, a family of Tsallis $\alpha$-entropic bounds is derived for $\alpha\in(0;2]$. In terms of R\'{e}nyi's entropies, lower bounds are given for $\alpha\in[2;\infty)$. State-dependent and state-independent forms of such bounds are both given. We also obtain lower bounds in term of the so-called symmetrized entropies. The presented results for mutually unbiased bases are one-parametric extensions of entropic bounds previously derived in the literature. Entropic relations for symmetrical informationally complete measurement are examined as well. For a symmetrical informationally complete measurement, we obtain state-independent lower bounds on its Tsallis $\alpha$-entropy for $\alpha\in(0;2]$ as well as on its R\'{e}nyi $\alpha$-entropy for $\alpha\in[2;\infty)$. These bounds are essentially based on the fact that the corresponding normalized states form a spherical 2-design. For a pair of symmetrical informationally complete measurements, we further obtain an entropic bound of Maassen-Uffink type. Both the state-dependent and state-independent formulations are briefly discussed.
View original:
http://arxiv.org/abs/1303.4467
No comments:
Post a Comment