The quantum Fisher information is a central quantity in quantum metrology and is the quantum analogue of the classical Fisher information.[1][2][3][4][5] The quantum Fisher information [math]\displaystyle{ F_{\rm Q}[\varrho,A] }[/math] of a state [math]\displaystyle{ \varrho }[/math] with respect to the observable [math]\displaystyle{ A }[/math] is defined as
where [math]\displaystyle{ \lambda_k }[/math] and [math]\displaystyle{ \vert k \rangle }[/math] are the eigenvalues and eigenvectors of the density matrix [math]\displaystyle{ \varrho, }[/math] respectively, and the summation goes over all [math]\displaystyle{ k }[/math] and [math]\displaystyle{ l }[/math] such that [math]\displaystyle{ \lambda_k+\lambda_l\gt 0 }[/math].
When the observable generates a unitary transformation of the system with a parameter [math]\displaystyle{ \theta }[/math] from initial state [math]\displaystyle{ \varrho_0 }[/math],
the quantum Fisher information constrains the achievable precision in statistical estimation of the parameter [math]\displaystyle{ \theta }[/math] via the quantum Cramér–Rao bound as
where [math]\displaystyle{ m }[/math] is the number of independent repetitions.
It is often desirable to estimate the magnitude of an unknown parameter [math]\displaystyle{ \alpha }[/math] that controls the strength of a system's Hamiltonian [math]\displaystyle{ H = \alpha A }[/math] with respect to a known observable [math]\displaystyle{ A }[/math] during a known dynamical time [math]\displaystyle{ t }[/math]. In this case, defining [math]\displaystyle{ \theta = \alpha t }[/math], so that [math]\displaystyle{ \theta A = t H }[/math], means estimates of [math]\displaystyle{ \theta }[/math] can be directly translated into estimates of [math]\displaystyle{ \alpha }[/math].
Classical Fisher information of measuring observable [math]\displaystyle{ B }[/math] on density matrix [math]\displaystyle{ \varrho(\theta) }[/math] is defined as [math]\displaystyle{ F[B,\theta]=\sum_b\frac{1}{p(b|\theta)}\left(\frac{\partial p(b|\theta)}{\partial \theta}\right)^2 }[/math], where [math]\displaystyle{ p(b|\theta)=\langle b\vert \varrho(\theta)\vert b \rangle }[/math] is the probability of obtaining outcome [math]\displaystyle{ b }[/math] when measuring observable [math]\displaystyle{ B }[/math] on the transformed density matrix [math]\displaystyle{ \varrho(\theta) }[/math]. [math]\displaystyle{ b }[/math] is the eigenvalue corresponding to eigenvector [math]\displaystyle{ \vert b \rangle }[/math] of observable [math]\displaystyle{ B }[/math].
Quantum Fisher information is the supremum of the classical Fisher information over all such observables,[6]
The quantum Fisher information equals the expectation value of [math]\displaystyle{ L_{\varrho}^2 }[/math], where [math]\displaystyle{ L_{\varrho} }[/math] is the symmetric logarithmic derivative
For a unitary encoding operation [math]\displaystyle{ \varrho(\theta)=\exp(-iA\theta)\varrho_0\exp(+iA\theta), }[/math], the quantum Fisher information can be computed as an integral,[7]
where [math]\displaystyle{ [\ ,\ ] }[/math] on the right hand side denotes commutator. It can be also expressed in terms of Kronecker product and vectorization,[8]
where [math]\displaystyle{ ^* }[/math] denotes complex conjugate, and [math]\displaystyle{ ^\dagger }[/math] denotes conjugate transpose. This formula holds for invertible density matrices. For non-invertible density matrices, the inverse above is substituted by the Moore-Penrose pseudoinverse. Alternatively, one can compute the quantum Fisher information for invertible state [math]\displaystyle{ \rho_\nu=(1-\nu)\rho_0+\nu\pi }[/math], where [math]\displaystyle{ \pi }[/math] is any full-rank density matrix, and then perform the limit [math]\displaystyle{ \nu \rightarrow 0^+ }[/math] to obtain the quantum Fisher information for [math]\displaystyle{ \rho_0 }[/math]. Density matrix [math]\displaystyle{ \pi }[/math] can be, for example, [math]\displaystyle{ {\rm Identity}/\dim{\mathcal{H}} }[/math] in a finite-dimensional system, or a thermal state in infinite dimensional systems.
For any differentiable parametrization of the density matrix [math]\displaystyle{ \varrho(\boldsymbol{\theta}) }[/math] by a vector of parameters [math]\displaystyle{ \boldsymbol{\theta}=(\theta_1,\dots,\theta_n) }[/math], the quantum Fisher information matrix is defined as
where [math]\displaystyle{ \partial_i }[/math] denotes partial derivative with respect to parameter [math]\displaystyle{ \theta_i }[/math]. The formula also holds without taking the real part [math]\displaystyle{ \operatorname{Re} }[/math], because the imaginary part leads to an antisymmetric contribution that disappears under the sum. Note that all eigenvalues [math]\displaystyle{ \lambda_k }[/math] and eigenvectors [math]\displaystyle{ \vert k\rangle }[/math] of the density matrix potentially depend on the vector of parameters [math]\displaystyle{ \boldsymbol{\theta} }[/math].
This definition is identical to four times the Bures metric, up to singular points where the rank of the density matrix changes (those are the points at which [math]\displaystyle{ \lambda_k+\lambda_l }[/math] suddenly becomes zero.) Through this relation, it also connects with quantum fidelity [math]\displaystyle{ F(\varrho,\sigma)=\left(\mathrm{tr}\left[\sqrt{\sqrt{\varrho}\sigma\sqrt{\varrho}}\right]\right)^2 }[/math] of two infinitesimally close states,[9]
where the inner sum goes over all [math]\displaystyle{ k }[/math] at which eigenvalues [math]\displaystyle{ \lambda_k(\boldsymbol{\theta})=0 }[/math]. The extra term (which is however zero in most applications) can be avoided by taking a symmetric expansion of fidelity,[10]
For [math]\displaystyle{ n=1 }[/math] and unitary encoding, the quantum Fisher information matrix reduces to the original definition.
Quantum Fisher information matrix is a part of a wider family of quantum statistical distances.[11]
Assuming that [math]\displaystyle{ \vert \psi_0(\theta)\rangle }[/math] is a ground state of a parameter-dependent non-degenerate Hamiltonian [math]\displaystyle{ H(\theta) }[/math], four times the quantum Fisher information of this state is called fidelity susceptibility, and denoted[12]
Fidelity susceptibility measures the sensitivity of the ground state to the parameter, and its divergence indicates a quantum phase transition. This is because of the aforementioned connection with fidelity: a diverging quantum Fisher information means that [math]\displaystyle{ \vert\psi_0(\theta)\rangle }[/math] and [math]\displaystyle{ \vert\psi_0(\theta+d\theta)\rangle }[/math] are orthogonal to each other, for any infinitesimal change in parameter [math]\displaystyle{ d\theta }[/math], and thus are said to undergo a phase-transition at point [math]\displaystyle{ \theta }[/math].
The quantum Fisher information equals four times the variance for pure states
For mixed states, when the probabilities are parameter independent, i.e., when [math]\displaystyle{ p(\theta)=p }[/math], the quantum Fisher information is convex:
The quantum Fisher information is the largest function that is convex and that equals four times the variance for pure states. That is, it equals four times the convex roof of the variance[13][14]
where the infimum is over all decompositions of the density matrix
Note that [math]\displaystyle{ \vert \Psi_k\rangle }[/math] are not necessarily orthogonal to each other. The above optimization can be rewritten as an optimization over the two-copy space as [15]
such that [math]\displaystyle{ \varrho_{12} }[/math] is a symmetric separable state and
Later the above statement has been proved even for the case of a minimization over general (not necessarily symmetric) separable states.[16]
When the probabilities are [math]\displaystyle{ \theta }[/math]-dependent, an extended-convexity relation has been proved:[17]
where [math]\displaystyle{ F_{\rm C}[\{p_i(\theta)\}]=\sum_i \frac{\partial_{\theta} p_i(\theta)^2}{p_i(\theta)} }[/math] is the classical Fisher information associated to the probabilities contributing to the convex decomposition. The first term, in the right hand side of the above inequality, can be considered as the average quantum Fisher information of the density matrices in the convex decomposition.
We need to understand the behavior of quantum Fisher information in composite system in order to study quantum metrology of many-particle systems.[18] For product states,
holds.
For the reduced state, we have
where [math]\displaystyle{ \varrho_{1}={\rm Tr}_2(\varrho_{12}) }[/math].
There are strong links between quantum metrology and quantum information science. For a multiparticle system of [math]\displaystyle{ N }[/math] spin-1/2 particles [19]
holds for separable states, where
and [math]\displaystyle{ j_z^{(n)} }[/math] is a single particle angular momentum component. The maximum for general quantum states is given by
Moreover, for quantum states with an entanglement depth [math]\displaystyle{ k }[/math],
holds, where [math]\displaystyle{ s=\lfloor N/k \rfloor }[/math] is the largest integer smaller than or equal to [math]\displaystyle{ N/k, }[/math] and [math]\displaystyle{ r=N-sk }[/math] is the remainder from dividing [math]\displaystyle{ N }[/math] by [math]\displaystyle{ k }[/math]. Hence, a higher and higher levels of multipartite entanglement is needed to achieve a better and better accuracy in parameter estimation.[20][21] It is possible to obtain a weaker but simpler bound [22]
Hence, a lower bound on the entanglement depth is obtained as
The Wigner–Yanase skew information is defined as [23]
It follows that [math]\displaystyle{ I(\varrho,H) }[/math] is convex in [math]\displaystyle{ \varrho. }[/math]
For the quantum Fisher information and the Wigner–Yanase skew information, the inequality
holds, where there is an equality for pure states.
For any decomposition of the density matrix given by [math]\displaystyle{ p_k }[/math] and [math]\displaystyle{ \vert \Psi_k\rangle }[/math] the relation [13]
holds, where both inequalities are tight. That is, there is a decomposition for which the second inequality is saturated, which is the same as stating that the quantum Fisher information is the convex roof of the variance over four, discussed above. There is also a decomposition for which the first inequality is saturated, which means that the variance is its own concave roof [13]
Knowing that the quantum Fisher information is the convex roof of the variance times four, we obtain the relation [24] [math]\displaystyle{ (\Delta A)^2 F_Q[\varrho,B] \geq \vert \langle i[A,B]\rangle\vert^2, }[/math] which is stronger than the Heisenberg uncertainty relation. For a particle of spin-[math]\displaystyle{ j, }[/math] the following uncertainty relation holds [math]\displaystyle{ (\Delta J_x)^2+(\Delta J_y)^2+(\Delta J_z)^2\ge j, }[/math] where [math]\displaystyle{ J_l }[/math] are angular momentum components. The relation can be strengthened as [25][26] [math]\displaystyle{ (\Delta J_x)^2+(\Delta J_y)^2+F_Q[\varrho,J_z]/4\ge j. }[/math]
Original source: https://en.wikipedia.org/wiki/Quantum Fisher information.
Read more |