Projection filters are a set of algorithms based on stochastic analysis and information geometry, or the differential geometric approach to statistics, used to find approximate solutions for filtering problems for nonlinear state-space systems.[1][2][3] The filtering problem consists of estimating the unobserved signal of a random dynamical system from partial noisy observations of the signal. The objective is computing the probability distribution of the signal conditional on the history of the noise-perturbed observations. This distribution allows for calculations of all statistics of the signal given the history of observations. If this distribution has a density, the density satisfies specific stochastic partial differential equations (SPDEs) called Kushner-Stratonovich equation, or Zakai equation. It is known that the nonlinear filter density evolves in an infinite dimensional function space.[4][5]
One can choose a finite dimensional family of probability densities, for example Gaussian densities, Gaussian mixtures, or exponential families, on which the infinite-dimensional filter density can be approximated. The basic idea of the projection filter is to use a geometric structure in the chosen spaces of densities to project the infinite dimensional SPDE of the optimal filter onto the chosen finite dimensional family, obtaining a finite dimensional stochastic differential equation (SDE) for the parameter of the density in the finite dimensional family that approximates the full filter evolution.[3] To do this, the chosen finite dimensional family is equipped with a manifold structure as in information geometry. The projection filter was tested against the optimal filter for the cubic sensor problem. The projection filter could track effectively bimodal densities of the optimal filter that would have been difficult to approximate with standard algorithms like the extended Kalman filter.[2][6] Projection filters are ideal for in-line estimation, as they are quick to implement and run efficiently in time, providing a finite dimensional SDE for the parameter that can be implemented efficiently.[2] Projection filters are also flexible, as they allow fine tuning the precision of the approximation by choosing richer approximating families, and some exponential families make the correction step in the projection filtering algorithm exact.[3] Some formulations coincide with heuristic based assumed density filters[3] or with Galerkin methods.[6] Projection filters can also approximate the full infinite-dimensional filter in an optimal way, beyond the optimal approximation of the SPDE coefficients alone, according to precise criteria such as mean square minimization.[7] Projection filters have been studied by the Swedish Defense Research Agency[1] and have also been successfully applied to a variety of fields including navigation, ocean dynamics, quantum optics and quantum systems, estimation of fiber diameters, estimation of chaotic time series, change point detection and other areas.[8]
The term "projection filter" was first coined in 1987 by Bernard Hanzon,[9] and the related theory and numerical examples were fully developed, expanded and made rigorous during the Ph.D. work of Damiano Brigo, in collaboration with Bernard Hanzon and Francois LeGland.[10][2][3] These works dealt with the projection filters in Hellinger distance and Fisher information metric, that were used to project the optimal filter infinite-dimensional SPDE on a chosen exponential family. The exponential family can be chosen so as to make the prediction step of the filtering algorithm exact.[2] A different type of projection filters, based on an alternative projection metric, the direct [math]\displaystyle{ L^2 }[/math] metric, was introduced in Armstrong and Brigo (2016).[6] With this metric, the projection filters on families of mixture distributions coincide with filters based on Galerkin methods. Later on, Armstrong, Brigo and Rossi Ferrucci (2021)[7] derive optimal projection filters that satisfy specific optimality criteria in approximating the infinite dimensional optimal filter. Indeed, the Stratonovich-based projection filters optimized the approximations of the SPDE separate coefficients on the chosen manifold but not the SPDE solution as a whole. This has been dealt with by introducing the optimal projection filters. The innovation here is to work directly with Ito calculus, instead of resorting to the Stratonovich calculus version of the filter equation. This is based on research on the geometry of Ito Stochastic differential equations on manifolds based on the jet bundle, the so-called 2-jet interpretation of Ito stochastic differential equations on manifolds.[11]
Here the derivation of the different projection filters is sketched.
This is a derivation of both the initial filter in Hellinger/Fisher metric sketched by Hanzon[9] and fully developed by Brigo, Hanzon and LeGland,[10][2] and the later projection filter in direct L2 metric by Armstrong and Brigo (2016).[6]
It is assumed that the unobserved random signal [math]\displaystyle{ X_t \in \R^m }[/math] is modelled by the Ito stochastic differential equation:
where f and [math]\displaystyle{ \sigma\, dW }[/math] are [math]\displaystyle{ \R^m }[/math] valued and [math]\displaystyle{ W_t }[/math] is a Brownian motion. Validity of all regularity conditions necessary for the results to hold will be assumed, with details given in the references. The associated noisy observation process [math]\displaystyle{ Y_t \in \R^d }[/math] is modelled by
where [math]\displaystyle{ b }[/math] is [math]\displaystyle{ \R^d }[/math] valued and [math]\displaystyle{ V_t }[/math] is a Brownian motion independent of [math]\displaystyle{ W_t }[/math]. As hinted above, the full filter is the conditional distribution of [math]\displaystyle{ X_t }[/math] given a prior for [math]\displaystyle{ X_0 }[/math] and the history of [math]\displaystyle{ Y }[/math] up to time [math]\displaystyle{ t }[/math]. If this distribution has a density described informally as
where [math]\displaystyle{ \sigma(Y_s, s\leq t) }[/math] is the sigma-field generated by the history of noisy observations [math]\displaystyle{ Y }[/math] up to time [math]\displaystyle{ t }[/math], under suitable technical conditions the density [math]\displaystyle{ p_t }[/math] satisfies the Kushner—Stratonovich SPDE:
where [math]\displaystyle{ E_p }[/math] is the expectation [math]\displaystyle{ E_p[h] = \int h(x) p(x) dx, }[/math] and the forward diffusion operator [math]\displaystyle{ {\cal L}^*_t }[/math] is
where [math]\displaystyle{ a=\sigma \sigma^T }[/math] and [math]\displaystyle{ T }[/math] denotes transposition. To derive the first version of the projection filters, one needs to put the [math]\displaystyle{ p_t }[/math] SPDE in Stratonovich form. One obtains
Through the chain rule, it's immediate to derive the SPDE for [math]\displaystyle{ d \sqrt{p_t} }[/math]. To shorten notation one may rewrite this last SPDE as [math]\displaystyle{ dp = F(p) \,dt + G^T(p) \circ dY\ , }[/math]
where the operators [math]\displaystyle{ F(p) }[/math] and [math]\displaystyle{ G^T(p) }[/math] are defined as
The square root version is [math]\displaystyle{ d \sqrt{p} = \frac{1}{2 \sqrt{p}}[ F(p) \,dt + G^T(p) \circ dY]\ . }[/math]
These are Stratonovich SPDEs whose solutions evolve in infinite dimensional function spaces. For example [math]\displaystyle{ p_t }[/math] may evolve in [math]\displaystyle{ L^2 }[/math] (direct metric [math]\displaystyle{ d_2 }[/math])
or [math]\displaystyle{ \sqrt{p_t} }[/math] may evolve in [math]\displaystyle{ L^2 }[/math] (Hellinger metric [math]\displaystyle{ d_H }[/math])
where [math]\displaystyle{ \Vert\cdot\Vert }[/math] is the norm of Hilbert space [math]\displaystyle{ L^2 }[/math]. In any case, [math]\displaystyle{ p_t }[/math] (or [math]\displaystyle{ \sqrt{p_t} }[/math]) will not evolve inside any finite dimensional family of densitities,
The projection filter idea is approximating [math]\displaystyle{ p_t(x) }[/math] (or [math]\displaystyle{ \sqrt{p_t(x)} }[/math]) via a finite dimensional density [math]\displaystyle{ p(x,\theta_t) }[/math] (or [math]\displaystyle{ \sqrt{p(x,\theta_t)} }[/math]).
The fact that the filter SPDE is in Stratonovich form allows for the following. As Stratonovich SPDEs satisfy the chain rule, [math]\displaystyle{ F }[/math] and [math]\displaystyle{ G }[/math] behave as vector fields. Thus, the equation is characterized by a [math]\displaystyle{ dt }[/math] vector field [math]\displaystyle{ F }[/math] and a [math]\displaystyle{ dY_t }[/math] vector field [math]\displaystyle{ G }[/math]. For this version of the projection filter one is satisfied with dealing with the two vector fields separately. One may project [math]\displaystyle{ F }[/math] and [math]\displaystyle{ G }[/math] on the tangent space of the densities in [math]\displaystyle{ S_\Theta }[/math] (direct metric) or of their square roots (Hellinger metric). The direct metric case yields
where [math]\displaystyle{ \Pi_{p(\cdot,\theta)} }[/math] is the tangent space projection at the point [math]\displaystyle{ p(\cdot,\theta) }[/math] for the manifold [math]\displaystyle{ S_\Theta }[/math], and where, when applied to a vector such as [math]\displaystyle{ G^T }[/math], it is assumed to act component-wise by projecting each of [math]\displaystyle{ G^T }[/math]'s components. As a basis of this tangent space is
by denoting the inner product of [math]\displaystyle{ L^2 }[/math] with [math]\displaystyle{ \langle \cdot, \cdot \rangle }[/math], one defines the metric
and the projection is thus
where [math]\displaystyle{ \gamma^{ij} }[/math] is the inverse of [math]\displaystyle{ \gamma_{ij} }[/math]. The projected equation thus reads
which can be written as
where it has been crucial that Stratonovich calculus obeys the chain rule. From the above equation, the final projection filter SDE is [math]\displaystyle{ d \theta_i = \left[\sum_{j=1}^n \gamma^{ij}(\theta_t)\; \int F(p(x, \theta_t)) \; \frac{\partial p(x,\theta_t)}{\partial \theta_j} dx \right] dt + \sum_{k=1}^d\; \left[ \sum_{j=1}^n \gamma^{ij}(\theta_t)\; \int G_k(p(x, \theta_t)) \; \frac{\partial p(x,\theta_t)}{\partial \theta_j}\; d x \right] \circ dY_k }[/math]
with initial condition a chosen [math]\displaystyle{ \theta_0 }[/math].
By substituting the definition of the operators F and G we obtain the fully explicit projection filter equation in direct metric:
[math]\displaystyle{ + \sum_{k=1}^d\; \left[ \sum_{j=1}^m \gamma^{ij}(\theta_t)\; \int \left[ b_k(x,t) - \int b_k(z,t) p(z,\theta_t) dz \right] \; p(x,\theta_t) \; \frac{\partial p(x,\theta_t)}{\partial \theta_j}\; d x \right] \circ dY_t^k\ . }[/math]
If one uses the Hellinger distance instead, square roots of densities are needed. The tangent space basis is then
and one defines the metric
The metric [math]\displaystyle{ g }[/math] is the Fisher information metric. One follows steps completely analogous to the direct metric case and the filter equation in Hellinger/Fisher metric is
again with initial condition a chosen [math]\displaystyle{ \theta_0 }[/math].
Substituting F and G one obtains [math]\displaystyle{ d \theta_i(t) = \left[ \sum_{j=1}^m g^{ij}(\theta_t)\; \int \frac{{\cal L}_t^\ast\, p(x,\theta_t)}{p(x,\theta_t)}\; \frac{\partial p(x,\theta_t)}{\partial \theta_j}\; dx - \sum_{j=1}^m g^{ij}(\theta_t) \int \frac{1}{2} \vert b_t(x) \vert^2 \frac{\partial p(x,\theta_t)}{\partial \theta_j} dx \right] dt }[/math]
[math]\displaystyle{ + \sum_{k=1}^d\; \left[ \sum_{j=1}^m g^{ij}(\theta_t)\; \int b_k(x,t)\; \frac{\partial p(x,\theta_t)}{\partial \theta_j}\; dx \right] \circ dY_t^k\ . }[/math]
The projection filter in direct metric, when implemented on a manifold [math]\displaystyle{ S_\Theta }[/math] of mixture families, leads to equivalence with a Galerkin method.[6]
The projection filter in Hellinger/Fisher metric when implemented on a manifold [math]\displaystyle{ S_\Theta^{1/2} }[/math] of square roots of an exponential family of densities is equivalent to the assumed density filters.[3]
One should note that it is also possible to project the simpler Zakai equation for an unnormalized version of the density p. This would result in the same Hellinger projection filter but in a different direct metric projection filter.[6]
Finally, if in the exponential family case one includes among the sufficient statistics of the exponential family the observation function in [math]\displaystyle{ dY_t }[/math], namely [math]\displaystyle{ b(x) }[/math]'s components and [math]\displaystyle{ |b(x)|^2 }[/math], then one can see that the correction step in the filtering algorithm becomes exact. In other terms, the projection of the vector field [math]\displaystyle{ G }[/math] is exact, resulting in [math]\displaystyle{ G }[/math] itself. Writing the filtering algorithm in a setting with continuous state [math]\displaystyle{ X }[/math] and discrete time observations [math]\displaystyle{ Y }[/math], one can see that the correction step at each new observation is exact, as the related Bayes formula entails no approximation.[3]
Now rather than considering the exact filter SPDE in Stratonovich calculus form, one keeps it in Ito calculus form
In the Stratonovich projection filters above, the vector fields [math]\displaystyle{ F }[/math] and [math]\displaystyle{ G }[/math] were projected separately. By definition, the projection is the optimal approximation for [math]\displaystyle{ F }[/math] and [math]\displaystyle{ G }[/math] separately, although this does not imply it provides the best approximation for the filter SPDE solution as a whole. Indeed, the Stratonovich projection, acting on the two terms [math]\displaystyle{ F }[/math] and [math]\displaystyle{ G }[/math] separately, does not guarantee optimality of the solution [math]\displaystyle{ p(\cdot,\theta_{0+\delta t}) }[/math] as an approximation of the exact [math]\displaystyle{ p_{0+\delta t} }[/math] for say small [math]\displaystyle{ \delta t }[/math]. One may look for a norm [math]\displaystyle{ \| \cdot \| }[/math] to be applied to the solution, for which
The Ito-vector projection is obtained as follows. Let us choose a norm for the space of densities, [math]\displaystyle{ \|\cdot \| }[/math], which might be associated with the direct metric or the Hellinger metric.
One chooses the diffusion term in the approximating Ito equation for [math]\displaystyle{ \theta_t }[/math] by minimizing (but not zeroing) the [math]\displaystyle{ \delta t }[/math] term of the Taylor expansion for the mean square error
finding the drift term in the approximating Ito equation that minimizes the [math]\displaystyle{ (\delta t)^2 }[/math] term of the same difference. Here the [math]\displaystyle{ \delta t }[/math] order term is minimized, not zeroed, and one never attains [math]\displaystyle{ (\delta t)^2 }[/math] convergence, only [math]\displaystyle{ \delta t }[/math] convergence.
A further benefit of the Ito vector projection is that it minimizes the order 1 Taylor expansion in [math]\displaystyle{ \delta t }[/math] of
To achieve [math]\displaystyle{ (\delta t)^2 }[/math] convergence, rather than [math]\displaystyle{ \delta t }[/math] convergence, the Ito-jet projection is introduced. It is based on the notion of metric projection.
The metric projection of a density [math]\displaystyle{ p \in L^2 }[/math] (or [math]\displaystyle{ \sqrt{p} \in L^2 }[/math]) onto the manifold [math]\displaystyle{ S_\Theta }[/math] (or [math]\displaystyle{ S_\Theta^{1/2} }[/math]) is the closest point on [math]\displaystyle{ S_\Theta }[/math] (or [math]\displaystyle{ S_\Theta^{1/2} }[/math]) to [math]\displaystyle{ p }[/math] (or [math]\displaystyle{ \sqrt{p} }[/math]). Denote it by [math]\displaystyle{ \pi(p) }[/math]. The metric projection is, by definition, according to the chosen metric, the best one can ever do for approximating [math]\displaystyle{ p }[/math] in [math]\displaystyle{ S_\Theta }[/math]. Thus the idea is finding a projection filter that comes as close as possible to the metric projection. In other terms, one considers the criterion [math]\displaystyle{ \theta_{0+\delta t} \approx \mbox{argmin}_\theta\ \| \pi(p_{0+\delta t})- p(\cdot,\theta) \|. }[/math]
The detailed calculations are lengthy and laborious,[7] but the resulting approximation achieves [math]\displaystyle{ (\delta t)^2 }[/math] convergence. Indeed, the Ito jet projection attains the following optimality criterion. It zeroes the [math]\displaystyle{ \delta t }[/math] order term and it minimizes the [math]\displaystyle{ (\delta t)^2 }[/math] order term of the Taylor expansion of the mean square distance in [math]\displaystyle{ L^2 }[/math] between [math]\displaystyle{ \pi(p_{0+\delta t}) }[/math] and [math]\displaystyle{ p(\cdot,\theta_{0+\delta t}) }[/math].
Both the Ito vector and the Ito jet projection result in final SDEs, driven by the observations [math]\displaystyle{ dY }[/math], for the parameter [math]\displaystyle{ \theta_t }[/math] that best approximates the exact filter evolution for small times.[7]
Jones and Soatto (2011) mention projection filters as possible algorithms for on-line estimation in visual-inertial navigation,[12] mapping and localization, while again on navigation Azimi-Sadjadi and Krishnaprasad (2005)[13] use projection filters algorithms. The projection filter has been also considered for applications in ocean dynamics by Lermusiaux 2006.[14] Kutschireiter, Rast, and Drugowitsch (2022)[15] refer to the projection filter in the context of continuous time circular filtering. For quantum systems applications, see for example van Handel and Mabuchi (2005),[16] who applied the quantum projection filter to quantum optics, studying a quantum model of optical phase bistability of a strongly coupled two-level atom in an optical cavity. Further applications to quantum systems are considered in Gao, Zhang and Petersen (2019).[17] Ma, Zhao, Chen and Chang (2015) refer to projection filters in the context of hazard position estimation, while Vellekoop and Clark (2006)[18] generalize the projection filter theory to deal with changepoint detection. Harel, Meir and Opper (2015)[19] apply the projection filters in assumed density form to the filtering of optimal point processes with applications to neural encoding. Broecker and Parlitz (2000)[20] study projection filter methods for noise reduction in chaotic time series. Zhang, Wang, Wu and Xu (2014) [21] apply the Gaussian projection filter as part of their estimation technique to deal with measurements of fiber diameters in melt-blown nonwovens.
Original source: https://en.wikipedia.org/wiki/Projection filters.
Read more |