Kalman filtered Compressed Sensing

@article{Vaswani2008KalmanFC,
  title={Kalman filtered Compressed Sensing},
  author={Namrata Vaswani},
  journal={2008 15th IEEE International Conference on Image Processing},
  year={2008},
  pages={893-896}
}
  • N. Vaswani
  • Published 4 April 2008
  • Computer Science
  • 2008 15th IEEE International Conference on Image Processing
We consider the problem of reconstructing time sequences of spatially sparse signals (with unknown and time-varying sparsity patterns) from a limited number of linear "incoherent" measurements, in real-time. The signals are sparse in some transform domain referred to as the sparsity basis. For a single spatial signal, the solution is provided by Compressed Sensing (CS). The question that we address is, for a sequence of sparse signals, can we do better than CS, if (a) the sparsity pattern of… 

Figures from this paper

Analyzing Least Squares and Kalman Filtered Compressed Sensing
  • N. Vaswani
  • Mathematics, Computer Science
    2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • 2009
TLDR
A solution to the problem of causally reconstructing time sequences of spatially sparse signals, with unknown and slow time-varying sparsity patterns, from a limited number of linear “incoherent” measurements is proposed called Kalman Filtered Compressed Sensing (KF-CS).
Compressed sensing of time-varying signals
TLDR
This paper develops CS algorithms for time-varying signals, based on the least-absolute shrinkage and selection operator (Lasso) that has been popular for sparse regression problems, and proposes two algorithms: the Group-Fused Lasso and the Dynamic Lasso.
A Simple Method for Sparse Signal Recovery from Noisy Observations Using Kalman Filtering
TLDR
The algorithm is a Kalman filter that utilize a so-called pseudo-measurement technique for optimizing the convex minimization problem following from the theory of compressed sensing, and is exclusively based on the KF formulation.
LS-CS-Residual (LS-CS): Compressive Sensing on Least Squares Residual
  • N. Vaswani
  • Computer Science
    IEEE Transactions on Signal Processing
  • 2010
TLDR
This work bound CS-residual error and shows “stability” of LS-CS over time for a signal model that allows support additions and removals, and that allows coefficients to gradually increase (decrease) until they reach a constant value (become zero).
Online Recovery of Temporally Correlated Sparse Signals Using Multiple Measurement Vectors
TLDR
This work proposes two online SBL algorithms which operate on the observations in a serial fashion, and illustrates that the mean square error and support recovery performance of the proposed algorithms is very close to the offline Kalman SBL algorithm.
Online Recovery of Temporally Correlated Sparse Signals Using Multiple Measurement Vectors
TLDR
This work proposes two online SBL algorithms which operate on the observations in a serial fashion, and illustrates that the mean square error and support recovery performance of the proposed algorithms is very close to the offline Kalman SBL algorithm.
Stability of Modified-CS and LS-CS for Recursive Reconstruction of Sparse Signal Sequences
TLDR
This work obtains sufficient conditions for the "stability" of the recently proposed algorithms, Least Squares Compressive Sensing residual (LS-CS) and modified-CS, for recursively reconstructing sparse signal sequences from noisy measurements, and shows that, for a signal model with fixed signal power and support set size; support set changes allowed at every time; and gradual coefficient magnitude increase/decrease, "st stability" holds under mild assumptions.
Tracking dynamic sparse signals using Hierarchical Bayesian Kalman filters
TLDR
This work is interested in the problem of reconstructing time-varying sparse signals for which the support is assumed to be sparse and a hierarchical Bayesian model is used in the tracking process which succeeds in modelling sparsity.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 20 REFERENCES
Bayesian Compressive Sensing
TLDR
The underlying theory, an associated algorithm, example results, and comparisons to other compressive-sensing inversion algorithms in the literature are presented.
Compressed Sensing Image Reconstruction Via Recursive Spatially Adaptive Filtering
TLDR
Overall, the conventional parametric modeling used in CS is replaced by a nonparametric one and it is shown that the algorithm allows to achieve exact reconstruction of synthetic phantom data even from a very small number projections.
Exploiting Prior Knowledge in The Recovery of Signals from Noisy Random Projections
TLDR
It is shown that it is possible to exploit prior knowledge (e.g., if the signal is a realization of a stochastic process,) to significantly improve reconstruction performance, and this is done in a fashion resembling standard joint source-channel coding of digital sources.
Compressed sensing in dynamic MRI
TLDR
Given sufficient data sparsity and base signal‐to‐noise ratio (SNR), CS is demonstrated to result in improved temporal fidelity compared to k‐t BLAST reconstructions for the example data sets used in this work.
Sparse MRI: The application of compressed sensing for rapid MR imaging
TLDR
Practical incoherent undersampling schemes are developed and analyzed by means of their aliasing interference and demonstrate improved spatial resolution and accelerated acquisition for multislice fast spin‐echo brain imaging and 3D contrast enhanced angiography.
Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
TLDR
It is shown how one can reconstruct a piecewise constant object from incomplete frequency samples - provided that the number of jumps (discontinuities) obeys the condition above - by minimizing other convex functionals such as the total variation of f.
Particle Filters for Infinite (or Large) Dimensional State Spaces-Part 2
TLDR
It is proposed to use a simple modification of the asymptotically stable adaptive particle filter to handle errors in estimating the basis dimension and the implications of weaker assumptions are studied.
Sparse Solution Of Underdetermined Linear Equations By Stagewise Orthogonal Matching Pursuit
TLDR
It is shown that for systems with ‘typical’/‘random’ Φ, a good approximation to the sparsest solution is obtained by applying a fixed number of standard operations from linear algebra, and rigorously derive a conditioned Gaussian distribution for the matched filtering coefficients at each stage of the procedure.
An Architecture for Compressive Imaging
TLDR
This paper proposes algorithms and hardware to support a new theory of compressive imaging based on a new digital image/video camera that directly acquires random projections of the signal without first collecting the pixels/voxels.
The Dantzig selector: Statistical estimation when P is much larger than n
In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y=Xβ+z, where β∈Rp is a
...
1
2
...