Mark Chiew1
1University of Oxford, Oxford, United Kingdom
Synopsis
This talk will provide some intuition behind low-rank methods and an overview of the mechanics involved in reconstruction. We will first present some background on low-rank matrices, then cover general low-rank methods, and finally we will discuss structured low-rank methods.
Target Audience
Physicists and engineers looking to get an introduction to low-rank and structured low-rank methods for image reconstruction.Outline
This talk will focus largely on providing some intuition for low-rank methods, and an overview of the mechanics involved in low-rank and structured low-rank reconstruction approaches.
This will include:
- some background on the SVD and properties of low-rank matrices
- an overview of low-rank image reconstruction approaches
- an introduction to structured low-rank reconstruction methods
Background
The singular value decomposition (SVD) decomposes any matrix $$$X$$$ as:
$$X=U\Sigma{}V^{H}$$
where $$$U$$$ is the matrix of left singular vectors, $$$V$$$ is the matrix of right singular vectors, and $$$\Sigma{}$$$ is a diagonal matrix of scaling factors called singular values. Here, $$$V^H$$$ denotes the conjugate transpose of $$$V$$$.
The SVD provides direct access to the rank of a matrix, as the number of non-zero singular values. When a matrix has low-rank, it has fewer degrees of freedom (DOF) than the number of matrix entries suggests. In fact, a rank $$$r$$$ matrix of dimension $$$m\times{}n$$$, has $$$DOF=r(m+n-r)$$$, which means that it is possible to fully characterize the matrix with fewer independent parameters, or in other words, is compressible. Another way of thinking about low-rank matrices is that the rows and columns are linearly dependent, so there is redunancy across entries. Furthermore, as the left and right singular vectors form bases for the column and row-spaces of the matrix respectively, these subspaces will have low dimensionality, because the dimensionality of the column and row subspaces is equal to the rank of the matrix.
We can think of low-rank methods as somewhat analogous to sparse methods, in that the diagonal vector of singular values is sparse in a low-rank matrix. However, one big difference is that unlike conventional compressed sensing, we only need to know the low-rank representation exists, and we do not need to know anything about the column or row subspaces a priori.
Low-Rank Methods
The basic steps involved in low-rank reconstruction are:
- Identify the low-rank matrix data. This can include (space x time)1, (space x spectra)2, (space x coils)3, (space x TE)4, or (space x diffusion encoding)5, just to name a few. Sometimes the data matrix is referred to as the Casorati matrix.
- Construct the forward measurement model, which maps the image to the measured k-space. Typically this involved coil sensitivity encoding, Fourier transform, and a sampling mask (or a non-Cartesian encoding model). Although the forward encoding operator is not usually explicitly constructed, it is useful to determine the adjoint or conjugate transpose of the operator, which is often used in calculating the gradient of the data consistency term.
- Choose a suitable cost function with low-rank constraint, which can include using the convex nuclear norm, or non-convex rank constraints, and global low-rank enforcement, or locally-low rank enforcement.
- Solve the cost function with an appropriate algorithm, usually an iterative first order method (using gradient information only). Two examples of commonly used methods include an SVD-free approach, using matrix factorization with dimensionalities chosen to enforce low-rank constraints by construction, and an SVD-based singular value soft thresholding approach, which solves the nuclear norm minimization problem.
Structured Low-Rank Methods
These are similar to general low-rank methods, but it is not the data matrix itself that is low-rank, but rather a transformation of the input data. One advantage of these types of approaches is that they don’t require multi-dimensional data - a structured low-rank matrix can be generated from a single image k-space. Typically, these structured matrices arise from forming Hankel-structured matrices by unravelling k-space kernel points into vectors and stacking them across kernel locations. For 2D k-spaces, this results in a block-Hankel structure.
The low-rank nature of structed matrices in MR reconstruction problems is less intuitive than general low-rank methods. However, it can be shown that constraints such as limited signal support6, smooth phase6,7,8, coil sensitivities9,10,11,12, and transform sparsity13,14 can form low-rank Hankel structured matrices from k-space signals.
From a reconstruction perspective, once the structured low-rank representation is identified, the mechanics of rank-constrained optimization are similar to the general low-rank case, with the exception that the low rank constraints are imposed on the Hankel transformed matrix. In this case, the linear Hankel operator and its conjugate transpose (or pseudoinverse) are needed.Further Reading
For further reading, the referenced articles15,16,17 are excellent reviews on the topics in this talk.Acknowledgements
No acknowledgement found.References
1. Liang, Zhi-Pei. ‘Spatiotemporal Imaging with Partially Separable Functions’. IEEE International Symposium on Biomedical Imaging (ISBI), 2007, 988–91.
2. Lam, Fan, and Zhi-Pei Liang. ‘A Subspace Approach to High-Resolution Spectroscopic Imaging’. Magnetic Resonance in Medicine 71, no. 4 (2014): 1349–57.
3. Trzasko, J. D., and A. Manduca. ‘Calibrationless Parallel MRI Using CLEAR’. 2011 Conference Record of the Forty Fifth Asilomar Conference on Signals, Systems and Computers (ASILOMAR), 75–79.
4. Zhang, Tao, John M. Pauly, and Ives R. Levesque. ‘Accelerating Parameter Mapping with a Locally Low Rank Constraint’. Magnetic Resonance in Medicine 73, no. 2 (2015): 655–61
5. Gao, Hao, Longchuan Li, Kai Zhang, Weifeng Zhou, and Xiaoping Hu. ‘PCLR: Phase-Constrained Low-Rank Model for Compressive Diffusion-Weighted MRI’. Magnetic Resonance in Medicine 72, no. 5 (2014): 1330–41.
6. Haldar, Justin P. ‘Low-Rank Modeling of Local k-Space Neighborhoods (LORAKS) for Constrained MRI.’ IEEE Transactions on Medical Imaging 33, no. 3 (2014): 668–81.
7. Lee, Juyoung, Kyong Hwan Jin, and Jong Chul Ye. ‘Reference-Free Single-Pass EPI Nyquist Ghost Correction Using Annihilating Filter-Based Low Rank Hankel Matrix (ALOHA)’. Magnetic Resonance in Medicine 76, no. 6 (2016): 1775–89.
8. Mani, Merry, Mathews Jacob, Douglas Kelley, and Vincent Magnotta. ‘Multi-Shot Sensitivity-Encoded Diffusion Data Recovery Using Structured Low-Rank Matrix Completion (MUSSELS)’. Magnetic Resonance in Medicine 78, no. 2 (1 August 2017): 494–507
9. Zhang, Jian, Chunlei Liu, and Michael E Moseley. ‘Parallel Reconstruction Using Null Operations.’ Magnetic Resonance in Medicine 66, no. 5 (2011): 1241–53.
10. Uecker, M, P Lai, M J Murphy, P Virtue, M Elad, John M Pauly, S S Vasanawala, and M Lustig. ‘ESPIRiT - An Eigenvalue Approach to Autocalibrating Parallel MRI: Where SENSE Meets GRAPPA’. Magnetic Resonance in Medicine 71, no. 3 (2014): 990–1001.
11. Shin, Peter J, Peder E Z Larson, Michael A Ohliger, Michael Elad, John M Pauly, Daniel B Vigneron, and Michael Lustig. ‘Calibrationless Parallel Imaging Reconstruction Based on Structured Low-Rank Matrix Completion.’ Magnetic Resonance in Medicine 72, no. 4 (2014): 959–70.
12. Haldar, Justin P, and Jingwei Zhuo. ‘P-LORAKS: Low-Rank Modeling of Local k-Space Neighborhoods with Parallel Imaging Data.’ Magnetic Resonance in Medicine 75, no. 4 (2016): 1499–1514
13. Ongie, Greg, and Mathews Jacob. ‘Off-the-Grid Recovery of Piecewise Constant Images from Few Fourier Samples’. SIAM Journal on Imaging Sciences 9, no. 3 (2016): 1004–41.
14. Jin, K. H., D. Lee, and Jong Chul Ye. ‘A General Framework for Compressed Sensing and Parallel MRI Using Annihilating Filter Based Low-Rank Hankel Matrix’. IEEE Transactions on Computational Imaging 2, no. 4 (2016): 480–95.
15. Christodoulou, A. G., and S. G. Lingala. ‘Accelerated Dynamic Magnetic Resonance Imaging Using Learned Representations: A New Frontier in Biomedical Imaging’. IEEE Signal Processing Magazine 37, no. 1 (2020): 83–93
16. Haldar, J. P., and K. Setsompop. ‘Linear Predictability in Magnetic Resonance Imaging Reconstruction: Leveraging Shift-Invariant Fourier Structure for Faster and Better Imaging’. IEEE Signal Processing Magazine 37, no. 1 (2020): 69–82.
17. Jacob, M., M. P. Mani, and J. C. Ye. ‘Structured Low-Rank Algorithms: Theory, Magnetic Resonance Applications, and Links to Machine Learning’. IEEE Signal Processing Magazine 37, no. 1 (2020): 54–68.