Sensors attached to the skin can ‘spy’ on product ultrasound scanners, allowing the position and orientation of the ultrasound probe to be inferred. These MR-compatible sensors can emit and/or receive ultrasound waves, they are small in size (about 3x3x1cm) and are easily fixed to the torso. In transmit-receive mode they can monitor physiological motion, and in receive-only mode they can monitor the motion of an ultrasound probe. The sensors can accompany subjects to serial MRI and ultrasound imaging exams and act as a common denominator between the two, allowing ultrasound images to be placed in the spatial context of MRI.
The 3D orientation of clinical 2D ultrasound images is not typically known with any certainty, as the imaging plane is determined by the moving hand of the technologist holding the probe. For this reason, placing ultrasound images in the spatial context of pre-acquired 3D MRI can be challenging. A number of solutions have been introduced in the literature1-8 and some have evolved into commercial products such as BrainLab, Veran Medical, SurgiVision or ClearGuide. Although impressive, these technologies tend to rely on optical tracking, electromagnetic tracking, inertial sensors and/or cameras that have their own set of strengths and limitations such as line-of-sight requirements and/or limited MR compatibility. The present approach can also track US imaging probes in 3D, and thus enable ultrasound-MRI fusion, but it does so in a very different manner. It involves using the ultrasound signal from the ultrasound scanner itself to locate the imaging probe, in a way reminiscent of passive sonar9.
Small MR-compatible sensors able to emit and/or receive ultrasound waves were developed that can easily be attached to the skin. These sensors are similar to transmit-receive devices previously used for physiological motion monitoring10, but a receive-only mode of operation is used here instead to essentially ‘spy’ on ultrasound scanners. These sensors are meant to accompany human subjects to serial MRI and ultrasound exams, and to act as a common denominator between the two.
Small sensors, about 3x3x1cm in size, were created using single-element MR-compatible transducers, 3D-printed capsules, ultrasound gel, special membranes to contain the gel, and tape for adhesion onto the skin (Fig. 2a,b). A switching circuit was developed (Fig. 2c) allowing up to 4 sensors to be used at a time, and providing a choice between ‘transmit-receive’ and ‘receive-only’ modes. As shown in Fig. 1d, a new device was designed, 3D-printed, cabled and clipped to the imaging probe; every time the probe transmitted a beam of energy into the imaged object, this device detected the associated electromagnetic activity. As shown in Fig. 2, different beams sent in different directions involve the various elements on the face of the probe in different ways, so that the distance d between source and listening device varies with beam number. ‘Delay vs. beam#’ matrices were obtained from the ‘spying’ device at the same rate as the imaging, about 20 fps. Signals in such ‘delay vs. beam#’ space were very sensitive to the probe position/orientation. These signals were fed to a convolutional neural network (CNN), generating predictions for the probe position/orientation (Fig. 3).
Several experiments were performed in gel phantoms. Furthermore, one healthy volunteer was scanned following informed consent, with three sensors attached to the skin, by MRI (Siemens Verio) and then ultrasound (Profocus Ultraview, BK Medical). The sensors were visible under MRI, more specifically, the water-based ultrasound gel they contain was visible. Optical tracking of the probe during ultrasound imaging was also performed, for validation purposes (Polaris Vicra, NDI).
Results from phantom experiments are summarized in Fig. 4. The black rectangular cuboid in Fig. 4a represents the gel phantom, and the green disk attached to it the ultrasound-based sensor. The ‘spied’ signals in ‘delay vs. beam#’ space were converted, through a CNN, into predictions for the position/orientation of the imaging probe.
Two imaging probes are represented in Fig. 4a: one is displayed in green, the other in yellow. The position/orientation of the green one is as predicted by our proposed sensor-based approach, while the yellow one is shown at the position/orientation measured by optical tracking. A few more time points are displayed in Fig. 4b. When the two probe representations mostly overlapped, they appeared more or less like a single probe in the display, with unusual green-yellow coloring patterns. A first human dataset is shown in Fig. 5, whereby spoiled gradient-echo images were acquired with three ultrasound-based sensors in place, followed by ultrasound imaging with sensors still in place.
1. Park S, Jang J, Kim J, Kim YS, Kim C. Real-time Triple-modal Photoacoustic, Ultrasound, and Magnetic Resonance Fusion Imaging of Humans. IEEE Trans Med Imaging 2017;36(9):1912-1921.
2. Yaniv Z, Wilson E, Lindisch D, Cleary K. Electromagnetic tracking in the clinical environment. Med Phys 2009;36(3):876-892.
3. Sun SY, Gilbertson M, Anthony BW. Probe localization for freehand 3D ultrasound by tracking skin features. Med Image Comput Comput Assist Interv 2014;17(Pt 2):365-372.
4. Horvath S, Galeotti J, Wang B, Perich M, Wang J, Siegel M, Vescovi P, Stetten G. Towards an Ultrasound Probe with Vision: Structured Light to Determine Surface Orientation. Workshop on Augmented Environment for Computer-Assisted Interventions, 2011:58-64.
5. Sun S-Y, Gilbertson M, Anthony B. 6-DOF probe tracking via skin mapping for freehand 3D ultrasound. IEEE Intl Symp on Biomedical Imaging. San Francisco, CA, USA, 2013.
6. Goldsmith A, Pedersen P, Szabo T. An inertial-optical tracking system for portable, quantitative, 3D ultrasound. IEEE Ultrasonics Symposium, 2008.
7. Stolka P, Kang H-J, Choti M, Boctor E. Multi-DoF probe trajectory reconstruction with local sensors for 2D-to-3D ultrasound. IEEE Intl Symp on Biomed Imag, 2010.
8. Lasso A, Heffter T, Rankin A, Pinter C, Ungi T, Fichtinger G. PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng 2014;61(10):2527-2537.
9. Maranda B. Passive Sonar. Handbook of Signal Processing in Acoustics: Springer; 2008. p 1757-1781.
10. Preiswerk F, Toews M, Cheng CC, Chiou JG, Mei CS, Schaefer LF, Hoge WS, Schwartz BM, Panych LP, Madore B. Hybrid MRI-Ultrasound acquisitions, and scannerless real-time imaging. Magn Reson Med 2017;78(3):897-908.