0968

Combining MR and ultrasound imaging, through sensor-based probe tracking
Bruno Madore1, Cheng-Chieh Cheng1, and Frank Preiswerk1

1Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, United States

Synopsis

Sensors attached to the skin can ‘spy’ on product ultrasound scanners, allowing the position and orientation of the ultrasound probe to be inferred. These MR-compatible sensors can emit and/or receive ultrasound waves, they are small in size (about 3x3x1cm) and are easily fixed to the torso. In transmit-receive mode they can monitor physiological motion, and in receive-only mode they can monitor the motion of an ultrasound probe. The sensors can accompany subjects to serial MRI and ultrasound imaging exams and act as a common denominator between the two, allowing ultrasound images to be placed in the spatial context of MRI.

Introduction

The 3D orientation of clinical 2D ultrasound images is not typically known with any certainty, as the imaging plane is determined by the moving hand of the technologist holding the probe. For this reason, placing ultrasound images in the spatial context of pre-acquired 3D MRI can be challenging. A number of solutions have been introduced in the literature1-8 and some have evolved into commercial products such as BrainLab, Veran Medical, SurgiVision or ClearGuide. Although impressive, these technologies tend to rely on optical tracking, electromagnetic tracking, inertial sensors and/or cameras that have their own set of strengths and limitations such as line-of-sight requirements and/or limited MR compatibility. The present approach can also track US imaging probes in 3D, and thus enable ultrasound-MRI fusion, but it does so in a very different manner. It involves using the ultrasound signal from the ultrasound scanner itself to locate the imaging probe, in a way reminiscent of passive sonar9.

Small MR-compatible sensors able to emit and/or receive ultrasound waves were developed that can easily be attached to the skin. These sensors are similar to transmit-receive devices previously used for physiological motion monitoring10, but a receive-only mode of operation is used here instead to essentially ‘spy’ on ultrasound scanners. These sensors are meant to accompany human subjects to serial MRI and ultrasound exams, and to act as a common denominator between the two.

Methods

Small sensors, about 3x3x1cm in size, were created using single-element MR-compatible transducers, 3D-printed capsules, ultrasound gel, special membranes to contain the gel, and tape for adhesion onto the skin (Fig. 2a,b). A switching circuit was developed (Fig. 2c) allowing up to 4 sensors to be used at a time, and providing a choice between ‘transmit-receive’ and ‘receive-only’ modes. As shown in Fig. 1d, a new device was designed, 3D-printed, cabled and clipped to the imaging probe; every time the probe transmitted a beam of energy into the imaged object, this device detected the associated electromagnetic activity. As shown in Fig. 2, different beams sent in different directions involve the various elements on the face of the probe in different ways, so that the distance d between source and listening device varies with beam number. ‘Delay vs. beam#’ matrices were obtained from the ‘spying’ device at the same rate as the imaging, about 20 fps. Signals in such ‘delay vs. beam#’ space were very sensitive to the probe position/orientation. These signals were fed to a convolutional neural network (CNN), generating predictions for the probe position/orientation (Fig. 3).

Several experiments were performed in gel phantoms. Furthermore, one healthy volunteer was scanned following informed consent, with three sensors attached to the skin, by MRI (Siemens Verio) and then ultrasound (Profocus Ultraview, BK Medical). The sensors were visible under MRI, more specifically, the water-based ultrasound gel they contain was visible. Optical tracking of the probe during ultrasound imaging was also performed, for validation purposes (Polaris Vicra, NDI).

Results

Results from phantom experiments are summarized in Fig. 4. The black rectangular cuboid in Fig. 4a represents the gel phantom, and the green disk attached to it the ultrasound-based sensor. The ‘spied’ signals in ‘delay vs. beam#’ space were converted, through a CNN, into predictions for the position/orientation of the imaging probe.

Two imaging probes are represented in Fig. 4a: one is displayed in green, the other in yellow. The position/orientation of the green one is as predicted by our proposed sensor-based approach, while the yellow one is shown at the position/orientation measured by optical tracking. A few more time points are displayed in Fig. 4b. When the two probe representations mostly overlapped, they appeared more or less like a single probe in the display, with unusual green-yellow coloring patterns. A first human dataset is shown in Fig. 5, whereby spoiled gradient-echo images were acquired with three ultrasound-based sensors in place, followed by ultrasound imaging with sensors still in place.

Discussion

Results in Fig. 4 were considered very promising, in the sense that ultrasound signals ‘spied’ by an ultrasound-based sensor attached to the skin became a surrogate for a bulkier, less convenient and more expensive optical tracking system. Figure 4 suggests that these sensors can provide the information needed to locate/orient 2D US images in the spatial context of pre-acquired MRI volumes, a proposition to be further tested through datasets such as that introduced in Fig. 5.

Conclusion

Small MR-compatible ultrasound-based sensors originally developed to monitor physiological motion can be used to monitor the motion of ultrasound-imaging probes, potentially enabling a convenient approach for MRI-ultrasound image fusion.

Acknowledgements

Financial support from grants NIH P41EB015898 and R03EB025546 and GPU donation from NVIDIA are acknowledged.

References

1. Park S, Jang J, Kim J, Kim YS, Kim C. Real-time Triple-modal Photoacoustic, Ultrasound, and Magnetic Resonance Fusion Imaging of Humans. IEEE Trans Med Imaging 2017;36(9):1912-1921.

2. Yaniv Z, Wilson E, Lindisch D, Cleary K. Electromagnetic tracking in the clinical environment. Med Phys 2009;36(3):876-892.

3. Sun SY, Gilbertson M, Anthony BW. Probe localization for freehand 3D ultrasound by tracking skin features. Med Image Comput Comput Assist Interv 2014;17(Pt 2):365-372.

4. Horvath S, Galeotti J, Wang B, Perich M, Wang J, Siegel M, Vescovi P, Stetten G. Towards an Ultrasound Probe with Vision: Structured Light to Determine Surface Orientation. Workshop on Augmented Environment for Computer-Assisted Interventions, 2011:58-64.

5. Sun S-Y, Gilbertson M, Anthony B. 6-DOF probe tracking via skin mapping for freehand 3D ultrasound. IEEE Intl Symp on Biomedical Imaging. San Francisco, CA, USA, 2013.

6. Goldsmith A, Pedersen P, Szabo T. An inertial-optical tracking system for portable, quantitative, 3D ultrasound. IEEE Ultrasonics Symposium, 2008.

7. Stolka P, Kang H-J, Choti M, Boctor E. Multi-DoF probe trajectory reconstruction with local sensors for 2D-to-3D ultrasound. IEEE Intl Symp on Biomed Imag, 2010.

8. Lasso A, Heffter T, Rankin A, Pinter C, Ungi T, Fichtinger G. PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng 2014;61(10):2527-2537.

9. Maranda B. Passive Sonar. Handbook of Signal Processing in Acoustics: Springer; 2008. p 1757-1781.

10. Preiswerk F, Toews M, Cheng CC, Chiou JG, Mei CS, Schaefer LF, Hoge WS, Schwartz BM, Panych LP, Madore B. Hybrid MRI-Ultrasound acquisitions, and scannerless real-time imaging. Magn Reson Med 2017;78(3):897-908.

Figures

Fig. 1: a) A sensor was created which includes an MR-compatible transducer, a 3D-printed capsule, ultrasound gel, a membrane to hold the gel, and adhesive tape. b) The sensor is attached to the skin by exposing the tape, pressing the sensor onto the skin, and turning the lid to press the transducer onto the skin for improved acoustic coupling. c) A switching circuit can accommodate up to four sensors, and toggle between ‘transmit-receive’ or ‘receive-only’ modes. d) A device was designed, 3D-printed, cabled and clipped to the imaging probe to detect the electromagnetic activity associated with beam firings.

Fig. 2: Acquiring an ultrasound image typically involves firing about 200 ultrasound beams into the imaged object, in different directions. Returning signals are typically received for roughly 0.25ms after each beam, which means that 200 beams are acquired in roughly 50ms, allowing ultrasound imaging to operate at roughly 20 fps. The face of the probe consists of many separate transducer elements, which get involved in varying degrees depending on the direction of the transmitted beam. As a result, the distance d between the most active patch on the front of the transducer and the sensor varies with beam number.

Fig. 3: The device from Fig. 1d captured the exact timing of ultrasound firings. Ultrasound signals travel at about 1540m/s in tissues, and as such there is a delay between transducer firing and the arrival of signals at the sensor. Because the distance d to be traversed varies with beam number (Fig. 2), the time it takes also varies with beam number, and the distribution of signals in ‘delay vs. beam#’ space is exquisitely sensitive to the position/orientation of the imaging probe. These signals are input into a trained CNN to generate predictions of probe position and orientation.

Fig. 4: a,b) The main results obtained here involved tracking the position/orientation of the imaging probe based on ‘spied’ sensor signals, in phantom experiments, as compared to optical tracking. The location/orientation measured with optical tracking is shown in yellow and the sensor-based prediction in green. Whenever sensor-based predictions and optical detection agreed, green and yellow probes mostly overlapped and visually combined into some patterned yellow-green display. Although a movie format may better suit these results, the few frames shown in (b) capture the fact that the two probes mostly overlapped, most of the time.

Fig. 5: A main goal for the method from Fig. 4 is to allow 2D US images to be placed in the spatial context of pre-acquired 3D MRI, using convenient and inexpensive hardware. The present sensors, originally developed to monitor physiological motion, could flip-flop between physiological monitoring (transmit-receive mode) and probe tracking (receive-only mode). IRB approval was obtained to gather MRI and ultrasound imaging data in volunteers, with sensors in place for both exams. Such data are needed toward training a new CNN, and moving from gel applications (as in Fig. 4) to in vivo applications.

Proc. Intl. Soc. Mag. Reson. Med. 27 (2019)
0968