3101

A Non-Intrusive Eye Tracking Based MRI Compatible VR System
Kun Qian1, Tomoki Arichi1, Jonathan Eden2, Sofia Dall'Orso2, Rui Pedro A G Teixeira1, Kawal Rhode1, Mark Neil3, Etienne Burdet2, A David Edwards1, and Jo V Hajnal1

1School of Biomedical Engineering and Imaging Sciences, King's College London, London, United Kingdom, 2Department of Bioengineering, Imperial College London, London, United Kingdom, 3Department of Physics, Imperial College London, London, United Kingdom

Synopsis

Achieving compatibility of VR systems with MRI scanners is challenging and for applications such as fMRI, it is highly desirable to avoid local distortions of the static magnetic field. We have developed a non-intrusive MR compatible VR system which avoids disturbing the magnetic environment and uses eye tracking as the main interface. Our system demonstrates a capability to bring the VR world into MRI systems, including dynamic interaction with VR content based on gaze, with performance that it is competitive to the current leading commercial gaming eye tracker.

Introduction

Virtual reality (VR) technology can provide an immersive interactive simulated environment which has huge potential in reducing anxiety during MRI scans and may even allow examinations that are currently infeasible. Although the VR industry is booming, devices for medical use remain relatively crude. Achieving compatibility with MRI scanners is challenging and for applications such as fMRI, it is highly desirable to avoid local distortions of the static magnetic field. In many VR systems the sense of immersion relies on head tracking to create active control of the visual scene. This is clearly undesirable for MRI. We have developed a non-intrusive MR compatible VR system which avoids disturbing the magnetic environment and uses eye tracking as the main interface. A challenge in achieving robust gaze control is correction for head movements. In commercial systems this requires unobstructed views of the full face, which is not feasible within standard head receiver coils.

Methods

To achieve the aim of avoiding disturbing the imaging fields, an optical projection system was developed as shown in figure 1. A desktop computer and digital projector (Aaxa Technologies, HD Pico) are placed outside the screened room to allow rapid prototyping of stimulus presentation without causing electrical interference. The native projector lens is replaced with a Kodak Ektapro Select 87-200mm zoom lens projecting through an open waveguide. Two front silvered mirrors mounted on non-magnetic stands steer the projector beam to the magnet center. A 3D printed plastic device that mates precisely with a Philips 32 channel head coil holds a diffuser screen viewed in transmission and a clear acrylic reflector. Eye tracking is achieved using live video from two on-board MRC 12M-I IR-LED cameras mounted on an adjustable holder to infer the current gaze direction. The VR system converts gaze data into control signals for interacting with the virtual world. The system is developed using the Unity game engine and the tracking system is mainly based on OpenCV and deep learning libraries (Dlib and Tensorflow).

Gaze estimation is achieved by pupil tracking combined with deformable eye shape tracking based on a 6-landmark shape descriptor for each eye to achieve head pose compensation1. The landmarks guide the application of an adaptive density-based pupil tracking algorithm. Pupil-eye-corner feature vectors are regressed onto a gaze point on the screen after a screen space calibration procedure. Changes in head pose are estimated from displacements of the two eye corners and used to provide motion compensation. Some initial immersive content was generated with integrated calibration procedure and gaze control, and tested on volunteers.

The system was tested for MRI compatibility on a 3T Philips Achieva system by imaging a spherical phantom and a normal volunteer using field echo EPI with parameters taken from a typical fMRI protocol and checking for changes in SNR and geometric distortion. VR performance was tested on adults and children, with gaze measurement compared to a Tobii 4C gaming eye tracker system using their metrics2. Matched calibration and testing conditions were used for both systems. For calibration the user looks at on screen targets and the corresponding pupil positions are recorded. Precision and accuracy testing involves the subject fixing their gaze on a succession of 8 target markers and recording detected gaze location for each target for 10 seconds.

Results

There was no detectable change in SNR or geometric distortion without and with the complete system in place. Typical eye images with key landmarks and calibration data overlaid are shown in figure 1. Gaze accuracy and precision data for a single subject are shown in figure 2. The proposed system had comparable performance to the reference system and also showed less drift in these measures over time. The system provided a strong immersive visual experience that could be controlled interactively by the subject.

Discussion

These preliminary results demonstrate a capability to bring the VR world into MRI systems, including dynamic interaction with VR content based on gaze, with performance that it is competitive to the current leading commercial gaming eye tracker. The completely non-intrusive and contactless design does not require any preparation work before the scan (such as sticking markers to the user’s face). The proposed system has both clinical applications for subjects who find MRI stressful (such as those with claustrophobia or children) and for neuroscience3 with (as yet untested) potential for prospective motion correction.

Acknowledgements

This work was supported by ERC grant agreement no. 319456 (dHCP project), the Wellcome EPSRC Centre for Medical Engineering at Kings College London (WT 203148/Z/16/Z) and by the National Institute for Health Research (NIHR) Biomedical Research Centre based at Guy’s and St Thomas’ NHS Foundation Trust and King’s College London. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.

References

[1] Kazemi, Vahid, and Josephine Sullivan. "One millisecond face alignment with an ensemble of regression trees." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2014.

[2] Tobii Technology (2015), "Tobii Accuracy and Precision Test Method for Remote Eye Trackers," https://stemedhub.org/resources/3310.

[3] Bohil, Corey J., Bradly Alicea, and Frank A. Biocca. "Virtual reality in neuroscience research and therapy." Nature reviews neuroscience 12.12 (2011): 752.

Figures

Figure 1: Overview of the proposed MRI compatible VR system including example eye tracking calibration data overlaid on a captured video image and example visual scenes showing a magical world (far right – top) and gaze controlled selection of a dancing figure (far right – bottom)

Figure 2: Performance comparison of the proposed within head coil gaze tracker with a Tobii 4C commercial gaming system using the metrics proposed by Tobii 2. Note that all distances are expressed as fractions of the on-screen target perimeter circle radius to remove any effects of differential screen size.

Proc. Intl. Soc. Mag. Reson. Med. 27 (2019)
3101