Achieving compatibility of VR systems with MRI scanners is challenging and for applications such as fMRI, it is highly desirable to avoid local distortions of the static magnetic field. We have developed a non-intrusive MR compatible VR system which avoids disturbing the magnetic environment and uses eye tracking as the main interface. Our system demonstrates a capability to bring the VR world into MRI systems, including dynamic interaction with VR content based on gaze, with performance that it is competitive to the current leading commercial gaming eye tracker.
To achieve the aim of avoiding disturbing the imaging fields, an optical projection system was developed as shown in figure 1. A desktop computer and digital projector (Aaxa Technologies, HD Pico) are placed outside the screened room to allow rapid prototyping of stimulus presentation without causing electrical interference. The native projector lens is replaced with a Kodak Ektapro Select 87-200mm zoom lens projecting through an open waveguide. Two front silvered mirrors mounted on non-magnetic stands steer the projector beam to the magnet center. A 3D printed plastic device that mates precisely with a Philips 32 channel head coil holds a diffuser screen viewed in transmission and a clear acrylic reflector. Eye tracking is achieved using live video from two on-board MRC 12M-I IR-LED cameras mounted on an adjustable holder to infer the current gaze direction. The VR system converts gaze data into control signals for interacting with the virtual world. The system is developed using the Unity game engine and the tracking system is mainly based on OpenCV and deep learning libraries (Dlib and Tensorflow).
Gaze estimation is achieved by pupil tracking combined with deformable eye shape tracking based on a 6-landmark shape descriptor for each eye to achieve head pose compensation1. The landmarks guide the application of an adaptive density-based pupil tracking algorithm. Pupil-eye-corner feature vectors are regressed onto a gaze point on the screen after a screen space calibration procedure. Changes in head pose are estimated from displacements of the two eye corners and used to provide motion compensation. Some initial immersive content was generated with integrated calibration procedure and gaze control, and tested on volunteers.
The system was tested for MRI compatibility on a 3T Philips Achieva system by imaging a spherical phantom and a normal volunteer using field echo EPI with parameters taken from a typical fMRI protocol and checking for changes in SNR and geometric distortion. VR performance was tested on adults and children, with gaze measurement compared to a Tobii 4C gaming eye tracker system using their metrics2. Matched calibration and testing conditions were used for both systems. For calibration the user looks at on screen targets and the corresponding pupil positions are recorded. Precision and accuracy testing involves the subject fixing their gaze on a succession of 8 target markers and recording detected gaze location for each target for 10 seconds.
[1] Kazemi, Vahid, and Josephine Sullivan. "One millisecond face alignment with an ensemble of regression trees." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2014.
[2] Tobii Technology (2015), "Tobii Accuracy and Precision Test Method for Remote Eye Trackers," https://stemedhub.org/resources/3310.
[3] Bohil, Corey J., Bradly Alicea, and Frank A. Biocca. "Virtual reality in neuroscience research and therapy." Nature reviews neuroscience 12.12 (2011): 752.