0538

Transforming The Experience of Having MRI Using Virtual Reality
Kun Qian1, Tomoki Arichi1, Jonathan Eden2, Sofia Dall'Orso2, Rui Pedro A G Teixeira3, Kawal Rhode3, Mark Neil4, Etienne Burdet2, A David Edwards1, and Jo V Hajnal1
1Centre for the Developing Brain, School of Biomedical Engineering and Imaging Sciences, King's College London, London, United Kingdom, 2Department of Bioengineering, Imperial College London, London, United Kingdom, 3School of Biomedical Engineering and Imaging Sciences, King's College London, London, United Kingdom, 4Department of Physics, Imperial College London, London, United Kingdom

Synopsis

Patients undergoing MRI often experience anxiety and sometimes distress prior to and during scanning. We have developed a non-intrusive MR compatible Virtual Reality (VR) system, providing a tailored immersive experience that the user can interact with and control using gaze tracking. Dedicated VR content has been created and tested on adults and children. A key feature is congruency between the VR world and physical sensations during MRI, including VR features corresponding to table motion and scanner noise/vibration. Results suggest the approach has huge clinical potential, and it could represent a platform for conducting a new generation of “natural” fMRI experiments.

Introduction

The MRI scanner environment is noisy and claustrophobic, making it challenging for many subjects and impeding imaging of vulnerable populations. Traditional interventions to alleviate anxiety (patient education/preparation, simple audio and video entertainment etc.) provide only limited relief. We hypothesized that completely replacing the visual scene with appropriate engaging content could substantially mitigate claustrophobia and that making the visual experience congruent with physical sensations (noise, vibration, table movement) could further reduce stress by achieving an integrated experience. Although Virtual Reality (VR) technology could provide a means to do this, in typical applications, immersion is strongly enhanced by user control of the perceived environment through tracking head motion, which is clearly undesirable in MRI. To test our hypotheses we developed a fully MRI compatible VR system for brain imaging applications, using gaze tracking for interactive control.

Methods:

The VR system (figure 1) comprises a MR compatible projector (Avotec, Florida, USA) and a 3D printed coil mounted VR headset including two infra-red eye tracking cameras (MRC systems, Heidelberg, Germany) and active noise-cancelling headphones (Optoacoustics, Moshav Mazor, Israel) all integrated into a Philips Achieva 32 channel head coil. Subjects are immersed in the visual environment as soon as they are placed in the head coil and visual presentation is continuous until removal from the coil at examination end. Real-time eye tracking and gaze estimation use feature-based methods [1] that achieve state-of-art performance [2]. A third camera within the MRI suite monitors the scanner bed position. A fourth camera equipped with a microphone is positioned outside the MRI suite to provide an interactive video feed showing the patient’s carer(s). Dedicated VR content has been developed using the Unity game engine (www.unity.com) and iteratively tested on adults and children.

Results:

The VR headset removes all peripheral visual cues relating to the scanner environment. Subjects report that this is important to avoid reminders of their actual context. Figure 2 shows the full timeline of VR content. The initial perspective is a face up supine view with scene elements that the subject moves past during patient table movement (figure 3). This produces an exceptionally strong immersive effect in which the scan setup procedure feel physically natural. A calibration step, needed to achieve accurate gaze control follows. This deliberately has no spatial cues. The visual perspective then changes to an upright pose so the patient can navigate autonomously through a virtual space. A busy street scene is used to establish the new spatial perspective and provide gaze control training ready for: 1. View control/navigation (figure 4). 2. UI interaction - actions are triggered by gaze fixation on virtual buttons/objects/characters for 3 seconds. Feedback is provided by a circular progress bar (figure 5). 3. Game interaction (figure 4).

The street scene leads to a lobby where there are choices of gaze-controlled games or videos, with specific content chosen using gaze control (figure 5). The subject can also call in their carer, achieved by gaze control (figure 4, figure 5) or by a verbal request for a carer initiated session. The carer then appears in a pop-up window with two-way audio. The carer has their own display replicating what the patient sees, plus a live feed showing the patient’s eyes, which can help indicate patient anxiety levels.

To create congruence with scanner acquisition noise, characters such as construction workers (with gaze activated interaction) have been integrated into various scenes. The VR system initially creates their work noise, which is then blended into the native scanner sounds. System usability and immersion has been tested on 22 adult volunteers including assessing navigation using gaze control. All subjects were able to finish the test task; those with normal eyesight (16) succeeded at their first attempt, those who normally use corrective lenses (6) needed several attempts to master the system. An optional dynamic marker showing current gaze location was tested, but reported unhelpful by 19/22 participants and it tended to cause eye strain. All the participants gave verbal feedback that they were impressed and even shocked by the unique immersive experience achieved. Importantly they were not aware of their location relative to the scanner bore whilst using the system. The system was tested in part or full by 6 children (aged 7 to 11) who gave positive reports.


Discussion and conclusion

This custom designed VR system for enhancing MRI acceptability has received positive feedback from all participants, which encourages development towards clinical practice. A further opportunity could be as a platform for a new generation of “natural” fMRI experiments such as those studying fundamental (but hitherto poorly understood) cognitive processes such as social communication. Development is still at an early stage and larger scale testing is required. So far all features appear valuable, but objective, unbiased testing is a challenge and we are exploring how best to do this. Gaze control appears intuitive and effective, but eye strain is a clear potential issue. Additional options, such as hand tracking could be considered to enrich the interaction experience.

Acknowledgements

This work was supported by ERC grant agreement no. 319456 (dHCP project), MRC Clinician Scientist Fellowship [MR/P008712/1], the Wellcome EPSRC Centre for Medical Engineering at Kings College London (WT 203148/Z/16/Z) and by the National Institute for Health Research (NIHR) Biomedical Research Centre based at Guy’s and St Thomas’ NHS Foundation Trust and King’s College London. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.

References

[1] Kazemi, Vahid, and Josephine Sullivan. "One millisecond face alignment with an ensemble of regression trees." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2014.

[2] Kun Qian, Tomoki Arichi, Jonathan Eden, Sofia Dall'Orso, Rui Pedro A G Teixeira, Kawal Rhode, Mark Neil, Etienne Burdet, A David Edwards, Jo V Hajnal. "A Non-Intrusive Eye Tracking Based MRI Compatible VR System." Proceedings of the 27th ISMRM, 2019.

Figures

System schematic overview (left), hardware setup (right - top) and eye tracking system (right - bottom). The projector is a SV-8000 MR-Mini™ LCD projection system from Avotec. The MR compatible cameras are12M-I IR-LED from MRC systems


System timeline and content overview. The table motion part is played while the subject is moved into the scanner. Once the patient is inside the bore, the eye tracking calibration begins. Then the subject is guided into a VR lobby to select the entertainment content they like using gaze tracking to effect choices.


Patient table driven VR content. An upward facing supine view is used to match the patient’s initial proprioception. When the patient bed moves, the VR content plays a synchronized motion scene that matches the physical movement of the table. This fosters integrated sense perception and produces a strong immersive experience.


Time sequence for gaze control in the VR system. Each sub-figure shows the eye video stream with tracked features overlaid above the VR content with the current gaze point superimposed (yellow dot). The gaze marker is not shown in normal operation as users report that it damages the immersive experience and it appears to lead to eye strain.


System feature analysis. This is the movie selection scene in the inner lobby. This scene includes five progress bar triggered buttons (movie, exit, back, next, carer), interactive character presenting a plausible source of scanner noise, carer interaction system.


Proc. Intl. Soc. Mag. Reson. Med. 28 (2020)
0538