Virtual reality system compatible with MRI scanner

文档序号:411561 发布日期:2021-12-17 浏览:6次 中文

阅读说明:本技术 与mri扫描仪兼容的虚拟现实系统 (Virtual reality system compatible with MRI scanner ) 是由 J·哈伊纳尔 有地友樹 钱坤 于 2020-04-24 设计创作,主要内容包括:方面和实施例提供与MRI扫描仪兼容的虚拟现实系统,所述系统包括:位于MRI扫描仪的孔内的用户设备,所述用户设备配置为向受试者提供沉浸式虚拟环境;所述系统还包括:至少一个传感器,其配置为跟踪所述受试者的眼部运动;其中所述受试者与所述沉浸式虚拟环境的交互由眼部运动跟踪控制。本方面和实施例可以实施这样的设置,所述设置认识到通常依赖于VR对象的动态运动的VR技术可用于帮助维持待放置在MRI扫描仪的孔内的对象的最小运动。可以实施本实施方式,使得受试者的平静程度可以增加,并且他们对物理环境的意识减弱,从而允许更为成功的MRI图像采集,同时寻求最小化所研究的受试者经历的痛苦、无聊和/或沮丧。(Aspects and embodiments provide a virtual reality system compatible with an MRI scanner, the system comprising: a user device located within an bore of an MRI scanner, the user device configured to provide an immersive virtual environment to a subject; the system further comprises: at least one sensor configured to track eye movement of the subject; wherein the subject's interaction with the immersive virtual environment is controlled by eye motion tracking. The present aspects and embodiments may implement an arrangement that recognizes that VR techniques, which typically rely on dynamic motion of a VR subject, may be used to help maintain minimal motion of the subject to be placed within the bore of the MRI scanner. This embodiment may be implemented such that the level of calmness of the subject may be increased and their awareness of the physical environment diminished, allowing for more successful MRI image acquisition while seeking to minimize the pain, boredom and/or depression experienced by the subject under study.)

1. A virtual reality system compatible with an MRI scanner, the system comprising:

a user device located within an bore of an MRI scanner, the user device configured to provide an immersive virtual environment to a subject;

the system further comprises:

at least one sensor configured to track eye movement of the subject;

wherein the subject's interaction with the immersive virtual environment is controlled by the tracked eye movements.

2. The system of claim 1, wherein the elements of the user device are constructed of a material that does not affect a magnetic field within a bore of the MRI scanner.

3. The system of claim 1 or 2, wherein the magnetically or electrically disruptive element of the virtual reality system is located outside of an aperture of a scanner.

4. The system of any one of the preceding claims, wherein the visual input for providing the immersive virtual environment is transmitted to the subject from outside the bore of the scanner through an optical system comprising a projector and one or more mirrors.

5. The system of any preceding claim, wherein the at least one sensor configured to track eye movement of the subject comprises at least one camera forming part of the user device, and wherein the camera is located on the user device at a distance from the subject placed in the user device selected to reduce electromagnetic interference in MRI images captured using an aperture of a scanner.

6. The system of any preceding claim, wherein the processing equipment required to create the virtual reality environment is located outside of an aperture of a scanner.

7. The system of any preceding claim, wherein the user device is sized to fit within an aperture of a scanner.

8. The system of any of the preceding claims, wherein one or more elements of the virtual reality system are removably separable from the MRI scanner.

9. The system of any one of the preceding claims, wherein the subject's interaction with the immersive virtual environment is primarily controlled by tracked eye movements.

10. The system of any preceding claim, wherein the user device comprises one or more limiters located within the user device for limiting head movement of the subject.

11. The system according to any of the preceding claims, wherein the tracked eye movement comprises a gaze estimation by pupil tracking.

12. The system of any preceding claim, wherein the tracked eye movement comprises deformable eye shape tracking.

13. The system of any preceding claim, wherein the tracked eye motion comprises pupil tracking with head pose compensation.

14. The system of any preceding claim, wherein the tracked eye movement comprises determining head movement of the subject from images obtained by the sensor, and wherein the system is configured to use the determined head movement to correct or compensate for MRI images obtained by the MRI scanner.

15. The system of any one of the preceding claims, wherein a change in the subject's head pose is estimated based on an angular eye displacement determined from one or more images of the subject's eyes captured by a sensor, and wherein the change is used to provide motion compensation.

16. The system of any preceding claim, wherein the system is configured to provide interactive gaze targeting to a subject, thereby providing subject feedback and improving overall integration with the target.

17. The system of claim 16, wherein the interactive gaze target comprises an icon that changes or evolves when the subject's gaze determines to remain in contact with the gaze target.

18. The system of any one of the preceding claims, wherein the immersive virtual environment is consistent with a physical environment experienced by a subject.

19. The system of any preceding claim, wherein the user device comprises a sound sensor configured to provide input to the system such that the auditory landscape forming part of the virtual environment comprises a major element of the auditory landscape within the bore of the scanner.

20. The system of any preceding claim, wherein the user device comprises one or more motion sensors configured to provide input to the system such that a visual landscape forming part of the virtual environment coincides with motion experienced by a subject within the bore of the scanner.

21. The system of any one of the preceding claims, wherein the system comprises a sound and image sensor configured to provide information to the system about a party located outside the bore of the scanner and to add the party to the virtual environment to interact with the subject.

Technical Field

The present invention relates to a Virtual Reality (VR) system compatible with a Magnetic Resonance Imaging (MRI) scanner, and more particularly, to a virtual reality system that facilitates successful image acquisition in relation to subjects that are otherwise found to have challenges or difficulties using an MRI scanner and imposing limitations on subjects performing the MRI scanner.

Background

Magnetic Resonance Imaging (MRI) is a clinical imaging technique that allows capturing images of anatomical and physiological processes occurring, for example, inside the human body. MRI scanners use strong magnetic fields, magnetic field gradients, and radio frequency magnetic fields to generate images of anatomy and procedures occurring within the body.

A typical MRI scanner has a relatively small and narrow bore of the scanner. The bore of the scanner is the region that generates a strong magnetic field to study a subject placed in the MRI scanner.

A subject placed in an MRI scanner may develop anxiety. In the case of adults, this anxiety may result from claustrophobia and/or confusion or dementia, for example. Children may also feel anxiety before and during scanning.

In order to obtain clear and useful images from a subject placed within an MRI scanner, the subject must remain substantially stationary during the scan. It is not uncommon for infants to be scanned with general anesthetics.

It would be desirable to provide a system for use in an MRI scanner that can help alleviate or mitigate some of the problems set forth above.

Disclosure of Invention

Accordingly, a first aspect provides: a virtual reality system compatible with an MRI scanner, the system comprising: a user device located within a coil of an MRI scanner, the user device configured to provide an immersive virtual environment to a subject; the system further comprises: at least one sensor configured to track eye movement of a subject; wherein the subject's interaction with the immersive virtual environment is controlled by the tracked eye movements.

In some embodiments, the system is configured to provide the subject with an immersive virtual environment that includes visual and audio inputs selected to reduce, minimize, and/or prevent interaction of the subject with the virtual environment through coarse physical motion.

In some embodiments, the system is configured to provide the subject with an immersive virtual environment that includes visual and audio inputs selected to encourage the subject to interact with the virtual environment through eye movements only.

In some embodiments, elements of the user device are constructed of materials that do not affect the magnetic field within the coils of the MRI scanner.

In some embodiments, the elements of the VR system that magnetically or electrically interfere are located outside of the coils of the scanner. In some embodiments, a presentation element of the VR system is present with a subject positioned within an aperture of a scanner. The presentation element moves with the subject to allow the immersive VR experience created by the system to begin from outside the scanner and continue undisturbed in all elements examined until the subject moves away from the nuclear magnetic scanner.

In some embodiments, the visual input providing the immersive virtual environment is transmitted from outside the coil of the scanner to the subject through an optical system that includes a projector and one or more mirrors.

In some embodiments, the at least one sensor configured to track eye movement of the subject comprises at least one camera forming part of the user device, and wherein the camera is located on the user device at a distance from the subject placed in the user device, the distance selected to reduce electromagnetic interference in MRI images captured using the coils of the scanner.

In some embodiments, the processing devices required to create the virtual reality environment are located outside of the coils of the scanner or the bore of the scanner.

In some embodiments, the user device is sized to fit within a coil of the scanner or a bore of the scanner.

In some embodiments, one or more elements of the virtual reality system are removably separable from the MRI scanner.

In some embodiments, the subject's interaction with the immersive virtual environment is primarily controlled by eye motion tracking.

In some embodiments, the user device includes one or more limiters located within the user device for limiting head movement of the subject.

In some embodiments, eye motion tracking comprises: gaze estimation by pupil tracking.

In some embodiments, eye motion tracking comprises: deformable eye shape tracking.

In some embodiments, eye motion tracking comprises: pupil tracking including head pose compensation.

In some embodiments, eye motion tracking comprises determining subject head motion from images obtained by the sensor, and wherein the system is configured to use the determined head motion to correct or compensate for MRI images obtained by the MRI scanner.

In some embodiments, a change in the pose of the subject's head is estimated based on the angular eye displacement determined from one or more images of the subject's eyes captured by the sensor, and wherein the change is used to provide motion compensation.

In some embodiments, the system is configured to provide interactive gaze targets to a subject, thereby providing subject feedback and improving overall integration with the targets.

In some embodiments, the interactive gaze target comprises: an icon that changes or evolves when the subject's line of sight is determined to remain in contact with the gaze target.

In some embodiments, the immersive virtual environment is consistent with the physical environment experienced by the subject.

In some embodiments, the user device includes a sound sensor configured to provide an input to the system such that an auditory landscape (aural landscapes) forming part of the virtual environment includes a primary element of the auditory landscape within the coil of the scanner.

In some embodiments, the user device includes one or more motion sensors configured to provide input to the system such that the visual landscape forming part of the virtual environment coincides with the motion experienced by the subject within the coil of the scanner.

In some embodiments, the system includes a sound and image sensor configured to provide information to the system about a party located outside of the coils of the scanner, and to add an image of the party to the virtual environment to interact with the subject.

It will be appreciated that another aspect of the invention relates to a method of using the apparatus of the first aspect. The method specifically comprises the following steps: providing a compatible virtual reality system with an MRI scanner: by positioning a user device within a coil of an MRI scanner, the user device is configured to provide an immersive virtual environment to a subject; the method further comprises the following steps: configuring at least one sensor to track eye movement of the subject; and to provide an arrangement in which the subject's interaction with the immersive virtual environment is controlled by eye motion tracking of the subject.

Method steps corresponding to the device features described in the first aspect may be provided.

Further particular and preferred aspects are set out in the accompanying independent and dependent claims. Features from the dependent claims may be combined with those of the independent claims as appropriate, or with features other than those expressly recited in a claim.

Where a device feature is described as being operable to provide a function, it will be understood that this includes providing the function or a device feature adapted to or configured to provide the function.

Drawings

Embodiments of the invention will now be further described with reference to the accompanying drawings, in which:

FIG. 1 shows an optical projection system designed for one possible arrangement;

FIG. 2 shows a user device forming part of a VR system, the user device being positioned within an aperture of a scanner;

fig. 3 schematically shows the main components of the calibration process of gaze tracking;

FIG. 4 illustrates a typical eye image with key landmarks and calibration data superimposed;

FIG. 5 shows a subject being placed in an bore of an MRI scanner;

FIG. 6 shows a screenshot of visual elements of a virtual environment provided to a subject in a bore of an MRI scanner according to one arrangement;

figure 7 shows an alternative user device forming part of a VR system in the form of a head device located within a bore of an MRI scanner; and

fig. 8 is an isometric view of some of the major components forming the internal structure of the user device shown in fig. 7.

Detailed Description

As mentioned above, many adults find the process of scanning when placed in an MRI scanner to be an anxious event. This is particularly noticeable if the adult suffers from claustrophobia or from confusion or dementia. Many children also experience anxiety before and during MRI scanning, and it is not uncommon to use general anesthetics or sedatives to scan subjects. It will be appreciated that the use of anesthesia has associated risks and costs. Conventional approaches to address these challenges vary by age: until around two years of age, it is feasible to scan children during natural sleep. As infants age, the imaging must therefore move to night, eventually scanning late at night to extend the age range of natural sleep imaging. Once the child is too old, natural sleep does not become a reliable mechanism for successful MRI imaging, and options are limited until the child is mature enough to allow extended scans to be performed again. From around 5 years of age, a movie or other distracting program may be shown for reasonable inspection time, although this may only be partially successful.

Similarly, systems are known according to which images are displayed that remain calm while an adult is within the bore of a scanner in order to soothe and assist the subject to minimize movement of the subject within the MRI scanner. Such systems typically do not remove peripheral visual cues, and these treatments may remind the subject where they are, which can cause pain.

An arrangement may be implemented that can provide a fully MRI compatible, fully immersive virtual reality system to a subject within the bore of an MRI scanner. Such systems may include various components and features, including, for example: an MR-safe visual display system; eye tracking, which allows a subject to interact with a virtual environment without moving the head completely, thereby keeping the subject to a certain degree still within the bore of the scanner and eliminating the need for head movement within the bore of the scanner for limited space. Settings may be implemented in which a constant gaze from a subject is used to calibrate and control selection of options within the VR environment, and the ability to directly control games and perform other tasks using eye motion may be provided by eye tracking and appropriate eye tracking algorithms. It will also be appreciated that eye tracking may provide a useful neuroscience/clinical assessment tool, and may be used to provide predictive tracking of a subject's head for MRI motion correction. These additional features are described in more detail below. Some settings may provide a direct video injection capability (video injection capability) in which a second party located outside of the MRI suite may interact with the subject in the scanner as within the immersive virtual environment provided to the subject. Similarly, some arrangements may provide two-way audio communication to allow the subject to communicate with a second party and/or with an operator of the MRI scanner. Some settings may provide optional hand tracking, allowing for providing motion input to the VR world, allowing for enhancing the subject's perception of immersion in a virtual environment, and possibly allowing for performing less-motion tasks for purposes of neuroscience and/or clinical experimentation. A setup may be implemented that provides a fully immersive MRI compatible VR system into which a subject may be placed prior to entering the MRI scanner or bore of the scanner. An arrangement may be implemented that enables a second party (e.g., a parent) to act as an object to join in a virtual environment provided to the subject when the subject is positioned within the bore of the MRI scanner.

In general, settings may be implemented that recognize that VR techniques, which typically rely on dynamic motion of VR objects, may be used to help maintain minimal motion of a subject to be placed within the bore of an MRI scanner. The implementation mode can be as follows: the calmness of the subject within the bore of the MRI scanner may increase and the awareness of their physical environment (within the bore of the smaller scanner) diminishes, allowing for more successful image acquisition while seeking to minimize the pain, boredom, and/or frustration suffered by the subject.

An arrangement may be implemented that provides a fully immersive and interactive experience for the subject by providing an MRI compatible VR system. The setup may provide control through eye movement, may provide the ability to introduce a third party avatar (avatar), and may be implemented so that the subject to be scanned may use the system before entering the bore of the scanner, thereby distracting them from any preparatory work for image acquisition.

Before describing particular features in more detail, a general overview of methods and possible arrangements is provided herein.

Achieving compatibility of VR systems with MRI scanners is challenging. For fMRI and like applications, it is highly desirable to avoid local distortion of the static magnetic field. The setup is configured to allow the development of a non-invasive, MR compatible VR system that avoids interference with the magnetic environment within the bore of the scanner and uses eye tracking as the primary interface for scanning the subject with the VR environment. The method can bring the VR world into the MRI system, including dynamic interaction with VR content based on gaze.

VR in MRI scanner environment

Virtual Reality (VR) technology may provide an immersive, interactive, simulated environment that can reduce anxiety experienced by a scanned subject, for example, during a scan that may last for an hour or more. While the VR gaming industry is developing explosively, devices for use in clinical settings remain relatively immature.

Various challenges arise in using VR technology in the MRI scanner environment. These challenges include, for example, placing the electronics in a strong magnetic field while ensuring that the equipment and imaging are not compromised. Achieving compatibility with MRI scanners is challenging and for fMRI and like applications, it is highly desirable to avoid local distortion of the static magnetic field. The desire to interact with the VR environment provided to the subject also presents further challenges. In many VR systems, immersion relies on the subject's motion, e.g., head motion and head motion tracking, to create active control of the presented visual scene by the subject. Encouraging subjects to move within the bore of a smaller scanner is clearly undesirable for MRI applications: the resulting image will lack clarity and large physical movements of the subject within the bore are not possible at all. While eye control of VR systems appears to be a viable alternative in limited motion scenarios, one challenge in achieving stable eye control is the need to correct head motion, which is typically achieved by obtaining an unobstructed view of the subject's full face. It is generally not feasible to obtain such a view within the head receiver coil of a standard MRI scanner.

The setup seeks to provide a non-invasive MR compatible VR system that avoids interference with the magnetic environment within the scanner and uses eye tracking as the primary interface to minimize subject movement while allowing control of the VR environment.

Provision is made for providing the VR system to a subject, wherein the subject does not use head or body movements to control the VR environment provided to the subject. While commercially available gaming systems that can use eye movements and eye tracking, these eye movements are typically used to determine the direction in which an object is facing, rather than determining a coarse movement or coarse interaction with the VR environment. In other words, most VR systems track gross-motion actions, such as hand swings, head tracking, gestures, etc., to drive the subject's primary interaction with the virtual environment, while eye tracking is used to improve such interaction, rather than as a primary interaction or simply as a way of interaction. The setup may provide a VR system where only eye movement/eye tracking is used to control the subject's interaction with the virtual environment. Thus, in situations where physical movement of the subject's body is constrained, for example because the subject has been placed in a bore of an MRI scanner, the setup provides a mechanism to control and interact with the VR environment.

MRI scanner compatibility

To substantially avoid interference with the imaging field within the MRI scanner, various techniques may be employed. For example, VR system components located within the bore of the scanner may be selected as follows: materials that do not include, and/or may be magnetically or electrically destructive may be shielded or configured within the bore to minimize magnetic or electrical disruptions. In particular: instead of providing an active display device, such as an LCD screen, within the aperture, visual input may be provided to the subject through a projection and mirror. Similarly, one or more sensors for tracking eye movement, such as a video camera, may be located sufficiently far from the subject to reduce electromagnetic interference in the MRI images to be captured. Such cameras may also include MRI compatible cameras that are shielded to avoid electromagnetic interference.

The processing equipment (computers and similar equipment) required to create and maintain the virtual reality environment may be located outside the bore of the scanner. The system components necessary to provide the subject with an experience are sized to fit within the bore of the scanner. The system elements required to provide a subject experience can be separated from the MRI scanner so that the subject can be placed on the scanner table and immersed in the VR environment while outside the bore of the scanner. In some embodiments, the user device is sized to fit within or around a head coil of a scanner, an RF coil, or a bore of a scanner. Such an arrangement may be implemented such that the system is scanned with another coil (e.g. a coil for cardiac examinations). The settings may allow the user device to provide an immersive field of view anywhere the subject is placed within the MRI scanner. That is, an immersive environment may be provided regardless of whether the head or other part of the subject's body is being imaged by the MRI scanner. With respect to head scanning, the setup is implemented such that the user device and other components do not interfere with imaging. That is, they are provided so that the VR system does not interfere with the magnetic field used for imaging. For imaging body parts of the subject away from the head (e.g., the torso or lower extremities) using magnetic fields within the scanner, ensuring that the VR system disposed on the subject's head does not interfere with the magnetic fields essential for imaging may be simpler than for head imaging.

Fig. 1 shows an optical projection system designed for one possible arrangement. A desktop computer and digital projector (in the example shown, Aaxa Technologies, HD Pico) are located outside the room in which the MRI scanner is located. This arrangement allows rapid prototyping of stimulus presentation without causing electrical interference. The standard projector lens has been replaced by a kodak Ektapro Select 87-200mm zoom lens, which is arranged to project through an open waveguide. Two front silvered mirrors mounted on a non-magnetic support are configured to direct the projector beam into the bore of the MRI scanner. The 3D printed plastic device was configured to exactly match the Philips 32channel head coil and was arranged to include a mount for a diffuser screen that was visible in transmission and a clear acrylic reflector. In the illustrated setup, eye tracking can be achieved using real-time video from two onboard MRC 12M-I IR-LED cameras mounted on an adjustable stand. Images from the real-time video may be evaluated and the gaze direction of the subject may be derived from the real-time video. The VR system processor may be configured to convert the gaze data into control signals for the subject to interact with the virtual environment. The system shown in FIG. 1 was developed using a Unity game engine, and the setup provided a tracking system based on OpenCV and deep learning libraries (Dlib and Tensorflow).

The system in fig. 1 tested MRI compatibility on a 3T Philips Achieva system (3T Philips Achieva system) by imaging spherical mode and normal volunteers using field Echo Planar Imaging (EPI), with parameters taken from a typical fMRI protocol, and examining changes in SNR and geometric distortion. Without and with the complete system shown in fig. 1, there is no detectable change in SNR or geometric distortion.

Fig. 2 shows a user device 200 forming part of a VR system, the user device being positioned within an aperture of a scanner. As shown in fig. 2, the user device 200 may be placed on a head coil of a typical MRI scanner. An arrangement may be provided in which the user device 200 may be integrally formed with the head device of the MRI scanner. The user device 200 may be removably separable from the rest of the MRI scanner to allow the VR environment to be provided to the subject prior to entering the MRI scanner.

Eye tracking

To provide a mechanism for a subject located within the bore of the scanner to interact with the virtual environment while avoiding gross movement, eye scanning techniques that allow line of sight control may be implemented so that eye scanning may serve as the primary input object. It will be appreciated that the arrangement may provide the camera as part of a user device which is positionable within the bore of the scanner. Those cameras may be positioned such that their entire field of view is occupied by an image of the subject's eyes. The camera may be located outside the head coil of the scanner and placed so as not to compromise imaging performance. It will be appreciated that physical objects such as lenses, which are typically placed in close proximity to the imaging object, can cause signal loss and/or distortion in fMRI and diffusion imaging.

Since the subject's head motion is relatively limited in the head coil of an MRI scanner, eye motion is of primary concern according to the set eye and line-of-sight scanning techniques, without the need to account for or correct for gross movement of the subject's head. Some arrangements may make an image of the subject's eyes available for determining head movement. This head movement can then be used to correct or compensate for the acquired MRI images.

Some arrangements may provide elements such as manually operated buttons to increase control over and interaction with the virtual environment by means added to primary eye tracking. Such controllers may for example include: buttons, joysticks, and/or small gesture tracking.

Fig. 3 schematically shows the main components of the calibration process of gaze tracking. According to some arrangements, gaze estimation is implemented by pupil tracking in combination with deformable eye shape tracking, which implements head pose compensation based on 6 landmark shape (6-landmark shape) descriptors per eye[1]. The flags listed in fig. 3 are used to guide the application of an adaptive density-based pupil tracking algorithm. After the screen space calibration procedure, Pupil-eye-corner feature vectors (Pupil-eye-corner features vectors) are returned to the point of gaze on the screen. The change in head pose may be estimated from the displacement of the corners of the eyes and used to provide motion compensation. Some arrangements provide interactive fixation of the target, thereby providing subject feedback and improving overall integration with the target. For example, interactive gaze targets include: an icon that changes or evolves while the subject's line of sight remains in contact with the target. This arrangement, which provides continuous and immediate apparent feedback to the subject based on a consistent gaze pattern, facilitates the use of VR systems in which line of sight control is the primary input for the subject to interact with the virtual environment.

FIG. 3 illustrates a typical eye image with key landmarks and calibration data superimposed; line of sight accuracy and precision data for a single subject are shown in figure 2. Immersive content was generated by integrating a calibration procedure and then using gaze control, and had been tested on volunteers. The system provides a strongly immersive visual experience that can be interactively controlled by a subject.

The arrangement of figure 3 has been found to have comparable performance to the "typical" reference system and to deviate to a lesser extent from the performance metric over time. In particular, having symbolsSight-controlled VR System as outlined in FIG. 3 was tested for performance on adults and children and used Tobii[2]The set index compares the achieved gaze measurement with the Tobii 4C gaming eye tracker system. Both systems use matching calibration and test conditions. For calibration, the subject gazes at the screen target and records the corresponding pupil position. The accuracy and precision test involved the subject fixing their line of sight to 8 consecutive target markers and recording the detected gaze location of each target for 10 seconds. This test was repeated after a delay of 2 minutes to check if there was any deviation in performance.

TABLE 1

Table 1 includes a comparison of the performance of gaze trackers operating in accordance with the methods set forth with respect to fig. 3 and 4, and which are implemented in accordance with user equipment such as that shown in fig. 2. In particular, a camera and eye tracking are associated with a subject placed in an MRI head coil. Performance was compared to the Tobii 4C commercial game system using the index proposed by Tobii. Note that all distances are expressed as a small fraction of the radius of the target circle on the screen to eliminate any effect of screen size differences.

Immersion property

It may be beneficial to provide a configuration that provides the subject with an immersive virtual environment. In other words, one or more senses of the subject may be spoofed to believe that the virtual environment may replace the physical environment surrounding the subject (i.e., the MRI scanner). In this regard, various sensory issues may be addressed by appropriate configurations within the virtual environment provided to the subject. The main senses in MRI scanner applications are: visual, auditory and tactile. Various methods may be provided to ensure consistency of the subject's experience between the virtual environment and any aspect of the real physical environment that may be apparent to the subject.

In some settings, the subject under study may be immersed in a virtual environment while physically away from the scanner, so they need not face the anxiety claustrophobia threat.

In some settings, the entire visual experience of the subject is provided by the virtual environment. That is, the entire field of view of the object may be provided by the virtual environment. Any areas not belonging to the virtual environment are obscured or blocked so that the subject cannot see the physical environment around them. For example, a head set may be provided that includes a virtual scene, but any areas that do not belong to the created virtual scene are blocked by a screen or other obstruction, thereby ensuring that the subject does not see the actual physical environment. In some settings, complete visual stimuli are provided by the system, including ensuring that the peripheral vision of the subject does not allow the subject to interact with and see the actual physical environment in which they are located. That is, the settings may provide a barrier to peripheral vision of the subject, and/or set peripheral vision information as part of the virtual environment created by the subject.

The sensors, for example motion sensors, which may be gyroscopic and/or accelerometer type sensors, may be provided on the head unit or on a table on which the subject may be positioned, so that any physical motion experienced by the subject in the physical environment around them may be adapted and consistent with motion apparent in the virtual environment. For example, movement of a subject into an aperture of a scanner may be detected, and the virtual environment may seek to provide a similar "move/slide in" visual experience to the subject within the virtual environment.

A noise sensor, such as a microphone, may be provided so that any noise present in the physical environment surrounding the subject may be accommodated and "interpreted" by factors in the virtual environment created. For example, the noise of an MRI scanner can be large and can interfere with the subject. Providing an "explanation" for the noise of the MRI scanner within the virtual environment may help subjects forget their real physical environment. For example, noise of an MRI scanner may be "provided" by noise events in the virtual environment, such as using windmilling, working at a construction site, rumbling of traffic, etc.

According to some arrangements, consistency between the "real" environment and the virtual environment is achieved by creating visual features corresponding to the externally perceived "real" sound environment. Although MRI subjects typically wear earmuffs, it is often not possible to completely eliminate external sounds. Thus, the settings may seek to provide the virtual environment provided to the subject with recognizable visual features for making the perceived sound interpretable. For example, a virtual road shovel with a pneumatic drill may be added to the virtual environment to "interpret" the noise emitted by the MRI scanner while it is running.

Some arrangements provide a mechanism by which to provide control over the complete visual and auditory scene experienced by the subject, and together with the sensors to provide information about how the subject interacts with the virtual environment provided by the system (active control and passive observation of his gaze location or other physiological sensors). Such an arrangement has particular utility and application in providing particular information for fMRI (functional MRI) studies. For example, the system may be configured to expose the subject to a complex scene, and may know which portion of the complex scene the subject is interested in and when the attention is present through eye tracking, and correlate this information with the acquired images.

The subject feels relieved

The setup recognizes that placement within the bore of an MRI scanner can create challenges for the subject, while distracting by providing an alternative reality with fewer spatial limitations and/or less terrorism, which can help to obtain clear and useful MRI images. Some arrangements are configured to further improve the comfort and reassurance of the subject by providing a known "third person" in the virtual environment. For example, some settings allow an avatar image or video image of a third person (e.g., a caregiver or parent) to appear within the virtual environment provided to the subject. The person may be provided to the subject visually and aurally. A VR system according to the present arrangement may include a camera, a microphone, and a "green screen" outside the aperture of the scanner, such that a real image of a third person may be provided within the virtual environment. A third person can see the virtual environment provided to the subject and can have conversational interaction with the subject within the bore of the scanner through the virtual environment. Thus, the child may feel reassured due to the visual and auditory presence of the parent in the virtual environment. Elderly or non-behavioral-competent persons may also similarly feel reassurance due to the presence of caregivers in the virtual environment.

Discussion of the related Art

The described setup demonstrates the ability to successfully introduce an immersive VR world into an MRI system. The settings may provide dynamic interaction with the gaze-based VR content. Gaze tracking according to the present arrangement may have the capability to compete with the eye tracker of the currently best commercial game. The non-invasive and non-contact design of the user device located within the bore of the scanner eliminates the need for any preparation prior to scanning (e.g., affixing a label to the subject's face), thereby allowing the subject to control interaction with the virtual environment, and/or allowing any movement of the subject's head to be calculated from motion inferences of the corners of the eyes. The present arrangement has applications in both the following two clinical fields: clinical fields related to subjects (e.g., subjects with claustrophobia or children) that feel MRI stressed, and clinical fields related to neuroscience research that can provide motion correction in relation to collected image data[3]

Although illustrative embodiments of the present invention have been disclosed in detail herein with reference to the accompanying drawings, it is to be understood that the invention is not limited to those precise embodiments, and that various changes and modifications can be effected therein by one skilled in the art without departing from the scope of the invention as defined by the appended claims and their equivalents.

VR System usage

FIG. 5 shows a subject being placed in an bore of an MRI scanner; some settings allow the subject to enter the VR system and onto the MRI scanner table when the subject is outside of the room containing the MRI scanner. Thus, the immersion of the VR system can be used to reduce any anxiety or discomfort that a subject may experience when entering the smaller physical space provided by the bore of the MRI scanner.

Fig. 6 shows a screenshot of visual elements of a virtual environment provided to a subject in a bore of an MRI scanner according to one arrangement. In the example shown, some elements are displayed to the subject, which the subject can interact with through appropriate eye movements.

Further review of VR in MRI scanner Environment

As noted above, the present aspects recognize that various challenges can arise in using VR technology in an MRI scanner environment. In particular, the present arrangement seeks to provide a VR system to a subject, wherein the subject does not use significant head or body movements to control the VR environment provided. Furthermore, by providing visual and auditory stimuli that are consistent with one or more sensations or stimuli that the subject receives from the real world, the present arrangement seeks to limit the extent to which the subject wants to make large movements. Such stimuli include, for example, sensations associated with vision, body movement, and sound.

Some settings recognize that placing a subject on an MRI table may require them to be in a supine position. For example, a supine sensation is noticeable to the subject when the subject is lying on a table, and providing the subject with an initial visual VR input that takes into account the supine position may allow the subject to relax and feel more comfortable. Similarly, when the table is moved into the bore of the scanner, it is likely that it is apparent from the stimulus that they receive from their body that they are moving. Providing a visual VR input to the subject that takes into account the motion and/or vibration produced by the MRI scanner may make the subject relax and feel more comfortable and begin to produce a sensation of being fully immersed in the virtual environment.

In this case, a motion sensor may be mounted on the subject's cradle, the motion sensor providing one or more signals to the VR system so that the visual stimuli in the virtual environment may match the overall motion of the MRI scanner assembly that the subject is experiencing in the real world. In some settings, analysis of the real-time video stream of the subject within the room of the MRI scanner or bore of the scanner may be used to provide an indication of the possible stimulus the subject is experiencing.

Similarly, audio in the virtual environment provided to the subject may take into account audio stimuli occurring in the real environment surrounding the subject. Thus, the VR system may include one or more audio sensors or microphones provided to capture audio signals experienced by the subject due to the surrounding environment. The VR system may be configured to match audio and/or visual signals to provide to the subject through a VR output that takes into account real-world motion and/or sound.

The system is configured to provide a virtual environment to the subject that encourages the subject to remain substantially stationary, thereby preventing or reducing movement of the subject while the MRI images are captured. Preventing gross body movement of one or more limbs of the subject facilitates the capture of useful images from the MRI scanner. The system may be configured to provide a virtual environment in which subjects naturally move their eyes rather than the entire head or body, thereby interacting with the virtual environment.

Further review of MRI scanner compatibility

As described above, to maximize the assurance that the field within the bore of the scanner is not disrupted, a system according to the present arrangement may include various mitigation features and methods. For example, VR system components located within the bore of the scanner may be selected as follows: materials that do not include, and/or may be magnetically or electrically destructive may be shielded or configured within the bore to minimize magnetic or electrical disruptions. In particular, a shielded active display device, such as a liquid crystal screen, may be used in conjunction with the patient's head set, and thus it is possible to place the subject in a virtual environment while preparing to enter the MRI scanner, and use the system to disguise or disguise the subject's context of entering a narrow bore in the MRI scanner. In particular, with respect to components of the VR system that are placed close to the head of the subject, the optics of the system may be arranged, positioned, or configured so as not to be too close to the head of the subject. Therefore, damage and distortion to the head imaging by the MRI technique can be prevented. As described above, one or more sensors for tracking eye movement, such as a camera, may be located sufficiently far from the subject to reduce electromagnetic interference in the MRI images to be captured. Such cameras may also include shielded MRI compatible cameras to avoid electromagnetic interference. Fixing or providing the elements of the VR system such that they remain substantially stationary around the subject may allow the subject to sink undisturbed in the virtual environment.

Of course, the processing equipment (computers and similar equipment) required to create and maintain the virtual reality environment may remain located outside the bore of the scanner. The system components necessary to provide the subject with an experience are sized to fit within the bore of the scanner.

Further review on functional MRI

As previously mentioned, the head motion of the subject is relatively limited when in the head coil of the MRI scanner. In fact, it is necessary to limit the head movements in order to ensure that the images obtained from the system are sharp and useful. Monitoring the motion of the subject's eye pupil or other identifiable eye feature (e.g., eye frame position) may allow the system to calculate a possible gross motion of the subject's head, and then provide a calculated motion inference to the MRI image capture system so that appropriate corrections may be made in the resulting captured MRI images.

In any case, within the VR system, monitoring the movement of the subject's eyes allows for adjustment of the visual material provided to the subject while the system is in use. Further, the movement of the subject's eyes can be used as the primary mode of the subject's interaction with the VR environment. Thus, the choice made by the subject in the VR environment can be affected by the movement of the subject's eyes. The present arrangement may provide information about the virtual environment and VR system, for example information about options or selections made by the subject may be provided to the MRI system. Thus, studies of functional systems within the brain of a subject may be facilitated.

Further, since the VR system may be configured to allow the subject to additionally interact with the system through speech control and/or through minor hand movements (e.g., pressing buttons or by tracking hand or finger movements). It should be appreciated that the choices made by the subject in the VR environment may be affected by verbal or manual interaction, and thus, some settings may provide the MRI system with information about the virtual environment and the VR system, as well as, for example, information about the options or choices made by the subject. Thus, studies of functional systems within the brain of a subject, for example, those triggered by the need to make a selection, or the need to make a sound or move a limb, may be facilitated.

Further review on eye tracking

As previously described, embodiments provide a mechanism for a subject located within the bore of a scanner to interact with a virtual environment while discourageing gross motor action. The eye scanning technique according to the embodiment allows subject line-of-sight control such that the line of sight serves as a main input subject. Embodiments may allow visual flow tracking. The system may be configured to enable progressive, adaptive calibration of the subject's line of sight. Thus, some embodiments operate such that when a subject makes a selection using gaze control and/or provides an input to the VR system, the detected pupil and/or head position is correlated with a known "target" position, i.e., a feature in the virtual environment with which the subject is interacting. The correlation between the detected pupil position and the target can be used in a system calibration step, allowing the system to update an eye tracking model used to convert the pupil position of the subject to the gaze point. The monitoring of eye, pupil, and/or head position and correlation to target position may be repeated throughout the subject's interaction with the VR environment. This continuous calibration helps to achieve a robust and stable gaze tracking system.

Immersive and configuration of virtual environments

It may be beneficial to provide a configuration that provides the subject with an immersive virtual environment. In other words, one or more senses of the subject may be spoofed to believe that the virtual environment may replace the physical environment surrounding the subject (i.e., the MRI scanner). In this regard, the system may allow various sensory issues to be addressed in a setting by appropriate configuration within the virtual environment provided to the subject. The main senses in MRI scanner applications are: visual, auditory and tactile. Various methods may be provided to ensure consistency of the subject's experience between the virtual environment and any aspect of the real physical environment that may be apparent to the subject.

Visual input

In some settings, the entire visual experience of the subject is provided by the virtual environment. That is, the entire field of view of the subject may be provided by the virtual environment. Any areas not belonging to the virtual environment are obscured or blocked so that the subject cannot see the physical environment around them. In this regard, the system may reduce or prevent inadvertent exposure to peripheral visual cues that may cause the subject to perceive the surrounding real world environment.

The present arrangement recognizes that various physical difficulties may arise in providing a suitable VR environment for a subject located within the bore of an MRI scanner. In particular, typical VR headsets are very close to and surround the user's head. Many systems, even those that use a cell phone as a screen, can be mounted directly to the user's head or body, so the user's movements can be easily tracked using sensors (e.g., accelerometers, etc.) in the head device or cell phone. Placing the screen close to the eyes of the user may ensure that a suitable three-dimensional VR image is provided to the user and that substantially the entire visual environment provided to the user may be controlled and accounted for. In a movie or home theater scenario, a screen providing a 3D image to a user is typically located at a distance from the user. Movie theaters and home users often wear special glasses, such as actively shuttered glasses or glasses with colored lenses, to allow the user to successfully handle three-dimensional imaging. Because these items may cause local interference with the magnetic field and/or may cause interference or discomfort to a subject who is scanning for a long period of time, the user cannot directly use these two options within the bore of the MRI scanner: it is undesirable to have the screen close to the subject's eyes as with a VR headset, and proper glasses cannot be provided.

Some arrangements are configured to provide stereoscopic images to a subject positionable within an MRI. The system may include a stereoscopic color filter positioned to allow the subject to view a stereoscopic image provided to the subject by the visual display device. The volume filter is spaced from the subject's eye in the scanner to ensure that the local static magnetic field is not disturbed and to minimize disturbance of the system to the subject.

Audio input

One or more microphones may be provided to detect the sound that the user is experiencing. The system may be configured to recognize noise typically experienced within the MRI environment and provide a response within the virtual environment consistent with sound experienced within the MRI scanner. The system may include one or more speakers, which may increase the sound experience of the subject. Thus, the system may be configured to provide virtual environment "overlay" sound to sound occurring in the real-world environment. In some settings, the microphone and speaker settings may be such that real-world noise cancellation may occur within the virtual environment.

Exercise of sports

The sensors, for example motion sensors, which may be gyroscopic and/or accelerometer type sensors, may be provided on the head unit or on a table on which the subject may be positioned, so that any physical motion experienced by the subject in its surrounding physical environment may be accommodated and consistent with the audio and video material provided to the subject. For example, vibrations experienced by a subject in the real world may cause distortion or "vibration" of visual and/or audio material provided to the subject in a virtual environment.

Further review of the subjects' feelings of peace

The system may be configured to provide an indication or copy of a virtual environment that a subject within an bore of the MRI scanner is experiencing to a user or viewer outside the bore of the MRI scanner. Providing a virtual environment to, for example, a caregiver, supervisor, MRI scanner operator, or parent located outside the scanner may allow audio interaction of the virtual environment between the person and the subject located within the bore of the scanner. Such audio interaction may help to reassure the subject within the scanner.

Head unit adaptation

Fig. 7 shows a user device 700 forming part of a VR system, which forms part of a head device or can be retrofitted to an existing head device of an MRI scanner and is dimensioned to be locatable within an aperture of the MRI scanner. The user device may be comprised of components that are compatible with MRI scanning and minimize interference with the MRI field. As shown in fig. 7, the user device 700 may be positioned on a head coil of a typical MRI scanner. An arrangement may be provided in which the user device 700 may be integrally formed with the head device of the MRI scanner. The user device 700 may be removably separable from the rest of the MRI scanner to allow the VR environment to be provided to the subject prior to (e.g., in preparation for) entering the MRI scanner.

Fig. 8 is an isometric view of some of the major components forming the internal structure of the user device shown in fig. 7. The user device 800 is located on the MRI scanner's head set 860. The user device is positioned over the subject's eye opening 850 so that the visual input provided to the subject can be controlled. In the example shown, the VR system is configured to project an image through the diffuser screen 810 to a viewer 820, the viewer 820 directing light toward a subject located within the head apparatus 870. The user device 800 includes a barrier 870 that blocks the view of each eye of the subject located within the head device 860 and allows control of visual input from the system to each eye of the subject. This eye-by-eye visual control may be particularly useful when using parallax stereo technology (anaglph techniques), as filters and/or lenses appropriate for each eye may be placed in the stand 830. User device 800 includes a stand 840 in which a camera or other sensor may be placed to enable monitoring of a subject's eye movements through an eye opening 850.

Reference to the literature

[1]Kazemi,Vahid,and Josephine Sullivan."One millisecond face alignment with an ensemble of regression trees."Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.2014.

[2]Tobii Technology(2015),"Tobii Accuracy and Precision Test Method for Remote Eye Trackers,"https://stemedhub.org/resources/3310.

[3]Bohil,Corey J.,Bradly Alicea,and Frank A.Biocca."Virtual reality in neuroscience research and therapy."Nature reviews neuroscience 12.12(2011):752.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:执行弥散加权磁共振测量的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!