System and method for controlling operating room display using augmented reality headset

文档序号:1805823 发布日期:2021-11-09 浏览:10次 中文

阅读说明:本技术 使用增强现实耳机控制手术室显示器的系统和方法 (System and method for controlling operating room display using augmented reality headset ) 是由 J·W·库普 于 2021-05-08 设计创作,主要内容包括:增强现实(AR)系统和方法涉及交互式头戴式装置(HMD)、外部显示器以及与所述HMD和所述外部显示器通信的医学图像计算机。所述外部显示器显示由所述医学图像计算机提供的医学图像或3D模型的一个或多个平面。佩戴所述HMD的用户可以通过将用户的注视聚焦在控制对象和/或所述交互式HMD的显示器上显示的医学图像或3D模型的一部分上来操纵所述外部显示器上显示的医学图像或3D模型。(Augmented Reality (AR) systems and methods relate to an interactive Head Mounted Device (HMD), an external display, and a medical image computer in communication with the HMD and the external display. The external display displays one or more planes of a medical image or 3D model provided by the medical image computer. A user wearing the HMD may manipulate medical images or 3D models displayed on the external display by focusing the user's gaze on a control object and/or a portion of the medical images or 3D models displayed on a display of the interactive HMD.)

1. A method, comprising:

displaying the scan information on an external display;

displaying, on a display of an Augmented Reality (AR) Head Mounted Device (HMD), at least one plane of scan information based on the scan information displayed on the external display;

monitoring a location of a focus of a user's gaze; and

updating the scan information on the external display based on the monitored movement of the location of the focus of the user's gaze.

2. The method of claim 1, further comprising:

displaying a control object outside the scan information displayed on the AR HMD;

detecting that the user's gaze is focused on the control object;

in response to detecting that the user's gaze is focused on the control object, determining that the user's gaze satisfies a predetermined condition; and

in response to determining that a characteristic associated with the user's gaze satisfies a predetermined condition, recording a selection of the control object.

3. The method of claim 2, wherein determining that a characteristic associated with the user's gaze satisfies a predetermined condition comprises determining that the user's gaze remains greater than a predetermined period of time.

4. The method of claim 2, wherein determining that a characteristic associated with the user's gaze satisfies a predetermined condition comprises determining that the user blinks for greater than a predetermined period of time.

5. The method of claim 4, further comprising, in response to recording the selection of the control object, displaying at least one plane of scan information in place of the control object displayed on the display of the AR HMD.

6. The method of claim 1, further comprising continuously displaying at least one plane of scan information on the display of the AR HMD.

7. The method of claim 6, further comprising determining a number of scanned information planes currently displayed on the external display.

Wherein the at least one plane of scan information is displayed on the display of the AR HMD based on the determined number of planes of scan information.

8. The method of claim 6, further comprising determining a number of scan information planes currently available for display.

Wherein the at least one plane of scan information is displayed on the display of the AR HMD based on the determined number of planes of scan information.

9. The method of claim 1, wherein the scan information comprises Computed Tomography (CT) scan information, Magnetic Resonance Imaging (MRI) scan information, Positron Emission Tomography (PET) scan information, or any combination thereof.

10. The method of claim 2, wherein the control object is a virtual joystick, a scroll bar, a button, or a selectable icon.

11. The method of claim 1, further comprising:

detecting that the user's gaze is focused on a scanning information plane; and

in response to detecting that the user's gaze is focused on a scanning information plane, highlighting the scanning information plane and displaying a control object associated with the highlighted scanning information plane.

12. The method of claim 11, wherein highlighting the scanning information plane comprises changing a fill color, a fill pattern, a border color, or a border pattern of the scanning information plane.

13. The method of claim 11, wherein displaying the control object associated with the highlighted scanned information plane comprises displaying a movement direction control button on an opposite side of the highlighted scanned information plane, and

wherein the moving direction control button includes an up-down control button or a forward control button and a backward control button.

14. The method of claim 11, further comprising:

detecting that the user's gaze is focused on a control object; and

in response to detecting that the user's gaze is focused on the control object, determining a selection of the control object and moving the highlighted scanned information plane in a direction associated with the selected control object.

15. The method of claim 11, wherein the highlighted scan information plane moves at a predetermined rate;

determining a length of time that the user's gaze is detected to be focused on a control object; and

increasing a rate of movement of the highlighted scanned information plane based on the determined length of time,

wherein increasing the movement rate comprises increasing the movement rate linearly or in predetermined steps.

16. The method of claim 1, wherein displaying scan information comprises displaying a three-dimensional (3D) model constructed based on a medical scan, the method further comprising:

detecting that the user's gaze is focused on a portion of the 3D model; and

in response to detecting that the user's gaze is focused on the portion of the 3D model, displaying a scan information plane corresponding to the portion of the 3D model.

17. The method of claim 2, wherein the control object is a slider of a scroll bar, further comprising:

detecting a movement of the user's gaze to another location on the scrollbar; and

displaying a scanned information plane corresponding to the other location on the scrollbar.

18. A method, comprising:

receiving scan information from a medical image computer in communication with an external display;

at least one plane displaying the scan information;

displaying at least one control object associated with the at least one plane of scan information;

detecting that a user's gaze is focused on the at least one control object;

in response to detecting that the user's gaze is focused on the control object, sending a first control message to the medical image computer to change the display of the scan information on the external display;

receiving a second control message from the medical image computer to change the display of the at least one plane of the scan information; and

changing the display of the at least one plane of scan information based on the second control message.

19. The method of claim 18, wherein the control message comprises an instruction to change a position, orientation, or size of the scan information, or an instruction to hide or show at least a portion of the scan information.

20. A system, comprising:

an operating room monitor configured to display scan information; and

an interactive head-mounted device in communication with the operating room monitor, the interactive head-mounted device comprising:

an optical assembly configured to display a medical image and to enable viewing of at least a portion of a surrounding environment;

an eye tracking device configured to track a location of a focal point of a user's gaze; and

a processor in communication with the optical assembly and the eye tracking device; and

a memory configured to store an application program that, when executed by the processor, causes the processor to:

displaying scan information on the operating room monitor;

displaying at least one plane of scan information on the interactive head mounted device based on the scan information displayed on the operating room monitor;

monitoring a location of a focus of a user's gaze;

updating the scan information on the operating room monitor based on the monitored movement of the location of the focus of the user's gaze;

determining that the scan information is changed on the operating room monitor; and

in response to determining that the scan information is changed on the operating room monitor, updating the at least one plane of scan information displayed on the interactive headset based on the scan information changed on the operating room monitor.

Technical Field

The present disclosure relates to the field of operating room devices, and more particularly to controlling an operating room display using augmented reality headsets.

Background

During surgery, a patient scan (e.g., a Computed Tomography (CT) scan, a Magnetic Resonance Imaging (MRI) scan, OR a Positron Emission Tomography (PET) scan) is typically displayed on a monitor in an Operating Room (OR) to guide a surgeon through the surgery, e.g., to guide a catheter tip to a target identified in the patient scan. Often, the surgeon needs to manipulate the patient scan shown on the OR monitor, for example, to move between slices of the patient scan OR center a target in the display in order to better view the relevant portion of the patient's anatomy. When a surgeon is involved in a surgery, it is difficult for the surgeon to manipulate the patient scan in the OR monitor to focus on the relevant portion of the patient volume at the time of the surgery. Accordingly, there is a need for methods and systems that allow a surgeon to manipulate a patient scan on a display without having to perform excessive steps beyond surgical steps.

Disclosure of Invention

The technology of the present disclosure relates generally to controlling an operating room display using augmented reality headsets so that a surgeon participating in an operation can easily, accurately, and aseptically manipulate patient scan information displayed on the operating room display.

In one general aspect, the disclosure features a method that includes: displaying the scan information on an external display; displaying at least one plane of scan information on a display of an Augmented Reality (AR) Head Mounted Device (HMD) based on the scan information displayed on the external display; monitoring a location of a focus of a user's gaze; and updating the scan information on the external display based on the monitored movement of the location of the focus of the user's gaze. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.

Implementations may include one or more of the following features. The method may include displaying a control object in addition to the scan information displayed on the AR HMD; detecting that a user's gaze is focused on a control object; in response to detecting that the user's gaze is focused on the control object, determining that the user's gaze satisfies a predetermined condition; and in response to determining that the characteristic associated with the user's gaze satisfies a predetermined condition, recording a selection of the control object. Determining that the characteristic associated with the user's gaze satisfies the predetermined condition may include determining that the user's gaze remains greater than a predetermined period of time. Determining that the characteristic associated with the user's gaze satisfies the predetermined condition may include determining that the user blinks for greater than a predetermined period of time.

The method may include, in response to recording the selection of the control object, displaying at least one plane of scan information in place of the control object displayed on the display of the AR HMD. The control object may be a virtual joystick, a scroll bar, a button, or a selectable icon. The control object may be a slider of a scroll bar. The method may include detecting a movement of a user's gaze to another location on the scroll bar and displaying a scanned information plane corresponding to the other location on the scroll bar. The method may include continuously displaying at least one plane of scan information on a display of the AR HMD. At least one plane of scan information may be displayed on a display of the AR HMD based on the determined number of planes of scan information. At least one plane of scan information may be displayed on a display of the AR HMD based on the determined number of planes of scan information.

The scan information may include Computed Tomography (CT) scan information, Magnetic Resonance Imaging (MRI) scan information, Positron Emission Tomography (PET) scan information, or any combination thereof. The method may include detecting that a user's gaze is focused on a scan information plane, and in response to detecting that the user's gaze is focused on the scan information plane, highlighting the scan information plane and displaying a control object associated with the highlighted scan information plane. Highlighting the scanned information plane may include changing a fill color, a fill pattern, a border color, or a border pattern of the scanned information plane. Displaying the control object associated with the highlighted scanned information plane may include displaying a movement direction control button on an opposite side of the highlighted scanned information plane. The moving direction control buttons may include up and down control buttons or forward and backward control buttons.

The method may include detecting that a user's gaze is focused on a control object, and in response to detecting that the user's gaze is focused on the control object, determining a selection of the control object and moving the highlighted scan information plane in a direction associated with the selected control object. The highlighted scanned information plane may be moved at a predetermined rate. The method may include determining a length of time that the user's gaze is detected to be focused on the control object, and increasing a rate of movement of the scanning information plane that moves the highlight based on the determined length of time. Increasing the movement rate may include increasing the movement rate linearly or in predetermined steps. Displaying the scan information may include displaying a three-dimensional (3D) model constructed based on the medical scan. The method may include detecting that a user's gaze is focused on a portion of the 3D model, and in response to detecting that the user's gaze is focused on the portion of the 3D model, displaying a scan information plane corresponding to the portion of the 3D model.

In another general aspect, the disclosure features a method that includes receiving scan information from a medical image computer in communication with an external display; at least one plane displaying scan information; displaying at least one control object associated with at least one plane of scan information; detecting that a user's gaze is focused on at least one control object; and in response to detecting that the user's gaze is focused on the control object, sending a first control message to the medical image computer to change the display of the scan information on the external display. The method also includes receiving a second control message from the medical image computer to change the display of the at least one plane of scan information, and changing the display of the at least one plane of scan information based on the second control message. Other embodiments of the general aspect include corresponding computer systems, apparatus, and computer programs, recorded on one or more computer storage devices, each configured to perform the actions of the methods.

Implementations may include the following features. The control message may include instructions to change the position, orientation, or size of the scan information, or to hide or show at least a portion of the scan information.

In another general aspect, the disclosure features a system that includes an operating room monitor that displays scan information and an interactive headset that communicates with the operating room monitor. The interactive head mounted device includes an optical assembly that displays a medical image and is capable of viewing at least a portion of a surrounding environment. The system further comprises: an eye tracking device that tracks a position of a focal point of a user's gaze; a processor in communication with the optical combiner assembly and the eye tracking device; a memory storing an application program that, when executed by the processor, causes the processor to: displaying the scan information on an operating room monitor; displaying at least one plane of scan information on the interactive head mounted device based on the scan information displayed on the operating room monitor; monitoring a location of a focus of a user's gaze; updating scan information on an operating room monitor based on the monitored movement of the location of the focus of the user's gaze; determining that the scan information is changed on the operating room monitor; and in response to determining that the scan information is changed on the operating room monitor, updating at least one plane of scan information displayed on the interactive headset based on the changed scan information on the operating room monitor.

The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.

Drawings

Various exemplary aspects are depicted in the drawings. It will be appreciated that for simplicity and clarity of illustration, elements illustrated in the figures referenced below have not necessarily been drawn to scale. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate like, corresponding or analogous elements. The figures are listed below.

FIG. 1 is a schematic block diagram of a system configured for use with the methods of the present disclosure;

2A-2C are schematic diagrams of displayed scan information, according to aspects of the present disclosure;

fig. 3 is a block diagram of an augmented reality system according to aspects of the present disclosure;

FIG. 4 is a perspective view of a head mounted see-through display device according to aspects of the present disclosure;

FIG. 5 is a block diagram of an operating room display system according to aspects of the present disclosure; and

fig. 6 is a flow chart of a method of controlling medical images displayed on an external display by using an AR HMD according to an aspect of the present disclosure.

Detailed Description

During surgery, a patient scan (e.g., a Computed Tomography (CT) scan, a Magnetic Resonance Imaging (MRI) scan, OR a Positron Emission Tomography (PET) scan) OR a three-dimensional (3D) model constructed based on the patient scan is typically displayed on a display (e.g., external monitor 140 of fig. 1) in an Operating Room (OR). According to the method for displaying a patient scan or 3D model, one plane (e.g., the axial (a) plane 202 illustrated in fig. 2A) may be shown, or additional planes (e.g., the coronal (C) plane 204 and the sagittal (S) plane 206 illustrated in fig. 2A) may be shown.

Once the surgeon is involved in the procedure, the surgeon may not be able to easily, accurately, and aseptically manipulate the patient scan information displayed on the OR display. For example, the OR display may be covered with a sterile bag, which may affect the touchscreen capabilities and clarity of the OR display as seen through the sterile bag, making it difficult for a surgeon to accurately OR properly manipulate the touchscreen of the OR display to focus on, among other things, the relevant portion of the patient volume. The surgeon may have three options: using a sterilizable pointing device (e.g., trackball, joystick), removing the surgical gown by itself and manipulating the OR display, OR instructing a non-sterile member of the surgical team to adjust the OR display. However, these options have the following disadvantages: sterilizable interfaces are not always available, the surgeon must attend the procedure after taking off the surgical gown and manipulating the OR display, extending the procedure time, and providing verbal instructions to non-sterile members of the surgical team can be difficult and frustrating for the surgeon, depending on the knowledge and experience of the non-sterile members of the surgical team.

As shown in fig. 1, one aspect of the present disclosure is for a clinician 102, such as a surgeon, to directly and hands-free control how one or more planes of a patient scan 145 are displayed on an external monitor 140 in an operating room. The clinician 102 wears an Augmented Reality (AR) headset 110 or any other suitable Head Mounted Device (HMD) with a frame 112 that incorporates eye tracking hardware (e.g., eye tracking sensors) and supports a lens 124 that includes a see-through headset display 116 for displaying a projection 118 of a patient scan 115. The patient scan 115 may be a modified version (e.g., a lower resolution version) of the patient scan 145 such that the patient scan 115 may be displayed as a projection 118 on the see-through earpiece display 116 of the AR earpiece 110.

A headset computer (e.g., headset computer 310 of fig. 3) integrated into the AR headset 110 monitors the current location or movement of the surgeon's gaze focus, updates the patient scan 115 on the headset display 116 based on the current location or movement of the surgeon's gaze focus, and provides scan display information (including display parameters associated with the updated patient scan 115) to the communication interface 135 of the external medical image computer 130. The processor 131 then executes instructions stored in the memory 132 of the external medical image computer 130 to cause the medical image computer 330 to change the displayed patient scan 145 based on the scan display information received from the AR headset 110.

The headset computer may employ any suitable method to place a projection 118 of the scan information (e.g., patient scan 115) that does not obstruct the surgical field of view or otherwise interfere with the clinician's ability to accurately view the surgical field of view. In one example approach, the projection 118 of the patient scan 115 is continuously displayed in a fixed position and/or orientation relative to the AR headset 110. For example, if the projection 118 of the patient scan 115 is displayed in the upper right corner of the see-through headphone display 116, the projection 118 of the patient scan 115 is continuously displayed in the upper right corner of the see-through headphone display 116 regardless of the position and/or orientation of the AR headphone 110.

In another example approach, the projection 118 is anchored to a fixed location such that a user wearing the AR headset 110 must look in the general direction of the fixed location to see the projection 118. The AR headset 110 may include one or more sensors for determining the orientation of the AR headset 110. Thus, when a user wearing the AR headset 110 turns their head away from a fixed position, the projection moves out of the user's field of view through the AR headset 110. And when the user orients their head towards a fixed position, the projection 118 returns to the user's field of view through the AR headset 110. In one aspect, an icon pointing in the direction of the projection 118 may be displayed at the edge of the user's field of view to help reposition the projection 118 when it is outside the user's current field of view.

In aspects of the present disclosure, the headset display 116 may show one or more icons or buttons for controlling the display of scan information on the headset display 116 and the external monitor 140. One or more icons may be shown toward the edge of the display 116, outside of the main field of view, to avoid obscuring the surgeon's view of the surgical site. According to an example of a display method, the one or more icons include an icon or other symbol, e.g., a thumbnail, representing the complete display shown in fig. 2A. Then, when the eye tracker 322 detects that the surgeon's gaze is focused on the icon, the icon is replaced by the full display of fig. 2A. In some aspects, the eye tracker 322 detects user selection of an icon by detecting a long pause or a long blink of the user's eyes focused on the icon. Alternatively, the display of FIG. 2A may be continued without displaying an icon or other symbol representing the complete display shown in FIG. 2A. The number of planes of the patient scan 115 displayed on the headset display 116 may be based on information from the medical image computer 130 to reflect those planes currently displayed on the OR monitor OR available but not currently displayed. In aspects, the headset display 116 may display one or more hide/show icons that, when selected by the user's gaze, cause one or more planes or 3D models of the patient scan 115 to be shown or hidden.

After the patient scan 200 shown in fig. 2A is displayed on the display of the AR headset 110, the surgeon selects a plane (e.g., plane 202, 204, or 206) for adjusting the patient scan 200. As further shown in fig. 2B and 2C, when the surgeon's field of view or gaze 212 is focused on a point 210 on the axial plane 202 for a predetermined period of time, the axial plane 202 is highlighted 224, indicating that the axial plane 202 has been selected by the surgeon. In other aspects, the plane may be selected by detecting a long pause in the movement of the surgeon's gaze 212 focus or by detecting a blink of a predetermined duration. Highlighting may include changing the fill color or pattern of the selected plane, or changing the border color or pattern (e.g., dots, dashes, or hashes) of the selected plane.

After the plane is selected, a forward control button 221 and a backward control button 222 are displayed in the headphone display 116, as shown in fig. 2C. In some aspects, the selected plane may remain highlighted while the forward control button 221 and the backward control button 222 are displayed. Additionally or alternatively, upper and lower control buttons may be displayed in the headset display 116. In various aspects, the control buttons 221, 222 may operate as auto-repeat buttons that, when successively selected by the surgeon's gaze, move the selected plane until the surgeon's gaze focus is moved away from one of the control buttons 221, 222. When the surgeon's gaze 212 focus is on the forward control button 221 for a predetermined period of time or longer, the selected plane 202 moves in a backward direction, as indicated by the arrow displayed on the backward control button 221. Similarly, when the surgeon's gaze focus is on the backward control button 221 for a predetermined period of time or longer, the selected plane moves in the backward direction, as indicated by the arrow displayed on the backward control button 221. In some aspects, the forward control button 221 and the backward control button 222 may have different positions and/or orientations than those shown in fig. 2C. For example, the forward control button 221 and the rearward control button 222 may be positioned adjacent to each other below the patient scan 115 or may be oriented parallel to the coronal plane 204.

The forward control button 221 and the backward control button 222 shown in fig. 2C are intended to be illustrative examples of control objects that may be controlled by the user's eyes via an eye tracker to manipulate the patient scan 115 or model displayed by the AR headset 110. For example, the control object may include a virtual joystick, selectable buttons or icons, and a scroll bar, which may be positioned and/or oriented in a manner suitable for manipulating the patient scan 115 or model displayed by the AR headset 110.

In some aspects, as the surgeon's gaze focus is continuously maintained on one of the control buttons 221, 222, the selected plane may first move at a slow rate and then accelerate to a predetermined maximum acceleration rate. The acceleration rate may be linear or may vary in steps (e.g., 1 ×,2 ×, 5 ×, 10 ×), where the focus time for each step may be a set amount (e.g., five seconds of successive focus increase steps). The selected plane in fig. 2C may be moved accordingly to provide visual feedback to the surgeon in, for example, direction and current body position.

Referring now to fig. 3, because of the low resolution of the headset display 326, it may not be practical for the headset computer 304 to display medical images on the headset display 326. Additionally, the contrast of the headphone display 326 may be insufficient to block background interference (e.g., background interference that would be outside of the headphone 305 behind the headphone display 326) and to overcome glare such as operating room lights. To address these issues, when the system is initially started up, the medical image computer 330 notifies the headset computer 310 of the scanned location information (e.g., available planes, displayed planes, current location of each plane) currently displayed on the OR display OR other external display. The headset computer 310 then constructs a partial view of the scan information currently displayed on the external monitor 340, which includes at least a representation of the scan information currently displayed on the external monitor 340.

Once the partial view is constructed, the headset computer 310 may transmit control messages or commands to the medical image computer 330 based on user input or commands provided by the user's eyes via the eye tracker 322. For example, the headset computer 310 may transmit commands to the medical image computer 330 to scroll through the displayed scan information and add or delete planes of the displayed scan information. The medical image computer 330 then updates the external monitor 340 accordingly. As an external user (e.g., a user not wearing AR headset 305) scrolls the displayed scan information using medical image computer 330, the new scan information may be sent to headset computer 310 in real-time to update the corresponding scan information displayed by headset display 326.

In aspects, the user input or command includes gazing a button for switching between views, e.g., switching between a view including the 3D model and a view including one or more medical scans (e.g., CT or fluoroscopic scans). The view may be a window. If two or more views are displayed simultaneously, e.g. a 3D model in one view and one or more 2D medical images in one or more other views, the user may select a view to manipulate, e.g. rotate or zoom, by gazing at the view for a predetermined period of time. Additionally or alternatively, the user input may include moving the user's gaze in a particular direction, which may cause the view to change when detected by the eye tracker 322.

In aspects, the user's gaze and/or movement of the user's gaze is used to manipulate, e.g., scroll, zoom, or rotate a 3D model or one or more medical scans. For example, the user may look over a slider of a scroll bar or zoom bar displayed to the user through the AR headset 305 for a predetermined period of time to grasp the slider, and then the user may move the user's gaze to a desired location on the scroll bar or zoom bar to achieve a desired amount of scrolling or zoom. To release the slider, the user may quickly move their gaze away from the slider in a particular direction, e.g., in a direction substantially perpendicular to the orientation of the scroll bar or zoom bar.

As another example, the user may gaze at a location on the AR view of the 3D model (e.g., the 3D model of the lung) for a predetermined period of time to grasp the location of the AR view of the 3D model. The user may then move the user's gaze in a desired direction to rotate the AR view of the 3D model in that direction. In some aspects, a user may gaze at a location in an AR view of a 3D model for a predetermined period of time, such that a 2D medical image corresponding to the location is displayed in the same and/or another AR view.

The manipulation mode for manipulating the medical image or view may be enabled, for example, zooming or rotating the medical image or view, when the user's gaze button is for a predetermined period of time, or when the user moves the user's gaze in a particular manner, for example, the user moves the user's gaze in one direction greater than a predetermined distance and then moves the user's gaze in another direction greater than the predetermined distance. When the manipulation mode is enabled, the AR headset 305 may display a menu of items, e.g., buttons and/or a slider bar.

In some aspects, the AR headset 305 may display a virtual controller interface with one or more joysticks and/or buttons (e.g., directional buttons) that may be controlled by the user's gaze to control navigation of the robotic medical device (e.g., flexible robotic endoscope). In other aspects, the user may control the navigation of the robotic medical device by gazing at a particular destination in the 3D model displayed by the AR headset 305. The user may enable such a navigation mode by gazing at a button displayed by the AR headset 305.

The systems and methods of the present disclosure may include display and control of other medical information. For example, pre-operative plans, surgical checklists, vital signs, medical device status (e.g., RF generator settings), insufflation status, or any combination of this information may be displayed on the external monitor 340.

Fig. 4 shows an exemplary Head Mounted Display (HMD)400 in the form of wearable glasses with a see-through display 402. For example, HMD 400 may be a non-limiting example of AR headset 110 of fig. 1 and/or AR headset 305 of fig. 3. HMD 400 may take any other suitable form in which a transparent, translucent, and/or non-transparent display is supported in front of one or both eyes of a viewer. Further, embodiments described herein may be used with any other suitable computing device, including but not limited to mobile computing devices, laptop computers, desktop computers, tablet computers, other wearable computers, and the like.

The HMD 400 includes a see-through display 402, a controller 404, and a memory 412 connected to the controller 404. The controller 404 may be configured to perform various operations related to eye gaze detection or tracking, user input recognition, visual presentation of augmented reality medical images on the see-through display 402, and other operations described herein.

The see-through display 402 may enable images, such as augmented reality images (also referred to as augmented images or holograms), to be delivered to the eyes of the wearer of the HMD 400. The see-through display 402 may be configured to visually enhance the appearance of the real-world physical environment to a wearer viewing the physical environment through the see-through display 402. Any suitable mechanism may be used to display the image via the see-through display 402. For example, see-through display 402 may include an image-producing element (e.g., a see-through Organic Light Emitting Diode (OLED) display) located within lens 406. As another example, see-through display 402 may include a display device (e.g., a Liquid Crystal On Silicon (LCOS) device or an OLED microdisplay) located within the frame of HMD 400. In this example, the lens 406 may function as or otherwise include an optical waveguide for transferring light from the display device to the eye of the wearer. Such optical waveguides may enable a wearer to perceive a 3D holographic medical image located within a physical environment (e.g., an operating room) that the wearer is viewing, while also allowing the wearer to directly view physical objects in the physical environment, creating a mixed reality environment. Additionally or alternatively, the see-through display 402 may present the left-eye and right-eye augmented reality images via respective left-eye and right-eye displays.

The HMD 400 may also include various sensors and related systems to provide information to the controller 404. Such sensors may include, but are not limited to, one or more inward facing image sensors 408A and 408B, and one or more outward facing image sensors 410A and 410B. One or more inward facing image sensors 408A, 408B may be configured to acquire image data from the wearer's eyes in the form of gaze tracking data (e.g., sensor 408A may acquire image data of one eye of the wearer and sensor 408B may acquire image data of the other eye of the wearer). The controller 404 of the HMD 400 may be configured to determine a gaze direction of each eye of the wearer in any suitable manner based on information received from the image sensors 408A, 408B. For example, one or more light sources 418A, 418B, such as infrared light sources, may be configured to reflect glints from the cornea of each eye of the wearer. The one or more image sensors 408A, 408B may then be configured to capture an image of the wearer's eyes.

The controller 404 may determine the optical axis of each eye using images of the glints and pupils determined from the image data collected by the image sensors 408A, 408B. Using this information, the controller 404 may be configured to determine the direction in which the wearer gazes (also referred to as the gaze vector). The controller 404 may be configured to additionally determine the identity of the physical and/or virtual object at which the wearer is gazing by projecting the user's gaze vector onto a 3D model of the surrounding environment. The one or more light sources 418A, 418B, the one or more inward facing image sensors 408A, 408B, and the controller 404 may collectively represent a gaze detector configured to determine a gaze vector of an eye of a wearer of the HMD 400.

In other implementations, different types of gaze detectors/sensors may be employed in the HMD 400 to measure one or more gaze parameters of the user's eyes. Examples of gaze parameters measured by the one or more gaze sensors that may be used by the controller 404 to determine the eye gaze samples may include eye gaze direction, head orientation, eye gaze velocity, eye gaze acceleration, angular change in eye gaze direction, and/or any other suitable tracking information. In some implementations, eye gaze tracking may be recorded independently for both eyes of a wearer of the HMD 400. In one aspect, the controller 404 may determine one of a left eye and a right eye of the user. These user-specific properties relating to the user's eyes can be used to improve the robustness and accuracy of eye tracking. For example, eye tracking may replay more weight on eye tracking information obtained from the dominant eye.

Reference is now made to fig. 5, which is a block diagram of an operating room display system configured for use with the methods of the present disclosure including the method of fig. 6. The operating room display system includes an Augmented Reality (AR) Head Mounted Device (HMD)510, a medical image computer 530 (also referred to as a workstation OR operating room computer), and an Operating Room (OR) display 540. The medical image computer 530 may be a stationary computing device such as a personal computer or a portable computing device such as a tablet computer.

The HMD 510 includes control circuitry 505 that may communicate with power management circuitry (not shown) to manage power distribution and control of the components of the HMD 510. The control circuit 505 includes a processor 511, a memory controller 518 in communication with a memory 528 (e.g., D-RAM), a camera interface 512, a camera buffer 513, a light driver 514 for driving an eye tracking light 524, and a display driver 516 for driving an HMD display 526. In one aspect, all components of control circuit 505 communicate with each other via dedicated lines of one or more buses or using a shared bus. In another aspect, each of the components of the control circuit 505 is in communication with the processor 511.

The medical image computer 530 includes a processor 531, a memory 532, a display driver 533 for driving the operation of the OR display 540, a medical image database 534, and a communication interface 535 to enable communication of medical images to the control circuit 505 of the HMD 510. The medical image computer 530 may optionally be connected to an imaging device, such as a Computed Tomography (CT) scanner, a Magnetic Resonance Imaging (MRI) scanner, a Positron Emission Tomography (PET) scanner, or a fluoroscope. The imaging device may be connected to the medical image computer 530, directly or indirectly, for example, by wireless communication. The processor 531 may include one or more processors. The memory 532 may store one or more applications and the medical image database 534 may store medical image data 1014. The one or more applications may include instructions executable by the processor 531 for performing the methods of the present disclosure, including performing a portion of the steps of the method of fig. 6.

In some aspects, the medical image computer 530 may include an input device (not shown), which may be any device through which a user may interact with the medical image computer 530, such as a mouse, keyboard, foot pedal, touch screen, and/or voice interface. The medical image computer 530 may also include an output module (not shown). The output module may include any connection port or bus, such as a parallel port, serial port, Universal Serial Bus (USB), or any other similar connection port known to those skilled in the art.

The communication interfaces 515, 535 of the HMD 510 and medical image computer 530 may be configured to connect to a network, such as a Local Area Network (LAN), a Wide Area Network (WAN), a wireless mobile network, a bluetooth network, a cellular network, and/or the internet, comprised of wired and/or wireless networks. The communication interfaces 515, 535 may be used to establish a connection between the HMD 510 and the medical image computer 530. The communication interfaces 515, 535 may also be used to receive scan information including medical image data from the imaging device.

It will be appreciated by those skilled in the art that the memory 528, 532 may be any medium accessible by the processor 511, 531. That is, the media may include non-transitory, volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. For example, the memory 532 may include RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the processor 531.

Eye tracking camera 522 may be used to detect eye elements such as the corneal center, the center of eyeball rotation, and the pupil center of each eye. Based on such information, and/or other information obtained using eye-tracking camera 522, the positions of the user's left and right eyes, including the interpupillary distance between the left and right eyes, may be determined. In addition, the vertical position of the left and right eyes relative to HMD 510 and to each other may be determined. Processor 511 may determine (e.g., calculate) the positions of the user's left and right eyes based on the images and/or other information obtained by eye-tracking camera 522.

Camera interface 512 provides an interface to eye tracking camera 522, which may include one or two outward facing cameras, and in one aspect, an IR camera. The camera interface 512 stores the respective images received from the eye tracking camera 522 in the camera buffer 513. The display driver 516 may drive an HMD display 526, e.g., a microdisplay device or a see-through microdisplay device. The control circuitry 505 may include a display formatter (not shown) that may provide information about virtual images displayed on the HMD display 526 to one or more processors of one or more computer systems (e.g., processor 531 of medical image computer 530) to perform processing of the augmented reality system.

Fig. 6 is a flow diagram of a method 600 of controlling at least one plane of a medical image displayed on an external display by using an eye tracker and a display of an HMD, according to an aspect of the present disclosure. At block 602, at least one plane of a medical image stored in a medical image computer is displayed on an external display, which may be an operating room monitor. The external display may be an LED display, an OLED display, an LCD display, or any other display suitable for viewing by a clinician in an operating room setting. The at least one plane of the medical image may include one or more of an axial plane, a coronal plane, and a sagittal plane.

At block 604, the at least one plane of the medical image is processed such that the processed at least one plane of the medical image may be displayed on a display of the AR HMD. The processing may include image processing to reduce a resolution of the at least one plane of the medical image such that a display of the AR HMD is capable of displaying the at least one plane of the medical image. The processing may include other image processing to enable a clinician wearing the AR HMD to easily view at least one plane of the medical image while also enabling the clinician to clearly see the surgical field of view through the AR HMD.

At block 606, the processed at least one plane of the medical image is displayed on a display of the AR HMD, and at block 608, a control icon associated with the processed at least one plane of the medical image is also displayed on the display of the AR HMD. At block 610, the focus of the user's gaze is monitored, for example, by an eye tracker incorporated into the AR HMD. At block 612, the method 600 involves detecting whether the user's gaze is focused on a plane of the processed at least one plane of the medical image for a period of time. The time period may be a predetermined time period adapted to detect that the user desires to select a plane. For example, the predetermined period of time may be between 3 seconds and 7 seconds. The predetermined time period may also be adjustable by the user. If the user's gaze is focused on a plane within the time period, the plane is highlighted on the display of the AR HMD. This indicates to the user the plane that the user desires to control (e.g., move or scroll).

At block 614, after highlighting the plane on the display of the AR HMD, a new focus of the user's gaze is monitored. At block 616, the method 600 involves detecting whether the user's gaze is focused on a control icon displayed on the display of the AR HMD. If the user's gaze is focused on a control icon displayed on the display of the AR HMD, the display of the plane of the medical image corresponding to the highlighted plane is changed on the external display based on a change instruction associated with the control icon. The change instruction may comprise an instruction to enlarge or rotate the plane of the medical image on the external display.

While several aspects of the disclosure have been illustrated in the accompanying drawings, it is not intended to limit the disclosure thereto, since it is intended that the scope of the disclosure be as broad in the art as permitted, and that the specification be read in a manner similar thereto. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular aspects.

It should be understood that the various aspects disclosed herein may be combined in different combinations than those specifically set forth in the description and drawings. It will also be understood that certain acts or events of any process or method described herein can be performed in a different order, may be added, merged, or omitted entirely, according to examples (e.g., all described acts or events may not be necessary to perform the techniques). Additionally, while certain aspects of the disclosure are described as being performed by a single module or unit for clarity, it should be understood that the techniques of the disclosure may be performed by a combination of units or modules associated with, for example, a medical device.

In one or more instances, the techniques described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. The computer-readable medium may include a non-transitory computer-readable medium corresponding to a tangible medium such as a data storage medium (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).

The instructions may be executed by one or more processors, such as one or more Digital Signal Processors (DSPs), general purpose microprocessors, an Application Specific Integrated Circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Thus, the term "processor," as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementing the described techniques. In addition, these techniques may be fully implemented in one or more circuits or logic elements.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:标记点选取位置的验证方法、装置、终端设备和存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!