Usage and technical analysis of surgeon/personnel performance relative to baseline to optimize device utilization and performance for both current and future procedures
阅读说明:本技术 相对于基线对外科医生/人员执行进行使用和技术分析,以优化当前手术和未来手术两者的装置利用率和执行 (Usage and technical analysis of surgeon/personnel performance relative to baseline to optimize device utilization and performance for both current and future procedures ) 是由 F·E·谢尔顿四世 J·L·哈里斯 T·W·阿伦霍尔特 于 2018-11-14 设计创作,主要内容包括:本发明公开了用于评估外科人员的各种系统和方法。一种计算机系统,诸如外科集线器,可以被配置为能够可通信地联接到外科装置和相机。该计算机系统可被编程为至少部分基于外科过程期间从外科装置接收到的围手术期数据来确定与外科手术有关的背景信息。此外,计算机系统可经由相机在视觉上确定外科人员的物理特性,并且将物理特性与基线进行比较以评估外科人员。(Various systems and methods for evaluating surgical personnel are disclosed. A computer system, such as a surgical hub, may be configured to be communicably coupled to a surgical device and a camera. The computer system can be programmed to determine contextual information relating to the surgical procedure based at least in part on perioperative data received from the surgical device during the surgical procedure. Further, the computer system may visually determine a physical characteristic of the surgical personnel via the camera and compare the physical characteristic to a baseline to evaluate the surgical personnel.)
1. A computer system configured to be communicably couplable to a surgical device and a camera, the computer system comprising:
a processor; and
a memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the computer system to:
receiving perioperative data from the surgical device;
determining a surgical context based at least in part on the perioperative data;
receiving, via the camera, an image of an individual;
determining a physical characteristic of the individual from the image;
retrieving a baseline physical characteristic corresponding to the surgical context; and
determining whether the physical characteristic of the individual deviates from the baseline physical characteristic.
2. The computer system of claim 1, wherein the physical characteristic comprises a posture of the individual.
3. The computer system of claim 2, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.
4. The computer system of claim 1, wherein the physical characteristic comprises a wrist orientation of the individual.
5. The computer system of claim 4, wherein the wrist orientation of the individual corresponds to an angle between the individual's wrist and a surgical instrument held by the individual.
6. The computer system of claim 1, wherein the baseline physical characteristic comprises a previously recorded instance of the physical characteristic of the individual.
7. The computer system of claim 1, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to provide a notification based on whether the physical characteristic deviates from the baseline physical characteristic.
8. The computer system of claim 7, wherein the computer system provides the notification during a surgical procedure in which the perioperative data is received.
9. A computer-implemented method for tracking physical characteristics of an individual, the method comprising:
receiving, by a computer system, perioperative data from a surgical device;
determining, by the computer system, a surgical context based at least in part on the perioperative data;
receiving, by the computer system, an image of the individual via a camera communicatively coupled to the computer system;
determining, by the computer system, a physical characteristic of the individual from the image;
retrieving, by the computer system, a baseline physical characteristic corresponding to the surgical context; and
determining, by the computer system, whether the physical characteristic of the individual deviates from the baseline physical characteristic.
10. The computer-implemented method of claim 9, wherein the physical characteristic comprises a posture of the individual.
11. The computer-implemented method of claim 10, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.
12. The computer-implemented method of claim 9, wherein the physical characteristic comprises a wrist orientation of the individual.
13. The computer-implemented method of claim 12, wherein the wrist orientation of the individual corresponds to an angle between the individual's wrist and a surgical instrument held by the individual.
14. The computer-implemented method of claim 9, wherein the baseline physical characteristic comprises a previously recorded instance of the physical characteristic of the individual.
15. The computer-implemented method of claim 9, further comprising providing, by the computer system, a notification on a display according to whether the physical characteristic deviates from the baseline physical characteristic.
16. A computer system configured to be communicably couplable to a surgical device and a camera, the computer system comprising:
a processor; and
a memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the computer system to:
receiving perioperative data from the surgical device;
determining a surgical context based at least in part on the perioperative data;
receiving, via the camera, an image of an individual;
determining a physical characteristic of the individual from the image;
transmitting data identifying the physical characteristic and the surgical context to a remote computer system;
wherein the remote computer system determines a baseline physical characteristic corresponding to the surgical context and the physical characteristic from data aggregated from a plurality of computer systems connected to the remote computer system; and
receiving from the remote computer system whether the physical characteristic of the individual deviates from the baseline physical characteristic.
17. The computer system of claim 16, wherein the remote computer system comprises a cloud computing system.
18. The computer system of claim 16, wherein the physical characteristic comprises a posture of the individual.
19. The computer system of claim 18, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.
20. The computer system of claim 16, wherein the physical characteristic comprises a wrist orientation of the individual.
21. The computer system of claim 20, wherein the wrist orientation of the individual corresponds to an angle between the individual's wrist and a surgical instrument held by the individual.
Background
The present disclosure relates to various surgical systems. Surgical procedures are often performed in surgical operating rooms or operating rooms (operating theaters or rooms) of medical facilities, such as, for example, hospitals. A sterile field is typically created around the patient. The sterile field may include members of a team who are properly wearing swabs, as well as all equipment and fixtures in the field. Various surgical devices and systems are utilized in performing surgical procedures.
Disclosure of Invention
In one general aspect, the present disclosure is directed to a computer system configured to be communicably coupleable to a surgical device and a camera. The computer system includes a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receiving perioperative data from a surgical device; determining a surgical context based at least in part on the perioperative data; receiving, via the camera, an image of an individual; determining a physical characteristic of the individual from the image; retrieving a baseline physical characteristic corresponding to the surgical context; and determining whether the physical characteristic of the individual deviates from the baseline physical characteristic.
In another general aspect, a computer-implemented method for tracking physical characteristics of an individual is provided. The method comprises the following steps: receiving, by a computer system, perioperative data from a surgical device; determining, by the computer system, a surgical context based at least in part on the perioperative data; receiving, by the computer system, an image of the individual via a camera communicatively coupled to the computer system; determining, by the computer system, a physical characteristic of the individual from the image; retrieving, by the computer system, a baseline physical characteristic corresponding to the surgical context; and determining, by the computer system, whether the physical characteristic of the individual deviates from a baseline physical characteristic.
In yet another general aspect, the present disclosure is directed to a computer system configured to be communicably coupleable to a surgical device and a camera. The computer system includes a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receiving perioperative data from a surgical device; determining a surgical context based at least in part on the perioperative data; receiving, via the camera, an image of an individual; determining a physical characteristic of the individual from the image; transmitting data identifying the physical characteristic and the surgical context to a remote computer system; wherein the remote computer system determines a baseline physical characteristic corresponding to the surgical context and the physical characteristic from data aggregated from a plurality of computer systems connected to the remote computer system; and receiving from the remote computer system whether the physical characteristic of the individual deviates from the baseline physical characteristic.
Drawings
The various aspects (relating to surgical tissues and methods) described herein, together with further objects and advantages thereof, may best be understood by reference to the following description taken in conjunction with the accompanying drawings.
Fig. 1 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.
Fig. 2 is a surgical system for performing a surgical procedure in an operating room according to at least one aspect of the present disclosure.
Fig. 3 is a surgical hub paired with a visualization system, a robotic system, and a smart instrument according to at least one aspect of the present disclosure.
Fig. 4 is a partial perspective view of a surgical hub housing and a composite generator module slidably received in a drawer of the surgical hub housing according to at least one aspect of the present disclosure.
Fig. 5 is a perspective view of a combined generator module having bipolar, ultrasonic and monopolar contacts and a smoke evacuation component according to at least one aspect of the present disclosure.
Fig. 6 illustrates an individual power bus attachment for a plurality of lateral docking ports of a lateral modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.
Fig. 7 illustrates a vertical modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.
Fig. 8 illustrates a surgical data network including a modular communication hub configured to connect modular devices located in one or more operating rooms of a medical facility or any room in the medical facility dedicated to surgical procedures to a cloud in accordance with at least one aspect of the present disclosure.
Fig. 9 is a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.
Fig. 10 illustrates a surgical hub including a plurality of modules coupled to a modular control tower according to at least one aspect of the present disclosure.
Fig. 11 illustrates one aspect of a Universal Serial Bus (USB) hub device in accordance with at least one aspect of the present disclosure.
Fig. 12 is a block diagram of a cloud computing system including a plurality of smart surgical instruments coupled to a surgical hub connectable to cloud components of the cloud computing system in accordance with at least one aspect of the present disclosure.
Fig. 13 is a functional module architecture of a cloud computing system according to at least one aspect of the present disclosure.
Fig. 14 illustrates a diagram of a situation-aware surgical system in accordance with at least one aspect of the present disclosure.
Fig. 15 is a timeline depicting situational awareness of a surgical hub, according to at least one aspect of the present disclosure.
Fig. 16 is a diagram of an exemplary Operating Room (OR) setup, according to at least one aspect of the present disclosure.
Fig. 17 is a logic flow diagram of a process for visually evaluating a surgical person according to at least one aspect of the present disclosure.
Fig. 18 is a diagram illustrating a series of models of a surgical person during a procedure of a surgical procedure in accordance with at least one aspect of the present disclosure.
Fig. 19 is a diagram depicting measured poses of the surgical personnel shown in fig. 18 over time, in accordance with at least one aspect of the present disclosure.
Fig. 20 is a depiction of a surgeon holding a surgical instrument according to at least one aspect of the present disclosure.
Fig. 21 is a scatter plot of wrist angle versus surgical outcome in accordance with at least one aspect of the present disclosure.
Detailed Description
The applicant of the present patent application owns the following U.S. patent applications filed 2018, 11/6, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application Ser. No. 16/182,224 entitled "SURGICAL NETWORK, INSTRUMENT, AND CLOUD RESPONSES BASED ONVALIDITION OF RECEIVED DATASET AND AUTHENTICATION OF ITS SOURCE ANDidignity";
U.S. patent application Ser. No. 16/182,230 entitled "SURGICAL SYSTEM FOR PRESENTING INFORMATION INTERPRETED FROMOXITENAL DATA";
U.S. patent application Ser. No. 16/182,233 entitled "MODIFICATION OF SURGICAL SYSTEMS CONTROL PROGRAMS BASED ONMACHINE LENING";
U.S. patent application Ser. No. 16/182,239 entitled "apparatus CONTROL program BASED ON STRATTIFIED CONTEXTUAL DATA IN ADDITION TO THE DATA";
U.S. patent application Ser. No. 16/182,243 entitled "SURGICAL HUB AND MODULAR DEVICE RESPONSE ADJUSTMENT BASED ONSIATIONAL AWARENESS";
U.S. patent application Ser. No. 16/182,248 entitled DETECTION AND evaluation OF Security RESPONSES OF SURGICALINSTRUTRUMENTS TO INCREASING SEVERITY THREATS;
U.S. patent application Ser. No. 16/182,251, entitled "INTERACTIVE SURGICAL SYSTEM";
U.S. patent application 16/182,260 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ONPREDEFINED PARAMETERS WITHIN SURGICAL NETWORKS";
U.S. patent application Ser. No. 16/182,267 entitled "SENSING THE PATIENT POSITION AND continuous timing same MONO-POLAR RETURN PAD ELECTRODE TO program lateral aware TO A SURGICALNETWORK";
U.S. patent application 16/182,249 entitled "POWER SURGICAL TOOL WITH PREDEFINED ADJUSTABLE CONTROLLALGORITHM FOR CONTROLLING END EFFECTOR PARAMETER";
U.S. patent application Ser. No. 16/182,246 entitled "ADJUSTMENTS BASED ON AIRBORNEPARATICLES PROPERTIES";
U.S. patent application Ser. No. 16/182,256 entitled "ADAJUSTMENT OF A SURGICAL DEVICE FUNCTION BASED ONSIATIONAL AWARENESS";
U.S. patent application 16/182,242 entitled "REAL-TIME ANALYSIS OF COMPREHENSIVE COST OF ALLINSTRUMENTATION USE IN SURGERY UTILIZING DATA FLUIDITY TO TRACK INSTRUMENTS THROUGH STOCKING AND IN-HOUSE PROCESSES";
U.S. patent application 16/182,269 entitled "IMAGE CAPTURING OF THE E AREAS OUTSIDE THE ABDOMEN TO IMPROVEPLEACEMENT AND CONTROL OF A SURGICAL DEVICE IN USE";
U.S. patent application 16/182,278 entitled "COMMUNICATION OF DATA WHERE A SURGICAL NETWORK USINGCONT OF THE DATA AND REQUIREMENTS OF A RECEIVING SYSTEM/USER TO INFLUENCE ENCONCE OR LINKAGE OF DATA AND METADATA TO ESTABLISH CONTENT;
U.S. patent application 16/182,290 entitled "SURGICAL NETWORK RECOMMETATIONS FROM REAL TIME ANALYSIS OF PROCESSING VARIABLE AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THEOTIMAL SOLUTION";
U.S. patent application Ser. No. 16/182,232 entitled "CONTROL OF A SURGICAL SYSTEM THROUGH A SURGICAL BARRIER";
U.S. patent application Ser. No. 16/182,227 entitled "SURGICAL NETWORK DETERMINATION OF PRIORITIATION OF COMMUNICATION, INTERACTION, OR PROCESSING BASED SYSTEM OR DEVICE NEEDS";
U.S. patent application Ser. No. 16/182,231 entitled "WIRELESS PAIRING OF SURGICAL DEVICE WITH ANOTHER DEVICEWITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESSOF DEVICES";
U.S. patent application 16/182,229 entitled "ADJUSTMENT OF STAPLE HEIGHT OF AT LEAST ONE ROW OF STAPLESBASED ON THE SENSED TISSUE THICKNESS OR FOR THE CLENING";
U.S. patent application Ser. No. 16/182,234 entitled "STAPLING DEVICE WITH BOTH COMPOSITIONY AND DISCRITIONARYLOCKOUTS BASED ON SENSED PARAMETERS";
U.S. patent application 16/182,240 entitled "POWER STAPLING DEVICE CONFIRED TO ADJUSE FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER BASED ON SENSEPARAMETER OF FIRING OR CLAMPING";
U.S. patent application 16/182,235 entitled "VARIATION OF RADIO FREQUENCY LEVEL AND ultra silicon POWER LEVEL information WITH VARYING CLAMP ARM PRESSURE TO ACHIEVE PREDEFINED HEAT FLUXOR POWER application TO time"; and
U.S. patent application 16/182,238 entitled "ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BCLAMP ARM TO PROVIDE THRESHOLD CONTROL AT A CUT PROGRESONLOCATION".
The applicant of the present patent application owns the following U.S. patent applications filed on 2018, 9, 10, the disclosure of each of which is incorporated herein by reference in its entirety:
entitled "A CONTROL FOR A SURGICAL NETWORK OR SURGICAL NETWORK CONNECTED DEVICE THAT ADJUTS ITS FUNCTION BASED A SENSED STATION ORUSAGE" U.S. provisional patent application No. 62/729,183;
U.S. provisional patent application No. 62/729,177 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ONPREDEFINED PARAMETERS WITHIN A SURGICAL NETWORK BEFORE TRANSMISSION";
U.S. provisional patent application No. 62/729,176 entitled "INDIRECT COMMAND CONTROL OF A FIRST OPERATING ROOMSYSTEM THROUGH THE USE OF A SECOND OPERATING ROOM SYSTEM WITHIN A STERILEFIELD WHERE THE SECOND OPERATING ROOM SYSTEM HAS PRIMARY AND SECONDARY OPERATING MODES";
U.S. provisional patent application No. 62/729,185 entitled "POWER STAPLING DEVICE THAT IS CAPABLE OF ADJUSE FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER OF THE DEVICE BASED ONSED PARAMETER OF FIRING OR CLAMPING";
U.S. provisional patent application No. 62/729,184 entitled "POWER SURGICAL TOOL WITH A PREDEFINED ADJUSTABLE CONTROLLALGORITHM FOR CONTROLLING AT LEAST ONE END EFFECTOR PARAMETER AND A MEANS FOR RLIMITING THE ADJUSTMENT";
U.S. provisional patent application No. 62/729,182 entitled "SENSING THE PATIENT POSITION AND CONTACT UTILIZING THE MONOPOLAR RETURN PAD ELECTRO TO PROVIDE SITUATIONAL AWARENESS TO THE HUB";
U.S. provisional patent application No. 62/729,191 entitled "SURGICAL NETWORK RECOMMETATIONS FROM REAL TIME ANALYSIS OF PROCESSING VARIABLE AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THEOTPIAL SOLUTION";
U.S. provisional patent application No. 62/729,195 entitled "ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BCLAMP ARM TO PROVIDE THRESHOLD CONTROL AT A CUT PROGRESONLOCATION"; and
U.S. provisional patent application No. 62/729,186 entitled "WIRELESS PAIRING OF A SURGICAL DEVICE WITH ANOTHER DEVICEWITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESSOF DEVICES".
The applicant of the present patent application owns the following U.S. patent applications filed on 28/8/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application Ser. No. 16/115,214 entitled "ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL STYSTEM THEREFOR";
U.S. patent application Ser. No. 16/115,205 entitled "TEMPERATURE CONTROL OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR";
U.S. patent application Ser. No. 16/115,233 entitled "RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMMUNICATICAL SIGNALS";
U.S. patent application Ser. No. 16/115,208 entitled "CONTROL AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TOTISSUE LOCATION";
U.S. patent application Ser. No. 16/115,220 entitled "CONTROL ACTITION OF AN ULTRASONIC SURGICAL INSTRUMENTACCORDING TO THE PRESENCE OF TISSUE";
U.S. patent application Ser. No. 16/115,232, entitled "DETERMINING TISSUE COMPOSITION VIA AN ULTRASONIC SYSTEM";
U.S. patent application Ser. No. 16/115,239 entitled "DETERMINING THE STATE OF AN ULTRASONIC ELECTROMECHANICAL SYSTEM ACCORDING TO FREQUENCY SHIFT";
U.S. patent application Ser. No. 16/115,247 entitled "DETERMINING THE STATE OF AN ULTRASONIC END EFFECTOR";
U.S. patent application Ser. No. 16/115,211 entitled "STATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS";
U.S. patent application Ser. No. 16/115,226, entitled "MECHANISMS FOR CONTROLLING DIFFERENT ELECTROMECHANICALSYSTEMS OF AN ELECTROSURGICAL INSTRUMENT";
U.S. patent application Ser. No. 16/115,240 entitled "DETECTION OF END EFFECTOR IN LIQUID identification";
U.S. patent application Ser. No. 16/115,249 entitled "INTERRUPTION OF ENGERING DUE TO INDVERTENT CAPACITIVEECOUPLING";
U.S. patent application Ser. No. 16/115,256 entitled "INCREASING RADIO FREQUENCY TO CREATE PAD-LES MONOPOLARLOOP";
U.S. patent application Ser. No. 16/115,223 entitled "BIPOLAR COMMUNICATION DEVICE THAT AUTOMATICALLY ADJUSTSPRESSURE BASED ON ENERGY MODALITY"; and
U.S. patent application Ser. No. 16/115,238 entitled "ACTIVATION OF ENERGY DEVICES".
The applicant of the present patent application owns the following U.S. patent applications filed on 23.8.2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. provisional patent application No. 62/721,995 entitled "control AN ULTRASONIC scientific relating LOCATION";
U.S. provisional patent application Ser. No. 62/721,998, entitled "SIGTUATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS";
U.S. provisional patent application No. 62/721,999, entitled "INTERRUPTION OF ENGERING DUE TO INDVERTENT CAPACITIVEECOUPLING";
U.S. provisional patent application No. 62/721,994 entitled "BIPOLAR COMMUNICATION DEVICE THAT AUTOMATICALLY ADJUSTSPRESSURE BASED ON ENERGY MODALITY"; and
U.S. provisional patent application No. 62/721,996 entitled "RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMMUNICATICAL SIGNALS".
The applicant of the present patent application owns the following U.S. patent applications filed on 30.6.2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. provisional patent application No. 62/692,747 entitled "SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE";
U.S. provisional patent application No. 62/692,748, entitled "SMART ENERGY ARCHITECTURE"; and
U.S. provisional patent application No. 62/692,768 entitled "SMART ENERGY DEVICES".
The applicant of the present patent application owns the following U.S. patent applications filed on 29.6.2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application Ser. No. 16/024,090 entitled "CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAYELEMENTS";
U.S. patent application Ser. No. 16/024,057 entitled "control A SURGICAL INSTRUMENT ACCORDING TO SENSECLOSURE PARAMETERS";
U.S. patent application Ser. No. 16/024,067 entitled "SYSTEM FOR ADJUSTING END EFFECTOR PARAMETERS BASED ONPERIORATIVE INFORMATION";
U.S. patent application Ser. No. 16/024,075 entitled "SAFETY SYSTEMS FOR SMART POWER SURGICAL STAPLING";
U.S. patent application Ser. No. 16/024,083, entitled "SAFETY SYSTEMS FOR SMART POWER SURGICAL STAPLING";
U.S. patent application Ser. No. 16/024,094 entitled "SURGICAL SYSTEMS FOR DETECTING END EFFECTOR TISSUEDISTRIBUTION IRREGULARITIES";
U.S. patent application Ser. No. 16/024,138 entitled "SYSTEM FOR DETECTING PROXIMITY OF SURGICAL END EFFECTOR TO CANCEROUS TISSUE";
U.S. patent application Ser. No. 16/024,150 entitled "SURGICAL INSTRUMENT CARTRIDGE SENSOR ASSEMBLIES";
U.S. patent application Ser. No. 16/024,160 entitled "VARIABLE OUTPUT CARTRIDGE SENSOR ASSEMBLY";
U.S. patent application Ser. No. 16/024,124 entitled "SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE";
U.S. patent application Ser. No. 16/024,132 entitled "SURGICAL INSTRUMENT HAVING A FLEXIBLE CIRCUIT";
U.S. patent application Ser. No. 16/024,141 entitled "SURGICAL INSTRUMENT WITH A TISSUE MARKING ASSEMBLY";
U.S. patent application Ser. No. 16/024,162 entitled "SURGICAL SYSTEMS WITH PRIORIZED DATA TRANSMISSIONCAPABILITIES";
U.S. patent application Ser. No. 16/024,066 entitled "SURGICAL EVACUTION SENSING AND MOTOR CONTROL";
U.S. patent application Ser. No. 16/024,096 entitled "SURGICAL EVACUTION SENSOR ARRANGEMENTS";
U.S. patent application Ser. No. 16/024,116 entitled "SURGICAL EVACUATION FLOW PATHS";
U.S. patent application Ser. No. 16/024,149 entitled "SURGICAL EVACUTION SENSING AND GENERATOR CONTROL";
U.S. patent application Ser. No. 16/024,180 entitled "SURGICAL EVACUTION SENSING AND DISPLAY";
U.S. patent application Ser. No. 16/024,245 entitled "COMMUNICATION OF SMOKE EVACUTION SYSTEM PARAMETERS TO HUBOR CLOUD IN SMOKE EVACUTION MODULATE FOR INTERACTIVE SURGICAL PLATFORM";
U.S. patent application Ser. No. 16/024,258 entitled "SMOKE EVACUTION SYSTEM INCLUDING A SEGMENTED CONTROL IRCUIT FOR INTERACTIVE SURGICAL PLATFORM";
U.S. patent application Ser. No. 16/024,265 entitled "SURGICAL EVACATION SYSTEM WITH A COMMUNICATION CIRCUIT FORCOMMUNICATION BETWEEN A FILTER AND A SMOKE EVACATION DEVICE"; and
U.S. patent application Ser. No. 16/024,273 entitled "DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 6/28, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. provisional patent application Ser. No. 62/691,228 entitled "A Method of using a cured fluid circuits with multiple sensors with electronic devices";
U.S. provisional patent application Ser. No. 62/691,227 entitled "controlling a surgical instrument recording to sensing closure parameters";
U.S. provisional patent application Ser. No. 62/691,230 entitled "SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE";
U.S. provisional patent application Ser. No. 62/691,219 entitled "SURGICAL EVACUTION SENSING AND MOTOR CONTROL";
U.S. provisional patent application Ser. No. 62/691,257 entitled "COMMUNICATION OF SMOKE EVACUTION SYSTEM PARAMETERS TO HUBOR CLOUD IN SMOKE EVACUTION MODULATE FOR INTERACTIVE SURGICAL PLATFORM";
U.S. provisional patent application Ser. No. 62/691,262 entitled "SURGICAL EVACATION SYSTEM WITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND A SMOKE EVACATION DEVICE"; and
U.S. provisional patent application Ser. No. 62/691,251, entitled "DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 4, 19, the disclosures of which are incorporated herein by reference in their entirety:
U.S. provisional patent application Ser. No. 62/659,900 entitled "METHOD OF HUB COMMUNICATION".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 30/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. provisional patent application No. 62/650,898 entitled "CAPACITIVE COUPLED RETURN PATH PAD WITHSEPARABLE ARRAY ELEMENTS," filed on 30/3.2018;
U.S. provisional patent application Ser. No. 62/650,887 entitled "SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES";
U.S. patent application Ser. No. 62/650,882 entitled "SMOKE EVACUTION MODULE FOR INTERACTIVE SURGICAL PLATFORM"; and
U.S. patent application Ser. No. 62/650,877 entitled "SURGICAL SMOKE EVACUTION SENSING AND CONTROL".
The applicant of the present patent application owns the following U.S. patent applications filed on 29/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application Ser. No. 15/940,641, entitled "INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATIONCAPABILITIES";
U.S. patent application Ser. No. 15/940,648, entitled "INTERACTIVE SURGICAL SYSTEMS WITH Conditioning HANDLING OFDEVICES AND DATA CAPABILITIES";
U.S. patent application Ser. No. 15/940,656 entitled "SURGICAL HUB COORDINATION OF CONTROL AND COMMUNICATION OF OPERATION DEVICES";
U.S. patent application Ser. No. 15/940,666 entitled "SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING ROOMS";
U.S. patent application Ser. No. 15/940,670 entitled "Cooling timing OF DATA DERIVED FROM SECONDARYSOURCES BY INTELLIGENT SURGICAL HUBS";
U.S. patent application Ser. No. 15/940,677 entitled "SURGICAL HUB CONTROL ARRANGEMENTS";
U.S. patent application Ser. No. 15/940,632, entitled "DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND DCREATE ANONYMIZED RECORD";
U.S. patent application Ser. No. 15/940,640 entitled "COMMUNICATION HUB AND STORAGE DEVICE FOR STORING PARAMETERSAND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICSSYSTEMS";
U.S. patent application Ser. No. 15/940,645, entitled "SELF DESCRIPTING DATA PACKETS GENERATED AT AN ISSUING GINSTRUMENT";
U.S. patent application Ser. No. 15/940,649 entitled "DATA PAIRING TO INTERCONNECT A DEVICE MEASURED PARAMETERWITH AN OUTCOME";
U.S. patent application Ser. No. 15/940,654 entitled "SURGICAL HUB SITUATIONAL AWARENESS";
U.S. patent application Ser. No. 15/940,663 entitled "SURGICAL SYSTEM DISTRIBUTED PROCESSING";
U.S. patent application Ser. No. 15/940,668 entitled "AGGREGAGATION AND REPORTING OF SURGICAL HUB DATA";
U.S. patent application Ser. No. 15/940,671, entitled "SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES INOPERATING THEEATER";
U.S. patent application Ser. No. 15/940,686 entitled "DISPLAY OF ALIGNMENT OF STAPLE CARTRIDGE TO PRIOR LINEARSTAPLE LINE";
U.S. patent application Ser. No. 15/940,700, entitled "STERILE FIELD INTERACTIVE CONTROL DISPLAYS";
U.S. patent application Ser. No. 15/940,629, entitled "COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS";
U.S. patent application Ser. No. 15/940,704, entitled "USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";
U.S. patent application Ser. No. 15/940,722 entitled "CHARACTERIZATION OF TISSUE IRREGULARITIES THROUGH THE USE OFMONO-CHROMATIC LIGHT REFRACTIVITY";
U.S. patent application Ser. No. 15/940,742 entitled "DUAL CMOS ARRAY IMAGING";
U.S. patent application Ser. No. 15/940,636 entitled "ADAPTIVE CONTROL PROGRAM UPDATES FOR basic DEVICES";
U.S. patent application Ser. No. 15/940,653, entitled "ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL HUBS";
U.S. patent application Ser. No. 15/940,660 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR CUTOSTOMIZATION AND DRECOMMERATIONS TO A USER";
U.S. patent application Ser. No. 15/940,679 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR LINKING OF LOCAL USAGETRENDS WITH THE RESOURCE ACQUISITION BEHAVORS OF LARGER DATA SET";
U.S. patent application Ser. No. 15/940,694 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR MEDICAL FACILITY SEGMENTED DIVIDIONUALIZATION OF INSTRUMENTS FUNCTION";
U.S. patent application Ser. No. 15/940,634, entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR SECURITY ANDAUTHENTICATION TRENDS AND REACTIVE MEASURES";
U.S. patent application Ser. No. 15/940,706 entitled "DATA HANDLING ANDPRIORITIZATION IN A CLOUD ANALYTICSNETWORK";
U.S. patent application Ser. No. 15/940,675 entitled "CLOOUD INTERFACE FOR COUPLED SURGICAL DEVICES";
U.S. patent application Ser. No. 15/940,627 entitled "DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,637 entitled "COMMUNICATION ARRANGEMENTS FOR ROBOT-ASSISTED SURGICATALLATFORMS";
U.S. patent application Ser. No. 15/940,642 entitled "CONTROL FOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,676 entitled "AUTOMATIC TOOL ADJUSTMENT FOR R ROBOT-ASSISTED SURGICATALLATOMS";
U.S. patent application Ser. No. 15/940,680 entitled "CONTROL FOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,683 entitled "passenger activities FOR ROBOT-associated passengers";
U.S. patent application Ser. No. 15/940,690 entitled "DISPLAY ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS"; and
U.S. patent application Ser. No. 15/940,711 entitled "SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 3, 28, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. provisional patent application Ser. No. 62/649,302 entitled "INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATIONCAPABILITIES";
U.S. provisional patent application Ser. No. 62/649,294 entitled "DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND DCREATE ANONYMIZED RECORD";
U.S. provisional patent application Ser. No. 62/649,300 entitled "SURGICAL HUB SITUONAL AWARENESS";
U.S. provisional patent application Ser. No. 62/649,309 entitled "SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES INOPERATING THEEATER";
U.S. provisional patent application Ser. No. 62/649,310, entitled "COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS";
U.S. provisional patent application Ser. No. 62/649,291 entitled "USE OF LASER LIGHT AND RED-GREEN COLORATION DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";
U.S. provisional patent application Ser. No. 62/649,296, entitled "ADAPTIVE CONTROL PROGRAM UPDATES FOR basic DEVICES";
U.S. provisional patent application Ser. No. 62/649,333 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR CUTOSTOMIZATION AND DRECOMMENDATIONS TO A USER";
U.S. provisional patent application Ser. No. 62/649,327 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR SECURITY ANDAUTHENTATION TRENDS AND REACTIVE MEASURES";
U.S. provisional patent application Ser. No. 62/649,315 entitled "DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICSNETWORK";
U.S. provisional patent application Ser. No. 62/649,313 entitled "CLOOUD INTERFACE FOR COUPLED SURGICAL DEVICES";
U.S. provisional patent application Ser. No. 62/649,320 entitled "DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. provisional patent application Ser. No. 62/649,307 entitled "AUTOMATIC TOOL ADJUSTMENT FOR ROBOT-ASSISTED SURGICATALLATOMS"; and
U.S. provisional patent application Ser. No. 62/649,323, entitled "SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 8/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. provisional patent application Ser. No. 62/640,417 entitled "TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM FOR"; and
U.S. provisional patent application Ser. No. 62/640,415 entitled "ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEMS THEREFOR".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2017, 12, 28, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. provisional patent application Ser. No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM";
U.S. provisional patent application Ser. No. 62/611,340, entitled "CLOOUD-BASED MEDICAL ANALYTICS"; and
U.S. patent application Ser. No. 62/611,339, entitled "ROBOT ASSISTED SURGICAL PLATFORM".
Before explaining various aspects of the surgical device and generator in detail, it should be noted that the example illustrated application or use is not limited to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented alone or in combination with other aspects, variations and modifications, and may be practiced or carried out in various ways. Furthermore, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative embodiments for the convenience of the reader and are not for the purpose of limiting the invention. Moreover, it is to be understood that expressions of one or more of the following described aspects, and/or examples may be combined with any one or more of the other below described aspects, and/or examples.
Surgical hub
Referring to fig. 1, a computer-implemented interactive surgical system 100 includes one or more
Fig. 2 shows an example of a
Other types of robotic systems may be readily adapted for use with the
Various examples of CLOUD-BASED analysis performed by CLOUD 104 and suitable for use with the present disclosure are described in U.S. provisional patent application serial No. 62/611,340 entitled "CLOUD-BASED MEDICAL ANALYTICS," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety.
In various aspects, the
The optical components of the
The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum (sometimes referred to as the optical spectrum or the luminescence spectrum) is that portion of the electromagnetic spectrum that is visible to (i.e., detectable by) the human eye, and may be referred to as visible light or simple light. A typical human eye will respond to wavelengths in air from about 380nm to about 750 nm.
The invisible spectrum (i.e., the non-luminescent spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380nm and above about 750 nm). The human eye cannot detect the invisible spectrum. Wavelengths greater than about 750nm are longer than the red visible spectrum and they become invisible Infrared (IR), microwave and radio electromagnetic radiation. Wavelengths less than about 380nm are shorter than the violet spectrum and they become invisible ultraviolet, x-ray and gamma-ray electromagnetic radiation.
In various aspects, the
In one aspect, the imaging device employs multispectral monitoring to distinguish topography from underlying structures. A multispectral image is an image that captures image data across a particular range of wavelengths of the electromagnetic spectrum. The wavelengths may be separated by filters or by using instruments that are sensitive to specific wavelengths, including light from frequencies outside the visible range, such as IR and ultraviolet. Spectral imaging may allow extraction of additional information that the human eye fails to capture with its red, green, and blue receptors. The use of multispectral Imaging is described in more detail under the heading "Advanced Imaging Acquisition Module" of U.S. provisional patent application Ser. No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety. Multispectral monitoring may be a useful tool for repositioning the surgical site after completion of a surgical task to perform one or more of the previously described tests on the treated tissue.
It is self-evident that strict sterilization of the operating room and surgical equipment is required during any surgical procedure. The stringent hygiene and sterilization conditions required in a "surgical room" (i.e., an operating room or treatment room) require the highest possible sterility of all medical devices and equipment. Part of this sterilization process is any substance that needs to be sterilized, including the
In various aspects, the
As shown in fig. 2, a
In one aspect,
Referring to fig. 2, a
Referring now to fig. 3,
During surgery, the application of energy to tissue for sealing and/or cutting is typically associated with smoke evacuation, aspiration of excess fluid, and/or irrigation of the tissue. Fluid lines, power lines and/or data lines from different sources are often tangled during surgery. Valuable time may be lost in addressing the problem during surgery. Disconnecting the lines may require disconnecting the lines from their respective modules, which may require resetting the modules. The hub
Aspects of the present disclosure provide a surgical hub for use in a surgical procedure involving application of energy to tissue at a surgical site. The surgical hub includes a hub housing and a composite generator module slidably received in a docking station of the hub housing. The docking station includes data contacts and power contacts. The combined generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component seated in a single cell. In one aspect, the combined generator module further comprises a smoke evacuation device, at least one energy delivery cable for connecting the combined generator module to a surgical instrument, at least one smoke evacuation device configured to evacuate smoke, fluids and/or particles generated by application of therapeutic energy to tissue, and a fluid line extending from the remote surgical site to the smoke evacuation device.
In one aspect, the fluid line is a first fluid line and the second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub housing. In one aspect, the hub housing includes a fluid interface.
Certain surgical procedures may require more than one energy type to be applied to tissue. One energy type may be more advantageous for cutting tissue, while a different energy type may be more advantageous for sealing tissue. For example, a bipolar generator may be used to seal tissue, while an ultrasonic generator may be used to cut the sealed tissue. Aspects of the present disclosure provide a solution in which the hub
Aspects of the present disclosure provide a modular surgical housing for use in a surgical procedure involving the application of energy to tissue. The modular surgical housing includes a first energy generator module configured to generate a first energy for application to tissue, and a first docking station including a first docking port including a first data and power contact, wherein the first energy generator module is slidably movable into electrical engagement with the power and data contact, and wherein the first energy generator module is slidably movable out of electrical engagement with the first power and data contact,
as further described above, the modular surgical housing further includes a second energy generator module configured to generate a second energy different from the first energy for application to tissue, and a second docking station including a second docking port including second data and power contacts, wherein the second energy generator module is slidably movable into electrical engagement with the power and data contacts, and wherein the second energy generator is slidably movable out of electrical contact with the second power and data contacts.
In addition, the modular surgical housing further includes a communication bus between the first docking port and the second docking port configured to facilitate communication between the first energy generator module and the second energy generator module.
Referring to fig. 3-7, aspects of the present disclosure are presented as a hub
In one aspect, the hub
In one aspect, the hub
In various aspects, the
In various aspects, the suction/
In one aspect, a surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, a suction tube, and an irrigation tube. The draft tube may have an inlet at a distal end thereof, and the draft tube extends through the shaft. Similarly, a draft tube may extend through the shaft and may have an inlet adjacent the energy delivery tool. The energy delivery tool is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the
The irrigation tube may be in fluid communication with a fluid source, and the aspiration tube may be in fluid communication with a vacuum source. The fluid source and/or vacuum source may be seated in the suction/
In one aspect, the
In some aspects, the
In addition, the contacts of a particular module may be keyed to engage the contacts of a particular drawer to avoid inserting the module into a drawer having unmatched contacts.
As shown in fig. 4, the
Fig. 6 illustrates a separate power bus attachment for a plurality of lateral docking ports of a lateral
Fig. 7 illustrates a vertical modular housing 164 configured to receive a plurality of modules 165 of
In various aspects, the
During a surgical procedure, it may be inefficient to remove a surgical device from a surgical site and replace the surgical device with another surgical device that includes a different camera or a different light source. Temporary loss of vision at the surgical site can lead to undesirable consequences. The modular imaging apparatus of the present disclosure is configured to enable midstream replacement of a light source module or a camera module during a surgical procedure without having to remove the imaging apparatus from the surgical site.
In one aspect, an imaging device includes a tubular housing including a plurality of channels. The first channel is configured to slidably receive a camera module that may be configured for snap-fit engagement with the first channel. The second channel is configured to slidably receive a light source module that may be configured for snap-fit engagement with the second channel. In another example, the camera module and/or the light source module may be rotated within their respective channels to a final position. Threaded engagement may be used instead of snap-fit engagement.
In various examples, multiple imaging devices are placed at different locations in a surgical field to provide multiple views. The
Various IMAGE PROCESSORs AND imaging devices suitable for use in the present disclosure are described in U.S. patent No. 7,995,045 entitled "COMBINED SBI AND associated IMAGE PROCESSOR" published on 9.8.2011, which is incorporated herein by reference in its entirety. Further, U.S. patent No. 7,982,776 entitled "SBI MOTION artifact AND METHOD," published 7/19/2011, which is incorporated by reference herein in its entirety, describes various systems for removing MOTION artifacts from image data. Such a system may be integrated with the
Fig. 8 illustrates a surgical data network 201 including a modular communication hub 203, the modular communication hub 203 configured to enable connection of modular devices located in one or more operating rooms of a medical facility or any room in the medical facility specifically equipped for surgical operations to a cloud-based system (e.g., a cloud 204 that may include a remote server 213 coupled to a storage device 205). In one aspect, modular communication hub 203 includes a network hub 207 and/or a network switch 209 that communicate with network routers. Modular communication hub 203 may also be coupled to local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 may be configured to be passive, intelligent, or switched. The passive surgical data network acts as a conduit for data, enabling it to be transferred from one device (or segment) to another device (or segment) as well as cloud computing resources. The intelligent surgical data network includes additional features to enable monitoring of traffic through the surgical data network and to configure each port in the hub 207 or network switch 209. The intelligent surgical data network may be referred to as a manageable hub or switch. The switching hub reads the destination address of each packet and then forwards the packet to the correct port.
Modular devices 1a-1n located in an operating room may be coupled to a modular communication hub 203. Network hub 207 and/or network switch 209 may be coupled to network router 211 to connect devices 1a-1n to cloud 204 or local computer system 210. Data associated with the devices 1a-1n may be transmitted via the router to the cloud-based computer for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transmitted to the local computer system 210 for local data processing and manipulation. Modular devices 2a-2m located in the same operating room may also be coupled to network switch 209. Network switch 209 may be coupled to network hub 207 and/or network router 211 to connect devices 2a-2m to cloud 204. Data associated with the devices 2a-2n may be transmitted via the network router 211 to the cloud 204 for data processing and manipulation. Data associated with the devices 2a-2m may also be transmitted to the local computer system 210 for local data processing and manipulation.
It should be understood that surgical data network 201 may be expanded by interconnecting multiple hubs 207 and/or multiple network switches 209 with multiple network routers 211. The modular communication hub 203 may be contained in a modular control tower configured to be capable of receiving a plurality of devices 1a-1n/2a-2 m. Local computer system 210 may also be contained in a modular control tower. The modular communication hub 203 is connected to a display 212 to display images obtained by some of the devices 1a-1n/2a-2m, for example, during surgery. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as non-contact sensor modules in an
In one aspect, the surgical data network 201 may include a combination of network hub(s), network switch (es), and network router(s) that connect the devices 1a-1n/2a-2m to the cloud. Any or all of the devices 1a-1n/2a-2m coupled to the hub or network switch may collect data in real time and transmit the data into the cloud computer for data processing and manipulation. It should be appreciated that cloud computing relies on shared computing resources rather than using local servers or personal devices to process software applications. The term "cloud" may be used as a metaphor for "internet," although the term is not so limited. Accordingly, the term "cloud computing" may be used herein to refer to a "type of internet-based computing" in which different services (such as servers, memory, and applications) are delivered to modular communication hub 203 and/or computer system 210 located in a surgical room (e.g., a fixed, mobile, temporary, or live operating room or space) and devices connected to modular communication hub 203 and/or computer system 210 over the internet. The cloud infrastructure may be maintained by a cloud service provider. In this case, the cloud service provider may be an entity that coordinates the use and control of the devices 1a-1n/2a-2m located in one or more operating rooms. Cloud computing services can perform a large amount of computing based on data collected by smart surgical instruments, robots, and other computerized devices located in the operating room. The hub hardware enables multiple devices or connections to connect to a computer in communication with the cloud computing resources and memory.
Applying cloud computer data processing techniques to the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical results, reduced costs and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to observe tissue conditions to assess leakage or perfusion of sealed tissue following tissue sealing and cutting procedures. At least some of the devices 1a-1n/2a-2m may be employed to identify pathologies, such as the effects of disease, using cloud-based computing to examine data including images of body tissue samples for diagnostic purposes. This includes localization and edge confirmation of tissues and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using various sensors integrated with imaging devices and techniques, such as overlaying images captured by multiple imaging devices. The data (including image data) collected by the devices 1a-1n/2a-2m may be transmitted to the cloud 204 or the local computer system 210 or both for data processing and manipulation, including image processing and manipulation. Such data analysis may further employ outcome analysis processing, and use of standardized methods may provide beneficial feedback to confirm or suggest modification of the behavior of the surgical treatment and surgeon.
In one implementation, the operating room devices 1a-1n may be connected to the modular communication hub 203 through a wired channel or a wireless channel, depending on the configuration of the devices 1a-1n to the network hub. In one aspect, hub 207 may be implemented as a local network broadcaster operating at the physical layer of the Open Systems Interconnection (OSI) model. The hub provides connectivity to devices 1a-1n located in the same operating room network. The hub 207 collects the data in the form of packets and transmits it to the router in half duplex mode. Hub 207 does not store any media access control/internet protocol (MAC/IP) used to transmit device data. Only one of the devices 1a-1n may transmit data through the hub 207 at a time. The hub 207 does not have routing tables or intelligence as to where to send information and broadcast all network data on each connection and to the remote server 213 (fig. 9) through the cloud 204. Hub 207 may detect basic network errors such as conflicts, but broadcasting all information to multiple ports may present a security risk and lead to bottlenecks.
In another implementation, the operating room devices 2a-2m may be connected to the network switch 209 via a wired channel or a wireless channel. Network switch 209 operates in the data link layer of the OSI model. The network switch 209 is a multicast device for connecting devices 2a-2m located in the same operating room to the network. Network switch 209 sends data in frames to network router 211 and operates in full duplex mode. Multiple devices 2a-2m may transmit data simultaneously through the network switch 209. The network switch 209 stores and uses the MAC addresses of the devices 2a-2m to transmit data.
Network hub 207 and/or network switch 209 are coupled to network router 211 to connect to cloud 204. Network router 211 operates in the network layer of the OSI model. Network router 211 creates a route for transmitting data packets received from network hub 207 and/or network switch 211 to the cloud-based computer resources for further processing and manipulation of data collected by any or all of devices 1a-1n/2a-2 m. Network router 211 may be employed to connect two or more different networks located at different locations, such as, for example, different operating rooms of the same medical facility or different networks located in different operating rooms of different medical facilities. Network router 211 sends data in packets to cloud 204 and operates in full duplex mode. Multiple devices may transmit data simultaneously. The network router 211 transmits data using the IP address.
In one example, hub 207 may be implemented as a USB hub, which allows multiple USB devices to be connected to a host. A USB hub may extend a single USB port to multiple tiers so that more ports are available for connecting devices to a host system computer. Hub 207 may include wired or wireless capabilities for receiving information over a wired channel or a wireless channel. In one aspect, a wireless USB short-range, high bandwidth wireless radio communication protocol may be used for communication between devices 1a-1n and devices 2a-2m located in an operating room.
In other examples, the operating room devices 1a-1n/2a-2m may communicate with the modular communication hub 203 via the Bluetooth wireless technology standard for exchanging data from stationary and mobile devices over short distances (using short wavelength UHF radio waves of 2.4 to 2.485GHz in the ISM band) and building Personal Area Networks (PANs). In other aspects, the operating room devices 1a-1n/2a-2m may communicate with the modular communication hub 203 via a variety of wireless or wired communication standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 series), WiMAX (IEEE 802.16 series), IEEE802.20, Long Term Evolution (LTE) and Ev-DO, HSPA +, HSDPA +, HSUPA +, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and ethernet derivatives thereof, as well as any other wireless and wired protocols designated 3G, 4G, 5G, and above. The computing module may include a plurality of communication modules. For example, a first communication module may be dedicated to shorter range wireless communications such as Wi-Fi and bluetooth, and a second communication module may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and the like.
The modular communication hub 203 may serve as a central connection for one or all of the operating room devices 1a-1n/2a-2m and handle a data type called a frame. The frames carry data generated by the devices 1a-1n/2a-2 m. When the modular communication hub 203 receives the frame, it is amplified and transmitted to the network router 211, which network router 211 transmits the data to the cloud computing resources using a plurality of wireless or wired communication standards or protocols as described herein.
Modular communication hub 203 may be used as a stand-alone device or connected to a compatible network hub and network switch to form a larger network. The modular communication hub 203 is generally easy to install, construct and maintain, making it a good option to network the operating room devices 1a-1n/2a-2 m.
Fig. 9 shows a computer-implemented interactive surgical system 200. The computer-implemented interactive surgical system 200 is similar in many respects to the computer-implemented interactive surgical system 100. For example, the computer-implemented interactive surgical system 200 includes one or more surgical systems 202 that are similar in many respects to the
Fig. 10 shows the surgical hub 206 including a plurality of modules coupled to a modular control tower 236. The modular control tower 236 includes a modular communication hub 203 (e.g., a network connectivity device) and a computer system 210 to provide, for example, local processing, visualization, and imaging. As shown in fig. 10, the modular communication hub 203 may be connected in a hierarchical configuration to expand the number of modules (e.g., devices) connectable to the modular communication hub 203 and transmit data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in fig. 10, each of the network hubs/switches in modular communication hub 203 includes three downstream ports and one upstream port. The upstream hub/switch is connected to the processor to provide a communication connection with the cloud computing resources and the local display 217. Communication with the cloud 204 may be through a wired or wireless communication channel.
The surgical hub 206 employs the non-contact sensor module 242 to measure dimensions of the operating room and uses ultrasound or laser type non-contact measurement devices to generate a map of the operating room. An ultrasound-based non-contact sensor module scans an Operating Room by emitting a burst of ultrasound waves and receiving echoes as they bounce off the enclosure of the Operating Room, as described under the heading "Surgical Hub Spatial aware with Operating Room" in U.S. provisional patent application serial No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, which is incorporated herein by reference in its entirety, wherein the sensor module is configured to be able to determine the size of the Operating Room and adjust the bluetooth pairing distance limit. The laser-based non-contact sensor module scans the operating room by emitting laser pulses, receiving laser pulses bouncing off the enclosure of the operating room, and comparing the phase of the emitted pulses to the received pulses to determine the size of the operating room and adjust the bluetooth paired distance limit.
Computer system 210 includes a processor 244 and a network interface 245. The processor 244 is coupled via a system bus to the communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface 251. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), micro Charmel architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), personal computer memory card international association bus (PCMCIA), Small Computer System Interface (SCSI), or any other peripheral bus.
The controller 244 may be any single-core or multi-core processor, such as those provided by Texas instruments under the tradename ARM Cortex. In one aspect, the processor may be a processor core available from, for example, Texas Instruments LM4F230H5QR ARM Cortex-M4F, which includes 256KB of on-chip memory of single cycle flash or other non-volatile memory (up to 40MHz), a prefetch buffer for improved execution above 40MHz, 32KB of single cycle Sequential Random Access Memory (SRAM), loaded with instructions
Software internal Read Only Memory (ROM), 2KB Electrically Erasable Programmable Read Only Memory (EEPROM), and/or one or more Pulse Width Modulation (PWM) modules, one or more Quadrature Encoder Input (QEI) analog, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, the details of which can be seen in the product data sheet.In one aspect, the processor 244 may comprise a safety controller comprising two series controller-based controllers (such as TMS570 and RM4x), also known under the trade name Hercules ARM Cortex R4, also manufactured by Texas Instruments. The safety controller may be configured to be specific to IEC 61508 and ISO 26262 safety critical applications, etc., to provide advanced integrated safety features while delivering scalable execution, connectivity, and memory options.
The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in nonvolatile memory. For example, nonvolatile memory can include ROM, Programmable ROM (PROM), Electrically Programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes Random Access Memory (RAM), which acts as external cache memory. Further, RAM may be available in a variety of forms, such as SRAM, Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM).
The computer system 210 also includes removable/non-removable, volatile/nonvolatile computer storage media such as, for example, disk storage. Disk storage includes, but is not limited to, devices such as a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick. In addition, disk storage can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), compact disk recordable drive (CD-R drive), compact disk rewritable drive (CD-RW drive) or a digital versatile disk ROM drive (DVD-ROM). To facilitate connection of the disk storage devices to the system bus, a removable or non-removable interface may be used.
It is to be appreciated that the computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in suitable operating environments. Such software includes an operating system. An operating system, which may be stored on disk storage, is used to control and allocate resources of the computer system. System applications utilize the operating system to manage resources through program modules and program data stored in system memory or on disk storage. It is to be appreciated that the various components described herein can be implemented with various operating systems or combinations of operating systems.
A user enters commands or information into the computer system 210 through input device(s) coupled to the I/O interface 251. Input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices are connected to the processor through the system bus via interface port(s). The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device(s) use the same type of port as the input device(s). Thus, for example, a USB port may be used to provide input to a computer system and to output information from the computer system to an output device. Output adapters are provided to illustrate that there are some output devices (such as monitors, displays, speakers, and printers) that require special adapters among other output devices.
The computer system 210 may operate in a networked environment using logical connections to one or more remote computers, such as a cloud computer(s), or a local computer. The remote cloud computer(s) can be personal computers, servers, routers, network PCs, workstations, microprocessor-based appliances, peer devices or other common network nodes and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of brevity, only a memory storage device with remote computer(s) is illustrated. The remote computer(s) is logically connected to the computer system through a network interface and then physically connected via a communications connection. Network interfaces encompass communication networks such as Local Area Networks (LANs) and Wide Area Networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, token Ring/IEEE 802.5, and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL).
In various aspects, the computer system 210, imaging module 238, and/or visualization system 208 of fig. 10, and/or the processor module 232 of fig. 9-10 may include an image processor, an image processing engine, a media processor, or any dedicated Digital Signal Processor (DSP) for processing digital images. The image processor may employ parallel computing with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) techniques to increase speed and efficiency. The digital image processing engine may perform a series of tasks. The image processor may be a system on a chip having a multi-core processor architecture.
Communication connection(s) refers to the hardware/software used to interface the network to the bus. While a communication connection is shown for exemplary clarity within the computer system, it can also be external to computer system 210. The hardware/software necessary for connection to the network interface includes, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.
Fig. 11 illustrates a functional block diagram of one aspect of a
The
The
In various aspects, the
Additional details regarding the structure and function OF the surgical HUB and/or surgical HUB network can be found in U.S. provisional patent application No. 62/659,900 entitled "METHOD OF COMMUNICATION" filed on 19/4.2018, which is incorporated herein by reference in its entirety.
Cloud system hardware and functional module
Fig. 12 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. In one aspect, a computer-implemented interactive surgical system is configured to monitor and analyze data related to the operation of various surgical systems, including surgical hubs, surgical instruments, robotic devices, and operating rooms or medical facilities. A computer-implemented interactive surgical system includes a cloud-based analysis system. While the cloud-based analysis system is described as a surgical system, it is not necessarily so limited and may generally be a cloud-based medical system. As shown in fig. 12, the cloud-based analysis system includes a plurality of surgical instruments 7012 (which may be the same as or similar to instrument 112), a plurality of surgical hubs 7006 (which may be the same as or similar to hub 106), and a surgical data network 7001 (which may be the same as or similar to network 201) to couple the
In addition, the surgical instrument 7012 can include a transceiver for transmitting data to and from its corresponding surgical hub 7006 (which can also include a transceiver). The combination of the surgical instrument 7012 and the
Based on the connections to the various
The particular cloud computing system configurations described in this disclosure are specifically designed to address various issues arising in the context of medical procedures and procedures performed using medical devices (such as the surgical instruments 7012, 112). In particular, the surgical instrument 7012 can be a digital surgical device configured to interact with the cloud 7004 for implementing techniques that improve performance of a surgical procedure. Various surgical instruments 7012 and/or the
Fig. 13 is a block diagram illustrating a functional architecture of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. The cloud-based analysis system includes a plurality of
For example, the data collection and
The patient
The control
The cloud-based analysis system may include security features implemented by the cloud 7004. These security features may be managed by the authorization and
Further, for security purposes, the cloud may maintain a database of
The surgical instrument 7012 may use the wireless transceiver to transmit a wireless signal, which may represent, for example, authorization credentials for accessing the corresponding
The cloud-based analysis system may allow monitoring of multiple medical facilities (e.g., medical facilities such as hospitals) to determine improved practices and suggest changes accordingly (e.g., via suggestion module 2030). Thus, the processor 7008 of the cloud 7004 may analyze data associated with an individual medical facility to identify the facility and aggregate the data with other data associated with other medical facilities in the group. For example, groups may be defined based on similar operational practices or geographic locations. In this way, the cloud 7004 can provide analysis and recommendations across a group of medical facilities. Cloud-based analytics systems may also be used to enhance situational awareness. For example, the processor 7008 may predictively model the impact of the recommendation on the cost and effectiveness of a particular facility (relative to the overall operation and/or various medical procedures). The cost and effectiveness associated with that particular facility may also be compared to corresponding local areas of other facilities or any other comparable facility.
Data classification and
Additional details regarding cloud analysis systems can be found in U.S. provisional patent application No. 62/659,900 entitled "METHOD OF humomunication" filed on 19/4/2018, which is incorporated herein by reference in its entirety.
Situation awareness
While a "smart" device that includes a control algorithm responsive to sensed data may be an improvement over a "dumb" device that operates without regard to sensed data, some sensed data may be incomplete or uncertain when considered in isolation, i.e., in the context of no type of surgical procedure being performed or type of tissue being operated upon. Without knowing the surgical context (e.g., knowing the type of tissue being operated on or the type of procedure being performed), the control algorithm may control the modular device incorrectly or sub-optimally given the particular no-context sensing data. For example, the optimal manner in which a control algorithm for controlling a surgical instrument in response to a particular sensed parameter may vary depending on the particular tissue type being operated on. This is due to the fact that: different tissue types have different characteristics (e.g., tear resistance) and thus respond differently to actions taken by a surgical instrument. Thus, it may be desirable for a surgical instrument to take different actions even when the same measurement for a particular parameter is sensed. As one particular example, the optimal manner in which a surgical stapling and severing instrument is controlled in response to the instrument sensing an unexpectedly high force for closing its end effector will vary depending on whether the tissue type is prone to tearing or tear-resistant. For tissue that is prone to tearing (such as lung tissue), the instrument's control algorithm will optimally ramp down the motor speed in response to an unexpectedly high force for closure, thereby avoiding tearing tissue. For tissue that is resistant to tearing (such as stomach tissue), the instrument's control algorithm will optimally ramp the motor speed up in response to an unexpectedly high force for closure, thereby ensuring that the end effector is properly clamped on the tissue. The control algorithm may make a suboptimal decision without knowing whether lung tissue or stomach tissue has been clamped.
One solution utilizes a surgical hub that includes a system configured to derive information about the surgical procedure being performed based on data received from various data sources, and then control the paired modular devices accordingly. In other words, the surgical hub is configured to infer information about the surgical procedure from the received data and then control the modular devices paired with the surgical hub based on the inferred context of the surgical procedure. Fig. 14 illustrates a diagram of a situation-aware
The surgical hub 5104 (which may be similar in many respects to the hub 106) may be configured to be capable of deriving background information related to the surgical procedure from the data, e.g., based on a particular combination of the received data or a particular order in which the data is received from the
The situational awareness system of the
The
As another example, the type of tissue being operated on may affect the adjustment of the compressibility and loading thresholds of the surgical stapling and severing instrument for a particular tissue gap measurement. The situational aware
As yet another example, the type of body cavity that is manipulated during an insufflation procedure may affect the function of the smoke extractor. The situational awareness
As yet another example, the type of procedure being performed may affect the optimal energy level at which an ultrasonic surgical instrument or a Radio Frequency (RF) electrosurgical instrument operates. For example, arthroscopic surgery requires higher energy levels because the end effector of an ultrasonic surgical instrument or RF electrosurgical instrument is immersed in fluid. The situation aware
As yet another example, data can be extracted from the
Another benefit includes actively and automatically controlling the paired
As another example, the situation aware
As yet another example, the situation aware
Another benefit includes checking for errors during setup of the surgical procedure or during the course of the surgical procedure. For example, the situational awareness
As another example, the situational awareness
In general, the situational awareness system for the
Referring now to fig. 15, a timeline 5200 depicting situational awareness of a hub, such as the
The situation aware
As a first step 5202 in this exemplary procedure, the hospital staff retrieves the patient's EMR from the hospital's EMR database. Based on the selected patient data in the EMR, the
In a second step 5204, the staff scans the incoming medical supplies for the procedure. The
In a third step 5206, medical personnel scan the patient belt via a scanner communicatively connected to the
Fourth, the medical staff opens the ancillary equipment 5208. The ancillary equipment utilized may vary depending on the type of surgery and the technique to be used by the surgeon, but in this exemplary case they include smoke ejectors, insufflators, and medical imaging devices. When activated, the auxiliary device as a modular device may be automatically paired with a
In a fifth step 5210, the practitioner attaches EKG electrodes and other patient monitoring devices to the patient. EKG electrodes and other patient monitoring devices can be paired with the
Sixth step 5212, the medical personnel induce anesthesia in the patient. The
In a seventh step 5214, the patient's lungs being operated on are collapsed (while ventilation is switched to the contralateral lungs). For example, the
In an eighth step 5216, a medical imaging device (e.g., an endoscope) is inserted and video from the medical imaging device is initiated. The
Ninth step 5218, the surgical team begins the dissection step of the procedure. The
In a tenth step 5220, the surgical team continues with the surgical ligation step. The
Eleventh step 5222, a segmental resection portion of the procedure is performed. The
In a twelfth step 5224, a node dissection step is performed. The
A thirteenth step 5226, reverse anesthetizing the patient. For example, the
Finally, a fourteenth step 5228 is for the medical personnel to remove various patient monitoring devices from the patient. Thus, when the hub loses EKG, BP, and other data from the patient monitoring device, the
Situational awareness is further described in U.S. provisional patent application serial No. 62/659,900 entitled "METHOD OF humomultination" filed on 19/4/2018, which is incorporated herein by reference in its entirety. In certain instances, operation of the robotic surgical system (including, for example, the various robotic surgical systems disclosed herein) may be controlled by the
Surgical assessment
In some aspects, the computer system described herein is programmed to evaluate surgical personnel (e.g., how the surgical personnel are using the surgical instrument) during a surgical procedure and to suggest suggestions to improve the surgical personnel's technique or actions. In one aspect, a computer system described herein, such as
Fig. 16 is a diagram of an exemplary operating room setup in accordance with at least one aspect of the present disclosure. In various implementations, the surgical hub 211801 may be connected via a communication protocol (e.g., bluetooth) to the various one OR more cameras 211802, surgical instruments 211810, displays 211806, and other surgical devices within the OR211800, as described above under the heading "surgical hub. The camera 211802 may be oriented to capture images and/or video of the surgical personnel 211803 during the course of a surgical procedure. Thus, the surgical hub 211801 may receive captured image and/or video data from the camera 211802 to visually analyze technical or physical characteristics of the surgical personnel 211803 during the surgical procedure.
Fig. 17 is a logic flow diagram of a process 211000 for visually evaluating surgical personnel in accordance with at least one aspect of the present disclosure. In the following description of process 211000, reference should also be made to fig. 10 and 16. The process 211000 may be performed by a processor or control circuitry of a computer system, such as the processor 244 of the surgical hub 206 shown in fig. 10. Thus, the process 211000 may be embodied as a set of computer-executable instructions stored in the memory 249 that, when executed by the processor 244, cause a computer system (e.g., the surgical hub 211801) to perform the described steps.
As described above under the heading "SURGICAL hub (SURGICAL hub)", a computer system, such as the SURGICAL hub 211801, may be connected to or paired with a variety of SURGICAL devices, such as SURGICAL instruments, generators, smoke ejectors, displays, and the like. Through their connection to these surgical devices, the surgical hub 211801 may receive an array of perioperative data from these pairs of surgical devices as they are used during a surgical procedure. Further, as described above under the heading "situational awareness" (sitatinoala walense), the surgical hub 211801 may determine the context of the surgical procedure being performed (e.g., the type of procedure or the surgical steps being performed) based at least in part on the perioperative data received from these connected surgical devices. Accordingly, the processor 244 executing the process 211000 receives 211002 perioperative data from one or more surgical devices connected to or paired with the surgical hub 211801 and determines 211004 a surgical context based at least in part on the received perioperative data utilizing situational awareness. The surgical context determined by situational awareness by the surgical hub 211801 may be used to inform the assessment of the surgical personnel performing the surgical procedure.
Thus, the processor 244 captures 211006 one OR more images of the surgical personnel performing the surgical procedure via, for example, the camera 211802 positioned within the OR 211800. The captured image or images may comprise still images or moving images (i.e., video). Images of the surgical personnel may be captured at a variety of angles and magnifications using different filters and the like. In one implementation, the cameras 211802 are arranged within the OR211800 so that they can collectively visualize each surgical personnel performing the procedure.
Accordingly, the processor 244 determines 211008 the physical characteristics of the one or more surgical personnel from the captured one or more images. For example, the physical characteristics may include a pose (as discussed in connection with fig. 18-19) or a wrist angle (as discussed in connection with fig. 20-21). In other implementations, the physical characteristic may include a position, orientation, angle, or rotation of the individual's head, shoulders, torso, elbows, legs, hips, etc. The physical characteristics may be determined 211008 using a variety of machine vision, image processing, object recognition, and optical tracking techniques. In one aspect, the physical characteristic may be determined 211008 by: the captured image is processed to detect edges of objects in the image and the detected image is compared to a template of the body part being evaluated. Once the evaluated body part has been identified, its position, orientation, and other characteristics may be tracked by comparing the motion of the tracked body part relative to the known position of the camera 211802. In another aspect, a tag-based optical system may be utilized to determine 211008 a physical characteristic (e.g., an active tag embedded in the uniform of a surgical person that emits electromagnetic radiation or other signals that may be received by camera 211802 or other sensors connected to surgical hub 211801). By tracking the movement of the markers relative to the camera 211802, the processor 244 may thus determine the corresponding position and orientation of the body part.
Thus, the processor 244 evaluates 211010 the determined physical characteristics of the surgical personnel based on the baseline. In one aspect, the baseline may correspond to a surgical context determined via situational awareness. The processor 244 may retrieve the baselines for the various physical characteristics from a memory (e.g., memory 249 shown in fig. 10), for example, according to a given surgical context. The baseline may include a value or range of values for a particular physical characteristic to be tracked during a particular surgical context. The type of physical characteristic evaluated in the different surgical contexts may be the same or unique for each particular surgical context.
In one aspect, the processor 244 may provide feedback to the surgical personnel in real time during the surgical procedure. The real-time feedback may include graphical notifications OR recommendations displayed on the display 211806 within the OR211800, audio feedback transmitted by the surgical hub 211801 OR surgical instrument 211810, OR the like. In addition, feedback may include trocar port placement offsets, surgical instrument movement from one trocar port to another, adjusting the positioning of the patient being operated on (e.g., positioning or rolling at an increased surgical table angle), and other such recommendations to improve access to the surgical site and minimize non-ideal surgical techniques exhibited by the surgical personnel. In another aspect, the processor 244 may provide post-operative feedback to the surgical personnel. Post-operative feedback may include a graphical overlay or notification displayed on the captured video of the procedure that is viewable by the surgical personnel for learning purposes, a post-operative report indicating when the surgical personnel deviated from baseline or a particular surgical step, and the like. Any visually identifiable physical characteristic (or combination of physical characteristics) may be used as a basis for advising the surgical personnel of the technical improvement exhibited.
In one aspect, one or more of the steps of process 211000 may be performed by a second computer system or a remote computer system, such as a CLOUD computing system described under the heading "CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES". For example, the surgical hub 211801 may receive 211002 perioperative data from a connected surgical device, determine 211004 a surgical background based at least in part on the perioperative data, capture 211006 or receive images of surgical personnel 211803 via the camera 211802, and determine 211008 physical characteristics of the surgical personnel 211803, as described above. However, in this aspect, rather than performing the evaluation on the surgical hub 211801, the surgical hub 211801 may instead transmit data regarding the physical characteristics and the determined surgical context to a second computer system, such as a cloud computing system. The cloud computing system may then perform an evaluation by determining whether the determined physical characteristic deviates from a baseline physical characteristic corresponding to the surgical context. In some aspects, the baseline physical characteristics may be determined or calculated from data aggregated from all surgical hubs 211801 communicatively connected to the cloud computing system, which allows the cloud computing system to compare the technology of surgical personnel 211803 across multiple medical facilities. Thus, the cloud computing system may transmit a comparison between the physical characteristics determined by the surgical hub 211801 and the corresponding baselines stored on or determined by the cloud computing system. Upon receiving the results, the surgical hub 211801 may then take appropriate action (e.g., display a notification if the skill of the surgical personnel 211803 deviates from the baseline, as described above). In other aspects, one or more additional or different steps of process 211000 can be performed by other computing systems communicatively coupled to the first computing system. In some aspects, such connected computer systems may be embodied as distributed computing systems.
18-19 illustrate a hypothetical implementation of the process 211000 illustrated in FIG. 17, wherein the physical characteristic being evaluated is the pose of the surgical personnel. Fig. 18 is a diagram illustrating a series of models 211050a, 211050b, 211050c, 211050d of a surgical person 211052 during a course of a surgical procedure, according to at least one aspect of the present disclosure. Correspondingly, fig. 19 is a
Referring to fig. 19,
In one aspect, the posture of an individual evaluated by a computer system may be quantified as a measure of positional deviation corresponding to one or more positions of the individual's body from corresponding initial or threshold positions. For example, fig. 18 shows the changes in head position 211054, shoulder position 211056, and hip position 211058 of a modeled individual over time of first line 211055, second line 211057, and third line 211059, respectively. In aspects utilizing a marker-based optical system, a surgeon's uniform may have markers located at one or more of these locations, which may be tracked by the optical system, for example. In aspects utilizing a markerless optical system, the optical system can be configured to be capable of identifying a surgical person and optically tracking one or more body parts of the identified surgical personOr the position and movement of the body position. In addition, head, shoulder and hip positions 211054, 211056, 211058 may be compared to a baseline head position 211060, a baseline shoulder position 211062 and a baseline hip position 211064, respectively. The baseline positions 211060, 211062, 211064 may correspond to initial positions of respective body parts (i.e., time t in fig. 19)0The location of the body part) or may be a predetermined threshold against which the location of the body part is compared. In one aspect, the posture metric (as shown by
In one aspect, the surgical hub 211801 performing process 211000 may compare the calculated pose metric to one or more thresholds and then take various actions accordingly. In the depicted implementation, the surgical hub 211801 compares the pose metric to a
20-21 illustrate a hypothetical implementation of the process 211000 illustrated in FIG. 17, wherein the physical characteristic being evaluated is the wrist angle of the surgical personnel. Fig. 20 is a depiction of a surgeon holding a
In this implementation, the angle of the individual's
In one aspect, the surgical hub 211801 performing process 211000 can compare the calculated wrist angle a to one or more thresholds and then take various actions accordingly. In the depicted implementation, the surgical hub 211801 determines whether the surgeon's wrist angle a falls within a first zone defined by the
In some aspects, various thresholds or baselines to which the monitored physical characteristics are compared may be empirically determined. The surgical hub 211801 and/or CLOUD computing system described above under the heading "CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES" may capture data relating to various physical characteristics of the surgical personnel from the sample population of the surgical procedure for analysis. In one aspect, the computer system may associate those physical characteristics with various surgical outcomes, and then set a threshold or baseline according to the particular physical characteristics of the surgeon or other surgical personnel associated with the highest degree of positive surgical outcomes. Thus, the surgical hub 211801 performing the procedure 211000 may provide a notification or warning when the surgical personnel deviate from best practice. In another aspect, the computer system may set a threshold or baseline according to the physical characteristics most commonly exhibited within a group of samples. Thus, the surgical hub 211801 performing the procedure 211000 may provide a notification or warning when the surgical personnel deviate from the most common practice. For example, in fig. 21, the
In one aspect, the physical characteristics tracked by the surgical hub 211801 may be differentiated according to product type. Thus, the surgical hub 211801 may be configured to notify surgical personnel when the particular physical characteristic being tracked corresponds to a different product type. For example, the surgical hub 211801 may be configured to notify the surgeon when the surgeon's arm and/or wrist pose deviates from the baseline for the particular surgical instrument currently being utilized, and thus indicate that a different surgical instrument will be more appropriate.
In one aspect, the surgical hub 211801 can be configured to compare an external orientation of the surgical instrument 211810 to an internal access orientation of its end effector. The external orientation of the surgical instrument 211810 may be determined via the camera 211802 and optical system described above. The internal orientation of the end effector of the surgical instrument 211810 may be determined via an endoscope or another scope for visualizing the surgical site. By comparing the external orientation and the internal orientation of the surgical instrument 211810, the surgical hub 211801 can then determine whether a different type of surgical instrument 211810 will be more appropriate. For example, if the external orientation of the surgical instrument 211810 deviates from the internal orientation of the end effector of the surgical instrument 211810 by more than a threshold degree, the surgical hub 211801 may be configured to provide a notification to the surgical personnel.
In summary, a computer system such as the surgical hub 211801 may be configured to be able to provide recommendations to a surgeon (e.g., surgeon) when the surgeon's skill begins to deviate from best or common practice. In some aspects, the computer system may be configured to be able to provide notification or feedback only when an individual repeatedly exhibits suboptimal behavior during the course of a given surgical procedure. The notification provided by the computer system may suggest, for example, that the surgical personnel adjust their technique to conform to the best technique for the type of procedure with a more appropriate instrument or the like.
In one aspect, the computer system (e.g., surgical hub 211801) may be configured to allow surgical personnel to compare their technique to themselves, rather than to a baseline established or preprogrammed into the computer system by the sampling population. In other words, the baseline with which the computer system compares the surgical personnel may be a prior performance by the surgical personnel in a prior instance of a particular surgical procedure type or with a particular type of surgical instrument. Such aspects may be useful for allowing surgeons to track improvements in the document trial period of their surgical techniques or new surgical products. Thus, the surgical hub 211801 may be configured to be able to evaluate the product during a trial period and provide a highlight of the product's use during a given time period. In one aspect, the surgical hub 211801 may be programmed to be particularly sensitive to deviations between the performance of the surgical personnel and the corresponding baseline, such that the surgical hub 211801 may enhance the appropriate technique for using the surgical device while the trial period is ongoing. In one aspect, the surgical hub 211801 may be configured to record usage of new surgical products and compare and contrast the new products with previous baseline product usage. When two different products are utilized, the surgical hub 211801 may also provide a post-analysis view to highlight the similarities and differences noted between the tracked physical characteristics of the surgeon. In addition, the surgical hub 211801 may allow surgeons to compare populations of procedures between new and old surgical products. The recommendations provided by the surgical hub 211801 may include, for example, comparison videos showing the use of new products.
In one aspect, the computer system (e.g., surgical hub 211801) may be configured to allow the surgeon to compare their technique directly to other surgeons, rather than to a baseline established by the sampling population or preprogrammed into the computer system.
In one aspect, the computer system (e.g., surgical hub 211801) may be configured to be able to analyze trends in surgical device usage as surgeons become more experienced in performing particular surgical procedures (or surgical procedures in general) or using new surgical instruments. For example, the computer system may identify movements, behaviors, and other physical characteristics that change significantly as the surgeon becomes more experienced. Thus, the computer system may recognize when the surgeon exhibits a suboptimal technique early in the surgeon's learning curve and may provide recommendations for the optimal approach before the suboptimal technique becomes underrooted in the surgeon.