Usage and technical analysis of surgeon/personnel performance relative to baseline to optimize device utilization and performance for both current and future procedures

文档序号:1078414 发布日期:2020-10-16 浏览:13次 中文

阅读说明:本技术 相对于基线对外科医生/人员执行进行使用和技术分析,以优化当前手术和未来手术两者的装置利用率和执行 (Usage and technical analysis of surgeon/personnel performance relative to baseline to optimize device utilization and performance for both current and future procedures ) 是由 F·E·谢尔顿四世 J·L·哈里斯 T·W·阿伦霍尔特 于 2018-11-14 设计创作,主要内容包括:本发明公开了用于评估外科人员的各种系统和方法。一种计算机系统,诸如外科集线器,可以被配置为能够可通信地联接到外科装置和相机。该计算机系统可被编程为至少部分基于外科过程期间从外科装置接收到的围手术期数据来确定与外科手术有关的背景信息。此外,计算机系统可经由相机在视觉上确定外科人员的物理特性,并且将物理特性与基线进行比较以评估外科人员。(Various systems and methods for evaluating surgical personnel are disclosed. A computer system, such as a surgical hub, may be configured to be communicably coupled to a surgical device and a camera. The computer system can be programmed to determine contextual information relating to the surgical procedure based at least in part on perioperative data received from the surgical device during the surgical procedure. Further, the computer system may visually determine a physical characteristic of the surgical personnel via the camera and compare the physical characteristic to a baseline to evaluate the surgical personnel.)

1. A computer system configured to be communicably couplable to a surgical device and a camera, the computer system comprising:

a processor; and

a memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the computer system to:

receiving perioperative data from the surgical device;

determining a surgical context based at least in part on the perioperative data;

receiving, via the camera, an image of an individual;

determining a physical characteristic of the individual from the image;

retrieving a baseline physical characteristic corresponding to the surgical context; and

determining whether the physical characteristic of the individual deviates from the baseline physical characteristic.

2. The computer system of claim 1, wherein the physical characteristic comprises a posture of the individual.

3. The computer system of claim 2, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.

4. The computer system of claim 1, wherein the physical characteristic comprises a wrist orientation of the individual.

5. The computer system of claim 4, wherein the wrist orientation of the individual corresponds to an angle between the individual's wrist and a surgical instrument held by the individual.

6. The computer system of claim 1, wherein the baseline physical characteristic comprises a previously recorded instance of the physical characteristic of the individual.

7. The computer system of claim 1, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to provide a notification based on whether the physical characteristic deviates from the baseline physical characteristic.

8. The computer system of claim 7, wherein the computer system provides the notification during a surgical procedure in which the perioperative data is received.

9. A computer-implemented method for tracking physical characteristics of an individual, the method comprising:

receiving, by a computer system, perioperative data from a surgical device;

determining, by the computer system, a surgical context based at least in part on the perioperative data;

receiving, by the computer system, an image of the individual via a camera communicatively coupled to the computer system;

determining, by the computer system, a physical characteristic of the individual from the image;

retrieving, by the computer system, a baseline physical characteristic corresponding to the surgical context; and

determining, by the computer system, whether the physical characteristic of the individual deviates from the baseline physical characteristic.

10. The computer-implemented method of claim 9, wherein the physical characteristic comprises a posture of the individual.

11. The computer-implemented method of claim 10, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.

12. The computer-implemented method of claim 9, wherein the physical characteristic comprises a wrist orientation of the individual.

13. The computer-implemented method of claim 12, wherein the wrist orientation of the individual corresponds to an angle between the individual's wrist and a surgical instrument held by the individual.

14. The computer-implemented method of claim 9, wherein the baseline physical characteristic comprises a previously recorded instance of the physical characteristic of the individual.

15. The computer-implemented method of claim 9, further comprising providing, by the computer system, a notification on a display according to whether the physical characteristic deviates from the baseline physical characteristic.

16. A computer system configured to be communicably couplable to a surgical device and a camera, the computer system comprising:

a processor; and

a memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the computer system to:

receiving perioperative data from the surgical device;

determining a surgical context based at least in part on the perioperative data;

receiving, via the camera, an image of an individual;

determining a physical characteristic of the individual from the image;

transmitting data identifying the physical characteristic and the surgical context to a remote computer system;

wherein the remote computer system determines a baseline physical characteristic corresponding to the surgical context and the physical characteristic from data aggregated from a plurality of computer systems connected to the remote computer system; and

receiving from the remote computer system whether the physical characteristic of the individual deviates from the baseline physical characteristic.

17. The computer system of claim 16, wherein the remote computer system comprises a cloud computing system.

18. The computer system of claim 16, wherein the physical characteristic comprises a posture of the individual.

19. The computer system of claim 18, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.

20. The computer system of claim 16, wherein the physical characteristic comprises a wrist orientation of the individual.

21. The computer system of claim 20, wherein the wrist orientation of the individual corresponds to an angle between the individual's wrist and a surgical instrument held by the individual.

Background

The present disclosure relates to various surgical systems. Surgical procedures are often performed in surgical operating rooms or operating rooms (operating theaters or rooms) of medical facilities, such as, for example, hospitals. A sterile field is typically created around the patient. The sterile field may include members of a team who are properly wearing swabs, as well as all equipment and fixtures in the field. Various surgical devices and systems are utilized in performing surgical procedures.

Disclosure of Invention

In one general aspect, the present disclosure is directed to a computer system configured to be communicably coupleable to a surgical device and a camera. The computer system includes a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receiving perioperative data from a surgical device; determining a surgical context based at least in part on the perioperative data; receiving, via the camera, an image of an individual; determining a physical characteristic of the individual from the image; retrieving a baseline physical characteristic corresponding to the surgical context; and determining whether the physical characteristic of the individual deviates from the baseline physical characteristic.

In another general aspect, a computer-implemented method for tracking physical characteristics of an individual is provided. The method comprises the following steps: receiving, by a computer system, perioperative data from a surgical device; determining, by the computer system, a surgical context based at least in part on the perioperative data; receiving, by the computer system, an image of the individual via a camera communicatively coupled to the computer system; determining, by the computer system, a physical characteristic of the individual from the image; retrieving, by the computer system, a baseline physical characteristic corresponding to the surgical context; and determining, by the computer system, whether the physical characteristic of the individual deviates from a baseline physical characteristic.

In yet another general aspect, the present disclosure is directed to a computer system configured to be communicably coupleable to a surgical device and a camera. The computer system includes a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receiving perioperative data from a surgical device; determining a surgical context based at least in part on the perioperative data; receiving, via the camera, an image of an individual; determining a physical characteristic of the individual from the image; transmitting data identifying the physical characteristic and the surgical context to a remote computer system; wherein the remote computer system determines a baseline physical characteristic corresponding to the surgical context and the physical characteristic from data aggregated from a plurality of computer systems connected to the remote computer system; and receiving from the remote computer system whether the physical characteristic of the individual deviates from the baseline physical characteristic.

Drawings

The various aspects (relating to surgical tissues and methods) described herein, together with further objects and advantages thereof, may best be understood by reference to the following description taken in conjunction with the accompanying drawings.

Fig. 1 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.

Fig. 2 is a surgical system for performing a surgical procedure in an operating room according to at least one aspect of the present disclosure.

Fig. 3 is a surgical hub paired with a visualization system, a robotic system, and a smart instrument according to at least one aspect of the present disclosure.

Fig. 4 is a partial perspective view of a surgical hub housing and a composite generator module slidably received in a drawer of the surgical hub housing according to at least one aspect of the present disclosure.

Fig. 5 is a perspective view of a combined generator module having bipolar, ultrasonic and monopolar contacts and a smoke evacuation component according to at least one aspect of the present disclosure.

Fig. 6 illustrates an individual power bus attachment for a plurality of lateral docking ports of a lateral modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.

Fig. 7 illustrates a vertical modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.

Fig. 8 illustrates a surgical data network including a modular communication hub configured to connect modular devices located in one or more operating rooms of a medical facility or any room in the medical facility dedicated to surgical procedures to a cloud in accordance with at least one aspect of the present disclosure.

Fig. 9 is a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.

Fig. 10 illustrates a surgical hub including a plurality of modules coupled to a modular control tower according to at least one aspect of the present disclosure.

Fig. 11 illustrates one aspect of a Universal Serial Bus (USB) hub device in accordance with at least one aspect of the present disclosure.

Fig. 12 is a block diagram of a cloud computing system including a plurality of smart surgical instruments coupled to a surgical hub connectable to cloud components of the cloud computing system in accordance with at least one aspect of the present disclosure.

Fig. 13 is a functional module architecture of a cloud computing system according to at least one aspect of the present disclosure.

Fig. 14 illustrates a diagram of a situation-aware surgical system in accordance with at least one aspect of the present disclosure.

Fig. 15 is a timeline depicting situational awareness of a surgical hub, according to at least one aspect of the present disclosure.

Fig. 16 is a diagram of an exemplary Operating Room (OR) setup, according to at least one aspect of the present disclosure.

Fig. 17 is a logic flow diagram of a process for visually evaluating a surgical person according to at least one aspect of the present disclosure.

Fig. 18 is a diagram illustrating a series of models of a surgical person during a procedure of a surgical procedure in accordance with at least one aspect of the present disclosure.

Fig. 19 is a diagram depicting measured poses of the surgical personnel shown in fig. 18 over time, in accordance with at least one aspect of the present disclosure.

Fig. 20 is a depiction of a surgeon holding a surgical instrument according to at least one aspect of the present disclosure.

Fig. 21 is a scatter plot of wrist angle versus surgical outcome in accordance with at least one aspect of the present disclosure.

Detailed Description

The applicant of the present patent application owns the following U.S. patent applications filed 2018, 11/6, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. patent application Ser. No. 16/182,224 entitled "SURGICAL NETWORK, INSTRUMENT, AND CLOUD RESPONSES BASED ONVALIDITION OF RECEIVED DATASET AND AUTHENTICATION OF ITS SOURCE ANDidignity";

U.S. patent application Ser. No. 16/182,230 entitled "SURGICAL SYSTEM FOR PRESENTING INFORMATION INTERPRETED FROMOXITENAL DATA";

U.S. patent application Ser. No. 16/182,233 entitled "MODIFICATION OF SURGICAL SYSTEMS CONTROL PROGRAMS BASED ONMACHINE LENING";

U.S. patent application Ser. No. 16/182,239 entitled "apparatus CONTROL program BASED ON STRATTIFIED CONTEXTUAL DATA IN ADDITION TO THE DATA";

U.S. patent application Ser. No. 16/182,243 entitled "SURGICAL HUB AND MODULAR DEVICE RESPONSE ADJUSTMENT BASED ONSIATIONAL AWARENESS";

U.S. patent application Ser. No. 16/182,248 entitled DETECTION AND evaluation OF Security RESPONSES OF SURGICALINSTRUTRUMENTS TO INCREASING SEVERITY THREATS;

U.S. patent application Ser. No. 16/182,251, entitled "INTERACTIVE SURGICAL SYSTEM";

U.S. patent application 16/182,260 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ONPREDEFINED PARAMETERS WITHIN SURGICAL NETWORKS";

U.S. patent application Ser. No. 16/182,267 entitled "SENSING THE PATIENT POSITION AND continuous timing same MONO-POLAR RETURN PAD ELECTRODE TO program lateral aware TO A SURGICALNETWORK";

U.S. patent application 16/182,249 entitled "POWER SURGICAL TOOL WITH PREDEFINED ADJUSTABLE CONTROLLALGORITHM FOR CONTROLLING END EFFECTOR PARAMETER";

U.S. patent application Ser. No. 16/182,246 entitled "ADJUSTMENTS BASED ON AIRBORNEPARATICLES PROPERTIES";

U.S. patent application Ser. No. 16/182,256 entitled "ADAJUSTMENT OF A SURGICAL DEVICE FUNCTION BASED ONSIATIONAL AWARENESS";

U.S. patent application 16/182,242 entitled "REAL-TIME ANALYSIS OF COMPREHENSIVE COST OF ALLINSTRUMENTATION USE IN SURGERY UTILIZING DATA FLUIDITY TO TRACK INSTRUMENTS THROUGH STOCKING AND IN-HOUSE PROCESSES";

U.S. patent application 16/182,269 entitled "IMAGE CAPTURING OF THE E AREAS OUTSIDE THE ABDOMEN TO IMPROVEPLEACEMENT AND CONTROL OF A SURGICAL DEVICE IN USE";

U.S. patent application 16/182,278 entitled "COMMUNICATION OF DATA WHERE A SURGICAL NETWORK USINGCONT OF THE DATA AND REQUIREMENTS OF A RECEIVING SYSTEM/USER TO INFLUENCE ENCONCE OR LINKAGE OF DATA AND METADATA TO ESTABLISH CONTENT;

U.S. patent application 16/182,290 entitled "SURGICAL NETWORK RECOMMETATIONS FROM REAL TIME ANALYSIS OF PROCESSING VARIABLE AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THEOTIMAL SOLUTION";

U.S. patent application Ser. No. 16/182,232 entitled "CONTROL OF A SURGICAL SYSTEM THROUGH A SURGICAL BARRIER";

U.S. patent application Ser. No. 16/182,227 entitled "SURGICAL NETWORK DETERMINATION OF PRIORITIATION OF COMMUNICATION, INTERACTION, OR PROCESSING BASED SYSTEM OR DEVICE NEEDS";

U.S. patent application Ser. No. 16/182,231 entitled "WIRELESS PAIRING OF SURGICAL DEVICE WITH ANOTHER DEVICEWITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESSOF DEVICES";

U.S. patent application 16/182,229 entitled "ADJUSTMENT OF STAPLE HEIGHT OF AT LEAST ONE ROW OF STAPLESBASED ON THE SENSED TISSUE THICKNESS OR FOR THE CLENING";

U.S. patent application Ser. No. 16/182,234 entitled "STAPLING DEVICE WITH BOTH COMPOSITIONY AND DISCRITIONARYLOCKOUTS BASED ON SENSED PARAMETERS";

U.S. patent application 16/182,240 entitled "POWER STAPLING DEVICE CONFIRED TO ADJUSE FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER BASED ON SENSEPARAMETER OF FIRING OR CLAMPING";

U.S. patent application 16/182,235 entitled "VARIATION OF RADIO FREQUENCY LEVEL AND ultra silicon POWER LEVEL information WITH VARYING CLAMP ARM PRESSURE TO ACHIEVE PREDEFINED HEAT FLUXOR POWER application TO time"; and

U.S. patent application 16/182,238 entitled "ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BCLAMP ARM TO PROVIDE THRESHOLD CONTROL AT A CUT PROGRESONLOCATION".

The applicant of the present patent application owns the following U.S. patent applications filed on 2018, 9, 10, the disclosure of each of which is incorporated herein by reference in its entirety:

entitled "A CONTROL FOR A SURGICAL NETWORK OR SURGICAL NETWORK CONNECTED DEVICE THAT ADJUTS ITS FUNCTION BASED A SENSED STATION ORUSAGE" U.S. provisional patent application No. 62/729,183;

U.S. provisional patent application No. 62/729,177 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ONPREDEFINED PARAMETERS WITHIN A SURGICAL NETWORK BEFORE TRANSMISSION";

U.S. provisional patent application No. 62/729,176 entitled "INDIRECT COMMAND CONTROL OF A FIRST OPERATING ROOMSYSTEM THROUGH THE USE OF A SECOND OPERATING ROOM SYSTEM WITHIN A STERILEFIELD WHERE THE SECOND OPERATING ROOM SYSTEM HAS PRIMARY AND SECONDARY OPERATING MODES";

U.S. provisional patent application No. 62/729,185 entitled "POWER STAPLING DEVICE THAT IS CAPABLE OF ADJUSE FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER OF THE DEVICE BASED ONSED PARAMETER OF FIRING OR CLAMPING";

U.S. provisional patent application No. 62/729,184 entitled "POWER SURGICAL TOOL WITH A PREDEFINED ADJUSTABLE CONTROLLALGORITHM FOR CONTROLLING AT LEAST ONE END EFFECTOR PARAMETER AND A MEANS FOR RLIMITING THE ADJUSTMENT";

U.S. provisional patent application No. 62/729,182 entitled "SENSING THE PATIENT POSITION AND CONTACT UTILIZING THE MONOPOLAR RETURN PAD ELECTRO TO PROVIDE SITUATIONAL AWARENESS TO THE HUB";

U.S. provisional patent application No. 62/729,191 entitled "SURGICAL NETWORK RECOMMETATIONS FROM REAL TIME ANALYSIS OF PROCESSING VARIABLE AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THEOTPIAL SOLUTION";

U.S. provisional patent application No. 62/729,195 entitled "ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BCLAMP ARM TO PROVIDE THRESHOLD CONTROL AT A CUT PROGRESONLOCATION"; and

U.S. provisional patent application No. 62/729,186 entitled "WIRELESS PAIRING OF A SURGICAL DEVICE WITH ANOTHER DEVICEWITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESSOF DEVICES".

The applicant of the present patent application owns the following U.S. patent applications filed on 28/8/2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. patent application Ser. No. 16/115,214 entitled "ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL STYSTEM THEREFOR";

U.S. patent application Ser. No. 16/115,205 entitled "TEMPERATURE CONTROL OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR";

U.S. patent application Ser. No. 16/115,233 entitled "RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMMUNICATICAL SIGNALS";

U.S. patent application Ser. No. 16/115,208 entitled "CONTROL AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TOTISSUE LOCATION";

U.S. patent application Ser. No. 16/115,220 entitled "CONTROL ACTITION OF AN ULTRASONIC SURGICAL INSTRUMENTACCORDING TO THE PRESENCE OF TISSUE";

U.S. patent application Ser. No. 16/115,232, entitled "DETERMINING TISSUE COMPOSITION VIA AN ULTRASONIC SYSTEM";

U.S. patent application Ser. No. 16/115,239 entitled "DETERMINING THE STATE OF AN ULTRASONIC ELECTROMECHANICAL SYSTEM ACCORDING TO FREQUENCY SHIFT";

U.S. patent application Ser. No. 16/115,247 entitled "DETERMINING THE STATE OF AN ULTRASONIC END EFFECTOR";

U.S. patent application Ser. No. 16/115,211 entitled "STATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS";

U.S. patent application Ser. No. 16/115,226, entitled "MECHANISMS FOR CONTROLLING DIFFERENT ELECTROMECHANICALSYSTEMS OF AN ELECTROSURGICAL INSTRUMENT";

U.S. patent application Ser. No. 16/115,240 entitled "DETECTION OF END EFFECTOR IN LIQUID identification";

U.S. patent application Ser. No. 16/115,249 entitled "INTERRUPTION OF ENGERING DUE TO INDVERTENT CAPACITIVEECOUPLING";

U.S. patent application Ser. No. 16/115,256 entitled "INCREASING RADIO FREQUENCY TO CREATE PAD-LES MONOPOLARLOOP";

U.S. patent application Ser. No. 16/115,223 entitled "BIPOLAR COMMUNICATION DEVICE THAT AUTOMATICALLY ADJUSTSPRESSURE BASED ON ENERGY MODALITY"; and

U.S. patent application Ser. No. 16/115,238 entitled "ACTIVATION OF ENERGY DEVICES".

The applicant of the present patent application owns the following U.S. patent applications filed on 23.8.2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. provisional patent application No. 62/721,995 entitled "control AN ULTRASONIC scientific relating LOCATION";

U.S. provisional patent application Ser. No. 62/721,998, entitled "SIGTUATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS";

U.S. provisional patent application No. 62/721,999, entitled "INTERRUPTION OF ENGERING DUE TO INDVERTENT CAPACITIVEECOUPLING";

U.S. provisional patent application No. 62/721,994 entitled "BIPOLAR COMMUNICATION DEVICE THAT AUTOMATICALLY ADJUSTSPRESSURE BASED ON ENERGY MODALITY"; and

U.S. provisional patent application No. 62/721,996 entitled "RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMMUNICATICAL SIGNALS".

The applicant of the present patent application owns the following U.S. patent applications filed on 30.6.2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. provisional patent application No. 62/692,747 entitled "SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE";

U.S. provisional patent application No. 62/692,748, entitled "SMART ENERGY ARCHITECTURE"; and

U.S. provisional patent application No. 62/692,768 entitled "SMART ENERGY DEVICES".

The applicant of the present patent application owns the following U.S. patent applications filed on 29.6.2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. patent application Ser. No. 16/024,090 entitled "CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAYELEMENTS";

U.S. patent application Ser. No. 16/024,057 entitled "control A SURGICAL INSTRUMENT ACCORDING TO SENSECLOSURE PARAMETERS";

U.S. patent application Ser. No. 16/024,067 entitled "SYSTEM FOR ADJUSTING END EFFECTOR PARAMETERS BASED ONPERIORATIVE INFORMATION";

U.S. patent application Ser. No. 16/024,075 entitled "SAFETY SYSTEMS FOR SMART POWER SURGICAL STAPLING";

U.S. patent application Ser. No. 16/024,083, entitled "SAFETY SYSTEMS FOR SMART POWER SURGICAL STAPLING";

U.S. patent application Ser. No. 16/024,094 entitled "SURGICAL SYSTEMS FOR DETECTING END EFFECTOR TISSUEDISTRIBUTION IRREGULARITIES";

U.S. patent application Ser. No. 16/024,138 entitled "SYSTEM FOR DETECTING PROXIMITY OF SURGICAL END EFFECTOR TO CANCEROUS TISSUE";

U.S. patent application Ser. No. 16/024,150 entitled "SURGICAL INSTRUMENT CARTRIDGE SENSOR ASSEMBLIES";

U.S. patent application Ser. No. 16/024,160 entitled "VARIABLE OUTPUT CARTRIDGE SENSOR ASSEMBLY";

U.S. patent application Ser. No. 16/024,124 entitled "SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE";

U.S. patent application Ser. No. 16/024,132 entitled "SURGICAL INSTRUMENT HAVING A FLEXIBLE CIRCUIT";

U.S. patent application Ser. No. 16/024,141 entitled "SURGICAL INSTRUMENT WITH A TISSUE MARKING ASSEMBLY";

U.S. patent application Ser. No. 16/024,162 entitled "SURGICAL SYSTEMS WITH PRIORIZED DATA TRANSMISSIONCAPABILITIES";

U.S. patent application Ser. No. 16/024,066 entitled "SURGICAL EVACUTION SENSING AND MOTOR CONTROL";

U.S. patent application Ser. No. 16/024,096 entitled "SURGICAL EVACUTION SENSOR ARRANGEMENTS";

U.S. patent application Ser. No. 16/024,116 entitled "SURGICAL EVACUATION FLOW PATHS";

U.S. patent application Ser. No. 16/024,149 entitled "SURGICAL EVACUTION SENSING AND GENERATOR CONTROL";

U.S. patent application Ser. No. 16/024,180 entitled "SURGICAL EVACUTION SENSING AND DISPLAY";

U.S. patent application Ser. No. 16/024,245 entitled "COMMUNICATION OF SMOKE EVACUTION SYSTEM PARAMETERS TO HUBOR CLOUD IN SMOKE EVACUTION MODULATE FOR INTERACTIVE SURGICAL PLATFORM";

U.S. patent application Ser. No. 16/024,258 entitled "SMOKE EVACUTION SYSTEM INCLUDING A SEGMENTED CONTROL IRCUIT FOR INTERACTIVE SURGICAL PLATFORM";

U.S. patent application Ser. No. 16/024,265 entitled "SURGICAL EVACATION SYSTEM WITH A COMMUNICATION CIRCUIT FORCOMMUNICATION BETWEEN A FILTER AND A SMOKE EVACATION DEVICE"; and

U.S. patent application Ser. No. 16/024,273 entitled "DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 6/28, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application Ser. No. 62/691,228 entitled "A Method of using a cured fluid circuits with multiple sensors with electronic devices";

U.S. provisional patent application Ser. No. 62/691,227 entitled "controlling a surgical instrument recording to sensing closure parameters";

U.S. provisional patent application Ser. No. 62/691,230 entitled "SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE";

U.S. provisional patent application Ser. No. 62/691,219 entitled "SURGICAL EVACUTION SENSING AND MOTOR CONTROL";

U.S. provisional patent application Ser. No. 62/691,257 entitled "COMMUNICATION OF SMOKE EVACUTION SYSTEM PARAMETERS TO HUBOR CLOUD IN SMOKE EVACUTION MODULATE FOR INTERACTIVE SURGICAL PLATFORM";

U.S. provisional patent application Ser. No. 62/691,262 entitled "SURGICAL EVACATION SYSTEM WITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND A SMOKE EVACATION DEVICE"; and

U.S. provisional patent application Ser. No. 62/691,251, entitled "DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 4, 19, the disclosures of which are incorporated herein by reference in their entirety:

U.S. provisional patent application Ser. No. 62/659,900 entitled "METHOD OF HUB COMMUNICATION".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 30/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. provisional patent application No. 62/650,898 entitled "CAPACITIVE COUPLED RETURN PATH PAD WITHSEPARABLE ARRAY ELEMENTS," filed on 30/3.2018;

U.S. provisional patent application Ser. No. 62/650,887 entitled "SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES";

U.S. patent application Ser. No. 62/650,882 entitled "SMOKE EVACUTION MODULE FOR INTERACTIVE SURGICAL PLATFORM"; and

U.S. patent application Ser. No. 62/650,877 entitled "SURGICAL SMOKE EVACUTION SENSING AND CONTROL".

The applicant of the present patent application owns the following U.S. patent applications filed on 29/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. patent application Ser. No. 15/940,641, entitled "INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATIONCAPABILITIES";

U.S. patent application Ser. No. 15/940,648, entitled "INTERACTIVE SURGICAL SYSTEMS WITH Conditioning HANDLING OFDEVICES AND DATA CAPABILITIES";

U.S. patent application Ser. No. 15/940,656 entitled "SURGICAL HUB COORDINATION OF CONTROL AND COMMUNICATION OF OPERATION DEVICES";

U.S. patent application Ser. No. 15/940,666 entitled "SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING ROOMS";

U.S. patent application Ser. No. 15/940,670 entitled "Cooling timing OF DATA DERIVED FROM SECONDARYSOURCES BY INTELLIGENT SURGICAL HUBS";

U.S. patent application Ser. No. 15/940,677 entitled "SURGICAL HUB CONTROL ARRANGEMENTS";

U.S. patent application Ser. No. 15/940,632, entitled "DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND DCREATE ANONYMIZED RECORD";

U.S. patent application Ser. No. 15/940,640 entitled "COMMUNICATION HUB AND STORAGE DEVICE FOR STORING PARAMETERSAND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICSSYSTEMS";

U.S. patent application Ser. No. 15/940,645, entitled "SELF DESCRIPTING DATA PACKETS GENERATED AT AN ISSUING GINSTRUMENT";

U.S. patent application Ser. No. 15/940,649 entitled "DATA PAIRING TO INTERCONNECT A DEVICE MEASURED PARAMETERWITH AN OUTCOME";

U.S. patent application Ser. No. 15/940,654 entitled "SURGICAL HUB SITUATIONAL AWARENESS";

U.S. patent application Ser. No. 15/940,663 entitled "SURGICAL SYSTEM DISTRIBUTED PROCESSING";

U.S. patent application Ser. No. 15/940,668 entitled "AGGREGAGATION AND REPORTING OF SURGICAL HUB DATA";

U.S. patent application Ser. No. 15/940,671, entitled "SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES INOPERATING THEEATER";

U.S. patent application Ser. No. 15/940,686 entitled "DISPLAY OF ALIGNMENT OF STAPLE CARTRIDGE TO PRIOR LINEARSTAPLE LINE";

U.S. patent application Ser. No. 15/940,700, entitled "STERILE FIELD INTERACTIVE CONTROL DISPLAYS";

U.S. patent application Ser. No. 15/940,629, entitled "COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS";

U.S. patent application Ser. No. 15/940,704, entitled "USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";

U.S. patent application Ser. No. 15/940,722 entitled "CHARACTERIZATION OF TISSUE IRREGULARITIES THROUGH THE USE OFMONO-CHROMATIC LIGHT REFRACTIVITY";

U.S. patent application Ser. No. 15/940,742 entitled "DUAL CMOS ARRAY IMAGING";

U.S. patent application Ser. No. 15/940,636 entitled "ADAPTIVE CONTROL PROGRAM UPDATES FOR basic DEVICES";

U.S. patent application Ser. No. 15/940,653, entitled "ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL HUBS";

U.S. patent application Ser. No. 15/940,660 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR CUTOSTOMIZATION AND DRECOMMERATIONS TO A USER";

U.S. patent application Ser. No. 15/940,679 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR LINKING OF LOCAL USAGETRENDS WITH THE RESOURCE ACQUISITION BEHAVORS OF LARGER DATA SET";

U.S. patent application Ser. No. 15/940,694 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR MEDICAL FACILITY SEGMENTED DIVIDIONUALIZATION OF INSTRUMENTS FUNCTION";

U.S. patent application Ser. No. 15/940,634, entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR SECURITY ANDAUTHENTICATION TRENDS AND REACTIVE MEASURES";

U.S. patent application Ser. No. 15/940,706 entitled "DATA HANDLING ANDPRIORITIZATION IN A CLOUD ANALYTICSNETWORK";

U.S. patent application Ser. No. 15/940,675 entitled "CLOOUD INTERFACE FOR COUPLED SURGICAL DEVICES";

U.S. patent application Ser. No. 15/940,627 entitled "DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,637 entitled "COMMUNICATION ARRANGEMENTS FOR ROBOT-ASSISTED SURGICATALLATFORMS";

U.S. patent application Ser. No. 15/940,642 entitled "CONTROL FOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,676 entitled "AUTOMATIC TOOL ADJUSTMENT FOR R ROBOT-ASSISTED SURGICATALLATOMS";

U.S. patent application Ser. No. 15/940,680 entitled "CONTROL FOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,683 entitled "passenger activities FOR ROBOT-associated passengers";

U.S. patent application Ser. No. 15/940,690 entitled "DISPLAY ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS"; and

U.S. patent application Ser. No. 15/940,711 entitled "SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 3, 28, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application Ser. No. 62/649,302 entitled "INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATIONCAPABILITIES";

U.S. provisional patent application Ser. No. 62/649,294 entitled "DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND DCREATE ANONYMIZED RECORD";

U.S. provisional patent application Ser. No. 62/649,300 entitled "SURGICAL HUB SITUONAL AWARENESS";

U.S. provisional patent application Ser. No. 62/649,309 entitled "SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES INOPERATING THEEATER";

U.S. provisional patent application Ser. No. 62/649,310, entitled "COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS";

U.S. provisional patent application Ser. No. 62/649,291 entitled "USE OF LASER LIGHT AND RED-GREEN COLORATION DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";

U.S. provisional patent application Ser. No. 62/649,296, entitled "ADAPTIVE CONTROL PROGRAM UPDATES FOR basic DEVICES";

U.S. provisional patent application Ser. No. 62/649,333 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR CUTOSTOMIZATION AND DRECOMMENDATIONS TO A USER";

U.S. provisional patent application Ser. No. 62/649,327 entitled "CLOOUD-BASED MEDICAL ANALYTICS FOR SECURITY ANDAUTHENTATION TRENDS AND REACTIVE MEASURES";

U.S. provisional patent application Ser. No. 62/649,315 entitled "DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICSNETWORK";

U.S. provisional patent application Ser. No. 62/649,313 entitled "CLOOUD INTERFACE FOR COUPLED SURGICAL DEVICES";

U.S. provisional patent application Ser. No. 62/649,320 entitled "DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. provisional patent application Ser. No. 62/649,307 entitled "AUTOMATIC TOOL ADJUSTMENT FOR ROBOT-ASSISTED SURGICATALLATOMS"; and

U.S. provisional patent application Ser. No. 62/649,323, entitled "SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 8/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. provisional patent application Ser. No. 62/640,417 entitled "TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM FOR"; and

U.S. provisional patent application Ser. No. 62/640,415 entitled "ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEMS THEREFOR".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2017, 12, 28, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application Ser. No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM";

U.S. provisional patent application Ser. No. 62/611,340, entitled "CLOOUD-BASED MEDICAL ANALYTICS"; and

U.S. patent application Ser. No. 62/611,339, entitled "ROBOT ASSISTED SURGICAL PLATFORM".

Before explaining various aspects of the surgical device and generator in detail, it should be noted that the example illustrated application or use is not limited to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented alone or in combination with other aspects, variations and modifications, and may be practiced or carried out in various ways. Furthermore, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative embodiments for the convenience of the reader and are not for the purpose of limiting the invention. Moreover, it is to be understood that expressions of one or more of the following described aspects, and/or examples may be combined with any one or more of the other below described aspects, and/or examples.

Surgical hub

Referring to fig. 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (e.g., cloud 104, which may include a remote server 113 coupled to a storage device 105). Each surgical system 102 includes at least one surgical hub 106 in communication with cloud 104, which may include a remote server 113. In one example, as shown in fig. 1, the surgical system 102 includes a visualization system 108, a robotic system 110, and a handheld smart surgical instrument 112 configured to be able to communicate with each other and/or with the hub 106. In some aspects, surgical system 102 may include M number of hubs 106, N number of visualization systems 108, O number of robotic systems 110, and P number of handheld intelligent surgical instruments 112, where M, N, O and P are integers greater than or equal to one.

Fig. 2 shows an example of a surgical system 102 for performing a surgical procedure on a patient lying on an operating table 114 in a surgical room 116. The robotic system 110 is used as part of the surgical system 102 during surgery. The robotic system 110 includes a surgeon's console 118, a patient side cart 120 (surgical robot), and a surgical robot hub 122. The patient side cart 120 can manipulate at least one removably coupled surgical tool 117 through a minimally invasive incision in the patient's body while the surgeon views the surgical site through the surgeon's console 118. An image of the surgical site may be obtained by the medical imaging device 124, which may be manipulated by the patient side cart 120 to orient the imaging device 124. The robot hub 122 may be used to process images of the surgical site for subsequent display to the surgeon via the surgeon's console 118.

Other types of robotic systems may be readily adapted for use with the surgical system 102. Various examples of robotic systems and surgical tools suitable for use in the present disclosure are described in U.S. provisional patent application serial No. 62/611,339, entitled "ROBOT assembly system tool," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety.

Various examples of CLOUD-BASED analysis performed by CLOUD 104 and suitable for use with the present disclosure are described in U.S. provisional patent application serial No. 62/611,340 entitled "CLOUD-BASED MEDICAL ANALYTICS," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety.

In various aspects, the imaging device 124 includes at least one image sensor and one or more optical components. Suitable image sensors include, but are not limited to, Charge Coupled Device (CCD) sensors and Complementary Metal Oxide Semiconductor (CMOS) sensors.

The optical components of the imaging device 124 may include one or more illumination sources and/or one or more lenses. One or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.

The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum (sometimes referred to as the optical spectrum or the luminescence spectrum) is that portion of the electromagnetic spectrum that is visible to (i.e., detectable by) the human eye, and may be referred to as visible light or simple light. A typical human eye will respond to wavelengths in air from about 380nm to about 750 nm.

The invisible spectrum (i.e., the non-luminescent spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380nm and above about 750 nm). The human eye cannot detect the invisible spectrum. Wavelengths greater than about 750nm are longer than the red visible spectrum and they become invisible Infrared (IR), microwave and radio electromagnetic radiation. Wavelengths less than about 380nm are shorter than the violet spectrum and they become invisible ultraviolet, x-ray and gamma-ray electromagnetic radiation.

In various aspects, the imaging device 124 is configured for use in minimally invasive surgery. Examples of imaging devices suitable for use in the present disclosure include, but are not limited to, arthroscopes, angioscopes, bronchoscopes, cholangioscopes, colonoscopes, cytoscopes, duodenoscopes, enteroscopes, esophago-duodenoscopes (gastroscopes), endoscopes, laryngoscopes, nasopharyngo-nephroscopes, sigmoidoscopes, thoracoscopes, and intrauterine scopes.

In one aspect, the imaging device employs multispectral monitoring to distinguish topography from underlying structures. A multispectral image is an image that captures image data across a particular range of wavelengths of the electromagnetic spectrum. The wavelengths may be separated by filters or by using instruments that are sensitive to specific wavelengths, including light from frequencies outside the visible range, such as IR and ultraviolet. Spectral imaging may allow extraction of additional information that the human eye fails to capture with its red, green, and blue receptors. The use of multispectral Imaging is described in more detail under the heading "Advanced Imaging Acquisition Module" of U.S. provisional patent application Ser. No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety. Multispectral monitoring may be a useful tool for repositioning the surgical site after completion of a surgical task to perform one or more of the previously described tests on the treated tissue.

It is self-evident that strict sterilization of the operating room and surgical equipment is required during any surgical procedure. The stringent hygiene and sterilization conditions required in a "surgical room" (i.e., an operating room or treatment room) require the highest possible sterility of all medical devices and equipment. Part of this sterilization process is any substance that needs to be sterilized, including the imaging device 124 and its attachments and components, to contact the patient or penetrate the sterile field. It should be understood that the sterile field may be considered a designated area that is considered free of microorganisms, such as within a tray or within a sterile towel, or the sterile field may be considered an area around a patient that is ready for surgery. The sterile field may include members of a team who are properly wearing swabs, as well as all equipment and fixtures in the field.

In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage arrays, and one or more displays, which are strategically arranged relative to the sterile zone, as shown in fig. 2. In one aspect, the visualization system 108 includes interfaces for HL7, PACS, and EMR. Various components of the visualization system 108 are described under the heading "Advanced Imaging Acquisition Module" of U.S. provisional patent application Ser. No. 62/611,341, entitled "INTERACTIVE SURGICAL PLATFORM," filed 2017, 12, 28, and the disclosure of which is incorporated herein by reference in its entirety.

As shown in fig. 2, a main display 119 is positioned in the sterile field to be visible to the operator at the surgical table 114. Further, the visualization tower 111 is positioned outside the sterile field. Visualization tower 111 includes a first non-sterile display 107 and a second non-sterile display 109 facing away from each other. The visualization system 108 guided by the hub 106 is configured to be able to coordinate the flow of information to operators inside and outside the sterile field using the displays 107, 109, and 119. For example, the hub 106 may cause the visualization system 108 to display a snapshot of the surgical site recorded by the imaging device 124 on the non-sterile display 107 or 109 while maintaining a real-time feed of the surgical site on the main display 119. A snapshot on non-sterile display 107 or 109 may allow a non-sterile operator to, for example, perform diagnostic steps associated with a surgical procedure.

In one aspect, hub 106 is further configured to be able to route diagnostic inputs or feedback entered by non-sterile operators at visualization tower 111 to a main display 119 within the sterile field, where it can be viewed by the sterile operator on the operating floor. In one example, the input may be a modified form of a snapshot displayed on non-sterile display 107 or 109, which may be routed through hub 106 to main display 119.

Referring to fig. 2, a surgical instrument 112 is used in surgery as part of the surgical system 102. Hub 106 is further configured to coordinate the flow of information to the display of surgical instrument 112. For example, the harmonized information flow is further described in U.S. provisional patent application serial No. 62/611,341, entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety. Diagnostic inputs or feedback entered by a non-sterile operator at the visualization tower 111 may be routed by the hub 106 to a surgical instrument display 115 within the sterile field, where the inputs or feedback may be viewed by the operator of the surgical instrument 112. Exemplary surgical instruments suitable for use in the surgical system 102 are described under the heading "surgical instrument hardware" in U.S. provisional patent application serial No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety, for example.

Referring now to fig. 3, hub 106 is depicted in communication with visualization system 108, robotic system 110, and handheld intelligent surgical instrument 112. The hub 106 includes a hub display 135, an imaging module 138, a generator module 140 (which may include a monopole generator 142, a dipole generator 144, and/or an ultrasound generator 143), a communication module 130, a processor module 132, and a memory array 134. In certain aspects, as shown in fig. 3, hub 106 further includes a smoke evacuation module 126, a suction/irrigation module 128, and/or an operating room mapping module 133.

During surgery, the application of energy to tissue for sealing and/or cutting is typically associated with smoke evacuation, aspiration of excess fluid, and/or irrigation of the tissue. Fluid lines, power lines and/or data lines from different sources are often tangled during surgery. Valuable time may be lost in addressing the problem during surgery. Disconnecting the lines may require disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular housing 136 provides a unified environment for managing power, data, and fluid lines, which reduces the frequency of entanglement between such lines.

Aspects of the present disclosure provide a surgical hub for use in a surgical procedure involving application of energy to tissue at a surgical site. The surgical hub includes a hub housing and a composite generator module slidably received in a docking station of the hub housing. The docking station includes data contacts and power contacts. The combined generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component seated in a single cell. In one aspect, the combined generator module further comprises a smoke evacuation device, at least one energy delivery cable for connecting the combined generator module to a surgical instrument, at least one smoke evacuation device configured to evacuate smoke, fluids and/or particles generated by application of therapeutic energy to tissue, and a fluid line extending from the remote surgical site to the smoke evacuation device.

In one aspect, the fluid line is a first fluid line and the second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub housing. In one aspect, the hub housing includes a fluid interface.

Certain surgical procedures may require more than one energy type to be applied to tissue. One energy type may be more advantageous for cutting tissue, while a different energy type may be more advantageous for sealing tissue. For example, a bipolar generator may be used to seal tissue, while an ultrasonic generator may be used to cut the sealed tissue. Aspects of the present disclosure provide a solution in which the hub modular housing 136 is configured to accommodate different generators and facilitate interactive communication therebetween. One of the advantages of the hub modular housing 136 is the ability to quickly remove and/or replace various modules.

Aspects of the present disclosure provide a modular surgical housing for use in a surgical procedure involving the application of energy to tissue. The modular surgical housing includes a first energy generator module configured to generate a first energy for application to tissue, and a first docking station including a first docking port including a first data and power contact, wherein the first energy generator module is slidably movable into electrical engagement with the power and data contact, and wherein the first energy generator module is slidably movable out of electrical engagement with the first power and data contact,

as further described above, the modular surgical housing further includes a second energy generator module configured to generate a second energy different from the first energy for application to tissue, and a second docking station including a second docking port including second data and power contacts, wherein the second energy generator module is slidably movable into electrical engagement with the power and data contacts, and wherein the second energy generator is slidably movable out of electrical contact with the second power and data contacts.

In addition, the modular surgical housing further includes a communication bus between the first docking port and the second docking port configured to facilitate communication between the first energy generator module and the second energy generator module.

Referring to fig. 3-7, aspects of the present disclosure are presented as a hub modular housing 136 that allows for modular integration of the generator module 140, smoke evacuation module 126, and suction/irrigation module 128. The hub modular housing 136 also facilitates interactive communication between the modules 140, 126, 128. As shown in fig. 5, the generator module 140 may be a generator module with integrated monopolar, bipolar, and ultrasound devices supported in a single housing unit 139 that is slidably inserted into the hub modular housing 136. As shown in fig. 5, the generator module 140 may be configured to be connectable to a monopolar device 146, a bipolar device 147, and an ultrasound device 148. Alternatively, the generator modules 140 may include a series of monopole generator modules, bipolar generator modules, and/or ultrasonic generator modules that interact through the hub modular housing 136. The hub modular housing 136 can be configured to facilitate the insertion of multiple generators and the interactive communication between generators docked into the hub modular housing 136 such that the generators will act as a single generator.

In one aspect, the hub modular housing 136 includes a modular power and communications backplane 149 having external and wireless communications connections to enable removable attachment of the modules 140, 126, 128 and interactive communications therebetween.

In one aspect, the hub modular housing 136 includes a docking cradle or drawer 151 (also referred to herein as a drawer) configured to slidably receive the modules 140, 126, 128. Fig. 4 illustrates a partial perspective view of the surgical hub housing 136 and the composite generator module 145 slidably received in the docking station 151 of the surgical hub housing 136. The docking port 152 having power and data contacts on the back of the combined generator module 145 is configured to be able to engage the corresponding docking port 150 with the power and data contacts of the corresponding docking station 151 of the hub module housing 136 when the combined generator module 145 is slid into place within the corresponding docking station 151 of the hub module housing 136. In one aspect, the combined generator module 145 includes bipolar, ultrasonic, and monopolar modules integrated together into a single housing unit 139, as shown in fig. 5.

In various aspects, the smoke evacuation module 126 includes a fluid line 154 that communicates captured/collected smoke and/or fluid from the surgical site to, for example, the smoke evacuation module 126. Vacuum suction from smoke evacuation module 126 may draw smoke into the opening of the common conduit at the surgical site. The utility conduit coupled to the fluid line may be in the form of a flexible tube terminating at the smoke evacuation module 126. The common conduit and fluid lines define a fluid path that extends toward the smoke evacuation module 126 housed in the hub housing 136.

In various aspects, the suction/irrigation module 128 is coupled to a surgical tool that includes an aspiration fluid line and a suction fluid line. In one example, the aspiration fluid line and the suction fluid line are in the form of flexible tubes extending from the surgical site toward the suction/irrigation module 128. The one or more drive systems may be configured to enable irrigation of fluid to the surgical site and aspiration of fluid from the surgical site.

In one aspect, a surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, a suction tube, and an irrigation tube. The draft tube may have an inlet at a distal end thereof, and the draft tube extends through the shaft. Similarly, a draft tube may extend through the shaft and may have an inlet adjacent the energy delivery tool. The energy delivery tool is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable that initially extends through the shaft.

The irrigation tube may be in fluid communication with a fluid source, and the aspiration tube may be in fluid communication with a vacuum source. The fluid source and/or vacuum source may be seated in the suction/irrigation module 128. In one example, the fluid source and/or vacuum source may be seated in the hub housing 136 independently of the suction/irrigation module 128. In such examples, the fluid interface can connect the suction/irrigation module 128 to a fluid source and/or a vacuum source.

In one aspect, the modules 140, 126, 128 on the hub modular housing 136 and/or their corresponding docking stations may include alignment features configured to enable alignment of the docking ports of the modules into engagement with their corresponding ports in the docking stations of the hub modular housing 136. For example, as shown in fig. 4, the combined generator module 145 includes side brackets 155, the side brackets 155 configured to be slidably engageable with corresponding brackets 156 of corresponding docking mounts 151 of the hub modular housing 136. The brackets cooperate to guide the docking port contacts of the combined generator module 145 into electrical engagement with the docking port contacts of the hub modular housing 136.

In some aspects, the drawers 151 of the hub modular housing 136 are the same or substantially the same size, and the modules are sized to be received in the drawers 151. For example, the side brackets 155 and/or 156 may be larger or smaller depending on the size of the module. In other aspects, the drawers 151 are sized differently and are each designed to accommodate a particular module.

In addition, the contacts of a particular module may be keyed to engage the contacts of a particular drawer to avoid inserting the module into a drawer having unmatched contacts.

As shown in fig. 4, the docking port 150 of one drawer 151 may be coupled to the docking port 150 of another drawer 151 by a communication link 157 to facilitate interactive communication between modules seated in the hub modular housing 136. Alternatively or additionally, the docking port 150 of the hub modular housing 136 can facilitate wireless interactive communication between modules seated in the hub modular housing 136. Any suitable wireless communication may be employed, such as, for example, Air Titan-Bluetooth.

Fig. 6 illustrates a separate power bus attachment for a plurality of lateral docking ports of a lateral modular housing 160, the lateral modular housing 160 configured to receive a plurality of modules of a surgical hub 206. The lateral modular housing 160 is configured to laterally receive and interconnect the modules 161. The modules 161 are slidably inserted into docking feet 162 of a lateral modular housing 160, which lateral modular housing 160 includes a floor for interconnecting the modules 161. As shown in fig. 6, the modules 161 are arranged laterally in a lateral modular housing 160. Alternatively, the modules 161 may be arranged vertically in a lateral modular housing.

Fig. 7 illustrates a vertical modular housing 164 configured to receive a plurality of modules 165 of surgical hub 106. The modules 165 are slidably inserted into docking feet or drawers 167 of a vertical modular housing 164, which vertical modular housing 164 includes a floor for interconnecting the modules 165. Although the drawers 167 of the vertical modular housing 164 are arranged vertically, in some cases, the vertical modular housing 164 may include laterally arranged drawers. Further, the modules 165 may interact with each other through docking ports of the vertical modular housing 164. In the example of FIG. 7, a display 177 is provided for displaying data related to the operation of module 165. In addition, the vertical modular housing 164 includes a main module 178 that seats a plurality of sub-modules slidably received in the main module 178.

In various aspects, the imaging module 138 includes an integrated video processor and modular light source, and is adapted for use with a variety of imaging devices. In one aspect, the imaging device is constructed of a modular housing that can be fitted with a light source module and a camera module. The housing may be a disposable housing. In at least one example, the disposable housing is removably coupled to the reusable controller, the light source module, and the camera module. The light source module and/or the camera module may be selectively selected according to the type of the surgical operation. In one aspect, the camera module includes a CCD sensor. In another aspect, the camera module includes a CMOS sensor. In another aspect, the camera module is configured for scanning beam imaging. Also, the light source module may be configured to be able to deliver white light or different light, depending on the surgical procedure.

During a surgical procedure, it may be inefficient to remove a surgical device from a surgical site and replace the surgical device with another surgical device that includes a different camera or a different light source. Temporary loss of vision at the surgical site can lead to undesirable consequences. The modular imaging apparatus of the present disclosure is configured to enable midstream replacement of a light source module or a camera module during a surgical procedure without having to remove the imaging apparatus from the surgical site.

In one aspect, an imaging device includes a tubular housing including a plurality of channels. The first channel is configured to slidably receive a camera module that may be configured for snap-fit engagement with the first channel. The second channel is configured to slidably receive a light source module that may be configured for snap-fit engagement with the second channel. In another example, the camera module and/or the light source module may be rotated within their respective channels to a final position. Threaded engagement may be used instead of snap-fit engagement.

In various examples, multiple imaging devices are placed at different locations in a surgical field to provide multiple views. The imaging module 138 may be configured to be able to switch between imaging devices to provide an optimal view. In various aspects, the imaging module 138 may be configured to be able to integrate images from different imaging devices.

Various IMAGE PROCESSORs AND imaging devices suitable for use in the present disclosure are described in U.S. patent No. 7,995,045 entitled "COMBINED SBI AND associated IMAGE PROCESSOR" published on 9.8.2011, which is incorporated herein by reference in its entirety. Further, U.S. patent No. 7,982,776 entitled "SBI MOTION artifact AND METHOD," published 7/19/2011, which is incorporated by reference herein in its entirety, describes various systems for removing MOTION artifacts from image data. Such a system may be integrated with the imaging module 138. Further, U.S. patent application publication Nos. 2011/0306840, entitled "CONTROL MAGNETIC SOURCE TO FIXTUREINTRACORPORAL APPATUS", published 12/15.2011 and U.S. patent application publication No. 2014/0243597, entitled "SYSTEM FOR PERFOR PERFORMING A MINIMALLY INVASIVE SURGICAL PROCEDURE", published 28.8.2014, each of which is incorporated herein by reference in its entirety.

Fig. 8 illustrates a surgical data network 201 including a modular communication hub 203, the modular communication hub 203 configured to enable connection of modular devices located in one or more operating rooms of a medical facility or any room in the medical facility specifically equipped for surgical operations to a cloud-based system (e.g., a cloud 204 that may include a remote server 213 coupled to a storage device 205). In one aspect, modular communication hub 203 includes a network hub 207 and/or a network switch 209 that communicate with network routers. Modular communication hub 203 may also be coupled to local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 may be configured to be passive, intelligent, or switched. The passive surgical data network acts as a conduit for data, enabling it to be transferred from one device (or segment) to another device (or segment) as well as cloud computing resources. The intelligent surgical data network includes additional features to enable monitoring of traffic through the surgical data network and to configure each port in the hub 207 or network switch 209. The intelligent surgical data network may be referred to as a manageable hub or switch. The switching hub reads the destination address of each packet and then forwards the packet to the correct port.

Modular devices 1a-1n located in an operating room may be coupled to a modular communication hub 203. Network hub 207 and/or network switch 209 may be coupled to network router 211 to connect devices 1a-1n to cloud 204 or local computer system 210. Data associated with the devices 1a-1n may be transmitted via the router to the cloud-based computer for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transmitted to the local computer system 210 for local data processing and manipulation. Modular devices 2a-2m located in the same operating room may also be coupled to network switch 209. Network switch 209 may be coupled to network hub 207 and/or network router 211 to connect devices 2a-2m to cloud 204. Data associated with the devices 2a-2n may be transmitted via the network router 211 to the cloud 204 for data processing and manipulation. Data associated with the devices 2a-2m may also be transmitted to the local computer system 210 for local data processing and manipulation.

It should be understood that surgical data network 201 may be expanded by interconnecting multiple hubs 207 and/or multiple network switches 209 with multiple network routers 211. The modular communication hub 203 may be contained in a modular control tower configured to be capable of receiving a plurality of devices 1a-1n/2a-2 m. Local computer system 210 may also be contained in a modular control tower. The modular communication hub 203 is connected to a display 212 to display images obtained by some of the devices 1a-1n/2a-2m, for example, during surgery. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as non-contact sensor modules in an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, a smoke evacuation module 126, a suction/irrigation module 128, a communication module 130, a processor module 132, a memory array 134, a surgical device connected to a display, and/or other modular devices that may be connected to a modular communication hub 203 of a surgical data network 201.

In one aspect, the surgical data network 201 may include a combination of network hub(s), network switch (es), and network router(s) that connect the devices 1a-1n/2a-2m to the cloud. Any or all of the devices 1a-1n/2a-2m coupled to the hub or network switch may collect data in real time and transmit the data into the cloud computer for data processing and manipulation. It should be appreciated that cloud computing relies on shared computing resources rather than using local servers or personal devices to process software applications. The term "cloud" may be used as a metaphor for "internet," although the term is not so limited. Accordingly, the term "cloud computing" may be used herein to refer to a "type of internet-based computing" in which different services (such as servers, memory, and applications) are delivered to modular communication hub 203 and/or computer system 210 located in a surgical room (e.g., a fixed, mobile, temporary, or live operating room or space) and devices connected to modular communication hub 203 and/or computer system 210 over the internet. The cloud infrastructure may be maintained by a cloud service provider. In this case, the cloud service provider may be an entity that coordinates the use and control of the devices 1a-1n/2a-2m located in one or more operating rooms. Cloud computing services can perform a large amount of computing based on data collected by smart surgical instruments, robots, and other computerized devices located in the operating room. The hub hardware enables multiple devices or connections to connect to a computer in communication with the cloud computing resources and memory.

Applying cloud computer data processing techniques to the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical results, reduced costs and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to observe tissue conditions to assess leakage or perfusion of sealed tissue following tissue sealing and cutting procedures. At least some of the devices 1a-1n/2a-2m may be employed to identify pathologies, such as the effects of disease, using cloud-based computing to examine data including images of body tissue samples for diagnostic purposes. This includes localization and edge confirmation of tissues and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using various sensors integrated with imaging devices and techniques, such as overlaying images captured by multiple imaging devices. The data (including image data) collected by the devices 1a-1n/2a-2m may be transmitted to the cloud 204 or the local computer system 210 or both for data processing and manipulation, including image processing and manipulation. Such data analysis may further employ outcome analysis processing, and use of standardized methods may provide beneficial feedback to confirm or suggest modification of the behavior of the surgical treatment and surgeon.

In one implementation, the operating room devices 1a-1n may be connected to the modular communication hub 203 through a wired channel or a wireless channel, depending on the configuration of the devices 1a-1n to the network hub. In one aspect, hub 207 may be implemented as a local network broadcaster operating at the physical layer of the Open Systems Interconnection (OSI) model. The hub provides connectivity to devices 1a-1n located in the same operating room network. The hub 207 collects the data in the form of packets and transmits it to the router in half duplex mode. Hub 207 does not store any media access control/internet protocol (MAC/IP) used to transmit device data. Only one of the devices 1a-1n may transmit data through the hub 207 at a time. The hub 207 does not have routing tables or intelligence as to where to send information and broadcast all network data on each connection and to the remote server 213 (fig. 9) through the cloud 204. Hub 207 may detect basic network errors such as conflicts, but broadcasting all information to multiple ports may present a security risk and lead to bottlenecks.

In another implementation, the operating room devices 2a-2m may be connected to the network switch 209 via a wired channel or a wireless channel. Network switch 209 operates in the data link layer of the OSI model. The network switch 209 is a multicast device for connecting devices 2a-2m located in the same operating room to the network. Network switch 209 sends data in frames to network router 211 and operates in full duplex mode. Multiple devices 2a-2m may transmit data simultaneously through the network switch 209. The network switch 209 stores and uses the MAC addresses of the devices 2a-2m to transmit data.

Network hub 207 and/or network switch 209 are coupled to network router 211 to connect to cloud 204. Network router 211 operates in the network layer of the OSI model. Network router 211 creates a route for transmitting data packets received from network hub 207 and/or network switch 211 to the cloud-based computer resources for further processing and manipulation of data collected by any or all of devices 1a-1n/2a-2 m. Network router 211 may be employed to connect two or more different networks located at different locations, such as, for example, different operating rooms of the same medical facility or different networks located in different operating rooms of different medical facilities. Network router 211 sends data in packets to cloud 204 and operates in full duplex mode. Multiple devices may transmit data simultaneously. The network router 211 transmits data using the IP address.

In one example, hub 207 may be implemented as a USB hub, which allows multiple USB devices to be connected to a host. A USB hub may extend a single USB port to multiple tiers so that more ports are available for connecting devices to a host system computer. Hub 207 may include wired or wireless capabilities for receiving information over a wired channel or a wireless channel. In one aspect, a wireless USB short-range, high bandwidth wireless radio communication protocol may be used for communication between devices 1a-1n and devices 2a-2m located in an operating room.

In other examples, the operating room devices 1a-1n/2a-2m may communicate with the modular communication hub 203 via the Bluetooth wireless technology standard for exchanging data from stationary and mobile devices over short distances (using short wavelength UHF radio waves of 2.4 to 2.485GHz in the ISM band) and building Personal Area Networks (PANs). In other aspects, the operating room devices 1a-1n/2a-2m may communicate with the modular communication hub 203 via a variety of wireless or wired communication standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 series), WiMAX (IEEE 802.16 series), IEEE802.20, Long Term Evolution (LTE) and Ev-DO, HSPA +, HSDPA +, HSUPA +, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and ethernet derivatives thereof, as well as any other wireless and wired protocols designated 3G, 4G, 5G, and above. The computing module may include a plurality of communication modules. For example, a first communication module may be dedicated to shorter range wireless communications such as Wi-Fi and bluetooth, and a second communication module may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and the like.

The modular communication hub 203 may serve as a central connection for one or all of the operating room devices 1a-1n/2a-2m and handle a data type called a frame. The frames carry data generated by the devices 1a-1n/2a-2 m. When the modular communication hub 203 receives the frame, it is amplified and transmitted to the network router 211, which network router 211 transmits the data to the cloud computing resources using a plurality of wireless or wired communication standards or protocols as described herein.

Modular communication hub 203 may be used as a stand-alone device or connected to a compatible network hub and network switch to form a larger network. The modular communication hub 203 is generally easy to install, construct and maintain, making it a good option to network the operating room devices 1a-1n/2a-2 m.

Fig. 9 shows a computer-implemented interactive surgical system 200. The computer-implemented interactive surgical system 200 is similar in many respects to the computer-implemented interactive surgical system 100. For example, the computer-implemented interactive surgical system 200 includes one or more surgical systems 202 that are similar in many respects to the surgical system 102. Each surgical system 202 includes at least one surgical hub 206 in communication with a cloud 204, which may include a remote server 213. In one aspect, the computer-implemented interactive surgical system 200 includes a modular control tower 236, the modular control tower 236 being connected to a plurality of operating room devices, such as, for example, intelligent surgical instruments, robots, and other computerized devices located in an operating room. As shown in fig. 10, the modular control tower 236 includes a modular communication hub 203 coupled to the computer system 210. As shown in the example of fig. 9, the modular control tower 236 is coupled to an imaging module 238 coupled to an endoscope 239, a generator module 240 coupled to an energy device 241, a smoke ejector module 226, a suction/irrigation module 228, a communication module 230, a processor module 232, a storage array 234, a smart device/instrument 235 optionally coupled to a display 237, and a non-contact sensor module 242. The operating room devices are coupled to cloud computing resources and data storage via modular control tower 236. Robot hub 222 may also be connected to modular control tower 236 and cloud computing resources. The devices/instruments 235, visualization system 208, etc. may be coupled to the modular control tower 236 via wired or wireless communication standards or protocols, as described herein. The modular control tower 236 may be coupled to the hub display 215 (e.g., monitor, screen) to display and overlay images received from the imaging module, device/instrument display, and/or other visualization system 208. The hub display may also combine the image and the overlay image to display data received from devices connected to the modular control tower.

Fig. 10 shows the surgical hub 206 including a plurality of modules coupled to a modular control tower 236. The modular control tower 236 includes a modular communication hub 203 (e.g., a network connectivity device) and a computer system 210 to provide, for example, local processing, visualization, and imaging. As shown in fig. 10, the modular communication hub 203 may be connected in a hierarchical configuration to expand the number of modules (e.g., devices) connectable to the modular communication hub 203 and transmit data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in fig. 10, each of the network hubs/switches in modular communication hub 203 includes three downstream ports and one upstream port. The upstream hub/switch is connected to the processor to provide a communication connection with the cloud computing resources and the local display 217. Communication with the cloud 204 may be through a wired or wireless communication channel.

The surgical hub 206 employs the non-contact sensor module 242 to measure dimensions of the operating room and uses ultrasound or laser type non-contact measurement devices to generate a map of the operating room. An ultrasound-based non-contact sensor module scans an Operating Room by emitting a burst of ultrasound waves and receiving echoes as they bounce off the enclosure of the Operating Room, as described under the heading "Surgical Hub Spatial aware with Operating Room" in U.S. provisional patent application serial No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, which is incorporated herein by reference in its entirety, wherein the sensor module is configured to be able to determine the size of the Operating Room and adjust the bluetooth pairing distance limit. The laser-based non-contact sensor module scans the operating room by emitting laser pulses, receiving laser pulses bouncing off the enclosure of the operating room, and comparing the phase of the emitted pulses to the received pulses to determine the size of the operating room and adjust the bluetooth paired distance limit.

Computer system 210 includes a processor 244 and a network interface 245. The processor 244 is coupled via a system bus to the communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface 251. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), micro Charmel architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), personal computer memory card international association bus (PCMCIA), Small Computer System Interface (SCSI), or any other peripheral bus.

The controller 244 may be any single-core or multi-core processor, such as those provided by Texas instruments under the tradename ARM Cortex. In one aspect, the processor may be a processor core available from, for example, Texas Instruments LM4F230H5QR ARM Cortex-M4F, which includes 256KB of on-chip memory of single cycle flash or other non-volatile memory (up to 40MHz), a prefetch buffer for improved execution above 40MHz, 32KB of single cycle Sequential Random Access Memory (SRAM), loaded with instructions

Figure BDA0002653063590000331

Software internal Read Only Memory (ROM), 2KB Electrically Erasable Programmable Read Only Memory (EEPROM), and/or one or more Pulse Width Modulation (PWM) modules, one or more Quadrature Encoder Input (QEI) analog, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, the details of which can be seen in the product data sheet.

In one aspect, the processor 244 may comprise a safety controller comprising two series controller-based controllers (such as TMS570 and RM4x), also known under the trade name Hercules ARM Cortex R4, also manufactured by Texas Instruments. The safety controller may be configured to be specific to IEC 61508 and ISO 26262 safety critical applications, etc., to provide advanced integrated safety features while delivering scalable execution, connectivity, and memory options.

The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in nonvolatile memory. For example, nonvolatile memory can include ROM, Programmable ROM (PROM), Electrically Programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes Random Access Memory (RAM), which acts as external cache memory. Further, RAM may be available in a variety of forms, such as SRAM, Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM).

The computer system 210 also includes removable/non-removable, volatile/nonvolatile computer storage media such as, for example, disk storage. Disk storage includes, but is not limited to, devices such as a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick. In addition, disk storage can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), compact disk recordable drive (CD-R drive), compact disk rewritable drive (CD-RW drive) or a digital versatile disk ROM drive (DVD-ROM). To facilitate connection of the disk storage devices to the system bus, a removable or non-removable interface may be used.

It is to be appreciated that the computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in suitable operating environments. Such software includes an operating system. An operating system, which may be stored on disk storage, is used to control and allocate resources of the computer system. System applications utilize the operating system to manage resources through program modules and program data stored in system memory or on disk storage. It is to be appreciated that the various components described herein can be implemented with various operating systems or combinations of operating systems.

A user enters commands or information into the computer system 210 through input device(s) coupled to the I/O interface 251. Input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices are connected to the processor through the system bus via interface port(s). The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device(s) use the same type of port as the input device(s). Thus, for example, a USB port may be used to provide input to a computer system and to output information from the computer system to an output device. Output adapters are provided to illustrate that there are some output devices (such as monitors, displays, speakers, and printers) that require special adapters among other output devices.

The computer system 210 may operate in a networked environment using logical connections to one or more remote computers, such as a cloud computer(s), or a local computer. The remote cloud computer(s) can be personal computers, servers, routers, network PCs, workstations, microprocessor-based appliances, peer devices or other common network nodes and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of brevity, only a memory storage device with remote computer(s) is illustrated. The remote computer(s) is logically connected to the computer system through a network interface and then physically connected via a communications connection. Network interfaces encompass communication networks such as Local Area Networks (LANs) and Wide Area Networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, token Ring/IEEE 802.5, and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL).

In various aspects, the computer system 210, imaging module 238, and/or visualization system 208 of fig. 10, and/or the processor module 232 of fig. 9-10 may include an image processor, an image processing engine, a media processor, or any dedicated Digital Signal Processor (DSP) for processing digital images. The image processor may employ parallel computing with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) techniques to increase speed and efficiency. The digital image processing engine may perform a series of tasks. The image processor may be a system on a chip having a multi-core processor architecture.

Communication connection(s) refers to the hardware/software used to interface the network to the bus. While a communication connection is shown for exemplary clarity within the computer system, it can also be external to computer system 210. The hardware/software necessary for connection to the network interface includes, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.

Fig. 11 illustrates a functional block diagram of one aspect of a USB hub 300 device in accordance with at least one aspect of the present disclosure. In the illustrated aspect, the USB hub device 300 employs a TUSB2036 integrated circuit hub from texas instruments. The USB hub 300 is a CMOS device that provides an upstream USB transceiver port 302 and up to three downstream USB transceiver ports 304, 306, 308 according to the USB 2.0 specification. The upstream USB transceiver port 302 is a differential root data port that includes a differential data negative (DP0) input paired with a differential data positive (DM0) input. The three downstream USB transceiver ports 304, 306, 308 are differential data ports, where each port includes a differential data positive (DP1-DP3) output paired with a differential data negative (DM1-DM3) output.

The USB hub 300 device is implemented with a digital state machine rather than a microcontroller and does not require firmware programming. Fully compatible USB transceivers are integrated into the circuitry for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full-speed devices and low-speed devices by automatically setting the slew rate according to the speed of the device attached to the port. The USB hub 300 device may be configured in a bus-powered mode or a self-powered mode and includes hub power logic 312 for managing power.

The USB hub 300 device includes a serial interface engine 310 (SIE). SIE 310 is the front end of the USB hub 300 hardware and handles most of the protocols described in section 8 of the USB specification. The SIE 310 typically includes signaling up to the transaction level. The processing functions thereof may include: packet identification, transaction ordering, SOP, EOP, RESET and RESUME signal detection/generation, clock/data separation, no return to zero inversion (NRZI) data encoding/decoding and digit stuffing, CRC generation and verification (token and data), Packet ID (PID) generation and verification/decoding, and/or serial-parallel/parallel-serial conversion. 310 receives a clock input 314 and is coupled to pause/resume logic and frame timer 316 circuitry and hub repeater circuitry 318 to control communications between the upstream USB transceiver port 302 and the downstream USB transceiver ports 304, 306, 308 through port logic circuits 320, 322, 324. The SIE 310 is coupled to a command decoder 326 via interface logic 328 to control commands from the serial EEPROM via a serial EEPROM interface 330.

In various aspects, the USB hub 300 may connect 127 functions constructed in up to six logical layers (tiers) to a single computer. Further, the USB hub 300 may be connected to all external devices using a standardized four-wire cable that provides both communication and power distribution. The power configuration is bus powered mode and self-powered mode. The USB hub 300 may be configured to support four power management modes: bus-powered hubs with individual port power management or package port power management, and self-powered hubs with individual port power management or package port power management. In one aspect, the USB hub 300, upstream USB transceiver port 302, are plugged into the USB host controller using a USB cable, and downstream USB transceiver ports 304, 306, 308 are exposed for connection of USB compatible devices, or the like.

Additional details regarding the structure and function OF the surgical HUB and/or surgical HUB network can be found in U.S. provisional patent application No. 62/659,900 entitled "METHOD OF COMMUNICATION" filed on 19/4.2018, which is incorporated herein by reference in its entirety.

Cloud system hardware and functional module

Fig. 12 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. In one aspect, a computer-implemented interactive surgical system is configured to monitor and analyze data related to the operation of various surgical systems, including surgical hubs, surgical instruments, robotic devices, and operating rooms or medical facilities. A computer-implemented interactive surgical system includes a cloud-based analysis system. While the cloud-based analysis system is described as a surgical system, it is not necessarily so limited and may generally be a cloud-based medical system. As shown in fig. 12, the cloud-based analysis system includes a plurality of surgical instruments 7012 (which may be the same as or similar to instrument 112), a plurality of surgical hubs 7006 (which may be the same as or similar to hub 106), and a surgical data network 7001 (which may be the same as or similar to network 201) to couple the surgical hubs 7006 to cloud 7004 (which may be the same as or similar to cloud 204). Each of the plurality of surgical hubs 7006 is communicatively coupled to one or more surgical instruments 7012. The hub 7006 is also communicatively coupled to a cloud 7004 of the computer-implemented interactive surgical system via a network 7001. The cloud 7004 is a remote centralized hardware and software source for storing, manipulating, and transmitting data generated based on the operation of various surgical systems. As shown in fig. 12, access to cloud 7004 is enabled via network 7001, which may be the internet or some other suitable computer network. The surgical hub 7006 coupled to the cloud 7004 may be considered a client side of a cloud computing system (i.e., a cloud-based analysis system). The surgical instrument 7012 is paired with a surgical hub 7006 for use in controlling and effecting various surgical procedures or operations as described herein.

In addition, the surgical instrument 7012 can include a transceiver for transmitting data to and from its corresponding surgical hub 7006 (which can also include a transceiver). The combination of the surgical instrument 7012 and the corresponding hub 7006 may indicate a particular location for providing a medical procedure, such as an operating room in a medical facility (e.g., hospital). For example, the memory of the surgical hub 7006 may store location data. As shown in fig. 12, the cloud 7004 includes a central server 7013 (which may be the same as or similar to remote server 113 in fig. 1 and/or remote server 213 in fig. 9), a hub application server 7002, a data analysis module 7034, and an input/output ("I/O") interface 7007. The central server 7013 of the cloud 7004 collectively hosts a cloud computing system that includes monitoring requests of the client surgical hubs 7006 and managing processing capacity of the cloud 7004 for executing the requests. Each of the central servers 7013 includes one or more processors 7008 coupled to a suitable memory device 7010, which may include volatile memory, such as Random Access Memory (RAM), and non-volatile memory, such as magnetic storage. Memory device 7010 may include machine executable instructions that, when executed, cause processor 7008 to execute data analysis module 7034 for cloud-based data analysis, operations, recommendations, and other operations described below. Further, the processor 7008 may execute the data analysis module 7034 independently or in conjunction with a hub application executed independently by the hub 7006. The central server 7013 also includes a database 2212 of aggregated medical data that may reside in memory 2210.

Based on the connections to the various surgical hubs 7006 via the network 7001, the cloud 7004 may aggregate data from particular data generated by the various surgical instruments 7012 and their corresponding hubs 7006. Such summarized data may be stored within the summarized medical data database 7011 of the cloud 7004. In particular, the cloud 7004 may advantageously perform data analysis and operations on the summarized data to generate insights and/or perform functions that the individual hubs 7006 themselves cannot implement. To this end, as shown in fig. 12, the cloud 7004 and the surgical hub 7006 are communicatively coupled to transmit and receive information. The I/O interface 7007 is connected to a plurality of surgical hubs 7006 via a network 7001. In this manner, the I/O interface 7007 may be configured to enable transfer of information between the surgical hub 7006 and the database 7011 of aggregated medical data. Thus, the I/O interface 7007 may facilitate read/write operations of the cloud-based analytics system. Such read/write operations may be performed in response to a request from the hub 7006. These requests may be transmitted to the hub 7006 through the hub application. The I/O interface 7007 may include one or more high speed data ports, which may include a Universal Serial Bus (USB) port, an IEEE 1394 port, and Wi-Fi and bluetooth I/O interfaces for connecting the cloud 7004 to the hub 7006. The hub application server 7002 of the cloud 7004 is configured to host and provide sharing capabilities to software applications (e.g., hub applications) executed by the surgical hub 7006. For example, the hub application server 7002 may manage requests made by hub applications through the hub 7006, control access to the database 7011 of aggregated medical data, and perform load balancing. Data analysis module 7034 is described in more detail with reference to fig. 13.

The particular cloud computing system configurations described in this disclosure are specifically designed to address various issues arising in the context of medical procedures and procedures performed using medical devices (such as the surgical instruments 7012, 112). In particular, the surgical instrument 7012 can be a digital surgical device configured to interact with the cloud 7004 for implementing techniques that improve performance of a surgical procedure. Various surgical instruments 7012 and/or the surgical hub 7006 may include touch-controlled user interfaces so that a clinician can control aspects of the interaction between the surgical instrument 7012 and the cloud 7004. Other suitable user interfaces for control may also be used, such as a user interface for auditory control.

Fig. 13 is a block diagram illustrating a functional architecture of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. The cloud-based analysis system includes a plurality of data analysis modules 7034 executable by processors 7008 of cloud 7004 for providing data analysis solutions to issues specifically raised in the medical field. As shown in fig. 13, the functionality of the cloud-based data analysis module 7034 may be facilitated via a hub application 7014 hosted by a hub application server 7002, which is accessible on a surgical hub 7006. Cloud processor 7008 and hub application 7014 may operate in conjunction to execute data analysis module 7034. An Application Program Interface (API)7016 defines a set of protocols and routines corresponding to the hub application 7014. In addition, the API7016 manages the storage and retrieval of data into and from the aggregated medical data database 7011 for the operation of the application program 7014. The cache 7018 also stores data (e.g., temporarily) and is coupled to the API7016 for more efficient retrieval of data used by the application programs 7014. Data analysis module 7034 in fig. 13 includes resource optimization module 7020, data summary collection and summary module 7022, authentication and security module 7024, control program update module 7026, patient outcome analysis module 7028, recommendation module 7030, and data classification and prioritization module 7032. According to some aspects, cloud 7004 can also implement other suitable data analysis modules. In one aspect, the data analysis module is used to analyze specific suggestions of trends, results, and other data.

For example, the data collection and aggregation module 7022 may be used to generate self-describing data (e.g., metadata) including identification of salient features or configurations (e.g., trends), management of redundant data sets that may be grouped by surgery, but not necessarily locked to actual surgical dates and surgeons, and storage of data in paired data sets. In particular, the set of data generated by operation of the surgical instrument 7012 may include applying a binary classification, e.g., bleeding or non-bleeding events. More generally, a binary classification may be characterized as a desired event (e.g., a successful surgical procedure) or an undesired event (e.g., a mis-fired or misused surgical instrument 7012). The aggregated self-descriptive data may correspond to individual data received from various groups or subgroups of the surgical hub 7006. Thus, the data collection and aggregation module 7022 may generate aggregated metadata or other organizational data based on the raw data received from the surgical hub 7006. To this end, the processor 7008 may be operably coupled to a hub application 7014 and a database 7011 of aggregated medical data for executing a data analysis module 7034. The data collection and aggregation module 7022 may store aggregated organizational data in a database 2212 of aggregated medical data.

Resource optimization module 7020 may be configured to be able to analyze this aggregate data to determine an optimal use of resources for a particular or group of medical facilities. For example, the resource optimization module 7020 may determine an optimal sequence point of the surgical stapling instrument 7012 for a set of medical facilities based on corresponding predicted requirements of such instruments 7012. Resource optimization module 7020 may also evaluate resource usage or other operational configurations of various medical facilities to determine whether resource usage can be improved. Similarly, suggestion module 7030 may be configured to be able to analyze the aggregated organizational data from data collection and aggregation module 7022 to provide suggestions. For example, the recommendation module 7030 may recommend to a medical facility (e.g., a medical services provider, such as a hospital) that a particular surgical instrument 7012 should be upgraded to an improved version based on, for example, a higher than expected error rate. Additionally, the recommendation module 7030 and/or the resource optimization module 7020 may recommend better supply chain parameters, such as product reordering points, and provide recommendations of different surgical instruments 7012, their use, or surgical steps to improve surgical outcomes. The medical facility may receive such recommendations via the corresponding surgical hub 7006. More specific advice regarding the parameters or configurations of various surgical instruments 7012 may also be provided. The hub 7006 and/or the surgical instrument 7012 may also each have a display screen that displays data or recommendations provided by the cloud 7004.

The patient outcome analysis module 7028 may analyze the surgical outcome associated with the currently used operating parameters of the surgical instrument 7012. Patient outcome analysis module 7028 may also analyze and evaluate other potential operating parameters. In this regard, recommendation module 7030 may recommend using these other potential operating parameters based on producing better surgical results (such as better sealing or less bleeding). For example, the suggestion module 7030 may transmit a suggestion to the surgical hub 7006 as to when to use a particular cartridge for a corresponding stapling surgical instrument 7012. Thus, the cloud-based analysis system, in controlling common variables, may be configured to be able to analyze a collection of large amounts of raw data and provide centralized recommendations (advantageously determined based on aggregated data) for a plurality of medical facilities. For example, a cloud-based analysis system may analyze, evaluate, and/or aggregate data based on the type of medical practice, the type of patient, the number of patients, geographic similarities between medical providers, which medical providers/facilities use similar types of instruments, and so forth, such that any individual medical facility alone cannot independently analyze.

The control program update module 7026 may be configured to execute various surgical instrument 7012 recommendations when a corresponding control program is updated. For example, patient outcome analysis module 7028 may identify correlations linking particular control parameters to successful (or unsuccessful) outcomes. Such correlations may be resolved when updated control programs are transmitted to the surgical instrument 7012 via the control program update module 7026. Updates to the instrument 7012 transmitted via the corresponding hub 7006 may be performed in conjunction with summaries collected and analyzed by the data collection and summary module 7022 of the cloud 7004. Additionally, the patient outcome analysis module 7028 and the recommendation module 7030 may identify improved methods of using the instrument 7012 based on the aggregated performance data.

The cloud-based analysis system may include security features implemented by the cloud 7004. These security features may be managed by the authorization and security module 7024. Each surgical hub 7006 may have associated unique credentials, such as a username, password, and other suitable security credentials. These credentials may be stored in memory 7010 and associated with the allowed cloud access levels. For example, based on providing accurate credentials, the surgical hub 7006 may be granted access to communicate with the cloud to a predetermined degree (e.g., may only participate in transmitting or receiving certain defined types of information). To this end, the database 7011 of aggregated medical data of the cloud 7004 may include a database of authorization credentials for verifying the accuracy of the provisioned credentials. Different credentials may be associated with different levels of permission to interact with cloud 7004, such as a predetermined level of access for receiving data analytics generated by cloud 7004.

Further, for security purposes, the cloud may maintain a database of hubs 7006, appliances 7012, and other devices that may include a "blacklist" of forbidden devices. In particular, the blacklisted surgical hubs 7006 may not be allowed to interact with the cloud, while the blacklisted surgical instruments 7012 may not have functional access to the corresponding hubs 7006 and/or may be prevented from functioning fully when paired with their corresponding hubs 7006. Additionally or alternatively, the cloud 7004 can mark the instrument 7012 based on incompatibility or other specified criteria. In this way, counterfeit medical devices and improper reuse of such devices throughout the cloud-based analysis system may be identified and addressed.

The surgical instrument 7012 may use the wireless transceiver to transmit a wireless signal, which may represent, for example, authorization credentials for accessing the corresponding hub 7006 and cloud 7004. The wired transceiver may also be used to transmit signals. Such authorization credentials may be stored in a respective memory device of the surgical instrument 7012. The authorization and security module 7024 may determine whether the authorization credential is accurate or counterfeit. The authorization and security module 7024 may also dynamically generate authorization credentials for enhanced security. The credentials may also be encrypted, such as by using hash-based encryption. Upon transmitting appropriate authorization, the surgical instrument 7012 may transmit a signal to the corresponding hub 7006 and ultimately to the cloud 7004 to indicate that the instrument 7012 is ready to acquire and transmit medical data. In response, the cloud 7004 may transition to a state that can be used to receive medical data for storage into the database 7011 of aggregated medical data. The readiness of this data transfer may be indicated, for example, by a light indicator on the instrument 7012. The cloud 7004 can also transmit signals to the surgical instrument 7012 for updating its associated control program. The cloud 7004 may transmit a signal relating to a particular class of surgical instrument 7012 (e.g., an electrosurgical instrument) such that software updates of the control program are transmitted only to the appropriate surgical instrument 7012. Further, the cloud 7004 can be used to implement a system-wide solution to address local or global issues based on selective data transfer and authorization credentials. For example, if a group of surgical instruments 7012 is identified as having a common manufacturing defect, the cloud 7004 may change the authorization credential corresponding to the group to achieve an operational lockout of the group.

The cloud-based analysis system may allow monitoring of multiple medical facilities (e.g., medical facilities such as hospitals) to determine improved practices and suggest changes accordingly (e.g., via suggestion module 2030). Thus, the processor 7008 of the cloud 7004 may analyze data associated with an individual medical facility to identify the facility and aggregate the data with other data associated with other medical facilities in the group. For example, groups may be defined based on similar operational practices or geographic locations. In this way, the cloud 7004 can provide analysis and recommendations across a group of medical facilities. Cloud-based analytics systems may also be used to enhance situational awareness. For example, the processor 7008 may predictively model the impact of the recommendation on the cost and effectiveness of a particular facility (relative to the overall operation and/or various medical procedures). The cost and effectiveness associated with that particular facility may also be compared to corresponding local areas of other facilities or any other comparable facility.

Data classification and prioritization module 7032 may prioritize and classify data based on criticality (e.g., severity, surprise, suspicion of medical events associated with the data). Such ordering and prioritization can be used in conjunction with the functionality of the other data analysis module 7034 described above to improve the cloud-based analysis and operations described herein. For example, data classification and prioritization module 7032 may assign priorities to data analysis performed by data collection and aggregation module 7022 and patient outcome analysis module 7028. Different priority levels may elicit particular responses (corresponding to urgency levels) from the cloud 7004, such as an increment of accelerated responses, special handling, exclusion of the database 7011 of aggregated medical data, or other suitable responses. Further, if desired, the cloud 7004 can transmit a request (e.g., a push message) for additional data from the corresponding surgical instrument 7012 through the hub application server. The push message may cause a notification to be displayed on the corresponding hub 7006 requesting support or additional data. This push message may be required in the event that the cloud detects a significant irregularity or abnormality and the cloud cannot determine the cause of the irregularity. The central server 7013 may be programmed to trigger the push message in certain significant circumstances, such as when the data is determined to be different than an expected value that exceeds a predetermined threshold or when it appears that security has been included, for example.

Additional details regarding cloud analysis systems can be found in U.S. provisional patent application No. 62/659,900 entitled "METHOD OF humomunication" filed on 19/4/2018, which is incorporated herein by reference in its entirety.

Situation awareness

While a "smart" device that includes a control algorithm responsive to sensed data may be an improvement over a "dumb" device that operates without regard to sensed data, some sensed data may be incomplete or uncertain when considered in isolation, i.e., in the context of no type of surgical procedure being performed or type of tissue being operated upon. Without knowing the surgical context (e.g., knowing the type of tissue being operated on or the type of procedure being performed), the control algorithm may control the modular device incorrectly or sub-optimally given the particular no-context sensing data. For example, the optimal manner in which a control algorithm for controlling a surgical instrument in response to a particular sensed parameter may vary depending on the particular tissue type being operated on. This is due to the fact that: different tissue types have different characteristics (e.g., tear resistance) and thus respond differently to actions taken by a surgical instrument. Thus, it may be desirable for a surgical instrument to take different actions even when the same measurement for a particular parameter is sensed. As one particular example, the optimal manner in which a surgical stapling and severing instrument is controlled in response to the instrument sensing an unexpectedly high force for closing its end effector will vary depending on whether the tissue type is prone to tearing or tear-resistant. For tissue that is prone to tearing (such as lung tissue), the instrument's control algorithm will optimally ramp down the motor speed in response to an unexpectedly high force for closure, thereby avoiding tearing tissue. For tissue that is resistant to tearing (such as stomach tissue), the instrument's control algorithm will optimally ramp the motor speed up in response to an unexpectedly high force for closure, thereby ensuring that the end effector is properly clamped on the tissue. The control algorithm may make a suboptimal decision without knowing whether lung tissue or stomach tissue has been clamped.

One solution utilizes a surgical hub that includes a system configured to derive information about the surgical procedure being performed based on data received from various data sources, and then control the paired modular devices accordingly. In other words, the surgical hub is configured to infer information about the surgical procedure from the received data and then control the modular devices paired with the surgical hub based on the inferred context of the surgical procedure. Fig. 14 illustrates a diagram of a situation-aware surgical system 5100 in accordance with at least one aspect of the present disclosure. In some examples, the data sources 5126 include, for example, a modular device 5102 (which may include sensors configured to be able to detect parameters associated with the patient and/or the modular device itself), a database 5122 (e.g., an EMR database containing patient records), and a patient monitoring device 5124 (e.g., a Blood Pressure (BP) monitor and an Electrocardiogram (EKG) monitor).

The surgical hub 5104 (which may be similar in many respects to the hub 106) may be configured to be capable of deriving background information related to the surgical procedure from the data, e.g., based on a particular combination of the received data or a particular order in which the data is received from the data source 5126. The context information inferred from the received data may include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure being performed by the surgeon, the type of tissue being operated on, or the body cavity that is the subject of the procedure. This ability of some aspects of the surgical hub 5104 to derive or infer information related to the surgical procedure from the received data may be referred to as "situational awareness. In one example, the surgical hub 5104 may incorporate a situational awareness system, which is hardware and/or programming associated with the surgical hub 5104 that derives contextual information related to the surgical procedure from the received data.

The situational awareness system of the surgical hub 5104 may be configured to derive contextual information from data received from the data source 5126 in a number of different ways. In one example, the situational awareness system includes a pattern recognition system or machine learning system (e.g., an artificial neural network) that has been trained on training data to associate various inputs (e.g., data from the database 5122, the patient monitoring device 5124, and/or the modular device 5102) with corresponding contextual information about the surgical procedure. In other words, the machine learning system may be trained to accurately derive contextual information about the surgical procedure from the provided inputs. In another example, the situational awareness system may include a look-up table that stores pre-characterized contextual information about the surgical procedure in association with one or more inputs (or input ranges) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table may return corresponding context information that the situational awareness system uses to control the modular device 5102. In one example, the contextual information received by the situational awareness system of the surgical hub 5104 is associated with a particular control adjustment or set of control adjustments for one or more modular devices 5102. In another example, the situational awareness system includes another machine learning system, look-up table, or other such system that generates or retrieves one or more control adjustments for one or more of the modular devices 5102 when providing contextual information as input.

The surgical hub 5104 incorporating the situational awareness system provides a number of benefits to the surgical system 5100. One benefit includes improved interpretation of sensed and collected data, which in turn will improve processing accuracy and/or use of the data during the surgical procedure. Returning to the previous example, the situational awareness surgical hub 5104 may determine the type of tissue being operated on; thus, when an unexpectedly high force is detected for closing the end effector of the surgical instrument, the situation aware surgical hub 5104 can properly ramp up or ramp down the motor speed for the tissue-type surgical instrument.

As another example, the type of tissue being operated on may affect the adjustment of the compressibility and loading thresholds of the surgical stapling and severing instrument for a particular tissue gap measurement. The situational aware surgical hub 5104 can infer whether the surgical procedure being performed is a chest procedure or an abdominal procedure, allowing the surgical hub 5104 to determine whether the tissue held by the end effector of the surgical stapling and severing instrument is lung tissue (for chest procedures) or stomach tissue (for abdominal procedures). The surgical hub 5104 can then adjust the compression rate and load thresholds of the surgical stapling and severing instrument as appropriate for the type of tissue.

As yet another example, the type of body cavity that is manipulated during an insufflation procedure may affect the function of the smoke extractor. The situational awareness surgical hub 5104 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the type of procedure. Since one type of procedure is typically performed within a particular body cavity, the surgical hub 5104 can then appropriately control the motor speed of the smoke extractor for the body cavity in which it is operating. Thus, the situational awareness surgical hub 5104 may provide consistent smoke output for both chest and abdominal surgery.

As yet another example, the type of procedure being performed may affect the optimal energy level at which an ultrasonic surgical instrument or a Radio Frequency (RF) electrosurgical instrument operates. For example, arthroscopic surgery requires higher energy levels because the end effector of an ultrasonic surgical instrument or RF electrosurgical instrument is immersed in fluid. The situation aware surgical hub 5104 can determine whether the surgical procedure is an arthroscopic procedure. The surgical hub 5104 may then adjust the RF power level or ultrasound amplitude (i.e., the "energy level") of the generator to compensate for the fluid-filled environment. Relatedly, the type of tissue being operated on may affect the optimal energy level at which the ultrasonic surgical instrument or RF electrosurgical instrument operates. The situational awareness surgical hub 5104 can determine the type of surgical procedure being performed and then customize the energy level of the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, according to the expected tissue profile of the surgical procedure. Further, the situation-aware surgical hub 5104 may be configured to be able to adjust the energy level of the ultrasonic surgical instrument or RF electrosurgical instrument throughout the surgical procedure, rather than only on a procedure-by-procedure basis. The situation aware surgical hub 5104 can determine the steps of the surgical procedure being performed or to be performed subsequently and then update the control algorithm for the generator and/or ultrasonic surgical instrument or RF electrosurgical instrument to set the energy level at a value appropriate for the desired tissue type according to the surgical procedure.

As yet another example, data can be extracted from the additional data source 5126 to improve the conclusion that the surgical hub 5104 extracts from one data source 5126. The situation aware surgical hub 5104 can augment the data it receives from the modular device 5102 with contextual information about the surgical procedure that has been built from other data sources 5126. For example, the situational awareness surgical hub 5104 may be configured to determine whether hemostasis has occurred (i.e., whether bleeding at the surgical site has stopped) based on video or image data received from the medical imaging device. However, in some cases, the video or image data may be uncertain. Thus, in one example, the surgical hub 5104 may also be configured to be able to compare physiological measurements (e.g., blood pressure sensed by a BP monitor communicatively connected to the surgical hub 5104) with hemostatic visual or image data (e.g., from the medical imaging device 124 (fig. 2) communicatively coupled to the surgical hub 5104) to determine the integrity of the suture or tissue weld. In other words, the situational awareness system of the surgical hub 5104 may take into account the physiological measurement data to provide additional context when analyzing the visualization data. Additional context may be useful when the visualization data itself may be ambiguous or incomplete.

Another benefit includes actively and automatically controlling the paired modular devices 5102 according to the particular step of the surgical procedure being performed to reduce the number of times medical personnel need to interact with or control the surgical system 5100 during the surgical procedure. For example, if the situation-aware surgical hub 5104 determines that a subsequent step of the procedure requires the use of an RF electrosurgical instrument, it may actively activate a generator connected to the instrument. Actively activating the energy source allows the instrument to be ready for use as soon as the previous step of the procedure is completed.

As another example, the situation aware surgical hub 5104 may determine whether a different view or degree of magnification on the display is required for the current or subsequent step of the surgical procedure based on the feature(s) that the surgeon expects to need to view at the surgical site. The surgical hub 5104 may then actively change the displayed view accordingly (e.g., provided by the medical imaging device for the visualization system 108), such that the display is automatically adjusted throughout the surgical procedure.

As yet another example, the situation aware surgical hub 5104 can determine which step of the surgical procedure is being performed or is to be performed subsequently and whether a comparison between particular data or data is required for that step of the surgical procedure. The surgical hub 5104 may be configured to automatically invoke a data screen based on the steps of the surgical procedure being performed without waiting for the surgeon to request this particular information.

Another benefit includes checking for errors during setup of the surgical procedure or during the course of the surgical procedure. For example, the situational awareness surgical hub 5104 may determine whether the operating room is properly or optimally set for the surgical procedure to be performed. The surgical hub 5104 may be configured to be able to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding manifest, product location, or setup requirements, and then compare the current operating room layout with the standard layout determined by the surgical hub 5104 for the type of surgical procedure being performed. In one example, the surgical hub 5104 may be configured to be able to compare a list of items for a procedure (e.g., scanned by a suitable scanner) and/or a list of devices paired with the surgical hub 5104 to a suggested or expected list of items and/or devices for a given surgical procedure. The surgical hub 5104 may be configured to provide an alert indicating the absence of a particular modular device 5102, patient monitoring device 5124, and/or other surgical item if any discontinuity exists between the lists. In one example, the surgical hub 5104 may be configured to be able to determine the relative distance or location of the modular device 5102 and the patient monitoring device 5124, e.g., via proximity sensors. The surgical hub 5104 can compare the relative position of the devices to a recommended or expected layout for a particular surgical procedure. The surgical hub 5104 may be configured to provide an alert indicating that the current layout for the surgical procedure deviates from the suggested layout if there are any discontinuities between layouts.

As another example, the situational awareness surgical hub 5104 can determine whether the surgeon (or other medical personnel) is making mistakes or otherwise deviating from the expected course of action during the surgical procedure. For example, the surgical hub 5104 may be configured to be able to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding list of steps or order of device usage, and then compare the steps being performed or the devices being used during the surgical procedure to the expected steps or devices determined by the surgical hub 5104 for the type of surgical procedure being performed. In one example, the surgical hub 5104 may be configured to provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at a particular step in the surgical procedure.

In general, the situational awareness system for the surgical hub 5104 improves surgical results by adjusting the surgical instruments (and other modular devices 5102) for the particular context of each surgical procedure, such as for different tissue types, and verifying actions during the surgical procedure. The situational awareness system also improves the surgeon's efficiency in performing the surgical procedure by automatically suggesting next steps, providing data, and adjusting the display and other modular devices 5102 in the operating room, depending on the particular context of the procedure.

Referring now to fig. 15, a timeline 5200 depicting situational awareness of a hub, such as the surgical hub 106 or 206 (fig. 1-11), is shown. The time axis 5200 is illustrative of the surgical procedure and background information that the surgical hub 106, 206 may derive from the data received from the data source at each step in the surgical procedure. The time axis 5200 depicts typical steps that nurses, surgeons, and other medical personnel will take during a lung segment resection procedure, starting from the establishment of an operating room and ending with the transfer of the patient to a post-operative recovery room.

The situation aware surgical hub 106, 206 receives data from data sources throughout the surgical procedure, including data generated each time medical personnel utilize a modular device paired with the surgical hub 106, 206. The surgical hub 106, 206 may receive this data from the paired modular devices and other data sources and continually derive inferences about the procedure being performed (i.e., background information) as new data is received, such as which step of the procedure is performed at any given time. The situational awareness system of the surgical hub 106, 206 can, for example, record data related to the procedure used to generate the report, verify that the medical personnel are taking steps, provide data or prompts that may be related to particular procedure steps (e.g., via a display screen), adjust the modular device based on context (e.g., activate a monitor, adjust a field of view (FOV) of a medical imaging device, or change an energy level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other such actions described above.

As a first step 5202 in this exemplary procedure, the hospital staff retrieves the patient's EMR from the hospital's EMR database. Based on the selected patient data in the EMR, the surgical hub 106, 206 determines that the procedure to be performed is a chest procedure.

In a second step 5204, the staff scans the incoming medical supplies for the procedure. The surgical hub 106, 206 cross-references the scanned supplies with a list of supplies utilized in various types of procedures and confirms that the supplied mixture corresponds to a chest procedure. In addition, the surgical hub 106, 206 may also be able to determine that the procedure is not a wedge procedure (because the incoming supplies lack some of the supplies required for a chest wedge procedure, or otherwise do not correspond to a chest wedge procedure).

In a third step 5206, medical personnel scan the patient belt via a scanner communicatively connected to the surgical hub 106, 206. The surgical hub 106, 206 may then confirm the identity of the patient based on the scanned data.

Fourth, the medical staff opens the ancillary equipment 5208. The ancillary equipment utilized may vary depending on the type of surgery and the technique to be used by the surgeon, but in this exemplary case they include smoke ejectors, insufflators, and medical imaging devices. When activated, the auxiliary device as a modular device may be automatically paired with a surgical hub 106, 206 located in a specific vicinity of the modular device as part of its initialization process. The surgical hub 106, 206 may then derive contextual information about the surgical procedure by detecting the type of modular device with which it is paired during the pre-operative or initialization phase. In this particular example, the surgical hub 106, 206 determines that the surgical procedure is a VATS procedure based on the particular combination of paired modular devices. Based on a combination of data from the patient's EMR, a list of medical supplies used in the procedure, and the type of modular device connected to the hub, the surgical hub 106, 206 can generally infer the particular procedure that the surgical team will perform. Once the surgical hub 106, 206 knows what specific procedure is being performed, the surgical hub 106, 206 may retrieve the steps of the procedure from memory or cloud and then cross-reference the data it subsequently receives from the connected data sources (e.g., modular devices and patient monitoring devices) to infer what steps of the surgical procedure are being performed by the surgical team.

In a fifth step 5210, the practitioner attaches EKG electrodes and other patient monitoring devices to the patient. EKG electrodes and other patient monitoring devices can be paired with the surgical hubs 106, 206. When the surgical hub 106, 206 begins to receive data from the patient monitoring device, the surgical hub 106, 206 thus confirms that the patient is in the operating room.

Sixth step 5212, the medical personnel induce anesthesia in the patient. The surgical hub 106, 206 may infer that the patient is under anesthesia based on data from the modular device and/or the patient monitoring device, including, for example, EKG data, blood pressure data, ventilator data, or a combination thereof. Upon completion of the sixth step 5212, the pre-operative portion of the lung segmentation resection procedure is completed and the operative portion begins.

In a seventh step 5214, the patient's lungs being operated on are collapsed (while ventilation is switched to the contralateral lungs). For example, the surgical hub 106, 206 may infer from the ventilator data that the patient's lungs have collapsed. The surgical hub 106, 206 may infer that the surgical portion of the procedure has begun because it may compare the detection of the patient's lung collapse to the expected steps of the procedure (which may have been previously visited or retrieved), thereby determining that collapsing the lungs is the first surgical step in that particular procedure.

In an eighth step 5216, a medical imaging device (e.g., an endoscope) is inserted and video from the medical imaging device is initiated. The surgical hub 106, 206 receives medical imaging device data (i.e., video or image data) through its connection to the medical imaging device. After receiving the medical imaging device data, the surgical hub 106, 206 may determine that a laparoscopic portion of the surgical procedure has begun. In addition, the surgical hub 106, 206 may determine that the particular procedure being performed is a segmental resection, rather than a leaf resection (note that wedge procedures have been excluded based on the data received by the surgical hub 106, 206 at the second step 5204 of the procedure). Data from the medical imaging device 124 (fig. 2) may be used to determine contextual information relating to the type of procedure being performed in a number of different ways, including by determining the angle of visualization orientation of the medical imaging device relative to the patient anatomy, monitoring the number of medical imaging devices utilized (i.e., activated and paired with the surgical hub 106, 206), and monitoring the type of visualization devices utilized. For example, one technique for performing a VATS lobectomy places a camera in the lower anterior corner of the patient's chest above the septum, while one technique for performing a VATS segmental resection places the camera in an anterior intercostal location relative to the segmental cleft. For example, using pattern recognition or machine learning techniques, the situational awareness system may be trained to recognize the positioning of the medical imaging device from a visualization of the patient's anatomy. As another example, one technique for performing VATS lobectomy utilizes a single medical imaging device, while another technique for performing VATS segmental resection utilizes multiple cameras. As another example, one technique for performing a VATS segmental resection utilizes an infrared light source (which may be communicatively coupled to a surgical hub as part of a visualization system) to visualize segmental fissures that are not used in a VATS pulmonary resection. By tracking any or all of this data from the medical imaging device, the surgical hub 106, 206 can thus determine the particular type of surgical procedure being performed and/or the technique being used for a particular type of surgical procedure.

Ninth step 5218, the surgical team begins the dissection step of the procedure. The surgical hub 106, 206 may infer that the surgeon is dissecting to mobilize the patient's lungs because it receives data from the RF generator or ultrasound generator indicating that the energy instrument is being fired. The surgical hub 106, 206 may intersect the received data with the retrieved steps of the surgical procedure to determine that the energy instrument fired at that point in the procedure (i.e., after completion of the previously discussed surgical steps) corresponds to an anatomical step. In some cases, the energy instrument may be an energy tool mounted to a robotic arm of a robotic surgical system.

In a tenth step 5220, the surgical team continues with the surgical ligation step. The surgical hub 106, 206 may infer that the surgeon is ligating arteries and veins because it receives data from the surgical stapling and severing instrument indicating that the instrument is being fired. Similar to the previous steps, the surgical hub 106, 206 may deduce the inference by cross-referencing the receipt of data from the surgical stapling and severing instrument with the retrieval steps in the procedure. In some cases, the surgical instrument may be a surgical tool mounted to a robotic arm of a robotic surgical system.

Eleventh step 5222, a segmental resection portion of the procedure is performed. The surgical hub 106, 206 may infer that the surgeon is transecting soft tissue based on data from the surgical stapling and severing instrument, including data from its cartridge. The cartridge data may correspond to, for example, the size or type of staples fired by the instrument. Since different types of staples are used for different types of tissue, the cartridge data can indicate the type of tissue being stapled and/or transected. In this case, the type of staple fired is for soft tissue (or other similar tissue type), which allows the surgical hub 106, 206 to infer that the segmental resection portion of the procedure is in progress.

In a twelfth step 5224, a node dissection step is performed. The surgical hub 106, 206 may infer that the surgical team is dissecting a node and performing a leak test based on data received from the generator indicating that the RF or ultrasonic instrument is being fired. For this particular procedure, the RF or ultrasound instruments utilized after transecting the soft tissue correspond to a nodal dissection step that allows the surgical hub 106, 206 to make such inferences. It should be noted that the surgeon periodically switches back and forth between the surgical stapling/severing instrument and the surgical energy (i.e., RF or ultrasonic) instrument depending on the particular step in the procedure, as different instruments are better suited to the particular task. Thus, the particular sequence in which the stapling/severing instrument and the surgical energy instrument are used may dictate the steps of the procedure being performed by the surgeon. Further, in some cases, robotic implements may be used for one or more steps in a surgical procedure, and/or hand-held surgical instruments may be used for one or more steps in a surgical procedure. One or more surgeons may alternate and/or may use the device simultaneously, for example, between a robotic tool and a hand-held surgical instrument. Upon completion of the twelfth step 5224, the incision is closed and the post-operative portion of the procedure is initiated.

A thirteenth step 5226, reverse anesthetizing the patient. For example, the surgical hub 106, 206 may infer that the patient is waking up from anesthesia based on, for example, ventilator data (i.e., the patient's breathing rate begins to increase).

Finally, a fourteenth step 5228 is for the medical personnel to remove various patient monitoring devices from the patient. Thus, when the hub loses EKG, BP, and other data from the patient monitoring device, the surgical hub 106, 206 may infer that the patient is being transferred to a recovery room. As can be seen from the description of this exemplary procedure, the surgical hub 106, 206 may determine or infer when each step of a given surgical procedure occurs from data received from various data sources communicatively coupled to the surgical hub 106, 206.

Situational awareness is further described in U.S. provisional patent application serial No. 62/659,900 entitled "METHOD OF humomultination" filed on 19/4/2018, which is incorporated herein by reference in its entirety. In certain instances, operation of the robotic surgical system (including, for example, the various robotic surgical systems disclosed herein) may be controlled by the hub 106, 206 based on its situational awareness and/or feedback from its components and/or based on information from the cloud 104.

Surgical assessment

In some aspects, the computer system described herein is programmed to evaluate surgical personnel (e.g., how the surgical personnel are using the surgical instrument) during a surgical procedure and to suggest suggestions to improve the surgical personnel's technique or actions. In one aspect, a computer system described herein, such as surgical hubs 106, 206 (fig. 1-11), can be programmed to analyze the surgeon and/or other surgical personnel's technical, physical characteristics and/or performance relative to a baseline. In addition, the computer system can be programmed to provide notifications or prompts indicating when surgical personnel have deviated from the baseline so that the surgical personnel can change their actions and optimize their performance or technique. In some aspects, the notification may include a warning that the surgical personnel is not utilizing the appropriate technique (which may also include recommendations regarding corrective actions that the surgical personnel may take to address their technique), suggestions for alternative surgical products, statistical information regarding the correlation between surgical variables (e.g., the time it takes to complete the procedure) and the monitored physical characteristics of the surgical personnel, comparisons between surgeons, and the like. In various aspects, the notification or recommendation may be provided in real-time (e.g., in the operating room during surgery) or in a post-operative report. Thus, the computer system may be programmed to automatically analyze and compare the skill and instrument use skills of the worker.

Fig. 16 is a diagram of an exemplary operating room setup in accordance with at least one aspect of the present disclosure. In various implementations, the surgical hub 211801 may be connected via a communication protocol (e.g., bluetooth) to the various one OR more cameras 211802, surgical instruments 211810, displays 211806, and other surgical devices within the OR211800, as described above under the heading "surgical hub. The camera 211802 may be oriented to capture images and/or video of the surgical personnel 211803 during the course of a surgical procedure. Thus, the surgical hub 211801 may receive captured image and/or video data from the camera 211802 to visually analyze technical or physical characteristics of the surgical personnel 211803 during the surgical procedure.

Fig. 17 is a logic flow diagram of a process 211000 for visually evaluating surgical personnel in accordance with at least one aspect of the present disclosure. In the following description of process 211000, reference should also be made to fig. 10 and 16. The process 211000 may be performed by a processor or control circuitry of a computer system, such as the processor 244 of the surgical hub 206 shown in fig. 10. Thus, the process 211000 may be embodied as a set of computer-executable instructions stored in the memory 249 that, when executed by the processor 244, cause a computer system (e.g., the surgical hub 211801) to perform the described steps.

As described above under the heading "SURGICAL hub (SURGICAL hub)", a computer system, such as the SURGICAL hub 211801, may be connected to or paired with a variety of SURGICAL devices, such as SURGICAL instruments, generators, smoke ejectors, displays, and the like. Through their connection to these surgical devices, the surgical hub 211801 may receive an array of perioperative data from these pairs of surgical devices as they are used during a surgical procedure. Further, as described above under the heading "situational awareness" (sitatinoala walense), the surgical hub 211801 may determine the context of the surgical procedure being performed (e.g., the type of procedure or the surgical steps being performed) based at least in part on the perioperative data received from these connected surgical devices. Accordingly, the processor 244 executing the process 211000 receives 211002 perioperative data from one or more surgical devices connected to or paired with the surgical hub 211801 and determines 211004 a surgical context based at least in part on the received perioperative data utilizing situational awareness. The surgical context determined by situational awareness by the surgical hub 211801 may be used to inform the assessment of the surgical personnel performing the surgical procedure.

Thus, the processor 244 captures 211006 one OR more images of the surgical personnel performing the surgical procedure via, for example, the camera 211802 positioned within the OR 211800. The captured image or images may comprise still images or moving images (i.e., video). Images of the surgical personnel may be captured at a variety of angles and magnifications using different filters and the like. In one implementation, the cameras 211802 are arranged within the OR211800 so that they can collectively visualize each surgical personnel performing the procedure.

Accordingly, the processor 244 determines 211008 the physical characteristics of the one or more surgical personnel from the captured one or more images. For example, the physical characteristics may include a pose (as discussed in connection with fig. 18-19) or a wrist angle (as discussed in connection with fig. 20-21). In other implementations, the physical characteristic may include a position, orientation, angle, or rotation of the individual's head, shoulders, torso, elbows, legs, hips, etc. The physical characteristics may be determined 211008 using a variety of machine vision, image processing, object recognition, and optical tracking techniques. In one aspect, the physical characteristic may be determined 211008 by: the captured image is processed to detect edges of objects in the image and the detected image is compared to a template of the body part being evaluated. Once the evaluated body part has been identified, its position, orientation, and other characteristics may be tracked by comparing the motion of the tracked body part relative to the known position of the camera 211802. In another aspect, a tag-based optical system may be utilized to determine 211008 a physical characteristic (e.g., an active tag embedded in the uniform of a surgical person that emits electromagnetic radiation or other signals that may be received by camera 211802 or other sensors connected to surgical hub 211801). By tracking the movement of the markers relative to the camera 211802, the processor 244 may thus determine the corresponding position and orientation of the body part.

Thus, the processor 244 evaluates 211010 the determined physical characteristics of the surgical personnel based on the baseline. In one aspect, the baseline may correspond to a surgical context determined via situational awareness. The processor 244 may retrieve the baselines for the various physical characteristics from a memory (e.g., memory 249 shown in fig. 10), for example, according to a given surgical context. The baseline may include a value or range of values for a particular physical characteristic to be tracked during a particular surgical context. The type of physical characteristic evaluated in the different surgical contexts may be the same or unique for each particular surgical context.

In one aspect, the processor 244 may provide feedback to the surgical personnel in real time during the surgical procedure. The real-time feedback may include graphical notifications OR recommendations displayed on the display 211806 within the OR211800, audio feedback transmitted by the surgical hub 211801 OR surgical instrument 211810, OR the like. In addition, feedback may include trocar port placement offsets, surgical instrument movement from one trocar port to another, adjusting the positioning of the patient being operated on (e.g., positioning or rolling at an increased surgical table angle), and other such recommendations to improve access to the surgical site and minimize non-ideal surgical techniques exhibited by the surgical personnel. In another aspect, the processor 244 may provide post-operative feedback to the surgical personnel. Post-operative feedback may include a graphical overlay or notification displayed on the captured video of the procedure that is viewable by the surgical personnel for learning purposes, a post-operative report indicating when the surgical personnel deviated from baseline or a particular surgical step, and the like. Any visually identifiable physical characteristic (or combination of physical characteristics) may be used as a basis for advising the surgical personnel of the technical improvement exhibited.

In one aspect, one or more of the steps of process 211000 may be performed by a second computer system or a remote computer system, such as a CLOUD computing system described under the heading "CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES". For example, the surgical hub 211801 may receive 211002 perioperative data from a connected surgical device, determine 211004 a surgical background based at least in part on the perioperative data, capture 211006 or receive images of surgical personnel 211803 via the camera 211802, and determine 211008 physical characteristics of the surgical personnel 211803, as described above. However, in this aspect, rather than performing the evaluation on the surgical hub 211801, the surgical hub 211801 may instead transmit data regarding the physical characteristics and the determined surgical context to a second computer system, such as a cloud computing system. The cloud computing system may then perform an evaluation by determining whether the determined physical characteristic deviates from a baseline physical characteristic corresponding to the surgical context. In some aspects, the baseline physical characteristics may be determined or calculated from data aggregated from all surgical hubs 211801 communicatively connected to the cloud computing system, which allows the cloud computing system to compare the technology of surgical personnel 211803 across multiple medical facilities. Thus, the cloud computing system may transmit a comparison between the physical characteristics determined by the surgical hub 211801 and the corresponding baselines stored on or determined by the cloud computing system. Upon receiving the results, the surgical hub 211801 may then take appropriate action (e.g., display a notification if the skill of the surgical personnel 211803 deviates from the baseline, as described above). In other aspects, one or more additional or different steps of process 211000 can be performed by other computing systems communicatively coupled to the first computing system. In some aspects, such connected computer systems may be embodied as distributed computing systems.

18-19 illustrate a hypothetical implementation of the process 211000 illustrated in FIG. 17, wherein the physical characteristic being evaluated is the pose of the surgical personnel. Fig. 18 is a diagram illustrating a series of models 211050a, 211050b, 211050c, 211050d of a surgical person 211052 during a course of a surgical procedure, according to at least one aspect of the present disclosure. Correspondingly, fig. 19 is a graph 211100 depicting measured pose of the surgical personnel shown in fig. 18 over time, in accordance with at least one aspect of the present disclosure. Fig. 16-17 should also be referenced in the following description of fig. 18-19. Thus, the surgical hub 211801 performing the process 211000 may analyze the pose of the surgical personnel and provide a recommendation if the pose of the staff member deviates from the baseline. Poor, unexpected, or otherwise incorrect posture may indicate, for example, that the surgeon is fatigued, has difficulty with a particular surgical procedure, is using the surgical instrument incorrectly, has incorrectly positioned the surgical instrument, or is operating in other ways that may create a dangerous, potentially risky, situation. Thus, monitoring the posture of the surgical staff during the course of the surgical procedure and providing notification when the staff deviates from the baseline posture may be beneficial to alert the user to their risky behavior so that they may take corrective action or allow other individuals to take corrective action (e.g., change a tired staff to a fresher individual).

Referring to fig. 19, vertical axis 211102 of fig. 211100 represents the posture of an individual, and horizontal axis 211104 represents time. The first model 211050a in FIG. 18 corresponds to time t in FIG. 19 during surgery1The second model 211050b corresponds to time t2The third model 211050c corresponds to time t3And the fourth model 211050d corresponds to time t4. In tandem, fig. 18 and 19 illustrate the posture of an individual assessed as gradually deviating from one or more baseline positions during the course of a surgical procedure.

In one aspect, the posture of an individual evaluated by a computer system may be quantified as a measure of positional deviation corresponding to one or more positions of the individual's body from corresponding initial or threshold positions. For example, fig. 18 shows the changes in head position 211054, shoulder position 211056, and hip position 211058 of a modeled individual over time of first line 211055, second line 211057, and third line 211059, respectively. In aspects utilizing a marker-based optical system, a surgeon's uniform may have markers located at one or more of these locations, which may be tracked by the optical system, for example. In aspects utilizing a markerless optical system, the optical system can be configured to be capable of identifying a surgical person and optically tracking one or more body parts of the identified surgical personOr the position and movement of the body position. In addition, head, shoulder and hip positions 211054, 211056, 211058 may be compared to a baseline head position 211060, a baseline shoulder position 211062 and a baseline hip position 211064, respectively. The baseline positions 211060, 211062, 211064 may correspond to initial positions of respective body parts (i.e., time t in fig. 19)0The location of the body part) or may be a predetermined threshold against which the location of the body part is compared. In one aspect, the posture metric (as shown by vertical axis 211102 of fig. 211100) may be equal to the distance between one of the body positions 211054, 211056, 211058 and its corresponding baseline position 211060, 211062, 211064. In another aspect, the posture metric may be equal to the cumulative distance between more than one of the body positions 211054, 211056, 211058 and their corresponding baseline positions 211060, 211062, 211064. The first line 211108 in the graph 211100 represents the raw pose metric values over time and the second line 211106 represents the normalized pose metric values over time. In various aspects, the process 211000 may evaluate 211010 whether the physical characteristic (in this case, the posture) has deviated from a baseline from raw data or mathematically processed (e.g., normalized) data.

In one aspect, the surgical hub 211801 performing process 211000 may compare the calculated pose metric to one or more thresholds and then take various actions accordingly. In the depicted implementation, the surgical hub 211801 compares the pose metric to a first threshold 211110 and a second threshold 211112. If the normalized pose metric represented by the second line 211106 exceeds the first threshold 211110, the surgical hub 211801 may be configured to provide a first notification OR alert to surgical personnel in the OR211800 indicating that a particular individual's morphology is potentially at risk. Further, if the normalized postural metric represented by the second line 211106 exceeds the second threshold 211112, the surgical hub 211801 may be configured to be able to provide a second notification OR warning to the user in the OR211800 indicating that there is a high degree of risk in the particular individual's morphology. For example, at time t4As represented by the fourth model 211050dThe pose metric of the evaluated surgical personnel exceeds a first threshold 211110; thus, the surgical hub 211801 may be configured to provide a first or initial alert to the surgical personnel.

20-21 illustrate a hypothetical implementation of the process 211000 illustrated in FIG. 17, wherein the physical characteristic being evaluated is the wrist angle of the surgical personnel. Fig. 20 is a depiction of a surgeon holding a surgical instrument 211654 in accordance with at least one aspect of the present disclosure. Correspondingly, fig. 21 is a scatter plot 211700 of wrist angle versus surgical outcome in accordance with at least one aspect of the present disclosure. Fig. 16-17 should also be referenced in the following description of fig. 20-21. Thus, the surgical hub 211801 performing the process 211000 can analyze the wrist angle of the hand of the surgical personnel holding the surgical instrument 211654 and provide a recommendation if the wrist angle of the personnel deviates from the baseline. Holding the surgical instrument incorrectly, as evidenced by extreme wrist angles relative to the surgical instrument, may indicate, for example, that the surgeon is utilizing the surgical instrument incorrectly, has positioned the surgical instrument incorrectly, is using the incorrect surgical instrument for a particular surgical procedure, or is operating in other potentially risky ways that may be dangerous.

In this implementation, the angle of the individual's wrist 211650 is defined as the angle a between the longitudinal axis 211656 of the surgical instrument 211654 held by the surgeon and the longitudinal axis 211652 (i.e., proximal-to-distal axis) of the individual's hand. In other implementations, the wrist angle may be defined as, for example, the angle between the individual's hand and forearm. In the scatter plot 211700 of fig. 21, the vertical axis 211702 represents the wrist angle α, and the horizontal axis 211704 represents the surgical outcome. The portions of horizontal axis 211704 to the right and left of vertical axis 211702 may correspond to positive and negative surgical results, respectively, for example. A variety of different surgical results may be compared to the surgeon's wrist angle a, such as whether a particular surgical step or firing of the surgical instrument 211654 results in excessive bleeding, the incidence of reoperation of the surgical procedure, and so forth. Furthermore, the surgical outcome may be quantified in a number of different ways, depending on the particular type of surgical outcome compared to the surgeon's wrist angle α. For example, if the surgical result is bleeding after a particular firing of the surgical instrument 211654, the horizontal axis 211704 may represent the degree or amount of blood along the incision line from the firing of the surgical instrument 211654. Further, the wrist angle α of each plotted point in the scatter plot 211700 may represent the wrist angle α at a particular instant in the surgical procedure, the average wrist angle α during a particular step of the surgical procedure, the overall average wrist angle during the surgical procedure, and so forth. Further, whether the wrist angle α corresponds to the average wrist angle α at a particular instant in time or the wrist angle α may correspond to the type of surgical outcome with which the wrist angle α is being compared. For example, if the surgical result represented by horizontal axis 211704 is a bleeding volume from the firing of the surgical instrument 211654, the vertical axis 211702 may represent the wrist angle α at the instant the surgical instrument 211654 is fired. As another example, if the surgical outcome represented by horizontal axis 211704 is an incidence of re-operation of a particular surgical type, vertical axis 211702 may represent an average wrist angle α during the surgical procedure.

In one aspect, the surgical hub 211801 performing process 211000 can compare the calculated wrist angle a to one or more thresholds and then take various actions accordingly. In the depicted implementation, the surgical hub 211801 determines whether the surgeon's wrist angle a falls within a first zone defined by the first threshold 211708a and the second threshold 211708b, within a second zone defined by the third threshold 211706a and the fourth threshold 211706b, or outside of the second zone. If the wrist angle a measured by the surgical hub 211801 during the course of the surgical procedure falls between the first threshold 221708a and the second threshold 221708b, the surgical hub 211801 may be configured to be able to determine that the wrist angle a is within acceptable parameters and take no action. If the surgeon's wrist angle a falls between the first and second thresholds 221708a, 221708b and the third and fourth thresholds 221706a, 221706b, the surgical hub 211801 may be configured to provide a first notification OR warning to the surgical personnel in the OR211800 indicating that there is a potential risk with the particular individual's morphology. Further, if the surgeon's wrist angle a falls outside of the third threshold 221706a and the fourth threshold 221706b, the surgical hub 211801 may be configured to provide a second notification OR warning to the user in the OR211800 indicating that there is a high degree of risk with the particular individual's morphology.

In some aspects, various thresholds or baselines to which the monitored physical characteristics are compared may be empirically determined. The surgical hub 211801 and/or CLOUD computing system described above under the heading "CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES" may capture data relating to various physical characteristics of the surgical personnel from the sample population of the surgical procedure for analysis. In one aspect, the computer system may associate those physical characteristics with various surgical outcomes, and then set a threshold or baseline according to the particular physical characteristics of the surgeon or other surgical personnel associated with the highest degree of positive surgical outcomes. Thus, the surgical hub 211801 performing the procedure 211000 may provide a notification or warning when the surgical personnel deviate from best practice. In another aspect, the computer system may set a threshold or baseline according to the physical characteristics most commonly exhibited within a group of samples. Thus, the surgical hub 211801 performing the procedure 211000 may provide a notification or warning when the surgical personnel deviate from the most common practice. For example, in fig. 21, the first threshold 211708a and the second threshold 211708b may be set such that they correspond to the most common wrist angle a (i.e., the most dense portion of the scatter plot 211700) exhibited by the surgeon when performing a particular surgical procedure. Thus, when the surgical hub 211801 performing the procedure 211000 determines that the surgeon's wrist angle a deviates from the empirically determined baseline defined by the first threshold 211708a and the second threshold 211708b, the surgical hub 211801 may provide a notification to the surgeon or take other action, as discussed above.

In one aspect, the physical characteristics tracked by the surgical hub 211801 may be differentiated according to product type. Thus, the surgical hub 211801 may be configured to notify surgical personnel when the particular physical characteristic being tracked corresponds to a different product type. For example, the surgical hub 211801 may be configured to notify the surgeon when the surgeon's arm and/or wrist pose deviates from the baseline for the particular surgical instrument currently being utilized, and thus indicate that a different surgical instrument will be more appropriate.

In one aspect, the surgical hub 211801 can be configured to compare an external orientation of the surgical instrument 211810 to an internal access orientation of its end effector. The external orientation of the surgical instrument 211810 may be determined via the camera 211802 and optical system described above. The internal orientation of the end effector of the surgical instrument 211810 may be determined via an endoscope or another scope for visualizing the surgical site. By comparing the external orientation and the internal orientation of the surgical instrument 211810, the surgical hub 211801 can then determine whether a different type of surgical instrument 211810 will be more appropriate. For example, if the external orientation of the surgical instrument 211810 deviates from the internal orientation of the end effector of the surgical instrument 211810 by more than a threshold degree, the surgical hub 211801 may be configured to provide a notification to the surgical personnel.

In summary, a computer system such as the surgical hub 211801 may be configured to be able to provide recommendations to a surgeon (e.g., surgeon) when the surgeon's skill begins to deviate from best or common practice. In some aspects, the computer system may be configured to be able to provide notification or feedback only when an individual repeatedly exhibits suboptimal behavior during the course of a given surgical procedure. The notification provided by the computer system may suggest, for example, that the surgical personnel adjust their technique to conform to the best technique for the type of procedure with a more appropriate instrument or the like.

In one aspect, the computer system (e.g., surgical hub 211801) may be configured to allow surgical personnel to compare their technique to themselves, rather than to a baseline established or preprogrammed into the computer system by the sampling population. In other words, the baseline with which the computer system compares the surgical personnel may be a prior performance by the surgical personnel in a prior instance of a particular surgical procedure type or with a particular type of surgical instrument. Such aspects may be useful for allowing surgeons to track improvements in the document trial period of their surgical techniques or new surgical products. Thus, the surgical hub 211801 may be configured to be able to evaluate the product during a trial period and provide a highlight of the product's use during a given time period. In one aspect, the surgical hub 211801 may be programmed to be particularly sensitive to deviations between the performance of the surgical personnel and the corresponding baseline, such that the surgical hub 211801 may enhance the appropriate technique for using the surgical device while the trial period is ongoing. In one aspect, the surgical hub 211801 may be configured to record usage of new surgical products and compare and contrast the new products with previous baseline product usage. When two different products are utilized, the surgical hub 211801 may also provide a post-analysis view to highlight the similarities and differences noted between the tracked physical characteristics of the surgeon. In addition, the surgical hub 211801 may allow surgeons to compare populations of procedures between new and old surgical products. The recommendations provided by the surgical hub 211801 may include, for example, comparison videos showing the use of new products.

In one aspect, the computer system (e.g., surgical hub 211801) may be configured to allow the surgeon to compare their technique directly to other surgeons, rather than to a baseline established by the sampling population or preprogrammed into the computer system.

In one aspect, the computer system (e.g., surgical hub 211801) may be configured to be able to analyze trends in surgical device usage as surgeons become more experienced in performing particular surgical procedures (or surgical procedures in general) or using new surgical instruments. For example, the computer system may identify movements, behaviors, and other physical characteristics that change significantly as the surgeon becomes more experienced. Thus, the computer system may recognize when the surgeon exhibits a suboptimal technique early in the surgeon's learning curve and may provide recommendations for the optimal approach before the suboptimal technique becomes underrooted in the surgeon.

65页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:感染风险评价方法、感染风险评价系统以及感染风险评价程序

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!