Wireless pairing of a surgical device with another device within a sterile surgical field based on device usage and context awareness

文档序号:1131504 发布日期:2020-10-02 浏览:16次 中文

阅读说明:本技术 基于装置使用和情景感知的无菌外科手术区域内外科装置与另一装置的无线配对 (Wireless pairing of a surgical device with another device within a sterile surgical field based on device usage and context awareness ) 是由 F·E·谢尔顿四世 J·L·哈里斯 于 2018-11-14 设计创作,主要内容包括:本发明提供了一种外科系统,该外科系统包括第一外科装置,该第一外科装置包括控制电路。该控制电路被配置成能够根据从数据库、患者监测装置或配对的外科装置或者数据库、患者监测装置或配对的外科装置的任何组合接收的数据情景地感知在该第一外科装置附近发生的事件。该控制电路被配置成能够根据该第一外科装置的使用以及该第一外科装置情景地感知的这些事件与第二外科装置无线配对。(A surgical system includes a first surgical device including a control circuit. The control circuit is configured to contextually sense an event occurring proximate the first surgical device based on data received from a database, a patient monitoring device, or a paired surgical device, or any combination thereof. The control circuit is configured to wirelessly pair with a second surgical device based on usage of the first surgical device and the events contextually perceived by the first surgical device.)

1. A surgical system, comprising:

a first surgical device comprising a control circuit configured to:

situationally perceiving an event occurring in proximity to the first surgical device according to data received from a database, a patient monitoring device, or a paired surgical device, or any combination of the database, the patient monitoring device, or the paired surgical device; and

wirelessly pairing with a second surgical device based on the use of the first surgical device and the event that the first surgical device contextually perceives.

2. The surgical system of claim 1, wherein the event contextually perceived by the first surgical device comprises a first user using the first surgical device and a second user using the second surgical device.

3. The surgical system of claim 2, wherein the event comprising the first user using the first surgical device comprises the first user gripping a handle of the first surgical device.

4. The surgical system of claim 3, wherein the event comprising the first user gripping the handle of the first surgical device comprises the first user gripping the handle of the first surgical device, allowing a transceiver in the handle of the first surgical device to communicate with an identifier worn by the first user, and allowing communication between the first surgical device and a surgical hub through the identifier.

5. The surgical system of claim 2, wherein the event contextually perceived by the first surgical device comprises a location of the first surgical device and a location of the second surgical device.

6. The surgical system of claim 5, wherein the control circuit is configured to determine the position of the second surgical device based on a wireless signal transmitted by the second surgical device to the first surgical device.

7. The surgical system of claim 1, wherein the control circuit is further configured to simultaneously activate each of the first and second surgical devices for a predetermined period of time when no tissue or patient is sensed.

8. The surgical system of claim 1, wherein the first surgical device is located within a sterile field and the second surgical device is located outside of the sterile field when the first surgical device is wirelessly mated with the second surgical device.

9. The surgical system of claim 1, wherein the control circuit is further configured to be wirelessly mateable with a communication device.

10. The surgical system of claim 1, wherein the event contextually perceived by the first surgical device comprises determining a distance between the first surgical device and a tissue structure within a patient.

11. A method, comprising:

situationally sensing, by a control circuit within a first surgical device, an event occurring in a vicinity of the first surgical device from data received from a database, a patient monitoring device, or a paired surgical device, or any combination thereof; and

wirelessly pairing, by the control circuit, with a second surgical device as a function of the use of the first surgical device and the event contextually perceived by the first surgical device.

12. The method of claim 11, wherein the contextual awareness by the control circuitry within the first surgical device comprises contextual awareness by the control circuitry within the first surgical device that the first user is using the first surgical device and that the second user is using the second surgical device.

13. The method of claim 12, wherein the contextual awareness, by control circuitry within the first surgical device, that the first user is using the first surgical device comprises contextual awareness, by control circuitry within the first surgical device, that the first user is gripping a handle of the first surgical device.

14. The method of claim 13, further comprising allowing a transceiver in the handle of the first surgical device to communicate with an identifier worn by the first user, and

allowing communication between the first surgical device and a surgical hub through the identifier.

15. The method of claim 12, wherein the contextual awareness, by control circuitry within a first surgical device, that a first user is using the first surgical device and that a second user is using the second surgical device comprises contextual awareness, by control circuitry within a first surgical device, of a location of the first surgical device and a location of the second surgical device.

16. The method of claim 15, further comprising determining, by the control circuit, the position of the second surgical device based on a wireless signal transmitted by the second surgical device to the first surgical device.

17. The method of claim 11, further comprising activating, by the control circuit, the first and second surgical devices each for a predetermined period of time when no tissue or patient is sensed.

18. The method of claim 11, wherein wirelessly pairing, by the control circuit, a second surgical device according to use of the first surgical device comprises wirelessly pairing, by the control circuit, a second surgical device outside of a sterile zone while the first surgical device is within the sterile zone.

19. The method of claim 11, further comprising the control circuit wirelessly pairing with a communication device.

20. The method of claim 11, further comprising determining, by the control circuit, a distance between the first surgical device and a tissue structure within a patient.

Background

The present disclosure relates to various surgical systems. Surgical procedures are often performed in surgical operating rooms or operating rooms (operating theaters or rooms) of medical facilities, such as, for example, hospitals. A sterile field is typically created around the patient. The sterile field may include members of a team who are properly wearing swabs, as well as all equipment and fixtures in the field. Various surgical devices and systems are utilized in performing surgical procedures.

Disclosure of Invention

One aspect of a surgical system may include a first surgical device having a control circuit configured to contextually sense events occurring in proximity to the first surgical device based on data received from a database, a patient monitoring device, or a paired surgical device, or any combination of a database, a patient monitoring device, or a paired surgical device, and wirelessly pair with a second surgical device based on usage of the first surgical device and the contextually sensed events of the first surgical device.

In one aspect of the surgical system, the event that is contextually perceived by the first surgical device includes the first user using the first surgical device and the second user using the second surgical device.

In one aspect of the surgical system, the event consisting of the first user using the first surgical device comprises the first user gripping a handle of the first surgical device.

In one aspect of the surgical system, the event consisting of the first user gripping the handle of the first surgical device may include the first user gripping the handle of the first surgical device, allowing a transceiver in the handle of the first surgical device to communicate with an identifier worn by the first user, and allowing communication between the first surgical device and the surgical hub through the identifier.

In one aspect of the surgical system, the event that is contextually perceived by the first surgical device may include a location of the first surgical device and a location of the second surgical device.

In one aspect of the surgical system, the control circuit is configured to determine a position of the second surgical device based on a wireless signal transmitted by the second surgical device to the first surgical device.

In one aspect of the surgical system, the control circuit is further configured to simultaneously activate each of the first and second surgical devices for a predetermined period of time when no tissue or patient is sensed.

In one aspect of the surgical system, when the first surgical device is wirelessly mated with the second surgical device, the first surgical device is located within the sterile zone and the second surgical device is located outside the sterile zone.

In one aspect of the surgical system, the control circuit is further configured to be wirelessly paired with the communication device.

In one aspect of the surgical system, the event contextually perceived by the first surgical device may include determining a distance between the first surgical device and a tissue structure within the patient.

One aspect of the method may include situationally sensing, by a control circuit within the first surgical device, an event occurring in proximity to the first surgical device according to data received from a database, a patient monitoring device, or a paired surgical device, or any combination of the database, the patient monitoring device, or the paired surgical device, and wirelessly pairing, by the control circuit, the second surgical device according to the use of the first surgical device and the situationally sensed event of the first surgical device.

In one aspect of the method, the contextual awareness by the control circuitry within the first surgical device may include contextual awareness by the control circuitry within the first surgical device that the first user is using the first surgical device and that the second user is using the second surgical device.

In one aspect of the method, the contextual awareness, by the control circuitry within the first surgical device, that the first user is using the first surgical device may include contextual awareness, by the control circuitry within the first surgical device, that the first user is gripping a handle of the first surgical device.

In one aspect, the method may further include allowing a transceiver in the handle of the first surgical device to communicate with an identifier worn by the first user, and allowing communication between the first surgical device and the surgical hub through the identifier.

In one aspect of the method, the contextual awareness, by the control circuitry within the first surgical device, that the first user is using the first surgical device and that the second user is using the second surgical device may include contextual awareness, by the control circuitry within the first surgical device, of a position of the first surgical device and a position of the second surgical device.

In one aspect, the method may further include determining, by the control circuitry, a position of the second surgical device based on a wireless signal transmitted by the second surgical device to the first surgical device.

In one aspect, the method may further include activating, by the control circuit, each of the first and second surgical devices for a predetermined period of time when no tissue or patient is sensed.

In one aspect of the method, wirelessly pairing, by the control circuitry, the first surgical device with the second surgical device in accordance with use of the first surgical device can include wirelessly pairing, by the control circuitry, the first surgical device with the second surgical device outside of the sterile zone.

In one aspect, the method may also include controlling wireless pairing of the circuit with the communication device.

In one aspect, the method may further include determining, by the control circuitry, a distance between the first surgical device and a tissue structure within the patient.

Drawings

The aspects described herein, however, both as to organization and method of operation, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in connection with the accompanying drawings, which are set forth below.

Fig. 1 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.

Fig. 2 is a surgical system for performing a surgical procedure in an operating room according to at least one aspect of the present disclosure.

Fig. 3 is a surgical hub paired with a visualization system, a robotic system, and a smart instrument according to at least one aspect of the present disclosure.

Fig. 4 is a partial perspective view of a surgical hub housing and a composite generator module slidably received in a drawer of the surgical hub housing according to at least one aspect of the present disclosure.

Fig. 5 is a perspective view of a combined generator module having bipolar, ultrasonic and monopolar contacts and a smoke evacuation component according to at least one aspect of the present disclosure.

Fig. 6 illustrates a single power bus attachment for a plurality of lateral docking ports of a lateral modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.

Fig. 7 illustrates a vertical modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.

Fig. 8 illustrates a surgical data network including a modular communication hub configured to connect modular devices located in one or more operating rooms of a medical facility or any room in a medical facility dedicated to surgical operations to a cloud in accordance with at least one aspect of the present disclosure.

Fig. 9 illustrates a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.

Fig. 10 illustrates a surgical hub including a plurality of modules coupled to a modular control tower according to at least one aspect of the present disclosure.

Fig. 11 illustrates one aspect of a Universal Serial Bus (USB) hub device in accordance with at least one aspect of the present disclosure.

Fig. 12 is a block diagram of a cloud computing system including a plurality of smart surgical instruments coupled to a surgical hub connectable to cloud components of the cloud computing system in accordance with at least one aspect of the present disclosure.

Fig. 13 is a functional module architecture of a cloud computing system according to at least one aspect of the present disclosure.

Fig. 14 illustrates a diagram of a context aware surgical system in accordance with at least one aspect of the present disclosure.

Fig. 15 is a timeline depicting context awareness of a surgical hub, in accordance with at least one aspect of the present disclosure.

Fig. 16 is a diagram of pairing of a personal-owned wireless device with a surgical hub, according to at least one aspect of the present disclosure.

Fig. 17 is a diagram of a cartridge configured to wirelessly communicate with a surgical hub according to at least one aspect of the present disclosure.

Fig. 71A depicts inductive power coupling between adjacent coils in accordance with at least one aspect of the present disclosure.

Fig. 18 is a block diagram of a resonant inductive wireless power system in accordance with at least one aspect of the present disclosure.

Fig. 19A is a diagram of a surgical hub detection room perimeter in accordance with at least one aspect of the present disclosure.

Fig. 19B is a diagram of a room perimeter including one or more interfering beacons, in accordance with at least one aspect of the present disclosure.

Fig. 20 is a diagram of interactions between an identifier worn by a user and a surgical instrument, according to at least one aspect of the present disclosure.

Fig. 21 is a diagram of a surgical system including a magnetic field generator for detecting a position and orientation of a surgical device relative thereto in accordance with at least one aspect of the present disclosure.

Fig. 22 is a diagram depicting a system for utilizing lidar to determine a position of a device relative to a measurement site selected by a user, in accordance with at least one aspect of the present disclosure.

Fig. 23 is a diagram of a system for determining a relative position of a device via a dual antenna receiver in accordance with at least one aspect of the present disclosure.

Fig. 24 is a graph depicting possible detected signal strengths in accordance with at least one aspect of the present disclosure.

Description

The applicant of the present patent application owns the following U.S. patent applications filed on 6/11/2018, the disclosures of each of which are incorporated herein by reference in their entirety:

U.S. patent application 16/182,224 entitled "SURGICAL NETWORK, INSTRUMENT, AND CLOUDDESPONSES BASED ON VALIDATION OF RECEIVED DATASET AND AUTHENTICATION OF ITSSOURCE AND INTEGRITY";

U.S. patent application 16/182,230 entitled "SURGICAL SYSTEM FOR PRESENTING INFORMATION INTERPRETED FROM EXTERNAL DATA";

U.S. patent application 16/182,233 entitled "MODIFICATION OF SURGICAL SYSTEMS CONTROL PROGRAMS BASED ON MACHINE LEARNING";

U.S. patent application 16/182,239 entitled "apparatus for controlling program BASED ON structured contact DATA IN ADDITION TO THE DATA";

U.S. patent application 16/182,243 entitled "SURGICAL HUB AND MODULAR DEVICES PONSE ADJUSTMENT BASED ON SITUATIONAL AWARENESS";

U.S. patent application 16/182,248 entitled "DETECTION AND evaluation office facilities RESPONSES OF SURGICAL INSTRUMENTS TO INCREASING SEVERITY THREATS";

U.S. patent application 16/182,251, entitled "INTERACTIVE SURGICA LSYSTEM";

U.S. patent application 16/182,260 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN SURGICAL NETWORKS";

U.S. patent application No. 16/182,267 entitled "SENSING THE PATIENT POSITION and orientation and tuning THE same Mono-POLAR RETURN PAD ELECTRODE TO Process POSITION and orientation TO A SURGICAL NETWORK";

U.S. patent application No. 16/182,249 entitled "Power SURGICAL TOOL WITHPREDEFINED ADJUSE TABLE CONTROL ALGORITHM FOR CONTROLLING END EFFECTORCTORPARAMETER";

U.S. patent application 16/182,246 entitled "ADJUSTMENTS BASED ON AIRBORNEPARATICLES PROPERTIES";

U.S. patent application 16/182,256 entitled "ADJUSTMENT OF A SURGICAL DEVICEFUNCTION BASED ON SITUATIONAL AWARENESS";

U.S. patent application 16/182,242 entitled "REAL-TIME ANALYSIS OF COMPREHENSIVEOST OF ALL INSTRUMENTATION USE IN SURGERY UTILIZING DATA FLUIDITY TO TRACKINSTRUMENTS THROUGH STOCKING AND IN-HOUSE PROCESSES";

U.S. patent application 16/182,255 entitled "USAGE AND TECHNIQUE ANALYSIS OFSURGION/STAFF PERFOMANCE AGAINST A BASELINE TO OPTIMIZATION DEVICE FOR BOTH CURRENT AND FUTURE PROCEDURES";

U.S. patent application 16/182,269 entitled "IMAGE CAPTURING OF THE AREASOUTSIDE THE ABDOMEN TO IMPROVE PLACEMENT AND CONTROL OF A SURGICAL DEVICE INCUSE";

U.S. patent application 16/182,278 entitled "COMMUNICATION OF DATA WHERE ASURGICAL NETWORKS USE CONTEXT OF THE DATA AND REQUIREMENTS OF A RECEIVINGSYSTEM/USER TO INFONCE INCLUSION OR LINKAGE OF DATA AND METADATA TOESTABILITY CONTENT";

U.S. patent application 16/182,290 entitled "SURGICAL NETWORK RECOMMENDIONSFROM REAL TIME ANALYSIS OF PROCEDURE VARIABLE AGAINST A BASELINEHHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION";

U.S. patent application 16/182,232 entitled "CONTROL OF A SURGICAL SYSTEMTHROUGH A SURGICAL BARRIER";

U.S. patent application 16/182,227 entitled "SURGICAL NETWORK DETERMINATION OF COMMUNICATION, INTERACTION, OR PROCESSING BASED ON SYSTEM ORDEVICES";

U.S. patent application 16/182,229 entitled "ADJUSTMENT OF STAPLE HEIGHT OF ATLEAST ONE ROW OF STAPLES BASED ON THE SENSED TISSUE THICKNESS OR FOR POWER INCLOSING";

U.S. patent application 16/182,234 entitled "STAPLING DEVICE WITH BOTHCOMPULSOLY AND DISCRETION LOCKOUTS BASED SENSED PARAMETERS";

U.S. patent application 16/182,240 entitled "POWER STAPLING DEVICE CONFIRED ADJUST FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER BASEDON SENSED PARAMETER OF FIRING OR CLAMPING";

U.S. patent application 16/182,235 entitled "VARIATION OF RADIO FREQUENCY ANDULTROASONIC POWER LEVEL IN COOPERATION WITH VARYING CLAMP ARM PRESSURE TOACHIEVE PREDEFINED HEAT FLUX OR POWER APPLIED TO TISSUE"; and

U.S. patent application 16/182,238 entitled "ULTRASONIC ENERGY DEVICE WHICHVARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL ATA CUT PROGRESSION LOCATION".

The applicant of the present patent application owns the following U.S. patent applications filed on 2018, 9, 10, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application 62/729,183 entitled "A CONTROL FOR A SURGICALNETWORK OR SURGICALNETWORK CONNECTED DEVICE THAT ADJUTS ITS FUNCTION BASION A SENSED STATION OR USAGE";

U.S. provisional patent application 62/729,177 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN A SURGICALNETWORK BEFORE TRANSMISSION";

U.S. provisional patent application 62/729,176 entitled "INDIRECT COMMAND AND CONTROL OFA FIRST OPERATING ROOM SYSTEM THROUGH THE USE OF A SECOND OPERATING ROOMSYSTEM WITHIN A STERILE FIELD WHERE THE SECOND OPERATING ROOM SYSTEM HASPRIMARY AND SECONDARY OPERATING MODES";

U.S. provisional patent application 62/729,185 entitled "POWER STAPLINGDEVICE THAT ISCABLE OF ADJUSE FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING GMERER OF THE DEVICE BASED ON SENSED PARAMETER OF FIRING OR CLAMPING";

U.S. provisional patent application 62/729,184 entitled "POWER SURGICAL TOOL WITH APREDEFINED ADJUSTABLE CONTROL ALGORITHM FOR CONTROLLING AT LEAST ONE ENDEFECTOR PARAMETER AND A MEANS FOR LIMITING THE ADJUSTMENT";

U.S. provisional patent application No. 62/729,182 entitled "SENSING THE PATIENT POSITIONIONNAND control UTILIZING THE MONO POLAR RETURN PAD ELECTRODE TO PROVIDEO STATIONATIONAL AWARENESS TO THE HUB";

U.S. provisional patent application 62/729,191 entitled "SURGICAL NETWORK RECOMMENDITIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST ABASELINE HIGHLIGHTING DIFFERENCES FROM THE OPEN THE OPTIMAL SOLUTION";

U.S. provisional patent application 62/729,195 entitled "ULTRASONIC ENERGY DEVICE WHICHVARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL ATA CUT PROGRESSION LOCATION"; and

U.S. provisional patent application 62/729,186, entitled "WIRELESS PAIRING OF A SURGICALDEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FILED BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES".

The applicant of the present patent application owns the following U.S. patent applications filed on 2018, 8/28, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. patent application 16/115,214 entitled "ESTIMATING STATE OF ULTRASONIC ENDEFECTOR AND CONTROL SYSTEM THEREFOR";

U.S. patent application 16/115,205 entitled "TEMPERATURE CONTROL OF ULTRASONIC EFFECTOR AND CONTROL SYSTEM THEREFOR";

U.S. patent application 16/115,233 entitled "RADIO FREQUENCY ENERGY DEVICE for RADIO interference COMBINED ELECTRICAL SIGNALS";

U.S. patent application No. 16/115,208 entitled "control AN ULTRASONIC SURGICAL ACCORDING TO TISSUE LOCATION";

U.S. patent application 16/115,220 entitled "control ACTIVATION OF atomic catalytic conversion TO THE PRESENCE OF TISSUE";

U.S. patent application 16/115,232, entitled "DETERMINING TISSUE COMPOSITION VIAAN ULTRASONIC SYSTEM";

U.S. patent application No. 16/115,239 entitled "DETERMINING THE STATE OF orthogonal electronic Circuit System ACCORDING TO FREQUENCY SHIFT";

U.S. patent application 16/115,247 entitled "DETERMINING THE STATE OF ANULTRASONIC END EFFECTOR";

U.S. patent application 16/115,211 entitled "STATATIONAL AWARENESS OFELECTRROSURGICAL SYSTEMS";

U.S. patent application 16/115,226, entitled "MECHANISMS FOR CONTROLLINGDIFFERENT ELECTROMECHANICAL SYSTEMS OF AN ELECTROSURGICAL INSTRUMENT";

U.S. patent application 16/115,240 entitled "DETECTION OF END effect IMMERSIONIN LIQUID";

U.S. patent application 16/115,249 entitled "INTERRUPTION OF ENGAGUTIVE DUE TOINADVERTENT CAPACITIVE COUPLING";

U.S. patent application 16/115,256 entitled "INCREASING RADIO FREQUENCY TOCREATE PAD-LESS MONOPOLAR LOOP";

U.S. patent application 16/115,223 entitled "BIPOLAR COMMUNICATION DEVICE THATOMATICALLY ADJUTS PRESSURE BASED ON ENERGY MODALITY"; and

U.S. patent application 16/115,238 entitled "activity OF ENERGY DEVICES".

The applicant of the present patent application owns the following U.S. patent applications filed on 23.8.2018, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application 62/721,995 entitled "control AN ultra semiconductor minor insertion TO a terminal LOCATION";

U.S. provisional patent application 62/721,998 entitled "STATATIONAL AWARENESS OFELECTRROSURGICAL SYSTEMS";

U.S. provisional patent application 62/721,999 entitled "INTERRUPTION OF ENGAGUTIVE DUE TOINADVERTENT CAPACITIVE COUPLING";

U.S. provisional patent application 62/721,994 entitled "BIPOLAR COMMUNICATION DEVICE THATUATION MATICALLY ADJUTS PRESSURE BASED ON ENERGY MODALITY"; and

U.S. provisional patent application 62/721,996 entitled RADIO FREQUENCY ENERGY development device delay COMBINED ELECTRICAL SIGNALS.

The applicant of the present patent application owns the following U.S. patent applications filed on 30.6.2018, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application 62/692,747 entitled "SMART ACTIVATION OF AN ENERGYDEVICE BY ANOTHER DEVICE";

U.S. provisional patent application 62/692,748, entitled "SMART ENERGY ARCHITECTURE"; and

us provisional patent application 62/692,768, entitled "SMART ENERGY DEVICES".

The applicant of the present patent application owns the following U.S. patent applications filed on 29.6.2018, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. patent application serial No. 16/024,090, entitled "CAPACITIVE COUPLED RETURNPATH PAD WITH SEPARABLE ARRAY ELEMENTS";

U.S. patent application Ser. No. 16/024,057 entitled "control A SURGICALINSTRUCTION ACCORDING TO SENSED CLOSURE PARAMETERS";

U.S. patent application Ser. No. 16/024,067 entitled "SYSTEM FOR ADJUSE ENDEFECTOR PARAMETERS BASED ON PERIORATIVE INFORMATION";

U.S. patent application Ser. No. 16/024,075 entitled "SAFETY SYSTEMS FOR SMARTPOWER SURGICAL STAPLING";

U.S. patent application Ser. No. 16/024,083 entitled "SAFETY SYSTEMS FOR SMARTPOWER SURGICAL STAPLING";

U.S. patent application Ser. No. 16/024,094 entitled "SURGICAL SYSTEMS FOR RDETTING END EFFECTOR TISSUE DISTRIBUTION IRREGULARITIES";

U.S. patent application Ser. No. 16/024,138 entitled "SYSTEM FOR DETECTING PROXIMITY OF SURGICAL END EFFECTOR TO CANCEROUS TISSUE";

U.S. patent application Ser. No. 16/024,150 entitled "SURGICAL INSTRUMENT CARTRIDGESENSOR ASSEMBLIES";

U.S. patent application Ser. No. 16/024,160 entitled "VARIABLE OUTPUT CARTRIDGESENSOR ASSEMBLY";

U.S. patent application Ser. No. 16/024,124 entitled "SURGICAL INSTRUMENT HAVING AFLEXIBLE ELECTRODE";

U.S. patent application Ser. No. 16/024,132 entitled "SURGICAL INSTRUMENT HAVARING AFLEXIBLE CICUIT";

U.S. patent application Ser. No. 16/024,141 entitled "SURGICAL INSTRUMENT WITH ATISSUE MARKING ASSEMBLY";

U.S. patent application Ser. No. 16/024,162 entitled "SURGICAL SYSTEMS WITHPRIORIZED DATA TRANSMISSION CAPABILITIES";

U.S. patent application Ser. No. 16/024,066 entitled "SURGICAL EVACUTION SENSING MOTOR CONTROL";

U.S. patent application Ser. No. 16/024,096 entitled "SURGICAL EVACUTION SENSORARRANGEMENTS";

U.S. patent application Ser. No. 16/024,116 entitled "SURGICAL EVACUTION FLOWPATHS";

U.S. patent application Ser. No. 16/024,149 entitled "SURGICAL EVACUTION SENSING GENERATOR CONTROL";

U.S. patent application Ser. No. 16/024,180, entitled "SURGICAL EVACUTION SENSINGAND DISPLAY";

U.S. patent application Ser. No. 16/024,245 entitled "COMMUNICATION OF SMOKEEVACUTION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUTION MODULE FOR RINTERACTIVE SURGICAL PLATFORM";

U.S. patent application Ser. No. 16/024,258 entitled "SMOKE EVACUATION SYSTEMINGLUTING A SEGMENTED CONTROL CIRCUIT FOR INTERACTIVE SURGICAL PLATFORM";

U.S. patent application Ser. No. 16/024,265 entitled "SURGICAL EVACUTION SYSTEMWITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND A SMOKEEVACUTION DEVICE"; and

U.S. patent application Ser. No. 16/024,273, entitled "DUAL IN-SERIES LARGE ANDSMALL DROPLET FILTERS".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 6/28, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application Ser. No. 62/691,228, entitled "A Method of using a formed fluid circuits with multiple sensors with electronic devices";

U.S. provisional patent application Ser. No. 62/691,227 entitled "controlling a scientific recording to sensed closure parameters";

U.S. provisional patent application Ser. No. 62/691,230 entitled "SURGICAL INSTRUMENTTHAVING A FLEXIBLE ELECTRODRODE";

U.S. provisional patent application Ser. No. 62/691,219 entitled "SURGICAL EVACUATIONSENSING AND MOTOR CONTROL";

U.S. provisional patent application Ser. No. 62/691,257 entitled "COMMUNICATION OF SMOKEEVACUTION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUTION MODULE FOR RINTERACTIVE SURGICAL PLATFORM";

U.S. provisional patent application Ser. No. 62/691,262 entitled "SURGICAL EVACUATION SYSTEM WITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND ASMOKE EVACUATION DEVICE"; and

U.S. provisional patent application serial No. 62/691,251, entitled "DUAL IN-SERIES LARGE ANDSMALL DROPLET FILTERS".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 4, 19, the disclosures of which are incorporated herein by reference in their entirety:

U.S. provisional patent application serial No. 62/659,900, entitled "METHOD OF hubcmonication".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 30/3/2018, the disclosure of each of which is incorporated herein by reference in its entirety:

us provisional patent application No. 62/650,898, entitled "CAPACITIVITY ECOUPLED RETURN PATH PAD WITH SECARABLE ARRAY ELEMENTS", filed 3, 30.2018;

U.S. provisional patent application Ser. No. 62/650,887 entitled "SURGICAL SYSTEMS WITHOPTIMIZED SENSING CAPABILITIES";

U.S. patent application Ser. No. 62/650,882 entitled "SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM"; and

U.S. patent application Ser. No. 62/650,877, entitled "SURGICAL SMOKE EVACUATIONSENSING AND CONTROLS".

The applicant of the present patent application owns the following U.S. patent applications filed on 29/3/2018, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. patent application Ser. No. 15/940,641 entitled "INTERACTIVE SURGICAL SYSTEMSWITH ENCRYPTED COMMUNICATION CAPABILITIES";

U.S. patent application Ser. No. 15/940,648 entitled "INTERACTIVE SURGICAL SYSTEMSWITH CONDITION HANDLING OF DEVICES AND DATA CAPABILITIES";

U.S. patent application Ser. No. 15/940,656 entitled "SURGICAL HUB COORDINATION OFCONTROL AND COMMUNICATION OF OPERATING ROOM DEVICES";

U.S. patent application Ser. No. 15/940,666 entitled "SPATIAL AWARENESS OF SURGICALUHUBS IN OPERATING ROOMS";

U.S. patent application Ser. No. 15/940,670 entitled "COOPERATIVE UTILIZATION OFDATA DERIVED FROM SECONDARY SOURCES BY INTELLIGENT SURGICAL HUBS";

U.S. patent application Ser. No. 15/940,677 entitled "SURGICAL HUB CONTROLARANGEMENTS";

U.S. patent application Ser. No. 15/940,632 entitled "DATA STRIPPING METHOD OF INTERROTATE PATIENT RECORD AND CREATE ANONYMIZED RECORD";

U.S. patent application Ser. No. 15/940,640 entitled "COMMUNICATION HUB AND STORAGE EVICE FOR STORING PARAMETERS AND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICS SYSTEMS";

U.S. patent application Ser. No. 15/940,645 entitled "SELF DESCRIBING DATA PACKETSGENERATED AT AN ISSUING INSTRUMENT";

U.S. patent application Ser. No. 15/940,649 entitled "DATA PAIRING TO INTERCONNECTA DEVICE MEASURED PARAMETER WITH AN OUTCOME";

U.S. patent application Ser. No. 15/940,654 entitled "SURGICAL HUB SITUATIONALAWARENESS";

U.S. patent application Ser. No. 15/940,663 entitled "SURGICAL SYSTEM DISTRIBUTEDPROCESSING";

U.S. patent application Ser. No. 15/940,668 entitled "AGGREGAGATION AND REPORTING OFSURGICAL HUB DATA";

U.S. patent application Ser. No. 15/940,671 entitled "SURGICAL HUB SPATIALAWARENESS TO DETERMINE DEVICES IN OPERATING THEEATER";

U.S. patent application Ser. No. 15/940,686 entitled "DISPLAY OF ALIGNMENT OFSTAPLE CARTRIDGE TO PRIOR LINEAR STAPLE LINE";

U.S. patent application Ser. No. 15/940,700 entitled "STERILE FIELD INTERACTIVECONNTROL DISPLAYS";

U.S. patent application Ser. No. 15/940,629 entitled "COMPUTER IMPLEMENTEDINTERACTIVE SURGICAL SYSTEMS";

U.S. patent application Ser. No. 15/940,704 entitled "USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";

U.S. patent application Ser. No. 15/940,722 entitled "CHARACTERIZATION OF TISSUEIRREGULARITIES THROUGH THE USE OF MONO-CHROMATIC LIGHT REFRACTIVITY";

U.S. patent application Ser. No. 15/940,742 entitled "DUAL CMOS ARRAY IMAGING";

U.S. patent application Ser. No. 15/940,636 entitled "ADAPTIVE CONTROL programs FOR basic DEVICES";

U.S. patent application Ser. No. 15/940,653 entitled "ADAPTIVE CONTROL PROGRAMUPDATES FOR SURGICAL HUBS";

U.S. patent application Ser. No. 15/940,660 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR CUTOSTIMION AND RECOMMENDITION TO A USER";

U.S. patent application Ser. No. 15/940,679 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR LINKING OF LOCAL USAGE TRENDS WITH THE RESOURCE ACQUISITION BEHAVIORS OFLARGER DATA SET";

U.S. patent application Ser. No. 15/940,694 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR MEDICAL FACILITY SEGMENTED INDIDUALIZATION OF INSTRUMENTS FUNCTIONS";

U.S. patent application Ser. No. 15/940,634 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES";

U.S. patent application Ser. No. 15/940,706 entitled "DATA HANDLING ANDPRIORITIZATION IN A CLOUD ANALYTICS NETWORK";

U.S. patent application Ser. No. 15/940,675 entitled "CLOOUD INTERFACE FOR COUPLEDSURGICAL DEVICES";

U.S. patent application Ser. No. 15/940,627 entitled "DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,637 entitled "COMMUNICATION ARRANGEMENTSFOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,642 entitled "CONTROL FOR ROBOT-ASSISTED DSURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,676 entitled "AUTOMATIC TOOL ADJUSTMENTSFOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,680 entitled "CONTROL FOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,683 entitled "COOPERATIVE SURGICAL ACTIONFOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. patent application Ser. No. 15/940,690 entitled "DISPLAY ARRANGEMENTS ForOBOT-ASSISTED SURGICAL PLATFORMS"; and

U.S. patent application Ser. No. 15/940,711, entitled "SENSING ARRANGEMENTS ForOBOT-ASSISTED SURGICAL PLATFORMS".

The applicant of the present patent application owns the following U.S. provisional patent applications filed 2018, 3, 28, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application serial No. 62/649,302 entitled "INTERACTIVE SURGICALSYSTEMS WITH ENCRYPTED notification CAPABILITIES";

U.S. provisional patent application Ser. No. 62/649,294 entitled "DATA STRIPPING METHOD OF INTERROTATE PATIENT RECORD AND CREATE ANONYMIZED RECORD";

U.S. patent application Ser. No. 62/649,300 entitled "SURGICAL HUB SITUATIONALAWARENESS";

U.S. provisional patent application Ser. No. 62/649,309 entitled "SURGICAL HUB SPATIALAWARENESS TO DETERMINE DEVICES IN OPERATING THEEATER";

U.S. patent application Ser. No. 62/649,310 entitled "COMPUTER IMPLEMENTEDINTERACTIVE SURGICAL SYSTEMS";

U.S. provisional patent application Ser. No. 62/649,291 entitled "USE OF LASER LIGHT ANDRED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";

U.S. patent application Ser. No. 62/649,296 entitled "ADAPTIVE CONTROL programs FOR basic DEVICES";

U.S. provisional patent application Ser. No. 62/649,333 entitled "CLOOUD-BASED MEDICANAL POLYTICS FOR CUTOSTOMIZATION AND RECOMMENDITIONS TO A USER";

U.S. provisional patent application Ser. No. 62/649,327 entitled "CLOOUD-BASED MEDICANAL POLYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES";

U.S. provisional patent application Ser. No. 62/649,315 entitled "DATA HANDLING ANDPRIORITIZATION IN A CLOUD ANALYTICS NETWORK";

U.S. patent application Ser. No. 62/649,313 entitled "CLOOUD INTERFACE FOR COUPLEDSURGICAL DEVICES";

U.S. patent application Ser. No. 62/649,320 entitled "DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS";

U.S. provisional patent application Ser. No. 62/649,307 entitled "AUTOMATIC TOOLADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS"; and

U.S. provisional patent application serial No. 62/649,323, entitled "SENSING ARRANGEMENTS forced-associated minor planar platrms".

The applicant of the present patent application owns the following U.S. provisional patent applications filed on 8.3.2018, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application Ser. No. 62/640,417 entitled "TEMPERATURE CONTROL INDULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR"; and

U.S. provisional patent application serial No. 62/640,415 entitled "ESTIMATING STATE outdoor patent END AND CONTROL SYSTEM valve".

The applicant of the present patent application owns the following U.S. provisional patent applications filed 2017, 12, 28, the disclosure of each of which is incorporated herein by reference in its entirety:

U.S. provisional patent application serial No. 62/611,341, entitled "INTERACTIVE SURGICALPLATFORM";

U.S. provisional patent application Ser. No. 62/611,340 entitled "CLOOUD-BASED MEDICALANALYTICS"; and

U.S. patent application Ser. No. 62/611,339, entitled "ROBOT ASSISTED SURGICALLLATFORM".

Before explaining various aspects of the surgical device and generator in detail, it should be noted that the example illustrated application or use is not limited to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented alone or in combination with other aspects, variations and modifications, and may be practiced or carried out in various ways. Furthermore, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative embodiments for the convenience of the reader and are not for the purpose of limiting the invention. Moreover, it is to be understood that expressions of one or more of the following described aspects, and/or examples may be combined with any one or more of the other below described aspects, and/or examples.

Surgical hub

Referring to fig. 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (e.g., cloud 104, which may include a remote server 113 coupled to a storage device 105). Each surgical system 102 includes at least one surgical hub 106 in communication with cloud 104, which may include a remote server 113. In one example, as shown in fig. 1, the surgical system 102 includes a visualization system 108, a robotic system 110, and a handheld smart surgical instrument 112 configured to communicate with each other and/or with the hub 106. In some aspects, surgical system 102 may include M number of hubs 106, N number of visualization systems 108, O number of robotic systems 110, and P number of handheld intelligent surgical instruments 112, where M, N, O and P are integers greater than or equal to one.

In various aspects, the smart instrument 112 as described herein with reference to fig. 1-7 may be implemented as a surgical instrument 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088, and 200078a, b (fig. 23), a surgical device 200078a, b (fig. 22), and a visualization system 200086 (fig. 23). Intelligent instrument 112 (e.g., device 1)aTo 1n) Such as surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088 and 200078a, b (fig. 23), surgical instruments 200078a, b (fig. 22), and visualization system 200086 (fig. 23) are configured to be operable in the surgical data network 201, as described with reference to fig. 8.

Fig. 2 shows an example of a surgical system 102 for performing a surgical procedure on a patient lying on an operating table 114 in a surgical room 116. The robotic system 110 is used as part of the surgical system 102 during surgery. The robotic system 110 includes a surgeon's console 118, a patient side cart 120 (surgical robot), and a surgical robot hub 122. The patient side cart 120 can manipulate at least one removably coupled surgical tool 117 through a minimally invasive incision in the patient's body while the surgeon views the surgical site through the surgeon's console 118. An image of the surgical site may be obtained by the medical imaging device 124, which may be manipulated by the patient side cart 120 to orient the imaging device 124. The robot hub 122 may be used to process images of the surgical site for subsequent display to the surgeon via the surgeon's console 118.

Other types of robotic systems may be readily adapted for use with the surgical system 102. Various examples of robotic systems and surgical tools suitable for use in the present disclosure are described in U.S. provisional patent application serial No. 62/611,339 entitled "ROBOT assembly system for surgical tools" filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety.

Various examples of CLOUD-BASED analysis performed by the CLOUD 104 and suitable for use with the present disclosure are described in U.S. provisional patent application serial No. 62/611,340 entitled "CLOUD-BASED MEDICAL ANALYTICS," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety.

In various aspects, the imaging device 124 includes at least one image sensor and one or more optical components. Suitable image sensors include, but are not limited to, Charge Coupled Device (CCD) sensors and Complementary Metal Oxide Semiconductor (CMOS) sensors.

The optical components of the imaging device 124 may include one or more illumination sources and/or one or more lenses. One or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.

The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as in the invisible spectrum. The visible spectrum (sometimes referred to as the optical spectrum or the luminescence spectrum) is that portion of the electromagnetic spectrum that is visible to (i.e., detectable by) the human eye, and may be referred to as visible light or simple light. A typical human eye will respond to wavelengths in air from about 380nm to about 750 nm.

The invisible spectrum (i.e., the non-luminescent spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380nm and above about 750 nm). The human eye cannot detect the invisible spectrum. Wavelengths greater than about 750nm are longer than the red visible spectrum and they become invisible Infrared (IR), microwave and radio electromagnetic radiation. Wavelengths less than about 380nm are shorter than the violet spectrum and they become invisible ultraviolet, x-ray and gamma-ray electromagnetic radiation.

In various aspects, the imaging device 124 is configured for use in minimally invasive surgery. Examples of imaging devices suitable for use in the present disclosure include, but are not limited to, arthroscopes, angioscopes, bronchoscopes, cholangioscopes, colonoscopes, cytoscopes, duodenoscopes, enteroscopes, esophago-duodenoscopes (gastroscopes), endoscopes, laryngoscopes, nasopharyngo-nephroscopes, sigmoidoscopes, thoracoscopes, and intrauterine scopes.

In one aspect, the imaging device employs multispectral monitoring to distinguish topography from underlying structures. A multispectral image is an image that captures image data across a particular range of wavelengths of the electromagnetic spectrum. The wavelengths may be separated by filters or by using instruments that are sensitive to specific wavelengths, including light from frequencies outside the visible range, such as IR and ultraviolet. Spectral imaging may allow extraction of additional information that the human eye fails to capture with its red, green, and blue receptors. The use of multispectral Imaging is described in more detail under the heading "Advanced Imaging Acquisition Module" of U.S. provisional patent application serial No. 62/611,341, entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety. Multispectral monitoring may be a useful tool for repositioning the surgical site after completion of a surgical task to perform one or more of the previously described tests on the treated tissue.

It is self-evident that strict sterilization of the operating room and surgical equipment is required during any surgical procedure. The stringent hygiene and sterilization conditions required in a "surgical room" (i.e., an operating room or treatment room) require the highest possible sterility of all medical devices and equipment. Part of this sterilization process is any substance that needs to be sterilized, including the imaging device 124 and its attachments and components, to contact the patient or penetrate the sterile field. It should be understood that the sterile field may be considered a designated area that is considered free of microorganisms, such as within a tray or within a sterile towel, or the sterile field may be considered an area around a patient that is ready for surgery. The sterile field may include members of a team who are properly wearing swabs, as well as all equipment and fixtures in the field.

In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage arrays, and one or more displays, which are strategically arranged relative to the sterile zone, as shown in fig. 2. In one aspect, the visualization system 108 includes interfaces for HL7, PACS, and EMR. Various components of the visualization system 108 are described under the heading "Advanced Imaging Acquisition Module" of U.S. provisional patent application serial No. 62/611,341, entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated by reference herein in its entirety.

As shown in fig. 2, a main display 119 is positioned in the sterile field to be visible to the operator at the surgical table 114. Further, the visualization tower 111 is positioned outside the sterile field. Visualization tower 111 includes a first non-sterile display 107 and a second non-sterile display 109 facing away from each other. The visualization system 108 guided by the hub 106 is configured to be able to utilize the displays 107, 109, and 119 to coordinate the flow of information to operators inside and outside the sterile zone. For example, the hub 106 may cause the visualization system 108 to display a snapshot of the surgical site recorded by the imaging device 124 on the non-sterile display 107 or 109 while maintaining a real-time feed of the surgical site on the main display 119. A snapshot on non-sterile display 107 or 109 may allow a non-sterile operator to, for example, perform diagnostic steps associated with a surgical procedure.

In one aspect, hub 106 is further configured to be able to route diagnostic inputs or feedback entered by non-sterile operators at visualization tower 111 to a main display 119 within the sterile field, where it can be viewed by the sterile operator on the operating floor. In one example, the input may be a modified form of a snapshot displayed on non-sterile display 107 or 109, which may be routed through hub 106 to main display 119.

Referring to fig. 2, a surgical instrument 112 is used in surgery as part of the surgical system 102. Hub 106 is further configured to coordinate the flow of information to the display of surgical instrument 112. For example, the coordinated information flow is further described in U.S. provisional patent application serial No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety. Diagnostic inputs or feedback entered by a non-sterile operator at the visualization tower 111 may be routed by the hub 106 to a surgical instrument display 115 within the sterile field, where the inputs or feedback may be viewed by the operator of the surgical instrument 112. Exemplary Surgical instruments suitable for use in the Surgical system 102 are described, for example, under the heading "Surgical Instrument Hardware" of U.S. provisional patent application serial No. 62/611,341 entitled "INTERACTIVE SURGICAL PLATFORM," filed 2017, 12, 28, the disclosure of which is incorporated herein by reference in its entirety.

Referring now to fig. 3, hub 106 is depicted in communication with visualization system 108, robotic system 110, and handheld intelligent surgical instrument 112. The hub 106 includes a hub display 135, an imaging module 138, a generator module 140 (which may include a monopole generator 142, a dipole generator 144, and/or an ultrasound generator 143), a communication module 130, a processor module 132, and a memory array 134. In certain aspects, as shown in fig. 3, the hub 106 further includes a smoke evacuation module 126, a suction/irrigation module 128, and/or an operating room mapping module 133.

During surgery, the application of energy to tissue for sealing and/or cutting is typically associated with smoke evacuation, aspiration of excess fluid, and/or irrigation of the tissue. Fluid lines, power lines and/or data lines from different sources are often tangled during surgery. Solving the problem during surgery may waste valuable time. Disconnecting the lines may require disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular housing 136 provides a unified environment for managing power, data, and fluid lines, which reduces the frequency of entanglement between such lines.

Aspects of the present disclosure provide a surgical hub for use in a surgical procedure involving application of energy to tissue at a surgical site. The surgical hub includes a hub housing and a composite generator module slidably received in a docking station of the hub housing. The docking station includes data contacts and power contacts. The combined generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component seated in a single cell. In one aspect, the combined generator module further comprises a smoke evacuation component for connecting the combined generator module to at least one energy delivery cable of the surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluids, and/or particles generated by application of the therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component.

In one aspect, the fluid line is a first fluid line and the second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub housing. In one aspect, the hub housing includes a fluid interface.

Certain surgical procedures may require more than one energy type to be applied to tissue. One energy type may be more advantageous for cutting tissue, while a different energy type may be more advantageous for sealing tissue. For example, a bipolar generator may be used to seal tissue, while an ultrasonic generator may be used to cut the sealed tissue. Aspects of the present disclosure provide a solution in which the hub modular housing 136 is configured to accommodate different generators and facilitate interactive communication therebetween. One of the advantages of the hub modular housing 136 is the ability to quickly remove and/or replace various modules.

Aspects of the present disclosure provide a modular surgical housing for use in a surgical procedure involving the application of energy to tissue. The modular surgical housing includes a first energy generator module configured to generate first energy for application to tissue, and a first docking station including a first docking port including first data and power contacts, wherein the first energy generator module is slidably movable into electrical engagement with the power and data contacts, and wherein the first energy generator module is slidably movable out of electrical engagement with the first power and data contacts,

in addition to the above, the modular surgical housing further comprises a second energy generator module configured to generate a second energy different from the first energy for application to tissue, and a second docking station comprising a second docking port comprising a second data and power contact, wherein the second energy generator module is slidably movable into electrical engagement with the power and data contact, and wherein the second energy generator is slidably movable out of electrical contact with the second power and data contact.

In addition, the modular surgical housing further includes a communication bus between the first docking port and the second docking port configured to facilitate communication between the first energy generator module and the second energy generator module.

Referring to fig. 3-7, aspects of the present disclosure are presented as a hub modular housing 136 that allows for modular integration of the generator module 140, smoke evacuation module 126, and suction/irrigation module 128. The hub modular housing 136 also facilitates interactive communication between the modules 140, 126, 128. As shown in fig. 5, the generator module 140 may be a generator module with integrated monopolar, bipolar, and ultrasound components supported in a single housing unit 139 that is slidably inserted into the hub modular housing 136. As shown in fig. 5, the generator module 140 may be configured to be connectable to a monopolar device 146, a bipolar device 147, and an ultrasound device 148. Alternatively, the generator modules 140 may include a series of monopole generator modules, bipolar generator modules, and/or ultrasonic generator modules that interact through the hub modular housing 136. The hub modular housing 136 can be configured to facilitate the insertion of multiple generators and the interactive communication between generators docked into the hub modular housing 136 such that the generators will act as a single generator.

In one aspect, the hub modular housing 136 includes a modular power and communications backplane 149 having external and wireless communications connections to enable removable attachment of the modules 140, 126, 128 and interactive communications therebetween.

In one aspect, the hub modular housing 136 includes a docking cradle or drawer 151 (also referred to herein as a drawer) configured to slidably receive the modules 140, 126, 128. Fig. 4 illustrates a partial perspective view of the surgical hub housing 136 and the composite generator module 145 slidably received in the docking station 151 of the surgical hub housing 136. The docking ports 152 having power and data contacts on the back of the combined generator module 145 are configured to engage the corresponding docking ports 150 with the power and data contacts of the corresponding docking station 151 of the hub modular housing 136 when the combined generator module 145 is slid into place within the corresponding docking station 151 of the hub modular housing 136. In one aspect, the combined generator module 145 includes bipolar, ultrasonic, and monopolar modules integrated together into a single housing unit 139, as shown in fig. 5.

In various aspects, the smoke evacuation module 126 includes a fluid line 154 that communicates captured/collected smoke and/or fluid from the surgical site to, for example, the smoke evacuation module 126. Vacuum suction from smoke evacuation module 126 may draw smoke into the opening of the common conduit at the surgical site. The utility conduit coupled to the fluid line may be in the form of a flexible tube terminating at the smoke evacuation module 126. The common conduit and fluid lines define a fluid path that extends toward the smoke evacuation module 126 received in the hub housing 136.

In various aspects, the suction/irrigation module 128 is coupled to a surgical tool that includes an aspiration fluid line and a suction fluid line. In one example, the aspiration fluid line and the suction fluid line are in the form of flexible tubes extending from the surgical site toward the suction/irrigation module 128. The one or more drive systems may be configured to irrigate fluid to and aspirate fluid from the surgical site.

In one aspect, a surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, a suction tube, and an irrigation tube. The draft tube may have an inlet at a distal end thereof, and the draft tube extends through the shaft. Similarly, a draft tube may extend through the shaft and may have an inlet adjacent the energy delivery tool. The energy delivery tool is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable that initially extends through the shaft.

The irrigation tube may be in fluid communication with a fluid source, and the aspiration tube may be in fluid communication with a vacuum source. The fluid source and/or vacuum source may be seated in the suction/irrigation module 128. In one example, the fluid source and/or vacuum source may be seated in the hub housing 136 independently of the suction/irrigation module 128. In such examples, the fluid interface can connect the suction/irrigation module 128 to a fluid source and/or a vacuum source.

In one aspect, the modules 140, 126, 128 on the hub modular housing 136 and/or their corresponding docking stations may include alignment features configured to enable alignment of the docking ports of the modules into engagement with their corresponding ports in the docking stations of the hub modular housing 136. For example, as shown in fig. 4, the combined generator module 145 includes side brackets 155 configured to be slidably engageable with corresponding brackets 156 of corresponding docking stations 151 of the hub modular housing 136. The brackets cooperate to guide the docking port contacts of the combined generator module 145 into electrical engagement with the docking port contacts of the hub modular housing 136.

In some aspects, the drawers 151 of the hub modular housing 136 are the same or substantially the same size, and the modules are sized to be received in the drawers 151. For example, the side brackets 155 and/or 156 may be larger or smaller depending on the size of the module. In other aspects, the drawers 151 are sized differently and are each designed to accommodate a particular module.

In addition, the contacts of a particular module may be keyed to engage the contacts of a particular drawer to avoid inserting the module into a drawer having unmatched contacts.

As shown in fig. 4, the docking port 150 of one drawer 151 may be coupled to the docking port 150 of another drawer 151 by a communication link 157 to facilitate interactive communication between modules seated in the hub modular housing 136. Alternatively or additionally, the docking port 150 of the hub modular housing 136 can facilitate wireless interactive communication between modules seated in the hub modular housing 136. Any suitable wireless communication may be employed, such as, for example, Air Titan-Bluetooth.

Fig. 6 illustrates a single power bus attachment for multiple lateral docking ports of a lateral modular housing 160 configured to receive multiple modules of a surgical hub 206. The lateral modular housing 160 is configured to laterally receive and interconnect the modules 161. The modules 161 are slidably inserted into docking feet 162 of a lateral modular housing 160 that includes a floor for interconnecting the modules 161. As shown in fig. 6, the modules 161 are arranged laterally in a lateral modular housing 160. Alternatively, the modules 161 may be arranged vertically in a lateral modular housing.

Fig. 7 illustrates a vertical modular housing 164 configured to receive a plurality of modules 165 of surgical hub 106. The modules 165 are slidably inserted into docking feet or drawers 167 of a vertical modular housing 164 that includes a floor for interconnecting the modules 165. Although the drawers 167 of the vertical modular housing 164 are arranged vertically, in some cases, the vertical modular housing 164 may include laterally arranged drawers. Further, the modules 165 may interact with each other through docking ports of the vertical modular housing 164. In the example of FIG. 7, a display 177 is provided for displaying data related to the operation of module 165. In addition, the vertical modular housing 164 includes a main module 178 that seats a plurality of sub-modules slidably received in the main module 178.

In various aspects, the imaging module 138 includes an integrated video processor and modular light source, and is adapted for use with a variety of imaging devices. In one aspect, the imaging device is constructed of a modular housing that can be fitted with a light source module and a camera module. The housing may be a disposable housing. In at least one example, the disposable housing is removably coupled to the reusable controller, the light source module, and the camera module. The light source module and/or the camera module may be selectively selected according to the type of the surgical operation. In one aspect, the camera module includes a CCD sensor. In another aspect, the camera module includes a CMOS sensor. In another aspect, the camera module is configured for scanning beam imaging. Also, the light source module may be configured to be capable of delivering white light or different light depending on the surgical procedure.

During a surgical procedure, it may be inefficient to remove a surgical device from a surgical site and replace the surgical device with another surgical device that includes a different camera or a different light source. Temporary loss of vision at the surgical site can lead to undesirable consequences. The modular imaging apparatus of the present disclosure is configured to enable the replacement of a light source module or a camera module during a surgical procedure without having to remove the imaging apparatus from the surgical site.

In one aspect, an imaging device includes a tubular housing including a plurality of channels. The first channel is configured to slidably receive a camera module that may be configured for snap-fit engagement with the first channel. The second channel is configured to slidably receive a light source module that may be configured for snap-fit engagement with the second channel. In another example, the camera module and/or the light source module may be rotated within their respective channels to a final position. Threaded engagement may be used instead of snap-fit engagement.

In various examples, multiple imaging devices are placed at different locations in a surgical field to provide multiple views. The imaging module 138 may be configured to be able to switch between imaging devices to provide an optimal view. In various aspects, the imaging module 138 may be configured to be able to integrate images from different imaging devices.

Various IMAGE PROCESSORs AND imaging devices suitable for use in the present disclosure are described in U.S. patent No. 7,995,045 entitled "COMBINED SBI AND associated IMAGE PROCESSOR" published on 9.8.2011, which is incorporated by reference herein in its entirety. Further, U.S. patent 7,982,776 entitled "MOTION artifact AND METHOD," published 7/19/2011, which is incorporated herein by reference in its entirety, describes various systems for removing MOTION artifacts from image data. Such a system may be integrated with the imaging module 138. Further, U.S. patent application publication 2011/0306840 entitled "control motor SOURCE TO fine motor anode separator application" published on 15.12.2011 and U.S. patent application publication 2014/0243597 entitled "SYSTEM FOR anode mixture A MINIMALLY INVASIVE target product" published on 28.8.2014, each of which is incorporated herein by reference in its entirety.

Fig. 8 illustrates a surgical data network 201 including a modular communication hub 203 configured to enable connection of modular devices located in one or more operating rooms of a medical facility or any room in a medical facility specially equipped for surgical operations to a cloud-based system (e.g., a cloud 204 that may include a remote server 213 coupled to a storage device 205). In one aspect, modular communication hub 203 includes a network hub 207 and/or a network switch 209 that communicate with network routers. Modular communication hub 203 may also be coupled to local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 may be configured to be passive, intelligent, or switched. The passive surgical data network acts as a conduit for data, enabling it to be transferred from one device (or segment) to another device (or segment) as well as cloud computing resources. The intelligent surgical data network includes additional features to enable monitoring of traffic through the surgical data network and configuring each port in the network hub 207 or network switch 209. The intelligent surgical data network may be referred to as a manageable hub or switch. The switching hub reads the destination address of each packet and then forwards the packet to the correct port.

Modular devices 1a-1n located in an operating room may be coupled to a modular communication hub 203. Network hub 207 and/or network switch 209 may be coupled to network router 211 to connect devices 1a-1n to cloud 204 or local computer system 210. Data associated with the devices 1a-1n may be transmitted via the router to the cloud-based computer for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transmitted to the local computer system 210 for local data processing and manipulation. Modular devices 2a-2m located in the same operating room may also be coupled to network switch 209. Network switch 209 may be coupled to network hub 207 and/or network router 211 to connect devices 2a-2m to cloud 204. Data associated with the devices 2a-2n may be transmitted via the network router 211 to the cloud 204 for data processing and manipulation. Data associated with the devices 2a-2m may also be transmitted to the local computer system 210 for local data processing and manipulation.

It should be understood that surgical data network 201 may be expanded by interconnecting multiple hubs 207 and/or multiple network switches 209 with multiple network routers 211. The modular communication hub 203 may be housed in a modular control tower configured to be able to receive a plurality of devices 1a-1n/2a-2 m. Local computer system 210 may also be contained in a modular control tower. The modular communication hub 203 is connected to a display 212 to display images obtained by some of the devices 1a-1n/2a-2m, for example, during surgery. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as non-contact sensor modules in an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, a smoke evacuation module 126, a suction/irrigation module 128, a communication module 130, a processor module 132, a memory array 134, a surgical device connected to a display, and/or other modular devices that may be connected to a modular communication hub 203 of a surgical data network 201.

In one aspect, the surgical data network 201 may include a combination of network hubs, network switches, and network routers that connect the devices 1a-1n/2a-2m to the cloud. Any or all of the devices 1a-1n/2a-2m coupled to the hub or network switch may collect data in real time and transmit the data into the cloud computer for data processing and manipulation. It should be appreciated that cloud computing relies on shared computing resources rather than using local servers or personal devices to process software applications. The term "cloud" may be used as a metaphor for "internet," although the term is not so limited. Accordingly, the term "cloud computing" may be used herein to refer to a "type of internet-based computing" in which different services (such as servers, storage devices, and applications) are delivered to modular communication hub 203 and/or computer system 210 located in a surgical room (e.g., a fixed, mobile, temporary, or live operating room or space) and devices connected to modular communication hub 203 and/or computer system 210 over the internet. The cloud infrastructure may be maintained by a cloud service provider. In this case, the cloud service provider may be an entity that coordinates the use and control of the devices 1a-1n/2a-2m located in one or more operating rooms. Cloud computing services can perform a large amount of computing based on data collected by smart surgical instruments, robots, and other computerized devices located in the operating room. The hub hardware enables multiple devices or connections to connect to a computer that communicates with the cloud computing resources and storage devices.

Applying cloud computer data processing techniques to the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical results, reduced costs and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to observe tissue conditions to assess leakage or perfusion of sealed tissue after tissue sealing and cutting procedures. At least some of the devices 1a-1n/2a-2m may be employed to identify pathologies, such as the effects of disease, using cloud-based computing to examine data including images of body tissue samples for diagnostic purposes. This includes localization and edge confirmation of tissues and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using various sensors integrated with imaging devices and techniques, such as overlaying images captured by multiple imaging devices. The data (including image data) collected by the devices 1a-1n/2a-2m may be transmitted to the cloud 204 or the local computer system 210 or both for data processing and manipulation, including image processing and manipulation. Such data analysis may further employ outcome analysis processing, and use of standardized methods may provide beneficial feedback to confirm or suggest modification of the behavior of the surgical treatment and surgeon.

In one implementation, the operating room devices 1a-1n may be connected to the modular communication hub 203 through a wired channel or a wireless channel, depending on the configuration of the devices 1a-1n to the network hub. In one aspect, hub 207 may be implemented as a local network broadcaster operating at the physical layer of the Open Systems Interconnection (OSI) model. The hub provides connectivity to devices 1a-1n located in the same operating room network. The hub 207 collects the data in the form of packets and transmits it to the router in half duplex mode. Hub 207 does not store any media access control/internet protocol (MAC/IP) used to transmit device data. Only one of the devices 1a-1n may transmit data through the hub 207 at a time. The hub 207 does not have routing tables or intelligence as to where to send information and broadcast all network data on each connection and to the remote server 213 (fig. 9) through the cloud 204. Hub 207 may detect basic network errors such as conflicts, but broadcasting all information to multiple ports may present a security risk and lead to bottlenecks.

In another implementation, the operating room devices 2a-2m may be connected to the network switch 209 via a wired channel or a wireless channel. Network switch 209 operates in the data link layer of the OSI model. The network switch 209 is a multicast device for connecting devices 2a-2m located in the same operating room to the network. Network switch 209 sends data in frames to network router 211 and operates in full duplex mode. Multiple devices 2a-2m may transmit data simultaneously through the network switch 209. The network switch 209 stores and uses the MAC addresses of the devices 2a-2m to transmit data.

Network hub 207 and/or network switch 209 are coupled to network router 211 to connect to cloud 204. Network router 211 operates in the network layer of the OSI model. Network router 211 creates a route for transmitting data packets received from network hub 207 and/or network switch 211 to the cloud-based computer resources for further processing and manipulation of data collected by any or all of devices 1a-1n/2a-2 m. Network router 211 may be employed to connect two or more different networks located at different locations, such as, for example, different operating rooms of the same medical facility or different networks located in different operating rooms of different medical facilities. Network router 211 sends data in packets to cloud 204 and operates in full duplex mode. Multiple devices may transmit data simultaneously. The network router 211 transmits data using the IP address.

In one example, hub 207 may be implemented as a USB hub, which allows multiple USB devices to be connected to a host. A USB hub may extend a single USB port to multiple tiers so that more ports are available for connecting devices to a host system computer. Hub 207 may include wired or wireless capabilities for receiving information over a wired channel or a wireless channel. In one aspect, a wireless USB short-range, high bandwidth wireless radio communication protocol may be used for communication between devices 1a-1n and devices 2a-2m located in an operating room.

In other examples, the operating room devices 1a-1n/2a-2m may communicate with the modular communication hub 203 via the Bluetooth wireless technology standard for exchanging data from fixed and mobile devices over short distances (using short wavelength UHF radio waves of 2.4 to 2.485GHz in the ISM band) and building Personal Area Networks (PANs). In other aspects, the operating room devices 1a-1n/2a-2m may communicate with the modular communication hub 203 via a variety of wireless or wired communication standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 series), WiMAX (IEEE 802.16 series), IEEE802.20, Long Term Evolution (LTE) and Ev-DO, HSPA +, HSDPA +, HSUPA +, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and ethernet derivatives thereof, as well as any other wireless and wired protocols designated 3G, 4G, 5G, and above. The computing module may include a plurality of communication modules. For example, a first communication module may be dedicated to shorter range wireless communications such as Wi-Fi and bluetooth, and a second communication module may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and the like.

The modular communication hub 203 may serve as a central connection for one or all of the operating room devices 1a-1n/2a-2m and handle a data type called a frame. The frames carry data generated by the devices 1a-1n/2a-2 m. When the modular communication hub 203 receives the frame, it is amplified and transmitted to the network router 211, which transmits the data to the cloud computing resources using a plurality of wireless or wired communication standards or protocols as described herein.

Modular communication hub 203 may be used as a stand-alone device or connected to a compatible network hub and network switch to form a larger network. The modular communication hub 203 is generally easy to install, configure and maintain, making it a good option to network the operating room devices 1a-1n/2a-2 m.

Fig. 9 illustrates a computer-implemented interactive surgical system 200. The computer-implemented interactive surgical system 200 is similar in many respects to the computer-implemented interactive surgical system 100. For example, the computer-implemented interactive surgical system 200 includes one or more surgical systems 202 that are similar in many respects to the surgical system 102. Each surgical system 202 includes at least one surgical hub 206 in communication with a cloud 204, which may include a remote server 213. In one aspect, the computer-implemented interactive surgical system 200 includes a modular control tower 236 that is connected to a plurality of operating room devices, such as, for example, intelligent surgical instruments, robots, and other computerized devices located in an operating room. As shown in fig. 10, the modular control tower 236 includes a modular communication hub 203 coupled to the computer system 210. As shown in the example of fig. 9, the modular control tower 236 is coupled to an imaging module 238 coupled to an endoscope 239, a generator module 240 coupled to an energy device 241, a smoke ejector module 226, a suction/irrigation module 228, a communication module 230, a processor module 232, a storage array 234, a smart device/instrument 235 optionally coupled to a display 237, and a non-contact sensor module 242. The operating room devices are coupled to cloud computing resources and data storage via modular control tower 236. Robot hub 222 may also be connected to modular control tower 236 and cloud computing resources. The devices/instruments 235, visualization system 208, etc. may be coupled to the modular control tower 236 via wired or wireless communication standards or protocols, as described herein. The modular control tower 236 may be coupled to the hub display 215 (e.g., monitor, screen) to display and overlay images received from the imaging module, device/instrument display, and/or other visualization system 208. The hub display may also combine the image and the overlay image to display data received from devices connected to the modular control tower.

Fig. 10 shows the surgical hub 206 including a plurality of modules coupled to a modular control tower 236. The modular control tower 236 includes a modular communication hub 203 (e.g., a network connectivity device) and a computer system 210 to provide, for example, local processing, visualization, and imaging. As shown in fig. 10, the modular communication hub 203 may be connected in a hierarchical configuration to expand the number of modules (e.g., devices) that may be connected to the modular communication hub 203 and transmit data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in fig. 10, each of the network hubs/switches in modular communication hub 203 includes three downstream ports and one upstream port. The upstream hub/switch is connected to the processor to provide a communication connection with the cloud computing resources and the local display 217. Communication with the cloud 204 may be through a wired or wireless communication channel.

The surgical hub 206 employs the non-contact sensor module 242 to measure dimensions of the operating room and uses ultrasound or laser type non-contact measurement devices to generate a map of the operating room. An ultrasound-based non-contact sensor module scans an Operating Room by emitting a burst of ultrasound waves and receiving echoes as it bounces off the enclosure of the Operating Room, as described under the heading "Surgical Hub Spatial aware Within the us provisional patent application serial No. 62/611,341 entitled" INTERACTIVE SURGICAL PLATFORM, "filed on 28.12.2017, which is incorporated herein by reference in its entirety, wherein the sensor module is configured to be able to determine the size of the Operating Room and adjust the bluetooth pairing distance limit. The laser-based contactless sensor module scans the operating room by emitting laser pulses, receiving laser pulses bouncing off the enclosure of the operating room, and comparing the phase of the emitted pulses with the received pulses to determine the size of the operating room and adjust the bluetooth pairing distance limit.

Computer system 210 includes a processor 244 and a network interface 245. The processor 244 is coupled via a system bus to the communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface 251. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), micro Charmel architecture (MSA), extended ISA (eisa), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), personal computer memory card international association bus (PCMCIA), Small Computer System Interface (SCSI), or any other peripheral bus.

Processor 244 may be any single-core or multi-core processor, such as those provided by Texas Instruments under the tradename ARM Cortex. In one aspect, the processor can be available from, for example, Texas InThe LM4F230H5QR ARM Cortex-M4F processor core of columns, which includes 256KB of on-chip memory of single cycle flash or other non-volatile memory (up to 40MHz), a prefetch buffer for improved performance above 40MHz, 32KB of single cycle Sequential Random Access Memory (SRAM), loaded with a load of memorySoftware internal Read Only Memory (ROM), 2KB Electrically Erasable Programmable Read Only Memory (EEPROM), and/or one or more Pulse Width Modulation (PWM) modules, one or more Quadrature Encoder Input (QEI) analog, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, the details of which can be seen in the product data sheet.

In one aspect, processor 244 may comprise a safety controller comprising two series controller-based controllers (such as TMS570 and RM4x), known under the trade name Hercules ARMCortex R4, also manufactured by Texas Instruments. The safety controller can be configured to be dedicated to IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.

The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in nonvolatile memory. For example, nonvolatile memory can include ROM, Programmable ROM (PROM), Electrically Programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes Random Access Memory (RAM), which acts as external cache memory. Further, RAM may be available in a variety of forms, such as SRAM, Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM).

The computer system 210 also includes removable/non-removable, volatile/nonvolatile computer storage media, such as, for example, magnetic disk storage. Disk storage devices include, but are not limited to, devices such as a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick. In addition, disk storage devices can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), a compact disk recordable drive (CD-R drive), a compact disk rewritable drive (CD-RW drive) or a digital versatile disk ROM drive (DVD-ROM). To facilitate connection of the disk storage devices to the system bus, a removable or non-removable interface may be used.

It is to be appreciated that the computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in suitable operating environments. Such software includes an operating system. An operating system, which may be stored on disk storage, is used to control and allocate resources of the computer system. System applications utilize the operating system to manage resources through program modules and program data stored in system memory or on disk storage. It is to be appreciated that the various components described herein can be implemented with various operating systems or combinations of operating systems.

A user enters commands or information into the computer system 210 through input devices coupled to the I/O interface 251. Input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices are connected to the processor through the system bus via interface ports. The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device uses the same type of port as the input device. Thus, for example, a USB port may be used to provide input to a computer system and to output information from the computer system to an output device. Output adapters are provided to illustrate that there are some output devices (such as monitors, displays, speakers, and printers) that require special adapters among other output devices.

The computer system 210 may operate in a networked environment using logical connections to one or more remote computers, such as a cloud computer, or local computers. The remote cloud computer can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor-based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of clarity, only a memory storage device with a remote computer is illustrated. The remote computer is logically connected to the computer system through a network interface and then physically connected via a communications connection. Network interfaces encompass communication networks such as Local Area Networks (LANs) and Wide Area Networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, token Ring/IEEE 802.5, and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL).

In various aspects, the computer system 210, imaging module 238, and/or visualization system 208 of fig. 10, and/or the processor module 232 of fig. 9-10 may include an image processor, an image processing engine, a media processor, or any dedicated Digital Signal Processor (DSP) for processing digital images. The image processor may employ parallel computing with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) techniques to increase speed and efficiency. The digital image processing engine may perform a series of tasks. The image processor may be a system on a chip having a multi-core processor architecture.

A communication connection refers to the hardware/software used to interface the network to the bus. While a communication connection is shown for exemplary clarity within the computer system, it can also be external to computer system 210. The hardware/software necessary for connection to the network interface includes, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.

In various aspects, the device/instrument 235 described with reference to fig. 9-10 may be implemented as a surgical instrument 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21)200088 and 200078a, b (fig. 23), a surgical device 200078a, b (fig. 22), and a visualization system 200086 (fig. 23). Thus, the surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21)200088 and 200078a, b (fig. 23), the surgical devices 200078a, b (fig. 22), and the visualization system 200086 (fig. 23) are configured to be engageable with the modular control tower 236 and the surgical hub 206. Once connected to the surgical hub 206, the surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088 and 200078a, b (fig. 23), surgical devices 200078a, b (fig. 22), and visualization system 200086 (fig. 23) are configured to be engageable with the cloud 204, server 213, other hub-connected instruments, hub display 215, or visualization system 209, or a combination thereof. In addition, once connected to the hub 206, the surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088 and 200078a, b (fig. 23), the surgical devices 200078a, b (fig. 22), and the visualization system 200086 (fig. 23) may utilize the processing circuitry available in the hub local computer system 210.

Fig. 11 illustrates a functional block diagram of one aspect of a USB hub 300 device in accordance with at least one aspect of the present disclosure. In the illustrated aspect, the USB hub device 300 employs a Texas Instruments TUSB2036 integrated circuit hub. The USB hub 300 is a CMOS device that provides an upstream USB transceiver port 302 and up to three downstream USB transceiver ports 304, 306, 308 according to the USB 2.0 specification. The upstream USB transceiver port 302 is a differential root data port that includes a differential data negative (DP0) input paired with a differential data positive (DM0) input. The three downstream USB transceiver ports 304, 306, 308 are differential data ports, where each port includes a differential data positive (DP1-DP3) output paired with a differential data negative (DM1-DM3) output.

The USB hub 300 device is implemented with a digital state machine rather than a microcontroller and does not require firmware programming. Fully compatible USB transceivers are integrated into the circuitry for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full-speed devices and low-speed devices by automatically setting the slew rate according to the speed of the device attached to the port. The USB hub 300 device may be configured to be capable of being in a bus-powered mode or a self-powered mode and includes hub power logic 312 for managing power.

The USB hub 300 device includes a serial interface engine 310 (SIE). SIE 310 is the front end of the USB hub 300 hardware and handles most of the protocols described in section 8 of the USB specification. The SIE 310 typically includes signaling up to the transaction level. The processing functions thereof may include: packet identification, transaction ordering, SOP, EOP, RESET and RESUME signal detection/generation, clock/data separation, no return to zero inversion (NRZI) data encoding/decoding and digit stuffing, CRC generation and verification (token and data), packet id (pid) generation and verification/decoding, and/or serial-parallel/parallel-serial conversion. 310 receives a clock input 314 and is coupled to pause/resume logic and frame timer 316 circuitry and hub repeater circuitry 318 to control communications between the upstream USB transceiver port 302 and the downstream USB transceiver ports 304, 306, 308 through port logic circuits 320, 322, 324. The SIE 310 is coupled to a command decoder 326 via interface logic 328 to control commands from the serial EEPROM via a serial EEPROM interface 330.

In various aspects, the USB hub 300 may connect 127 functions configured in up to six logical layers (tiers) to a single computer. Further, the USB hub 300 may be connected to all external devices using a standardized four-wire cable that provides both communication and power distribution. The power configuration is a bus powered mode and a self-powered mode. The USB hub 300 may be configured to support four power management modes: bus-powered hubs with individual port power management or package port power management, and self-powered hubs with individual port power management or package port power management. In one aspect, the USB hub 300, upstream USB transceiver port 302, are plugged into the USB host controller using a USB cable, and downstream USB transceiver ports 304, 306, 308 are exposed for connection of USB compatible devices, or the like.

Additional details regarding the structure and function OF the surgical HUB and/or surgical HUB network can be found in U.S. provisional patent application No. 62/659,900 entitled "METHOD OF HUB COMMUNICATION" filed on 19.4.2018, which is hereby incorporated by reference in its entirety.

Cloud system hardware and functional module

Fig. 12 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. In one aspect, a computer-implemented interactive surgical system is configured to monitor and analyze data related to the operation of various surgical systems including a surgical hub, surgical instruments, robotic devices, and an operating room or medical facility. A computer-implemented interactive surgical system includes a cloud-based analysis system. While the cloud-based analysis system is described as a surgical system, it is not necessarily so limited and may generally be a cloud-based medical system. As shown in fig. 12, the cloud-based analysis system includes a plurality of surgical instruments 7012 (which may be the same as or similar to instrument 112), a plurality of surgical hubs 7006 (which may be the same as or similar to hub 106), and a surgical data network 7001 (which may be the same as or similar to network 201) to couple the surgical hubs 7006 to cloud 7004 (which may be the same as or similar to cloud 204). Each of the plurality of surgical hubs 7006 is communicatively coupled to one or more surgical instruments 7012. The hub 7006 is also communicatively coupled to the cloud 7004 of the computer-implemented interactive surgical system via a network 7001. The cloud 7004 is a remote centralized hardware and software source for storing, manipulating, and transmitting data generated based on the operation of various surgical systems. As shown in fig. 12, access to cloud 7004 is enabled via a network 7001, which may be the internet or some other suitable computer network. The surgical hub 7006 coupled to the cloud 7004 may be considered a client side of a cloud computing system (i.e., a cloud-based analysis system). The surgical instrument 7012 is paired with a surgical hub 7006 for use in controlling and effecting various surgical procedures or operations as described herein.

In addition, the surgical instrument 7012 can include a transceiver for transmitting data to and from its corresponding surgical hub 7006 (which can also include a transceiver). The combination of the surgical instrument 7012 and the corresponding hub 7006 may indicate a particular location for providing a medical procedure, such as an operating room in a medical facility (e.g., hospital). For example, the memory of the surgical hub 7006 may store location data. As shown in fig. 12, the cloud 7004 includes a central server 7013 (which may be the same as or similar to remote server 113 in fig. 1 and/or remote server 213 in fig. 9), a hub application server 7002, a data analysis module 7034, and an input/output ("I/O") interface 7007. The central server 7013 of the cloud 7004 collectively hosts a cloud computing system that includes monitoring requests of the client surgical hubs 7006 and managing processing capacity of the cloud 7004 for executing the requests. Each of the central servers 7013 includes one or more processors 7008 coupled to a suitable memory device 7010, which may include volatile memory, such as Random Access Memory (RAM), and non-volatile memory, such as magnetic storage. Memory device 7010 may include machine executable instructions that, when executed, cause processor 7008 to execute data analysis module 7034 for cloud-based data analysis, operations, recommendations, and other operations described below. Further, the processor 7008 may execute the data analysis module 7034 independently or in conjunction with a hub application executed independently by the hub 7006. The central server 7013 also includes a database 2212 of aggregated medical data that may reside in memory 2210.

Based on the connections to the various surgical hubs 7006 via the network 7001, the cloud 7004 may aggregate data from particular data generated by the various surgical instruments 7012 and their corresponding hubs 7006. Such summarized data may be stored within the summarized medical database 7011 of the cloud 7004. In particular, the cloud 7004 may advantageously perform data analysis and operations on the summarized data to generate insights and/or perform functions that cannot be implemented by the respective hubs 7006 themselves. To this end, as shown in fig. 12, the cloud 7004 and the surgical hub 7006 are communicatively coupled to transmit and receive information. The I/O interface 7007 is connected to a plurality of surgical hubs 7006 via a network 7001. In this manner, the I/O interface 7007 may be configured to enable transfer of information between the surgical hub 7006 and the database 7011 of aggregated medical data. Thus, the I/O interface 7007 may facilitate read/write operations of the cloud-based analytics system. Such read/write operations may be performed in response to a request from the hub 7006. These requests may be transmitted to the hub 7006 through the hub application. The I/O interface 7007 may include one or more high speed data ports, which may include a Universal Serial Bus (USB) port, an IEEE 1394 port, and Wi-Fi and bluetooth I/O interfaces for connecting the cloud 7004 to the hub 7006. The hub application server 7002 of the cloud 7004 is configured to host and provide sharing capabilities to software applications (e.g., hub applications) executed by the surgical hub 7006. For example, the hub application server 7002 may manage requests made by hub applications through the hub 7006, control access to the database 7011 of aggregated medical data, and perform load balancing. Data analysis module 7034 is described in more detail with reference to fig. 13.

The particular cloud computing system configurations described in this disclosure are specifically designed to address various issues arising in the context of medical procedures and procedures performed using medical devices (such as the surgical instruments 7012, 112). In particular, the surgical instrument 7012 can be a digital surgical device configured to interact with the cloud 7004 for implementing techniques that improve performance of a surgical procedure. Various surgical instruments 7012 and/or the surgical hub 7006 may include touch-controlled user interfaces so that a clinician can control aspects of the interaction between the surgical instrument 7012 and the cloud 7004. Other suitable user interfaces for control may also be used, such as a user interface for auditory control.

Fig. 13 is a block diagram illustrating a functional architecture of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. The cloud-based analysis system includes a plurality of data analysis modules 7034 executable by processors 7008 of cloud 7004 for providing data analysis solutions to issues specifically raised in the medical field. As shown in fig. 13, the functionality of the cloud-based data analysis module 7034 may be facilitated via a hub application 7014 hosted by a hub application server 7002, which is accessible on a surgical hub 7006. Cloud processor 7008 and hub application 7014 may operate in conjunction to execute data analysis module 7034. An Application Program Interface (API)7016 defines a set of protocols and routines corresponding to the hub application 7014. In addition, the API 7016 manages the storage and retrieval of data into and from the aggregated medical data database 7011 for the operation of the application program 7014. The cache 7018 also stores data (e.g., temporarily) and is coupled to the API 7016 for more efficient retrieval of data used by the application programs 7014. Data analysis module 7034 in fig. 13 includes resource optimization module 7020, data collection and aggregation module 7022, authentication and security module 7024, control program update module 7026, patient outcome analysis module 7028, recommendation module 7030, and data classification and prioritization module 7032. According to some aspects, cloud 7004 can also implement other suitable data analysis modules. In one aspect, the data analysis module is configured to analyze specific recommendations for trends, results, and other data.

For example, the data collection and aggregation module 7022 may be used to generate self-describing data (e.g., metadata) including identification of salient features or configurations (e.g., trends), management of redundant data sets that may be grouped by surgery, but not necessarily locked to actual surgical dates and surgeons, and storage of data in paired data sets. In particular, the set of data generated by operation of the surgical instrument 7012 may include applying a binary classification, e.g., bleeding or non-bleeding events. More generally, a binary classification may be characterized as a desired event (e.g., a successful surgical procedure) or an undesired event (e.g., a mis-fired or misused surgical instrument 7012). The aggregated self-descriptive data may correspond to individual data received from various groups or subgroups of the surgical hub 7006. Thus, the data collection and aggregation module 7022 may generate aggregated metadata or other organizational data based on the raw data received from the surgical hub 7006. To this end, the processor 7008 may be operatively coupled to a hub application 7014 and a database 7011 of aggregated medical data for executing a data analysis module 7034. The data collection and aggregation module 7022 may store aggregated organizational data in a database 2212 of aggregated medical data.

Resource optimization module 7020 may be configured to be able to analyze the aggregate data to determine an optimal use of resources for a particular or group of medical facilities. For example, the resource optimization module 7020 may determine the optimal sequence point of the surgical stapling instrument 7012 for a set of medical facilities based on the corresponding predicted demand of such instruments 7012. Resource optimization module 7020 may also evaluate resource usage or other operational configurations of various medical facilities to determine whether resource usage can be improved. Similarly, recommendation module 7030 may be configured to analyze the aggregated organizational data from data collection and aggregation module 7022 to provide recommendations. For example, the recommendation module 7030 may recommend to a medical facility (e.g., a medical services provider, such as a hospital) that a particular surgical instrument 7012 should be upgraded to an improved version based on, for example, a higher than expected error rate. In addition, the recommendation module 7030 and/or the resource optimization module 7020 may recommend better supply chain parameters, such as product reordering points, and provide recommendations for different surgical instruments 7012, their use, or surgical steps to improve surgical outcomes. The medical facility may receive such recommendations via the corresponding surgical hub 7006. More specific advice regarding the parameters or configurations of various surgical instruments 7012 may also be provided. The hub 7006 and/or the surgical instrument 7012 may also each have a display screen that displays data or recommendations provided by the cloud 7004.

The patient outcome analysis module 7028 may analyze the surgical outcome associated with the currently used operating parameters of the surgical instrument 7012. Patient outcome analysis module 7028 may also analyze and evaluate other potential operating parameters. In this regard, the recommendation module 7030 may recommend using these other potential operating parameters based on producing better surgical results (such as better sealing or less bleeding). For example, the recommendation module 7030 may transmit a recommendation to the surgical hub 7006 as to when to use a particular cartridge for a corresponding stapling surgical instrument 7012. Thus, the cloud-based analysis system, in controlling common variables, may be configured to be able to analyze a collection of large amounts of raw data and provide centralized recommendations (advantageously determined based on aggregated data) for a plurality of medical facilities. For example, a cloud-based analysis system may analyze, evaluate, and/or aggregate data based on the type of medical practice, the type of patient, the number of patients, geographic similarities between medical providers, which medical providers/facilities use similar types of instruments, and so forth, such that any individual medical facility alone cannot independently analyze.

The control program update module 7026 may be configured to execute various surgical instrument 7012 recommendations when a corresponding control program is updated. For example, patient outcome analysis module 7028 may identify correlations linking particular control parameters to successful (or unsuccessful) outcomes. Such correlations may be resolved when updated control programs are transmitted to the surgical instrument 7012 via the control program update module 7026. Updates to the instrument 7012 transmitted via the corresponding hub 7006 may incorporate aggregated performance data collected and analyzed by the data collection and aggregation module 7022 of the cloud 7004. Additionally, the patient outcome analysis module 7028 and the recommendation module 7030 may identify improved methods of using the instrument 7012 based on the aggregated performance data.

The cloud-based analytics system may include security features implemented by the cloud 7004. These security features may be managed by the authorization and security module 7024. Each surgical hub 7006 may have associated unique credentials, such as a username, password, and other suitable security credentials. These credentials may be stored in memory 7010 and associated with the allowed cloud access levels. For example, based on providing accurate credentials, the surgical hub 7006 may be granted access to communicate with the cloud to a predetermined degree (e.g., may only participate in transmitting or receiving certain defined types of information). To this end, the database 7011 of aggregated medical data of the cloud 7004 may include a database of authorization credentials for verifying the accuracy of the provisioned credentials. Different credentials may be associated with different levels of permission to interact with cloud 7004, such as a predetermined level of access for receiving data analytics generated by cloud 7004.

Further, for security purposes, the cloud may maintain a database of hubs 7006, appliances 7012, and other devices that may include a "blacklist" of forbidden devices. In particular, the blacklisted surgical hubs 7006 may not be allowed to interact with the cloud, while the blacklisted surgical instruments 7012 may not have functional access to the corresponding hubs 7006 and/or may be prevented from functioning fully when paired with their corresponding hubs 7006. Additionally or alternatively, the cloud 7004 can mark the instrument 7012 based on incompatibility or other specified criteria. In this way, counterfeit medical devices and improper reuse of such devices throughout the cloud-based analysis system may be identified and addressed.

The surgical instrument 7012 may use the wireless transceiver to transmit a wireless signal, which may represent, for example, authorization credentials for accessing the corresponding hub 7006 and cloud 7004. The wired transceiver may also be used to transmit signals. Such authorization credentials may be stored in a respective memory device of the surgical instrument 7012. The authorization and security module 7024 may determine whether the authorization credential is accurate or counterfeit. The authorization and security module 7024 may also dynamically generate authorization credentials for enhanced security. The credentials may also be encrypted, such as by using hash-based encryption. Upon transmitting appropriate authorization, the surgical instrument 7012 may transmit a signal to the corresponding hub 7006 and ultimately to the cloud 7004 to indicate that the instrument 7012 is ready to acquire and transmit medical data. In response, the cloud 7004 may transition to a state that can be used to receive medical data for storage into the database 7011 of aggregated medical data. The readiness of this data transfer may be indicated, for example, by a light indicator on the instrument 7012. The cloud 7004 can also transmit signals to the surgical instrument 7012 for updating its associated control program. The cloud 7004 may transmit a signal relating to a particular class of surgical instrument 7012 (e.g., an electrosurgical instrument) such that software updates of the control program are transmitted only to the appropriate surgical instrument 7012. Further, the cloud 7004 can be used to implement a system-wide solution to address local or global issues based on selective data transfer and authorization credentials. For example, if a group of surgical instruments 7012 is identified as having a common manufacturing defect, the cloud 7004 may change the authorization credential corresponding to the group to achieve an operational lockout of the group.

The cloud-based analysis system may allow monitoring of multiple medical facilities (e.g., medical facilities such as hospitals) to determine improved practices and suggest changes accordingly (e.g., via suggestion module 2030). Thus, the processor 7008 of the cloud 7004 may analyze data associated with each medical facility to identify the facility and aggregate the data with other data associated with other medical facilities in the group. For example, groups may be defined based on similar operational practices or geographic locations. In this way, the cloud 7004 can provide analysis and recommendations across a group of medical facilities. Cloud-based analysis systems may also be used to enhance situational awareness. For example, the processor 7008 may predictively model the impact of the recommendations on the cost and effectiveness of a particular facility (relative to the overall operation and/or various medical procedures). The cost and effectiveness associated with that particular facility may also be compared to corresponding local areas of other facilities or any other comparable facility.

Data classification and prioritization module 7032 may prioritize and classify data based on criticality (e.g., severity, surprise, suspicion of medical events associated with the data). Such classification and prioritization can be used in conjunction with the functionality of the other data analysis module 7034 described above to improve the cloud-based analysis and operations described herein. For example, data classification and prioritization module 7032 may assign prioritization to data analysis performed by data collection and aggregation module 7022 and patient outcome analysis module 7028. Different priority levels may elicit specific responses (corresponding to urgency levels) from the cloud 7004, such as escalation of accelerated responses, special handling, exclusion of the database 7011 of aggregated medical data, or other suitable responses. Further, if desired, the cloud 7004 can transmit a request (e.g., a push message) for additional data from the corresponding surgical instrument 7012 through the hub application server. The push message may cause a notification to be displayed on the corresponding hub 7006 requesting support or additional data. This push message may be required in the event that the cloud detects a significant irregularity or abnormality and the cloud cannot determine the cause of the irregularity. The central server 7013 may be programmed to trigger the push message in certain significant circumstances, such as when the data is determined to be different than an expected value that exceeds a predetermined threshold or when it appears that security has been included, for example.

In various aspects, the surgical instrument 7012 described above with reference to fig. 12 and 13 may be implemented as a surgical instrument 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088 and 200078a, b (fig. 23), a surgical device 200078a, b (fig. 22), and a visualization system 200086 (fig. 23). Thus, the surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21)200088 and 200078a, b (fig. 23), the surgical devices 200078a, b (fig. 22), and the visualization system 200086 (fig. 23) are configured to be engageable with the surgical hub 7006 and the network 2001 configured to be engageable with the cloud 7004. Thus, the processing power provided by the central server 7013 and the data analysis module 7034 is configured to process information (e.g., data and control) from the surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088 and 200078a, b (fig. 23), surgical devices 200078a, b (fig. 22), and visualization system 200086 (fig. 23).

Additional details regarding the cloud analysis system can be found in U.S. provisional patent application 62/659,900 entitled "METHOD OF HUBCOMMUNICATION," filed on 19/4.2018, which is hereby incorporated by reference in its entirety.

Context awareness

While a "smart" device that includes a control algorithm responsive to sensed data may be an improvement over a "dumb" device that operates without regard to sensed data, some sensed data may be incomplete or uncertain when considered in isolation, i.e., in the context of no type of surgical procedure being performed or type of tissue being operated upon. Without knowing the surgical context (e.g., knowing the type of tissue being operated on or the type of procedure being performed), the control algorithm may control the modular device incorrectly or sub-optimally given the particular no-context sensing data. For example, the optimal manner in which a control algorithm for controlling a surgical instrument in response to a particular sensed parameter may vary depending on the particular tissue type being operated on. This is due to the fact that: different tissue types have different characteristics (e.g., tear resistance) and thus respond differently to actions taken by a surgical instrument. Thus, it may be desirable for a surgical instrument to take different actions even when the same measurement for a particular parameter is sensed. As one particular example, the optimal manner in which a surgical stapling and severing instrument is controlled in response to the instrument sensing an unexpectedly high force for closing its end effector will vary depending on whether the tissue type is prone to tearing or tear-resistant. For tissue that is prone to tearing (such as lung tissue), the instrument's control algorithm will optimally ramp down the motor speed in response to an unexpectedly high force for closure, thereby avoiding tearing tissue. For tissue that is resistant to tearing (such as stomach tissue), the instrument's control algorithm will optimally ramp the motor speed up in response to an unexpectedly high force for closure, thereby ensuring that the end effector is properly clamped on the tissue. The control algorithm may make a suboptimal decision without knowing whether lung tissue or stomach tissue has been clamped.

One solution utilizes a surgical hub that includes a system configured to derive information about the surgical procedure being performed based on data received from various data sources, and then control the paired modular devices accordingly. In other words, the surgical hub is configured to infer information about the surgical procedure from the received data and then control the modular devices paired with the surgical hub based on the inferred context of the surgical procedure. Fig. 14 illustrates a diagram of a context aware surgical system 5100 in accordance with at least one aspect of the present disclosure. In some examples, the data source 5126 includes, for example, a modular device 5102 (which may include sensors configured to be able to detect parameters associated with the patient and/or the modular device itself), a database 5122 (e.g., an EMR database containing patient records), and a patient monitoring device 5124 (e.g., a Blood Pressure (BP) monitor and an Electrocardiogram (EKG) monitor).

The surgical hub 5104, which may be similar in many respects to the hub 106, may be configured to be capable of deriving background information related to a surgical procedure from the data, e.g., based on a particular combination of received data or a particular order in which the data is received from the data source 5126. The context information inferred from the received data may include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure being performed by the surgeon, the type of tissue being operated on, or the body cavity that is the subject of the procedure. This ability of some aspects of the surgical hub 5104 to derive or infer information related to the surgical procedure from the received data may be referred to as "context awareness. In one example, the surgical hub 5104 may incorporate a context aware system, which is hardware and/or programming associated with the surgical hub 5104 that derives background information related to the surgical procedure from the received data.

The context awareness system of the surgical hub 5104 may be configured to be able to derive contextual information from data received from the data source 5126 in a number of different ways. In one example, the context awareness system includes a pattern recognition system or machine learning system (e.g., an artificial neural network) that has been trained on training data to associate various inputs (e.g., data from the database 5122, the patient monitoring device 5124, and/or the modular device 5102) with corresponding contextual information about the surgical procedure. In other words, the machine learning system may be trained to accurately derive contextual information about the surgical procedure from the provided inputs. In another example, the context awareness system may include a look-up table that stores pre-characterized context information about the surgical procedure in association with one or more inputs (or input ranges) corresponding to the context information. In response to a query with one or more inputs, the lookup table may return corresponding context information that the context awareness system uses to control the modular device 5102. In one example, the contextual awareness system of the surgical hub 5104 receives contextual information associated with a particular control adjustment or set of control adjustments for one or more modular devices 5102. In another example, the context awareness system includes another machine learning system, a look-up table, or other such system that generates or retrieves one or more control adjustments for one or more modular devices 5102 when providing contextual information as input.

The surgical hub 5104 incorporating a context aware system provides a number of benefits to the surgical system 5100. One benefit includes improved interpretation of sensed and collected data, which in turn will improve processing accuracy and/or use of the data during the surgical procedure. Returning to the previous example, the context aware surgical hub 5104 may determine the type of tissue being operated on; thus, when an unexpectedly high force is detected for closing the end effector of the surgical instrument, the context aware surgical hub 5104 can properly ramp up or ramp down the motor speed for the tissue-type surgical instrument.

As another example, the type of tissue being operated on may affect the adjustment of the compressibility and loading thresholds of the surgical stapling and severing instrument for a particular tissue gap measurement. The context aware surgical hub 5104 may infer whether the surgical procedure being performed is a chest or abdominal procedure, allowing the surgical hub 5104 to determine whether the tissue held by the end effector of the surgical stapling and severing instrument is lung tissue (for chest procedures) or stomach tissue (for abdominal procedures). The surgical hub 5104 can then appropriately adjust the compression rate and load thresholds of the surgical stapling and severing instrument for the type of tissue.

As yet another example, the type of body cavity that is manipulated during an insufflation procedure may affect the function of the smoke extractor. Context aware surgical hub 5104 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the type of procedure. Since one type of procedure is typically performed within a particular body cavity, the surgical hub 5104 can then appropriately control the motor speed of the smoke extractor for the body cavity in which it is operating. Thus, the context aware surgical hub 5104 may provide consistent smoke output for both chest and abdominal surgery.

As yet another example, the type of procedure being performed may affect the optimal energy level at which an ultrasonic surgical instrument or a Radio Frequency (RF) electrosurgical instrument operates. For example, arthroscopic surgery requires higher energy levels because the end effector of an ultrasonic surgical instrument or RF electrosurgical instrument is immersed in fluid. Context aware surgical hub 5104 can determine whether the surgical procedure is an arthroscopic procedure. The surgical hub 5104 may then adjust the RF power level or ultrasound amplitude (i.e., the "energy level") of the generator to compensate for the fluid-filled environment. Relatedly, the type of tissue being operated on may affect the optimal energy level at which the ultrasonic surgical instrument or RF electrosurgical instrument operates. The context aware surgical hub 5104 can determine the type of surgical procedure being performed and then customize the energy level of the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, according to the expected tissue profile of the surgical procedure. Further, the context aware surgical hub 5104 may be configured to be able to adjust the energy level of the ultrasonic surgical instrument or RF electrosurgical instrument throughout the surgical procedure, rather than just on a procedure-by-procedure basis. The context aware surgical hub 5104 may determine the steps of the surgical procedure being performed or to be performed subsequently and then update the control algorithm for the generator and/or ultrasonic surgical instrument or RF electrosurgical instrument to set the energy level at a value appropriate for the desired tissue type according to the surgical procedure.

As yet another example, data may be extracted from additional data sources 5126 to improve the conclusion that the surgical hub 5104 extracts from one data source 5126. The context aware surgical hub 5104 may augment the data it receives from the modular device 5102 with contextual information about the surgical procedure that has been built from other data sources 5126. For example, the context aware surgical hub 5104 may be configured to determine from video or image data received from the medical imaging device whether hemostasis has occurred (i.e., whether bleeding at the surgical site has ceased). However, in some cases, the video or image data may be uncertain. Thus, in one example, the surgical hub 5104 may also be configured to compare physiological measurements (e.g., blood pressure sensed by a BP monitor communicatively connected to the surgical hub 5104) with hemostatic visual or image data (e.g., from the medical imaging device 124 (fig. 2) communicatively coupled to the surgical hub 5104) to determine the integrity of the suture or tissue weld. In other words, the context awareness system of the surgical hub 5104 may take into account the physiological measurement data to provide additional context when analyzing the visualization data. Additional context may be useful when the visualization data itself may be ambiguous or incomplete.

Another benefit includes actively and automatically controlling the paired modular devices 5102 according to the particular step of the surgical procedure being performed to reduce the number of times medical personnel need to interact with or control the surgical system 5100 during the surgical procedure. For example, if the context aware surgical hub 5104 determines that a subsequent step of the procedure requires the use of an RF electrosurgical instrument, it may actively activate a generator connected to the instrument. Actively activating the energy source allows the instrument to be ready for use as soon as the previous step of the procedure is completed.

As another example, the context aware surgical hub 5104 may determine whether a different view or degree of magnification on the display is required for the current or subsequent step of the surgical procedure based on features that the surgeon expects to need to view at the surgical site. The surgical hub 5104 may then actively change the displayed view accordingly (e.g., provided by the medical imaging device for the visualization system 108), such that the display is automatically adjusted throughout the surgical procedure.

As yet another example, the context aware surgical hub 5104 can determine which step of the surgical procedure is being performed or is to be performed subsequently and whether a comparison between particular data or data is required for that step of the surgical procedure. The surgical hub 5104 may be configured to automatically invoke a data screen based on the steps of the surgical procedure being performed without waiting for the surgeon to request this particular information.

Another benefit includes checking for errors during setup of the surgical procedure or during the course of the surgical procedure. For example, the context aware surgical hub 5104 may determine whether the operating room is properly or optimally set for the surgical procedure to be performed. The surgical hub 5104 may be configured to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding manifest, product location, or setup requirements, and then compare the current operating room layout to the standard layout determined by the surgical hub 5104 for the type of surgical procedure being performed. In one example, the surgical hub 5104 may be configured to be able to compare, for example, a list of items for procedures scanned by a suitable scanner and/or a list of devices paired with the surgical hub 5104 to a recommended or expected list of items and/or devices for a given surgical procedure. The surgical hub 5104 may be configured to provide an alert indicating the absence of a particular modular device 5102, patient monitoring device 5124, and/or other surgical item if any discontinuity exists between the lists. In one example, the surgical hub 5104 may be configured to be able to determine the relative distance or location of the modular device 5102 and the patient monitoring device 5124, e.g., via proximity sensors. The surgical hub 5104 can compare the relative position of the devices to a recommended or expected layout for a particular surgical procedure. The surgical hub 5104 may be configured to provide an alert indicating that the current layout for the surgical procedure deviates from the recommended layout if there are any discontinuities between layouts.

As another example, the context aware surgical hub 5104 can determine whether the surgeon (or other medical personnel) is making mistakes or otherwise deviating from the expected course of action during the surgical procedure. For example, the surgical hub 5104 may be configured to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding list of steps or order of device usage, and then compare the steps being performed or the devices being used during the surgical procedure to the expected steps or devices determined by the surgical hub 5104 for the type of surgical procedure being performed. In one example, the surgical hub 5104 may be configured to provide an alert indicating that an unexpected action is being performed or an unexpected device is being used at a particular step in the surgical procedure.

In general, the context awareness system for the surgical hub 5104 improves surgical results by adjusting the surgical instruments (and other modular devices 5102) for the particular context of each surgical procedure, such as for different tissue types, and verifying actions during the surgical procedure. The context aware system also improves the efficiency of the surgeon performing the surgery by automatically suggesting next steps, providing data, and adjusting the display and other modular devices 5102 in the operating room according to the specific context of the surgery.

In one aspect, as described below with reference to fig. 24-40, the modular device 5102 is implemented as a surgical instrument 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21)200088 and 200078a, b (fig. 23), a surgical device 200078a, b (fig. 22), and a visualization system 200086 (fig. 23). Thus, the modular device 5102 is implemented as a surgical instrument 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088, and 200078a, b (fig. 23), a surgical device 200078a, b (fig. 22), and a visualization system 200086 (fig. 23) configured to be operable as a data source 5126 and to interact with the database 5122 and the patient monitoring device 5124. The modular device 5102, implemented as surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21), 200088 and 200078a, b (fig. 23), surgical devices 200078a, b (fig. 22), and a visualization system 200086 (fig. 23), is further configured to be capable of interacting with the surgical hub 5104 to provide information (e.g., data and control) to the surgical hub 5104 and to receive information (e.g., data and control) from the surgical hub 5104.

Referring now to fig. 15, a timeline 5200 depicting context awareness of a hub, such as the surgical hub 106 or 206 (fig. 1-11), is shown. The time axis 5200 is illustrative of the surgical procedure and background information that the surgical hub 106, 206 may derive from the data received from the data source at each step in the surgical procedure. The time axis 5200 depicts typical steps that nurses, surgeons, and other medical personnel would take during a lung segmentation resection procedure, starting from the establishment of an operating room and ending with the transfer of the patient to a post-operative recovery room.

Context aware surgical hubs 106, 206 receive data from data sources throughout the surgical procedure, including data generated each time medical personnel utilize a modular device paired with the surgical hub 106, 206. The surgical hub 106, 206 may receive this data from the paired modular devices and other data sources and continually derive inferences about the procedure being performed (i.e., background information) as new data is received, such as which step of the procedure is performed at any given time. The context awareness system of the surgical hub 106, 206 can, for example, record data related to the procedure used to generate the report, verify that the medical personnel are taking steps, provide data or prompts that may be related to the particular procedure step (e.g., via a display screen), adjust the modular device based on context (e.g., activate a monitor, adjust a field of view (FOV) of a medical imaging device, or change an energy level of the ultrasonic surgical instrument or RF electrosurgical instrument), and take any other such action as described above.

As a first step 5202 in this exemplary procedure, the hospital staff retrieves the patient's EMR from the hospital's EMR database. Based on the selected patient data in the EMR, the surgical hub 106, 206 determines that the procedure to be performed is a chest procedure.

In a second step 5204, the staff scans the incoming medical supplies for the procedure. The surgical hub 106, 206 cross-references the scanned supplies with a list of supplies used in various types of procedures and confirms that the supplied mix corresponds to a chest procedure. In addition, the surgical hub 106, 206 may also be able to determine that the procedure is not a wedge procedure (because the incoming supplies lack some of the supplies required for a chest wedge procedure, or otherwise do not correspond to a chest wedge procedure).

In a third step 5206, medical personnel scan the patient belt via a scanner communicatively connected to the surgical hub 106, 206. The surgical hub 106, 206 may then confirm the identity of the patient based on the scanned data.

Fourth, the medical staff opens the ancillary equipment 5208. The ancillary equipment utilized may vary depending on the type of surgery and the technique to be used by the surgeon, but in this exemplary case they include smoke ejectors, insufflators, and medical imaging devices. When activated, the auxiliary device as a modular device may be automatically paired with a surgical hub 106, 206 located in a specific vicinity of the modular device as part of its initialization process. The surgical hub 106, 206 may then derive contextual information about the surgical procedure by detecting the type of modular device with which it is paired during the pre-operative or initialization phase. In this particular example, the surgical hub 106, 206 determines that the surgical procedure is a VATS procedure based on the particular combination of paired modular devices. Based on a combination of data from the patient's EMR, a list of medical supplies used in the procedure, and the type of modular device connected to the hub, the surgical hub 106, 206 can generally infer the particular procedure that the surgical team will perform. Once the surgical hub 106, 206 knows what specific procedure is being performed, the surgical hub 106, 206 may retrieve the steps of the procedure from memory or cloud and then cross-reference the data it subsequently receives from the connected data sources (e.g., modular devices and patient monitoring devices) to infer what steps of the surgical procedure are being performed by the surgical team.

In a fifth step 5210, the practitioner attaches EKG electrodes and other patient monitoring devices to the patient. EKG electrodes and other patient monitoring devices can be paired with the surgical hubs 106, 206. When the surgical hub 106, 206 begins to receive data from the patient monitoring device, the surgical hub 106, 206 thus confirms that the patient is in the operating room.

Sixth step 5212, the medical personnel induce anesthesia in the patient. The surgical hub 106, 206 may infer that the patient is under anesthesia based on data from the modular device and/or the patient monitoring device, including, for example, EKG data, blood pressure data, ventilator data, or a combination thereof. Upon completion of the sixth step 5212, the pre-operative portion of the lung segmentation resection procedure is completed and the operative portion begins.

In a seventh step 5214, the patient's lungs being operated on are collapsed (while ventilation is switched to the contralateral lungs). For example, the surgical hub 106, 206 may infer from the ventilator data that the patient's lungs have collapsed. The surgical hub 106, 206 may infer that the surgical portion of the procedure has begun because it may compare the detection of the patient's lung collapse to the expected steps of the procedure (which may have been previously visited or retrieved), thereby determining that collapsing the lungs is the first surgical step in that particular procedure.

In an eighth step 5216, a medical imaging device (e.g., an endoscope) is inserted and video from the medical imaging device is initiated. The surgical hub 106, 206 receives medical imaging device data (i.e., video or image data) through its connection to the medical imaging device. After receiving the medical imaging device data, the surgical hub 106, 206 may determine that a laparoscopic portion of the surgical procedure has begun. In addition, the surgical hub 106, 206 may determine that the particular procedure being performed is a segmental resection, rather than a lobectomy (note that wedge procedures have been excluded by the surgical hub 106, 206 based on the data received at the second step 5204 of the procedure). Data from the medical imaging device 124 (fig. 2) may be used to determine contextual information relating to the type of procedure being performed in a number of different ways, including by determining the angle of visualization orientation of the medical imaging device relative to the patient anatomy, monitoring the number of medical imaging devices utilized (i.e., activated and paired with the surgical hub 106, 206), and monitoring the type of visualization devices utilized. For example, one technique for performing a VATS lobectomy places a camera in the lower anterior corner of the chest above the patient's septum, while one technique for performing a VATS segmental resection places the camera in an anterior intercostal location relative to the segmental cleft. For example, using pattern recognition or machine learning techniques, the context awareness system may be trained to recognize the positioning of the medical imaging device from a visualization of the patient's anatomy. As another example, one technique for performing VATS leaf resection utilizes a single medical imaging device, while another technique for performing VATS segmental resection utilizes multiple cameras. As yet another example, a technique for performing a VATS segmental resection utilizes an infrared light source (which may be communicatively coupled to a surgical hub as part of a visualization system) to visualize segmental fissures that are not used in a VATS pulmonary resection. By tracking any or all of this data from the medical imaging device, the surgical hub 106, 206 can thus determine the particular type of surgical procedure being performed and/or the technique being used for a particular type of surgical procedure.

Ninth step 5218, the surgical team begins the dissection step of the procedure. The surgical hub 106, 206 may infer that the surgeon is dissecting to mobilize the patient's lungs because it receives data from the RF generator or ultrasound generator indicating that the energy instrument is being fired. The surgical hub 106, 206 may intersect the received data with the retrieved steps of the surgical procedure to determine that the energy instrument fired at that point in the procedure (i.e., after completion of the previously discussed surgical steps) corresponds to an anatomical step. In some cases, the energy instrument may be an energy tool mounted to a robotic arm of a robotic surgical system.

In a tenth step 5220, the surgical team continues with the surgical ligation step. The surgical hub 106, 206 may infer that the surgeon is ligating arteries and veins because it receives data from the surgical stapling and severing instrument indicating that the instrument is being fired. Similar to the previous steps, the surgical hub 106, 206 may deduce the inference by cross-referencing the receipt of data from the surgical stapling and severing instrument with the retrieval steps in the procedure. In some cases, the surgical instrument may be a surgical tool mounted to a robotic arm of a robotic surgical system.

Eleventh step 5222, a segmental resection portion of the procedure is performed. The surgical hub 106, 206 may infer that the surgeon is transecting soft tissue based on data from the surgical stapling and severing instrument, including data from its cartridge. The cartridge data may correspond to, for example, the size or type of staples fired by the instrument. Since different types of staples are used for different types of tissue, the cartridge data can indicate the type of tissue being stapled and/or transected. In this case, the type of staple fired is for soft tissue (or other similar tissue type), which allows the surgical hub 106, 206 to infer that the segmental resection portion of the procedure is in progress.

In a twelfth step 5224, a node dissection step is performed. The surgical hub 106, 206 may infer that the surgical team is dissecting a node and performing a leak test based on data received from the generator indicating that the RF or ultrasonic instrument is being fired. For this particular procedure, the RF or ultrasound instruments used after transecting the soft tissue correspond to a nodal dissection step that allows the surgical hub 106, 206 to make such inferences. It should be noted that the surgeon periodically switches back and forth between the surgical stapling/severing instrument and the surgical energy (i.e., RF or ultrasonic) instrument depending on the particular step in the procedure, as different instruments are better suited to the particular task. Thus, the particular sequence in which the stapling/severing instrument and the surgical energy instrument are used may dictate the steps of the procedure being performed by the surgeon. Further, in some cases, robotic implements may be used for one or more steps in a surgical procedure, and/or hand-held surgical instruments may be used for one or more steps in a surgical procedure. One or more surgeons may alternate and/or may use the device simultaneously, for example, between a robotic tool and a hand-held surgical instrument. Upon completion of the twelfth step 5224, the incision is closed and the post-operative portion of the procedure is initiated.

A thirteenth step 5226, reverse anesthetizing the patient. For example, the surgical hub 106, 206 may infer that the patient is waking up from anesthesia based on, for example, ventilator data (i.e., the patient's breathing rate begins to increase).

Finally, a fourteenth step 5228 is for the medical personnel to remove various patient monitoring devices from the patient. Thus, when the hub loses EKG, BP, and other data from the patient monitoring device, the surgical hub 106, 206 may infer that the patient is being transferred to a recovery room. As can be seen from the description of this exemplary procedure, the surgical hub 106, 206 may determine or infer when each step of a given surgical procedure occurs from data received from various data sources communicatively coupled to the surgical hub 106, 206.

In various aspects, the surgical instruments 200018 (fig. 17), 200062 (fig. 20), 200072a, b (fig. 21)200088 and 200078a, b (fig. 23), the surgical devices 200078a, b (fig. 22), and the visualization system 200086 (fig. 23) are configured to be operable in context awareness in a hub environment, such as the surgical hub 106 or 206 (fig. 1-11), for example, as illustrated by time axis 5200. Context awareness is further described in U.S. provisional patent application serial No. 62/659,900 entitled "METHOD OF HUB COMMUNICATION," filed on 19.4.2018, the entire contents OF which are incorporated herein by reference. In certain instances, operation of the robotic surgical system (including, for example, the various robotic surgical systems disclosed herein) may be controlled by the hub 106, 206 based on its situational awareness and/or feedback from its components and/or based on information from the cloud 104.

Wireless hub interaction

Device-to-device intercommunication

In various aspects, various techniques are described herein for pairing devices with rules that define device interactions. Thus, wireless interactive pairing for a surgical hub device is described herein.

In one aspect, wireless pairing of a surgical device with another device within a sterile surgical field is based on the use and context awareness of the devices. In one aspect, context awareness may include awareness of which user controls which devices based on location devices on the user. In one aspect, when the active device does not sense tissue or a patient, the pairing between the devices may be based on simultaneous activation of both devices within a predetermined amount of time. In another aspect, one device can be within the sterile zone and another device can be outside the sterile zone.

Pairing of personally owned wireless devices

Various techniques for pairing wireless devices owned by individuals are described herein. In one aspect, the encryption key may be used to authenticate a smartphone, wearable device, or other personally owned device that is provisioned to a given user. The personal device will require the hub to perform the definition of the function given certain input elements. In one aspect, porting a device owned by an individual into the system provides a link from the device to the surgical hub to run internal functions. For example, a device may be connected to a hub and music from a library or playlist on the device will be ported (i.e., by streaming) to speakers of the hub. As another example, a phone or another such device may be connected to the hub, and options for the device may be linked through the hub to allow for porting of calls through the hub monitor and speakers. In one application, an auto-reply voice or text message may be sent to an incoming call or text indicating that the user cannot reach when the user's device is connected to the hub, unless, for example, the call or text is from a selected subset of numbers (e.g., from other doctors who may be calling a consultation case). In another application, a contact list from a linked phone may be stored so that calls to the surgeon's phone during the surgical procedure may be answered or ignored depending on whether the call comes from a number on the contact list.

In one aspect, the surgical hub may be configured to display functionally imported data (e.g., imported data from a mobile device) on the auxiliary display due to the hub's perception of the type of data and/or the prevalence of data usage. In one aspect, information may be displayed on the auxiliary display when data is uploaded/imported to the surgical hub. In another aspect, when interaction is available, the interactive menu may become operable on the primary or active display when data is uploaded/imported to the surgical hub. For example, when a call is received by a mobile device connected to the surgical hub, caller ID information from the contact list of the mobile device may pop up on a selected monitor visible to the surgeon and nurse. As another example, caller ID information may be displayed on the secondary monitor to display supplemental information such as device settings, or on a configurable computer tablet positioned in the sterile field that the surgeon may touch to answer if desired, in order to avoid cluttering the main surgical screen with pop-up windows. As another example, depending on the particular sensed user, the number of times the user uses the auxiliary device, and other parameters, the hub may be configured to be able to mark the most common and/or most appropriate options or menus according to the particular interaction. In some aspects, the hub may be configured to be able to display options or menus on the user interface without interfering with the task at hand.

Fig. 16 depicts an example of pairing a personally owned wireless device 200002 with a surgical hub 200006. The wireless device 200002 and the surgical hub 200006 may communicate with each other through a wireless link 200004. As disclosed above, the surgical hub 200006 may display the imported data received from the wireless device 200002 on one or more displays visible to the surgical team member. In one aspect, the surgical hub 200006 can cause imported data to be displayed on a primary or active display monitor 200008. In another aspect, the surgical hub 200006 can cause imported data to be displayed on the secondary display monitor 200010.

Communicating with a smart bin of a hub without passing through an attached device

Various techniques are described herein for communicating with a smart cartridge of a hub without utilizing an instrument with an attached cartridge as the communication medium.

In various aspects, the cartridge can be configured such that there is a wired connection between the device and the cartridge, and physical contact is required between the instrument and the cartridge to transfer power to the cartridge. In one such aspect, the cartridge can include a circuit for identification that includes a portion that requires both the sled and at least one staple of the instrument to contact one another for continuity. If either the sled or staples are not in contact with the circuit, no power transfer to the cartridge will occur and the device will be locked. In these aspects, the described circuitry can be utilized to provide an auxiliary or backup method of locking the instrument against use with a used cartridge.

In various aspects, the cartridge can be configured to be able to communicate with the hub without requiring any power from a surgical instrument (e.g., a surgical stapler).

In one such aspect, inserting a cartridge into a device is configured to be able to provide an instantaneous amount of power to the cartridge, which is then configured to be able to communicate directly with the hub without passing through the device. In some aspects, the cartridge does not include an on-board battery or power source. In some aspects, a small amount of power may be dropped on the connection and during the transmission, after which the power consumption of the bins ceases. For example, fig. 17 is a diagram of a cartridge 200012 configured to be capable of wireless communication with a surgical hub 200006 in accordance with at least one aspect of the present disclosure. In one aspect, communication may be accomplished through a wireless communication circuit 200028 embedded in the cartridge 200012. In this example, power is transferred wirelessly from the device to the bin through inductive coupling. In one aspect, the first wired transmission antenna coil 200014 is printed into a wall 200016 of the channel of the instrument 200018. The second receiver coil 200020 can be printed on the mating surface of the cartridge 200012. When the two coils are close to and overlap each other, power may be transferred from the transmit antenna coil 200014 to the receiver coil 200020. In some aspects, power 200024 can be supplied to instrument 200018 via any suitable conductor, such as through flex circuit conductor 200026, and conducted to transmit coil 200014.

Fig. 17A depicts an overlap 200022 of a transmit coil 200014 and a receiver coil 20020. The transmit coil 200014 may receive power 200024 from instrument 200018. The amount and proximity of overlap 200022 between the transmit coil 200014 and the receiver coil 200020 can determine the amount of power received by the receiver coil 200020. The power in the receiver coil 200020 can be used to power the communication circuit 200028.

In such aspects, the proximity and alignment of the transmitter coil 200014 and receiver coil 20020 can be achieved with ear-like features 200030 formed in the body of the cartridge 200012. The ear features 200030 can be configured to align the cartridge 200012 within the channel of the instrument 200018 when the cartridge 200012 is inserted into the instrument 200018. The lug features 200030 can be configured to align a cartridge within the channel of the instrument 200018 by mating with corresponding slot features 200032 made in the channel.

In some aspects, the cartridge and/or instrument further comprises a resonant circuit to increase the efficiency of power transfer therebetween. For example, fig. 18 is a block diagram of a resonant inductive wireless power system 200034 in accordance with at least one aspect of the present disclosure. The resonant inductive wireless power system 200034 may include a transmitter oscillator 200040 that receives power from, for example, a power supply 200042. The transmitter oscillator 200040 can supply AC current to the transmit coil 200044. The resonant inductive wireless power system 200034 may also include, for example, a rectifier 200046 that may receive power from the transmit coil 200044 via the receiver coil 200048. The receiver coil 200048 may be coupled to the transmit coil 200044 by a magnetic (B) field generated by the transmit coil 200044. In some aspects, rectifier 200046 can convert AC power received from transmitter oscillator 200040 to DC power to provide it to load 200050. In one example, load 200050 can include communication circuit 200028. The resonant inductive wireless power system 200034 may further comprise, for example, one or more resonant coils 200036a, b, for example made of copper wire, resonating with its internal capacitance (indicated as capacitors 200038a, b in dashed lines) at a resonant frequency (e.g., at 10 MHz). In some aspects, the resonant coils 200036a, b can have matched impedances to optimize power transfer from the transmitter oscillator 200040 to the rectifier 200046.

In another aspect, the cartridge 200012 can include a battery that can power the communication circuit 200028 when the cartridge 200012 is inserted into the instrument 200018. In this regard, the communication circuit 200028 may be powered regardless of the power state of the instrument 200018.

In another aspect, the sterile scanning pad can be configured to scan the instrument 200018 and/or the cartridge 200012. In operation, the scanning pad may be present on the back table within the Operating Room (OR), and the healthcare professional may scan the instrument 200018 OR cartridge 200012 by placing the instrument 200018 OR cartridge 200012 on the scanning pad. When the instrument 200018 or cartridge 200012 is opened and placed on a scanning pad, data from the instrument 200018 or cartridge 200012 can be provided to the hub. In some aspects, the instrument 200018 or the cartridge 200012 can be scanned, e.g., via Radio Frequency (RF), to activate the instrument 200018 or cartridge 200012 and tracked by the hub. In other aspects, there may be a wired connection from the pad to the hub to provide power for scanning.

Detecting an environment and setting a geofenced area

Various technologies are described herein for detecting an environment and establishing a geofence.

Fig. 19A is a diagram of a surgical hub around a detection area OR room, such as an Operating Room (OR), in accordance with at least one aspect of the present disclosure. In one aspect, the perimeter 200052 of the space detectable by the surgical hub 200006 can be defined by one or more standalone beacons 200054a-d with directional antennas. In one aspect, the beacons 200054a-d may be placed at desired locations within the room in which the hub 200006 is located or is to be located. In one aspect, the perimeter 200052 defined by the beacons 200054a-d may be bounded by the surgical hub 200006 to form a device detection space. The beacons 200054a-d can be used, for example, to define regions having regular three-dimensional shapes or irregular three-dimensional shapes. In some applications, as few as three beacons (typically 200054) may be used to define a simple device detection perimeter, such as the interior of a square or rectangular room. In other aspects, more than three beacons 200054a-d may be used to define detection zones having irregular shapes, such as shown in fig. 19.

In some aspects, the beacons 200054a-d may be active or passive. Active beacons 200054a-d may actively transmit information for receipt by the hub 200006 without the hub 200006 transmitting any information to it. The passive beacons 200054a-d may be activated only after receiving one or more transmissions from the hub 200006. The passive beacons 200054a-d may then transmit response signals in response to the initiation query of the hub 200006 and in response to receiving the initiation query from the hub 200006. The signals transmitted by the beacons 200054a-d may be in any suitable form including, but not limited to, wireless signals, acoustic signals, or optical signals. The signals transmitted by the beacons 200054a-d may include any suitable information, such as identification information, location information, or any other information that the hub 200006 may use to determine the location of the beacons 200054a-d, thereby permitting the hub 200006 to determine the surroundings 200052.

As disclosed above, the perimeter 200052 may define a detection zone in which the hub 200006 may scan one or more surgical instruments or other devices. The hub 200006 may identify devices within the detection zone as potentially associated with a surgical procedure. It is to be appreciated that in this regard, devices located outside of the detection zone may not be identified by the hub 200006 as potentially being associated with a surgical procedure. Alternatively, beacons may be used to define exclusion zones where the hub 200006 may not be able to identify devices. In some aspects, the transmission angle of the signals from the beacons 200054a-d may be adjustable. A plurality of beacons 200054a-d may be placed on the floor or wall around the operating room from about 90 degrees to define a perimeter 200052. In some aspects, the perimeter 200052 can form a surgical instrument detection zone. In some aspects, when the beacon assembly is set up, the detected angle of the beacon may be visually displayed with the light beam.

Fig. 19B depicts some aspects of a geofence system that may also include "interfering" beacons 200056. In some aspects, the spatial region may be protected from receiving transmissions from the hub, or devices within the spatial region may be shielded from receiving transmissions from the hub 200006. For example, the "jamming" beacon 200056 can be placed at, near, or inside the perimeter of the jamming hub or device signal to prevent devices within the exclusion zone defined by the jamming beacon 200056 from connecting to the surgical hub. In various applications, a "jamming" beacon may be utilized, for example, to define a shielded area, a sterile table, an instrument cabinet 200058 in an operating room, or a storage area between operating rooms.

It can be appreciated that the use of "interfering" beacons 200056 can operate differently than the use of beacons 200054a-d to define exclusion zones. For example, an "interfering" beacon 200056 may be associated with the movable instrument cabinet 200058. The "jamming" function of the "jamming" beacon 200056 may prevent the hub 200006 from establishing communication with the medical instruments stored in the instrument cabinet 200058 regardless of the location of the instrument cabinet 200058.

In some applications, locating the beacons 200054a-d along the boundaries of a room, such as an operating room, may establish a controlled means of determining the true size and orientation of the operating room relative to the hub 200006. In other applications, locating the beacons 200054a-d at the boundaries of the sterile zone may designate disposable instruments that are turned on and ready for use as compared to capital instruments or instruments that are available but not yet turned on.

Instant pairing between multiple controllers and controlled devices

In one aspect, the hub and/or devices to which the hub may be connected may be configured to be able to wirelessly and interactively pair with each other. Thus, multiple controllers and controlled devices may be configured to be able to wirelessly enter pairing instantly without any direct user control. For example, fig. 20 is a diagram of a user and device pairing 200060 between the hub 200006, the user worn identifier 200066, and the surgical instrument 200062, according to at least one aspect of the present disclosure. In the depicted aspect, the identifier 200066 may be worn or attached to each user's hand. The identifier 200066 may interact with a receiver 200064 attached to or integral with the surgical device 200062. In one aspect, the receiver 200064 can be integrated within the handle of the surgical device 200062. The identifier 200066 and the receiver 200064 may be configured to be able to communicate via Near Field Communication (NFC) or another such communication protocol.

In operation, the receiver 200064 of the device automatically pairs the device 200062 with the identifier 200066 each time the user picks up the device 200062. In response to the pairing between the receiver 200064 and the identifier 200066, the hub 200006 identifies the device 200062, permitting the hub 200006 to control and/or receive status data from the device 200062. In some aspects, the hub 200006 may communicate directly with the device 200062. In other aspects, the hub 200006 can communicate with the device 200062 via a communication link from the hub 200006 through the identifier 200066 to the device receiver 200064. The NFC link allows the surgical device 200062 to communicate with the identifier 200066, which in turn communicates with the hub 200006. In some aspects, the identifier 200066 may act as a communication relay 200068 between the hub 200006 and the surgical device 200062, permitting identification and/or sensor information from the surgical device 200062 to be transmitted to the hub 200006, and control data to be transmitted from the hub 200006 to control the surgical device 200062.

In some other aspects, the identifier 200066 can transmit information to one or both of the hub 200006 and the surgical device 200062. In some aspects, the information from the identifier 200066 may include an identification of the user. In some other aspects, the information from the identifier 200066 may include which hand is using the surgical device 200062. In some additional aspects, the hub 200006 can also provide appropriate identification information for each device to one or both of the identifier 200066 and the surgical device 200062 to allow them to communicate directly or through the hub 200006 to coordinate activation of controls with activation of device functions.

Method for interchanging instruments for controlling pairing between two controllers

In various aspects, control of an instrument paired with a surgical hub may be switched interchangeably between different surgical hubs.

Initiation of a control change between the paired instrument and the surgical hub may be controlled and/or indicated to the user/other device in different ways. In one aspect, the predefined sequence can be used to indicate by a user the release of the controlled device to a control device (e.g., a surgical hub) and/or an associated device (e.g., other devices connected to the surgical hub).

The designation of a new relationship between the controlling device and the controlled device may be controlled and/or indicated to the user/other device in different ways. In one aspect, once released or when not paired with a control system within the operating room's local network, a series of steps may be used to link the two systems to control one system with the other. In an alternative aspect, the sterile zone control and interactive device can be utilized to display all paired links within the operating room and reassign them in a different order.

The identification and notification of control changes to the device can be achieved in different ways without the use of a control device. In one aspect, the illumination of the built-in display of the handheld device may be configured to change from a first color (e.g., blue or green) to a second color (e.g., red) and/or from a first state (e.g., solid) to a second state (e.g., flashing) to indicate and notify the user of a change in the device control state. For example, a first color and/or first state may indicate control of the device (e.g., the device is paired with a surgical hub), and a second color and/or second state may indicate that no control device is connected to the instrument. Further, the illumination may surround the perimeter of the device's built-in display. Further, the illumination may also be performed by a light-transmissive plastic surrounding the control module. In an alternative aspect, the device may be outlined on the primary display, and the color and/or status of the outline around the device (or a component of the device, such as a shaft of an instrument) may indicate its control status (i.e., pairing the device with the control device or absence of pairing).

In one aspect, control may be shared from more than one control device to a single controlled device. For example, the system may be used to enable two wireless control devices to control the same device simultaneously or to control multiple devices from a single control device.

Detection of device position and orientation

Various techniques for detecting a position and orientation of a device are described herein.

In one aspect, measurements relative to a ground coordinate system or relative to each other may be displayed. In such aspects, the display system can be configured to display user-selectable measurements of device position relative to the patient, hub, or device (e.g., trocar). Fig. 21 depicts an aspect of a surgical kit 200070 in which a surgical instrument (e.g., surgical instrument 200072a, b) is used as part of a surgical procedure.

In one aspect, the display system can be configured to display the current position of the surgical instrument 200072a, b relative to the local coordinate system. In another aspect, the display system can be configured to be able to calculate whether there is or will be an interaction between the surgical instruments 200072a, b. In one aspect, the display can switch from displaying local coordinate measurements to interaction calculations when the surgical instruments 200072a, b are near each other or near tissue. Interaction calculations can be used to avoid accidental collisions between the surgical instruments 200072a, b, or to allow the user to coordinate the movement of the two surgical instruments 200072a, b to specifically control the interaction between them.

In one aspect, the display system is configured to display the true position of the surgical instrument 200072a, b relative to an externally established frame of reference. For example, triangulation beacons engaged with the hub may be positioned around the OR to establish the position and orientation of any devices within the operating room (see, e.g., fig. 19A, 19B). Further, a beacon may be attached to each of the surgical instruments 200072a, b to establish the position of each of the surgical instruments 200072a, b relative to each other, other devices, and/or other beacons. In one aspect, the trocar may be marked with a beacon, which will allow the hub 200006 to identify which surgical instruments 200072a, b are currently inserted into the trocar. The display system can display an identifier of a surgical instrument (e.g., surgical instrument 200072a, b) to ensure that the surgical instrument and trocar inserted therein are retained on the display.

By determining the relative position and/or orientation of the surgical instruments 200072a, b with respect to each other or with respect to other instruments, the hub 200006 can provide members of a surgical team with the angle, insertion depth, and relative orientation of the surgical instruments 200072a, b and/or the end effector of each of the surgical instruments 200072a, b. In some aspects, the position and/or orientation of the surgical instrument 200072a, b can be determined relative to the patient, surgical site, or incision site for the location of critical instruments.

As disclosed above, the surgical instrument 200072a, b and/or other devices may include one or more beacons to help determine their relative position and/or orientation with respect to each other. Such beacons may be based on RF, magnetic, or another energy waveform capable of penetrating tissue, as well as air for transmitting and receiving triangulation signals. In some aspects, the hub 200006 may receive triangulation signals transmitted by beacons. In some aspects, the triangulation signals may include identifier information that permits the hub 20006 to determine which beacon is associated with which triangulation signal. In some aspects, an elongated surgical instrument (e.g., surgical instrument 200072a, b) may have multiple beacons attached to the handle and shaft, such that the orientation of the instrument shaft relative to the instrument handle may be determined by the hub 200006.

As disclosed above, the position and/or orientation of a surgical instrument may be determined relative to the position and/or orientation of another surgical instrument or other surgical device. In another aspect, the position and/or orientation of the surgical instrument 200072a, b can be determined relative to one or more local references. In some aspects, the one or more local references may include one or more wireless or RF beacons disposed within the surgical suite. In another aspect, the local reference may include a magnetic field generator 200074 on a shelf or mounted on a wall or ceiling within the operating room. The magnetic field generator 200074 may be configured to be able to generate a predefined magnetic field within a room, as depicted in fig. 21. Further, each surgical instrument or medical device may include one or more built-in or attached sensors to detect the magnetic field (or RF field used with one or more RF beacons) and determine the device orientation (or RF field) relative to the magnetic field.

Each device (e.g., surgical instrument 200072a, b) may transmit position and/or orientation information to the hub 200006 via a wired or wireless communication system to allow the hub 200006 to track the position and orientation of the device. In one aspect, each surgical instrument 200072a, b can comprise several sensors capable of detecting their respective distances and orientations relative to a predefined magnetic field. Multiple sensors may be useful for surgical instruments that include an elongated shaft connected to a handheld unit. For example, the magnetic sensor may be disposed with the handheld unit along half of the length of the elongate shaft and at a distal end effector attached to the elongate shaft. The instrument can then report the position and orientation of its elongate shaft and end effector to a central programming system (e.g., executed by the hub 200006). The program system may then calculate and track the use and configuration of all instruments in the operating room and display or highlight to the user on the visual display when there is an interaction or special condition.

In another aspect, each of the surgical instruments 200072a, b can define a coordinate system local to the instrument. In some aspects, the local coordinate system may be determined relative to one or more local references, such as the magnetic field generator 200074. In another example, a local coordinate system can be established relative to a local ground, such as a trocar port on a patient. The use of a local ground in the vicinity of the surgical instrument 200072a, b allows to establish a local coordinate system with increased spatial resolution compared to a coordinate system based on a distant beacon, such as the magnetic field generator 200074. Such higher resolution coordinate systems can provide detailed information about the position and orientation of a surgical instrument passing through the trocar. Further, for training purposes, the trocar position itself may be used to help understand port placement and other procedures to provide information to other systems both intra-and post-operatively.

In one aspect, a first frame of reference is established relative to a device (e.g., an endoscope) positioned inside a patient, and a second frame of reference is established outside the patient relative to a predefined location. Further, the system can include means for linking one reference frame to another reference frame to enable establishment of the instrument position to the jaw position relative to the tissue. Thus, the position and orientation of the device can be determined from two separate interrelated coordinate systems.

In one aspect, the coupling sensor may be used to link an internal visualization image within the surgical site to an external visualization image of the surgical field in order to coordinate the position of the end effector of the surgical instrument relative to patient tissue in the surgical field and the external position and orientation of the handle of the surgical instrument. For example, the primary internal visualization system may be used to determine the position, distance, and velocity between various aspects of the instrument and the tissue of interest within the body. In one aspect, the primary internal visualization system may use a dedicated frame capture imaging device. Such devices may capture images of the internal surgical site by using light beams reflected from the internal structures of the surgical site and any devices disposed therein. Thus, the refraction of the light beam by the tissue can be used to determine the distance between the internal tissue structure and the device, rather than the reflectivity of the tissue.

In one aspect, lidar may be used as a measurement method for this type of system. Lidar measurements may use a pulsed laser to create a pattern and then measure the reflected pulses. In some aspects, such techniques may be referred to as laser scanning. In various aspects, the techniques may employ a CMOS array multi-laser light source for advanced visualization. For example, fig. 22 depicts such a system 200076 for determining the position of a surgical device 200078a, b relative to a user-selected measurement site 200080 using lidar in accordance with at least one aspect of the present disclosure. As depicted in fig. 22, the primary internal visualization system may permit a user of the surgical device 200078a, b to estimate a distance 200082 between the end effectors of the surgical device 200078a, b. In some aspects, the surgical hub can display the position of the end effector within the surgical site. In some additional aspects, if the surgical devices 200078a, b are near or within a minimum collision distance between them, the surgical hub may provide an alert, such as a visual indicator in a display, to alert the user of the surgical devices 200078a, b.

In another aspect, RF may be used to determine the position of the end effector within the abdominal cavity or within any internal surgical field. Fig. 23 depicts such a system. Radio frequency time of flight would be one method of determining distance to the smart device. For example, the primary transmitters and receivers may be used on an endoscope or visualization system 200086. In one aspect, the primary transmitter may include a first antenna 200084a and the receiver may include a second antenna 200084 b. In another aspect, the first antenna 200084a may serve as both a transmitting element and a receiving element. Similarly, the second antenna 200084b may function as both a transmitting element and a receiving element. By incorporating the primary transmitter and receiver into the tip of visualization system 200086, the receiver can measure the distance from visualization system 200086 to the first target device relative to the visualization focus, allowing the user to measure from a frame of reference based on what the user can see.

In one aspect, the antenna array 200083 associated with the endoscope or visualization system 200086 may be comprised of a first antenna 200084a and a second antenna 200084 b. In one aspect, one antenna (such as first antenna 200084a) of antenna array 200083 may be configured to transmit signals at one frequency, while a second antenna (such as first antenna 200084a) of antenna array 200083 may be configured to receive signals transmitted back from first target surgical instrument 200088. As one example, the frequency of the signal transmitted by antenna array 200083 may be approximately 13.56 MHz. In another example, the strength of the signal received by the first target surgical instrument 200088 may be about-36 dbm RSSI. In some aspects, the return signal to the antenna array 200083 may be transmitted by the first target surgical instrument 200088 at a frequency different from the frequency of the signal transmitted by the antenna array 200083. Such a communication protocol is considered full duplex communication 200090. Different transmit and receive frequencies may be used to prevent the transmit signal from being interfered with by the receive signal (and vice versa). Further, the different transmit and receive frequencies may permit the round trip time of the measurement signal to and from the first target surgical instrument 200088. In some aspects, the round trip time of the signal to and from the first target surgical instrument 200088 may be used to calculate the distance of the first target surgical instrument 200088 from the antenna array 200083.

In another aspect, the distance of the first target surgical instrument 200088 from the antenna array 200083 to the first target surgical instrument 200088 may be calculated based on the power loss of the signal transmitted by the antenna array 200083 or the response signal transmitted by the first target surgical instrument 200088. Geometric factors such as spreading of the transmitted signal over distance and absorption losses due to the medium between antenna array 200083 and first target surgical instrument 200088 may permit such distance measurements. Generally, the distance between the antenna array 200083 and the first target surgical instrument 200088 is proportional to the ratio of the signal strength received by the first target surgical instrument 200088 to the signal strength originally transmitted by the antenna array 200083. Alternatively, the distance between the antenna array 200083 and the first target surgical instrument 200088 may be calculated from the ratio of the signal strength of the response signal received by the antenna array 200083 to the strength of the signal transmitted by the first target surgical instrument 200088. In some examples of this technique, the signal transmitted by the first target surgical instrument 200088 may encode information related to the signal strength of the transmitted signal.

Thus, the intelligent system can determine the relative position by receiving and then returning a signal. The receive array may include a Field Programmable Gate Array (FPGA) and a microcontroller configured to be able to process in real time the measured speeds required by the plurality of instruments. In one aspect, receiver antenna array 200083 may be comprised of two different antennas, such as first antenna 200084a and second antenna 200084 b. The system may compare the difference of the signals received on the two antennas (first antenna 200084a and second antenna 200084b) and triangulate the source position in 3D space, as depicted in fig. 23. Fig. 23 is a diagram of a system for determining a relative position of a device via a dual antenna array 200083 in accordance with at least one aspect of the present disclosure. In the system shown in fig. 23, a dual antenna array 200083 is disposed on the endoscope 200086 and receives either actively transmitted signals or passive signals from the device to determine the relative position of the device. In one aspect, the passive signal technique may include a full duplex communication system 200090 depicted relative to the first target surgical instrument 200088. In another aspect, the active signal communication 200092 may relate to a second target surgical instrument 200094. The location of the device may be determined based on the detected signal strength, as shown in fig. 24.

Fig. 24 depicts a graph 200110 for an example of spatial resolution for determining the position of multiple target surgical instruments based on detected signal strength. The abscissa represents the ratio of signal strength in dBm, for example, of wireless communication between the target surgical instrument and a transceiver mounted on a reference device. The ordinate is a distance (e.g., in centimeters) that can be resolved based on the signal strength ratio. It can be observed in graph 200110 that the difference between maximum distance 200112 and minimum distance 200114 can increase as the signal strength ratio increases.

Returning to fig. 23, in another aspect, the end effector of the instrument (e.g., the second target surgical instrument 200094) may include one or more transmitters 200096 configured to continuously ping receivers of an antenna array 200083 secured to the visualization device 200086. In some non-limiting examples, transmitter 200096 can transmit signals at a frequency between about 860mHz to about 960 mHz. In some examples, the transmitted signal may have a signal strength of about-60 dbm. One or more transmitters 200096 can transmit a unique ID and the expected strength of the signal, so the receivers of antenna array 200083 can then calculate the distance based on the received strength. In another aspect, one or more transmitters 200096 can transmit signals to be received by multiple antennas (e.g., first antenna 200084a and second antenna 200084b of antenna array 200083). The difference in the time of receipt or signal strength of the transmitted signal determined by the first antenna 200084a and the second antenna 200084b may be used to triangulate the position of one or more transmitters 200096 and, thus, the position of the end effector of the second target surgical instrument 200094.

In another aspect, an RFID tag can be placed on or in the end effector of each targeted surgical instrument. The RFID tag may be activated by a signal transmitted by a transmitting antenna. In some aspects, the transmit antenna may be part of an antenna array 200083 disposed on a surgical visualization device 200086. In some aspects, each antenna (e.g., first antenna 200084a and second antenna 200084b) of antenna array 200083 may act as a separate transmit antenna. Alternatively, one antenna of antenna array 200083 may be a transmit antenna and another antenna of antenna array 200083 may be a receive antenna. Thus, the strength of the transmitted signal received by the RFID tag may be used to determine the distance of the RFID tag from the transmitter antenna. In another aspect, the power transmission strength of the transmitted signal may be varied, allowing the distance to be determined using the wake-up process of the RFID tag. The wake-up process of the RFID tag may be initiated by receiving a radio frequency signal having a power greater than a threshold power. It can be appreciated that the power of the transmitted signal decays with increasing distance. Thus, an RFID tag that is set at a distance that results in a degraded received signal will not enter the wake-up process. However, an RFID tag disposed at a close distance may receive the transmitted signal with sufficient power to initiate the wake-up process. In any of these examples, the transmitter antenna transmits a power signal for receipt by a passive RFID tag on the end effector. Upon receiving a transmit signal with sufficient power, the RFID tag may wake up and then transmit a return RF signal to be received by the receiver antenna. The return signal may include a unique identifier that the system may use to measure distances from itself to multiple devices within the surgical site.

Returning to fig. 23, in another aspect, a single scanning array laser may be used to detect only the distance 200098 between itself and structures within body 200099. The scanning laser array may be cycled out of order from the primary visualization system 200086 to prevent light from the rangefinder from interacting with light from the primary visualization device. Alternatively, energy devices outside the sensing capabilities of the primary visualization array may be utilized. If the main visualization device can detect near infrared to near ultraviolet EMR, then a light/EMR source that is fully transmitted into the ultraviolet spectrum can be used to scan the laser array. Alternatively, ultrasound, microwave or RF may be used to move completely into another region of the spectrum to prevent interference between the scanning array and the visualization device. For example, ultrasonic diffuse and retro-reflective sensors may be used to determine the distance and size of an object through a gaseous medium within its range (e.g., Senix or Pepperl + Fuchs ultrasonic sensors). As one example, the distance measurement 200098 between the primary visualization system 200086 and a particular structure within the body 200099 can be used with the measurements to determine the location of the first target surgical instrument 200088 to calculate the distance between the first target surgical instrument 200088 and the particular structure within the body 200099. As another example, a contact ultrasound sensor may be used to interrogate tissue, fluids, etc. for use in an imaging device. As yet another example, a combination of these two sources may be used to determine the tissue location and instrument location within the injected gas of the patient's abdomen.

In another aspect, infrared ID and tracking may be used via a camera projecting light and viewing the operating room. For example, at least two separate reflectors or one reflector having an aspect ratio in at least two planes may be used to determine the position and orientation of a targeted surgical instrument relative to the trocar and relative to an endoscopic image within the patient.

Examples

Various aspects of the subject matter described herein are set forth in the following numbered examples:

74页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于焦点消融的系统、设备和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!