Interactive surgical system
阅读说明:本技术 交互式外科系统 (Interactive surgical system ) 是由 F·E·谢尔顿四世 J·L·哈里斯 于 2018-11-14 设计创作,主要内容包括:本发明公开了一种外科器械。所述外科器械包括端部执行器、用户界面、以及控制电路。所述端部执行器被配置成能够将钉部署到由所述端部执行器抓持的组织中并且在击发行程期间切割所抓持的组织。所述控制电路被配置成能够使得与所述击发行程相关联的至少一个参数设置显示在所述用户界面上,使得与所述击发行程相关的解释信息与所述至少一个参数设置同时显示在所述用户界面上,其中所述解释信息基于外部数据,并且通过所述用户界面推荐对所述至少一个参数设置的调整,其中所推荐的调整基于所述解释信息。(The invention discloses a surgical instrument. The surgical instrument includes an end effector, a user interface, and a control circuit. The end effector is configured to deploy staples into tissue grasped by the end effector and to cut the grasped tissue during a firing stroke. The control circuit is configured to cause at least one parameter setting associated with the firing stroke to be displayed on the user interface, cause interpretation information related to the firing stroke to be displayed on the user interface concurrently with the at least one parameter setting, wherein the interpretation information is based on external data, and recommend an adjustment to the at least one parameter setting through the user interface, wherein the recommended adjustment is based on the interpretation information.)
1. A surgical instrument, comprising:
an end effector configured to deploy staples into tissue grasped by the end effector and to cut the grasped tissue during a firing stroke;
a user interface; and
a control circuit configured to be capable of:
causing at least one parameter setting associated with the firing stroke to be displayed on the user interface;
causing interpretation information related to the firing stroke to be displayed on the user interface concurrently with the at least one parameter setting, wherein the interpretation information is based on external data; and is
Recommending, via the user interface, an adjustment to the at least one parameter setting, wherein the recommended adjustment is based on the interpretation information.
2. The surgical instrument of claim 1, wherein the external data originates from a measurement device independent of the surgical instrument.
3. The surgical instrument of claim 1, wherein the external data is transmitted to the surgical instrument over a wireless communication link.
4. The surgical instrument of claim 1, wherein the interpretive information is updated in real-time.
5. The surgical instrument of claim 1, wherein the interpretation information is updated at a predetermined update rate.
6. The surgical instrument of claim 1, wherein the explanatory information relates to tissue hemostasis.
7. The surgical instrument of claim 1, wherein the interpreted information relates to hemostasis of tissue previously treated with the end effector.
8. The surgical instrument of claim 1, wherein the at least one parameter setting comprises a speed setting of the firing stroke.
9. The surgical instrument of claim 1, wherein the at least one parameter setting comprises a wait time before beginning the firing stroke.
10. A surgical instrument, comprising:
an end effector configured to perform a function for treating tissue grasped by the end effector;
a user interface; and
a control circuit configured to be capable of:
causing at least one parameter setting associated with the function to be displayed on the user interface;
causing interpretation information related to the function to be displayed on the user interface concurrently with the at least one parameter setting, wherein the interpretation information is based on external data; and is
Recommending, via the user interface, an adjustment to the at least one parameter setting, wherein the recommended adjustment is based on the interpretation information.
11. The surgical instrument of claim 10, wherein the interpretive information is updated in real-time.
12. The surgical instrument of claim 10, wherein the interpretation information is updated at a predetermined update rate.
13. The surgical instrument of claim 10, wherein the explanatory information relates to tissue hemostasis.
14. The surgical instrument of claim 10, wherein the interpreted information relates to hemostasis of tissue previously treated with the end effector.
15. The surgical instrument of claim 10, wherein the explanatory information relates to a blood pressure of the selected blood vessel.
16. The surgical instrument of claim 10, wherein the at least one parameter setting comprises a speed setting of a firing stroke.
17. The surgical instrument of claim 10, wherein the at least one parameter setting comprises a wait time before starting a firing stroke.
18. A surgical instrument for use with a medical imaging device and a surgical hub, the surgical hub including a visualization module in communication with the medical imaging device, the surgical instrument comprising:
an end effector configured to perform a function for treating tissue grasped by the end effector;
a user interface; and
a control circuit configured to be capable of:
receiving an input from the surgical hub indicating a location of a critical structure determined by the visualization module relative to a current field of view of the medical imaging device; and is
Causing the user interface to recommend an adjustment to change a position of the critical structure relative to a current field of view of the medical imaging based on the received input.
19. The surgical instrument of claim 18, wherein the critical structure is the end effector.
20. The surgical instrument of claim 18, wherein the adjusting comprises selecting an auto-centering mode.
Background
The present disclosure relates to various surgical systems. Surgical procedures are often performed in surgical operating rooms or operating rooms (operating theaters or rooms) of medical facilities, such as, for example, hospitals. A sterile field is typically created around the patient. The sterile field may include the members of the team who are properly wearing the scrub, as well as all of the equipment and fixtures in the field. Various surgical devices and systems are utilized in performing surgical procedures.
Furthermore, in the digital and information age, medical systems and facilities often implement systems or procedures that utilize newer and improved technologies more slowly due to patient safety and general expectations for maintaining traditional practices. However, medical systems and facilities in general may therefore lack communication and shared knowledge with other adjacent or similarly located facilities. To improve patient practice, it is desirable to find ways to better help interconnect medical systems and facilities.
Disclosure of Invention
In various embodiments, a surgical instrument is disclosed that includes an end effector, a user interface, and a control circuit. The end effector is configured to deploy staples into tissue grasped by the end effector and to cut the grasped tissue during a firing stroke. The control circuit is configured to cause at least one parameter setting associated with the firing stroke to be displayed on the user interface, cause interpretation information related to the firing stroke to be displayed on the user interface concurrently with the at least one parameter setting, wherein the interpretation information is based on external data, and recommend an adjustment to the at least one parameter setting through the user interface, wherein the recommended adjustment is based on the interpretation information.
In various embodiments, a surgical instrument is disclosed that includes an end effector, a user interface, and a control circuit. The end effector is configured to perform a function for treating tissue grasped by the end effector. The control circuitry is configured to enable display of at least one parameter setting associated with the function on the user interface, cause display of interpretation information related to the function on the user interface concurrently with the at least one parameter setting, wherein the interpretation information is based on external data, and recommend an adjustment to the at least one parameter setting through the user interface, wherein the recommended adjustment is based on the interpretation information.
In various embodiments, a surgical instrument for use with a medical imaging device and a surgical hub including a visualization module in communication with the medical imaging device is disclosed. The surgical instrument includes an end effector, a user interface, and a control circuit. The end effector is configured to perform a function for treating tissue grasped by the end effector. The control circuitry is configured to receive an input from the surgical hub indicating a position of a critical structure relative to a current field of view of the medical imaging device as determined by the visualization module, and cause the user interface to recommend an adjustment to change the position of the critical structure relative to the current field of view of the medical imaging based on the received input.
Drawings
The aspects described herein, however, both as to organization and method of operation, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in connection with the accompanying drawings, which are set forth below.
Fig. 1 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.
Fig. 2 is a surgical system for performing a surgical procedure in an operating room according to at least one aspect of the present disclosure.
Fig. 3 is a surgical hub paired with a visualization system, a robotic system, and a smart instrument according to at least one aspect of the present disclosure.
Fig. 4 is a partial perspective view of a surgical hub housing and a composite generator module slidably received in a drawer of the surgical hub housing according to at least one aspect of the present disclosure.
Fig. 5 is a perspective view of a combined generator module having bipolar, ultrasonic and monopolar contacts and a smoke evacuation component according to at least one aspect of the present disclosure.
Fig. 6 illustrates a single power bus attachment for a plurality of lateral docking ports of a lateral modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.
Fig. 7 illustrates a vertical modular housing configured to be capable of receiving a plurality of modules in accordance with at least one aspect of the present disclosure.
Fig. 8 illustrates a surgical data network including a modular communication hub configured to connect modular devices located in one or more operating rooms of a medical facility or any room in a medical facility dedicated to surgical operations to a cloud in accordance with at least one aspect of the present disclosure.
Fig. 9 illustrates a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure.
Fig. 10 illustrates a surgical hub including a plurality of modules coupled to a modular control tower according to at least one aspect of the present disclosure.
Fig. 11 illustrates one aspect of a Universal Serial Bus (USB) hub device in accordance with at least one aspect of the present disclosure.
Fig. 12 is a block diagram of a cloud computing system including a plurality of smart surgical instruments coupled to a surgical hub connectable to cloud components of the cloud computing system in accordance with at least one aspect of the present disclosure.
Fig. 13 is a functional module architecture of a cloud computing system according to at least one aspect of the present disclosure.
Fig. 14 illustrates a diagram of a situation-aware surgical system in accordance with at least one aspect of the present disclosure.
Fig. 15 is a timeline depicting situational awareness of a surgical hub, according to at least one aspect of the present disclosure.
Fig. 16 illustrates a surgical device including a user interface and a surgical stapling end effector in accordance with at least one aspect of the present disclosure.
FIG. 17 is a schematic view of various components of the surgical device of FIG. 16.
Fig. 18 is a logic flow diagram depicting a process of control procedure or logic configuration for displaying external data based interpretation information in accordance with at least one aspect of the present disclosure.
Fig. 19 is a logic flow diagram depicting a process of control procedure or logic configuration for displaying external data based interpretation information in accordance with at least one aspect of the present disclosure.
Fig. 20 illustrates a surgical device including a user interface and a surgical stapling end effector in accordance with at least one aspect of the present disclosure.
Fig. 21 is a logic flow diagram depicting a process for adjusting a control program or logic configuration of parameter settings of the surgical device of fig. 20 in accordance with at least one aspect of the present disclosure.
Fig. 22 is a logic flow diagram depicting a process for adjusting a control program or logic configuration of parameter settings of the surgical device of fig. 20 in accordance with at least one aspect of the present disclosure.
Fig. 23 illustrates a surgical device including a user interface and a surgical stapling end effector in accordance with at least one aspect of the present disclosure.
Fig. 24 is a logic flow diagram depicting a process for automatically adjusting a control program or logic configuration of a medical imaging device relative to a field of view of a detected critical structure in accordance with at least one aspect of the present disclosure.
Fig. 25 is a logic flow diagram depicting a process for obtaining a control program or logic configuration for user permission to automatically adjust a field of view of a medical imaging device relative to a critical structure in accordance with at least one aspect of the present disclosure.
Detailed Description
The applicant of the present patent application owns the following U.S. patent applications filed on 6/11/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application 16/182,224 entitled "SURGICAL NETWORK, INSTRUMENT, ANDCLOUD RESPONSES BASED ON VALIDATION OF RECEIVED DATASET AND AUTHENTICATION OFITS SOURCE AND INTEGRITY";
U.S. patent application No. 16/182,230 entitled "SURGICAL SYSTEM FOR PRESENTING INFORMATION INTERPRETED FROMOXITENAL DATA";
U.S. patent application 16/182,233 entitled "modulation OF SURGICAL SYSTEMS CONTROL PROCESSES BASED MASTER LEARNING";
U.S. patent application 16/182,239 entitled "apparatus for controlling program BASED ON structured contact DATA IN ADDITION TO THE DATA";
U.S. patent application 16/182,243 entitled "SURGICAL HUB AND MODULAR DEVICES PONSE ADJUSTMENT BASED ON SITUATIONAL AWARENESS";
U.S. patent application 16/182,248 entitled "DETECTION AND evaluation office facilities RESPONSES OF SURGICAL INSTRUMENTS TO INCREASING SEVERITY THREATS";
U.S. patent application 16/182,260 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN SURGICAL NETWORKS";
U.S. patent application No. 16/182,267 entitled "SENSING THE PATIENT POSITION and orientation and tuning THE same Mono-POLAR RETURN PAD ELECTRODE TO Process POSITION and orientation TO A SURGICAL NETWORK";
U.S. patent application 16/182,249 entitled "Power supply minor TOOL WITHPREDEFINED ADJUSTABLE CONTROL ALGORITHM FOR CONTROLLING END EFFECTORT PARAMETER";
U.S. patent application 16/182,246 entitled "ADJUSTMENTS BASED ON AIRBORNEPARATICLES PROPERTIES";
U.S. patent application 16/182,256 entitled "ADJUSTMENT OF A SURGICAL DEVICEFUNCTION BASED ON SITUATIONAL AWARENESS";
U.S. patent application 16/182,242 entitled "REAL-TIME ANALYSIS OF COMPREHENSIVEOST OF ALL INSTRUMENTATION USE IN SURGERY UTILIZING DATA FLUIDITY TO TRACKINSTRUMENTS THROUGH STOCKING AND IN-HOUSE PROCESSES";
U.S. patent application 16/182,255 entitled "USAGE AND TECHNIQUE ANALYSIS OFSURGION/STAFF PERFOMANCE AGAINST A BASELINE TO OPTIMIZATION DEVICE FOR BOTH CURRENT AND FUTURE PROCEDURES";
U.S. patent application 16/182,269 entitled "IMAGE CAPTURING OF THE AREASOUTSIDE THE ABDOMEN TO IMPROVE PLACEMENT AND CONTROL OF A SURGICAL DEVICE INCUSE";
U.S. patent application 16/182,278 entitled "COMMUNICATION OF DATA WHERE ASURGICAL NETWORKS USE CONTEXT OF THE DATA AND REQUIREMENTS OF A RECEIVINGSYSTEM/USER TO INFONCE INCLUSION OR LINKAGE OF DATA AND METADATA TOESTABILITY CONTENT";
U.S. patent application 16/182,290 entitled "SURGICAL NETWORK RECOMMENDIONSFROM REAL TIME ANALYSIS OF PROCEDURE VARIABLE AGAINST A BASELINEHHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION";
U.S. patent application 16/182,232 entitled "CONTROL OF A SURGICAL SYSTEMTHROUGH A SURGICAL BARRIER";
U.S. patent application 16/182,227 entitled "SURGICAL NETWORK DETERMINATION OF COMMUNICATION, INTERACTION, OR PROCESSING BASED ON SYSTEM ORDEVICES";
U.S. patent application No. 16/182,231 entitled "WIRELESS PAIRING OF A SURGICALDEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FILED BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES";
U.S. patent application 16/182,229 entitled "ADJUSTMENT OF STAPLE HEIGHT OF ATLEAST ONE ROW OF STAPLES BASED ON THE SENSED TISSUE THICKNESS OR FOR FORCEINCLOSING";
U.S. patent application No. 16/182,234 entitled "STAPLING DEVICE WITH PREVIENT AndD reduction LOCKOUT BASED PARAMETER";
U.S. patent application 16/182,240 entitled "POWER STAPLING DEVICE CONFIRED ADJUST FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER BASEDON SENSED PARAMETER OF FIRING OR CLAMPING";
U.S. patent application 16/182,235 entitled "VARIATION OF RADIO FREQUENCY ANDULTROASONIC POWER LEVEL IN COOPERATION WITH VARYING CLAMP ARM PRESSURE TOACHIEVE PREDEFINED HEAT FLUX OR POWER APPLIED TO TISSUE"; and
U.S. patent application 16/182,238 entitled "ULTRASONIC ENERGY DEVICE WHICHVARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL ATA CUT PROGRESSION LOCATION".
The applicant of the present patent application owns the following U.S. patent applications filed on 2018, 9, 10, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. provisional patent application 62/729,183 entitled "A CONTROL FOR A SURGICALNETWORK OR SURGICALNETWORK CONNECTED DEVICE THAT ADJUTS ITS FUNCTION BASION A SENSED STATION OR USAGE";
U.S. provisional patent application 62/729,177 entitled "AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN A SURGICALNETWORK BEFORE TRANSMISSION";
U.S. provisional patent application 62/729,176 entitled "INDIRECT COMMAND AND CONTROL OFA FIRST OPERATING ROOM SYSTEM THROUGH THE USE OF A SECOND OPERATING ROOMSYSTEM WITHIN A STERILE FIELD WHERE THE SECOND OPERATING ROOM SYSTEM HASPRIMARY AND SECONDARY OPERATING MODES";
U.S. provisional patent application 62/729,185 entitled "POWER STAPLING DEVICE THAT ISCABLE OF ADJUSE FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING GMERER OF THE DEVICE BASED ON SENSED PARAMETER OF FIRING OR CLAMPING";
U.S. provisional patent application 62/729,184 entitled "POWER SURGICAL TOOL WITH APREDEFINED ADJUSTABLE CONTROL ALGORITHM FOR CONTROLLING AT LEAST ONE ENDEFECTOR PARAMETER AND A MEANS FOR LIMITING THE ADJUSTMENT";
U.S. provisional patent application No. 62/729,182 entitled "SENSING THE PATIENT POSITIONIONNAND control UTILIZING THE MONO POLAR RETURN PAD ELECTRODE TO PROVIDEO STATIONATIONAL AWARENESS TO THE HUB";
U.S. provisional patent application 62/729,191 entitled "SURGICAL NETWORK RECOMMENDITIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST ABASELINE HIGHLIGHTING DIFFERENCES FROM THE OPEN THE OPTIMAL SOLUTION";
U.S. provisional patent application 62/729,195 entitled "ULTRASONIC ENERGY DEVICE WHICHVARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL ATA CUT PROGRESSION LOCATION"; and
U.S. provisional patent application 62/729,186, entitled "WIRELESS PAIRING OF A SURGICALDEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FILED BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES".
The applicant of the present patent application owns the following U.S. patent applications filed on 28/8/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application 16/115,214 entitled "ESTIMATING STATE OF ultrasilicon end product AND CONTROL SYSTEM for thermal.
U.S. patent application 16/115,205 entitled "TEMPERATURE CONTROL OF ULTRASONIC EFFECTOR AND CONTROL SYSTEM THEREFOR".
U.S. patent application 16/115,233 entitled "RADIO FREQUENCY ENERGY DEVICE for RADIO interference COMBINED ELECTRICAL SIGNALS".
U.S. patent application No. 16/115,208 entitled "control AN ULTRASONIC SURGICAL ACCORDING TO TISSUE LOCATION";
U.S. patent application 16/115,220 entitled "control ACTIVATION OF atomic catalytic conversion TO THE PRESENCE OF TISSUE";
U.S. patent application 16/115,232, entitled "DETERMINING TISSUE COMPOSITION VIAAN ULTRASONIC SYSTEM";
U.S. patent application No. 16/115,239 entitled "DETERMINING THE STATE OF orthogonal electronic Circuit System ACCORDING TO FREQUENCY SHIFT";
U.S. patent application 16/115,247 entitled "DETERMINING THE STATE OF ANULTRASONIC END EFFECTOR";
U.S. patent application 16/115,211 entitled "STATATIONAL AWARENESS OFELECTRROSURGICAL SYSTEMS";
U.S. patent application 16/115,226, entitled "MECHANISMS FOR CONTROLLINGDIFFERENT ELECTROMECHANICAL SYSTEMS OF AN ELECTROSURGICAL INSTRUMENT";
U.S. patent application 16/115,240 entitled "DETECTION OF END effect IMMERSIONIN LIQUID";
U.S. patent application 16/115,249 entitled "INTERRUPTION OF ENGAGUTIVE DUE TOINADVERTENT CAPACITIVE COUPLING";
U.S. patent application 16/115,256 entitled "INCREASING RADIO FREQUENCY TOCREATE PAD-LESS MONOPOLAR LOOP";
U.S. patent application 16/115,223 entitled "BIPOLAR COMMUNICATION DEVICE THATOMATICALLY ADJUTS PRESSURE BASED ON ENERGY MODALITY"; and
U.S. patent application 16/115,238 entitled "activity OF ENERGY DEVICES".
The applicant of the present patent application owns the following U.S. patent applications filed on 23.8.2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. provisional patent application 62/721,995 entitled "control AN ultra semiconductor minor insertion TO a terminal LOCATION";
U.S. provisional patent application 62/721,998 entitled "STATATIONAL AWARENESS OFELECTRROSURGICAL SYSTEMS";
U.S. provisional patent application 62/721,999 entitled "INTERRUPTION OF ENGAGUTIVE DUE TOINADVERTENT CAPACITIVE COUPLING";
U.S. provisional patent application 62/721,994 entitled "BIPOLAR COMMUNICATION DEVICE THATUATION MATICALLY ADJUTS PRESSURE BASED ON ENERGY MODALITY"; and
U.S. provisional patent application 62/721,996 entitled RADIO FREQUENCY ENERGY development device delay COMBINED ELECTRICAL SIGNALS.
The applicant of the present patent application owns the following U.S. patent applications filed on 30.6.2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. provisional patent application 62/692,747 entitled "SMART ACTIVATION OF AN ENERGYDEVICE BY ANOTHER DEVICE";
U.S. provisional patent application 62/692,748, entitled "SMART ENERGY ARCHITURE"; and
us provisional patent application 62/692,768, entitled "SMART ENERGY DEVICES".
The applicant of the present patent application owns the following U.S. patent applications filed on 29.6.2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application serial No. 16/024,090, entitled "CAPACITIVE COUPLED RETURNPATH PAD WITH SEPARABLE ARRAY ELEMENTS";
U.S. patent application Ser. No. 16/024,057 entitled "control A SURGICALINSTRUCTION ACCORDING TO SENSED CLOSURE PARAMETERS";
U.S. patent application Ser. No. 16/024,067 entitled "SYSTEM FOR ADJUSE ENDEFECTOR PARAMETERS BASED ON PERIORATIVE INFORMATION";
U.S. patent application Ser. No. 16/024,075 entitled "SAFETY SYSTEMS FOR SMARTPOWER SURGICAL STAPLING";
U.S. patent application Ser. No. 16/024,083 entitled "SAFETY SYSTEMS FOR SMARTPOWER SURGICAL STAPLING";
U.S. patent application Ser. No. 16/024,094 entitled "SURGICAL SYSTEMS FOR RDETTING END EFFECTOR TISSUE DISTRIBUTION IRREGULARITIES";
U.S. patent application Ser. No. 16/024,138 entitled "SYSTEM FOR DETECTING PROXIMITY OF SURGICAL END EFFECTOR TO CANCEROUS TISSUE";
U.S. patent application Ser. No. 16/024,150 entitled "SURGICAL INSTRUMENT CARTRIDGESENSOR ASSEMBLIES";
U.S. patent application Ser. No. 16/024,160 entitled "VARIABLE OUTPUT CARTRIDGESENSOR ASSEMBLY";
U.S. patent application Ser. No. 16/024,124 entitled "SURGICAL INSTRUMENT HAVING AFLEXIBLE ELECTRODE";
U.S. patent application Ser. No. 16/024,132 entitled "SURGICAL INSTRUMENT HAVARING AFLEXIBLE CICUIT";
U.S. patent application Ser. No. 16/024,141 entitled "SURGICAL INSTRUMENT WITH ATISSUE MARKING ASSEMBLY";
U.S. patent application Ser. No. 16/024,162 entitled "SURGICAL SYSTEMS WITHPRIORIZED DATA TRANSMISSION CAPABILITIES";
U.S. patent application Ser. No. 16/024,066 entitled "SURGICAL EVACUTION SENSING MOTOR CONTROL";
U.S. patent application Ser. No. 16/024,096 entitled "SURGICAL EVACUTION SENSORARRANGEMENTS";
U.S. patent application Ser. No. 16/024,116 entitled "SURGICAL EVACUTION FLOWPATHS";
U.S. patent application Ser. No. 16/024,149 entitled "SURGICAL EVACUTION SENSING GENERATOR CONTROL";
U.S. patent application Ser. No. 16/024,180, entitled "SURGICAL EVACUTION SENSINGAND DISPLAY";
U.S. patent application Ser. No. 16/024,245 entitled "COMMUNICATION OF SMOKEEVACUTION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUTION MODULE FOR RINTERACTIVE SURGICAL PLATFORM";
U.S. patent application Ser. No. 16/024,258 entitled "SMOKE EVACUATION SYSTEMINGLUTING A SEGMENTED CONTROL CIRCUIT FOR INTERACTIVE SURGICAL PLATFORM";
U.S. patent application Ser. No. 16/024,265 entitled "SURGICAL EVACUTION SYSTEMWITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND A SMOKEEVACUTION DEVICE"; and
U.S. patent application Ser. No. 16/024,273, entitled "DUAL IN-SERIES LARGE ANDSMALL DROPLET FILTERS".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 6/28, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. provisional patent application Ser. No. 62/691,228 entitled "A METHOD OF USENNING INFORCED FLEX CICUITS WITH MULTI SENSE SENSOR WITH ELECTRICITY DEVICES";
U.S. provisional patent application Ser. No. 62/691,227 entitled "control A SURGICALINcorner ACCORDING TO SENSED CLOSURE PARAMETERS";
U.S. provisional patent application Ser. No. 62/691,230 entitled "SURGICAL INSTRUMENTTHAVING A FLEXIBLE ELECTRODRODE";
U.S. provisional patent application Ser. No. 62/691,219 entitled "SURGICAL EVACUATIONSENSING AND MOTOR CONTROL";
U.S. provisional patent application Ser. No. 62/691,257 entitled "COMMUNICATION OF SMOKEEVACUTION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUTION MODULE FOR RINTERACTIVE SURGICAL PLATFORM";
U.S. provisional patent application Ser. No. 62/691,262 entitled "SURGICAL EVACUATION SYSTEM WITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND ASMOKE EVACUATION DEVICE"; and
U.S. provisional patent application Ser. No. 62/691,251 entitled "DUAL IN-SERIES LARGE ANDSMALL DROPLET FILTERS";
the applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 4, 19, the disclosures of which are incorporated herein by reference in their entirety:
U.S. provisional patent application serial No. 62/659,900 entitled "METHOD OF hubcmonication";
the applicant of the present patent application owns the following U.S. provisional patent applications filed on 30/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
us provisional patent application 62/650,898, entitled "CAPACITIVICEOUS RETURN PATH PAD WITH SECARABLE ARRAY ELEMENTS", filed 3, 30.2018;
U.S. provisional patent application Ser. No. 62/650,887 entitled "SURGICAL SYSTEMS WITHOPTIMIZED SENSING CAPABILITIES";
U.S. patent application Ser. No. 62/650,882 entitled "SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM"; and
U.S. patent application Ser. No. 62/650,877, entitled "SURGICAL SMOKE EVACUATIONSENSING AND CONTROLS".
The applicant of the present patent application owns the following U.S. patent applications filed on 29/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. patent application Ser. No. 15/940,641 entitled "INTERACTIVE SURGICAL SYSTEMSWITH ENCRYPTED COMMUNICATION CAPABILITIES";
U.S. patent application Ser. No. 15/940,648 entitled "INTERACTIVE SURGICAL SYSTEMSWITH CONDITION HANDLING OF DEVICES AND DATA CAPABILITIES";
U.S. patent application Ser. No. 15/940,656 entitled "SURGICAL HUB COORDINATION OFCONTROL AND COMMUNICATION OF OPERATING ROOM DEVICES";
U.S. patent application Ser. No. 15/940,666 entitled "SPATIAL AWARENESS OF SURGICALUHUBS IN OPERATING ROOMS";
U.S. patent application Ser. No. 15/940,670 entitled "COOPERATIVE UTILIZATION OFDATA DERIVED FROM SECONDARY SOURCES BY INTELLIGENT SURGICAL HUBS";
U.S. patent application Ser. No. 15/940,677 entitled "SURGICAL HUB CONTROLARANGEMENTS";
U.S. patent application Ser. No. 15/940,632 entitled "DATA STRIPPING METHOD OF INTERROTATE PATIENT RECORD AND CREATE ANONYMIZED RECORD";
U.S. patent application Ser. No. 15/940,640 entitled "COMMUNICATION HUB AND STORAGE EVICE FOR STORING PARAMETERS AND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICS SYSTEMS";
U.S. patent application Ser. No. 15/940,645 entitled "SELF DESCRIBING DATA PACKETSGENERATED AT AN ISSUING INSTRUMENT";
U.S. patent application Ser. No. 15/940,649 entitled "DATA PAIRING TO INTERCONNECTA DEVICE MEASURED PARAMETER WITH AN OUTCOME";
U.S. patent application Ser. No. 15/940,654 entitled "SURGICAL HUB SITUATIONALAWARENESS";
U.S. patent application Ser. No. 15/940,663 entitled "SURGICAL SYSTEM DISTRIBUTEDPROCESSING";
U.S. patent application Ser. No. 15/940,668 entitled "AGGREGAGATION AND REPORTING OFSURGICAL HUB DATA";
U.S. patent application Ser. No. 15/940,671 entitled "SURGICAL HUB SPATIALAWARENESS TO DETERMINE DEVICES IN OPERATING THEEATER";
U.S. patent application Ser. No. 15/940,686 entitled "DISPLAY OF ALIGNMENT OFSTAPLE CARTRIDGE TO PRIOR LINEAR STAPLE LINE";
U.S. patent application Ser. No. 15/940,700 entitled "STERILE FIELD INTERACTIVECONNTROL DISPLAYS";
U.S. patent application Ser. No. 15/940,629 entitled "COMPUTER IMPLEMENTEDINTERACTIVE SURGICAL SYSTEMS";
U.S. patent application Ser. No. 15/940,704 entitled "USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";
U.S. patent application Ser. No. 15/940,722 entitled "CHARACTERIZATION OF TISSUEIRREGULARITIES THROUGH THE USE OF MONO-CHROMATIC LIGHT REFRACTIVITY";
U.S. patent application Ser. No. 15/940,742 entitled "DUAL CMOS ARRAY IMAGING";
U.S. patent application Ser. No. 15/940,636 entitled "ADAPTIVE CONTROL programs FOR basic DEVICES";
U.S. patent application Ser. No. 15/940,653 entitled "ADAPTIVE CONTROL PROGRAMUPDATES FOR SURGICAL HUBS";
U.S. patent application Ser. No. 15/940,660 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR CUTOSTIMION AND RECOMMENDITION TO A USER";
U.S. patent application Ser. No. 15/940,679 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR LINKING OF LOCAL USAGE TRENDS WITH THE RESOURCE ACQUISITION BEHAVIORS OFLARGER DATA SET";
U.S. patent application Ser. No. 15/940,694 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR MEDICAL FACILITY SEGMENTED INDIDUALIZATION OF INSTRUMENTS FUNCTIONS";
U.S. patent application Ser. No. 15/940,634 entitled "CLOOUD-BASED MEDICAL ANALYTICSFOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES";
U.S. patent application Ser. No. 15/940,706 entitled "DATA HANDLING ANDPRIORITIZATION IN A CLOUD ANALYTICS NETWORK";
U.S. patent application Ser. No. 15/940,675 entitled "CLOOUD INTERFACE FOR COUPLEDSURGICAL DEVICES";
U.S. patent application Ser. No. 15/940,627 entitled "DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,637 entitled "COMMUNICATION ARRANGEMENTSFOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,642 entitled "CONTROL FOR ROBOT-ASSISTED DSURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,676 entitled "AUTOMATIC TOOL ADJUSTMENTSFOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,680 entitled "CONTROL FOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,683 entitled "COOPERATIVE SURGICAL ACTIONFOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application Ser. No. 15/940,690 entitled "DISPLAY ARRANGEMENTS ForOBOT-ASSISTED SURGICAL PLATFORMS"; and
U.S. patent application Ser. No. 15/940,711, entitled "SENSING ARRANGEMENTS ForOBOT-ASSISTED SURGICAL PLATFORMS".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2018, 3, 28, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. provisional patent application serial No. 62/649,302 entitled "INTERACTIVE SURGICALSYSTEMS WITH ENCRYPTED notification CAPABILITIES";
U.S. provisional patent application Ser. No. 62/649,294 entitled "DATA STRIPPING METHOD OF INTERROTATE PATIENT RECORD AND CREATE ANONYMIZED RECORD";
U.S. provisional patent application Ser. No. 62/649,300 entitled "SURGICAL HUB SITUATIONALAWARENESS";
U.S. provisional patent application Ser. No. 62/649,309 entitled "SURGICAL HUB SPATIALAWARENESS TO DETERMINE DEVICES IN OPERATING THEEATER";
U.S. provisional patent application serial No. 62/649,310 entitled "COMPUTER incorporated into active minor SYSTEMS";
U.S. provisional patent application Ser. No. 62/649291 entitled "USE OF LASER LIGHT ANDRED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT";
U.S. provisional patent application Ser. No. 62/649,296 entitled "ADAPTIVE CONTROL program FOR basic DEVICES";
U.S. provisional patent application Ser. No. 62/649,333 entitled "CLOOUD-BASED MEDICANAL POLYTICS FOR CUTOSTOMIZATION AND RECOMMENDITIONS TO A USER";
U.S. provisional patent application Ser. No. 62/649,327 entitled "CLOOUD-BASED MEDICANAL POLYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES";
U.S. provisional patent application Ser. No. 62/649,315 entitled "DATA HANDLING ANDPRIORITIZATION IN A CLOUD ANALYTICS NETWORK";
U.S. provisional patent application Ser. No. 62/649,313 entitled "CLOOUD INTERFACE FORCOUPLED SURGICAL DEVICES";
U.S. provisional patent application Ser. No. 62/649,320, entitled "DRIVE ARRANGEMENTS ForOBOT-ASSISTED SURGICAL PLATFORMS";
U.S. provisional patent application Ser. No. 62/649,307 entitled "AUTOMATIC TOOLADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS"; and
U.S. provisional patent application serial No. 62/649,323, entitled "SENSING ARRANGEMENTS forced-associated minor planar platrms".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 8/3/2018, the disclosures of each of which are incorporated herein by reference in their entirety:
U.S. provisional patent application Ser. No. 62/640,417 entitled "TEMPERATURE CONTROL INDULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR"; and
U.S. provisional patent application serial No. 62/640,415 entitled "ESTIMATING STATE outdoor patent END AND CONTROL SYSTEM valve".
The applicant of the present patent application owns the following U.S. provisional patent applications filed on 2017, 12, 28, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. provisional patent application serial No. 62/611,341, entitled "INTERACTIVE SURGICALPLATFORM";
U.S. provisional patent application Ser. No. 62/611,340 entitled "CLOOUD-BASED MEDICALANALYTICS"; and
U.S. provisional patent application serial No. 62/611,339, entitled "ROBOT associated SURGICALPLATFORM";
before explaining various aspects of the surgical device and generator in detail, it should be noted that the example illustrated application or use is not limited to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented alone or in combination with other aspects, variations and modifications, and may be practiced or carried out in various ways. Furthermore, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative embodiments for the convenience of the reader and are not for the purpose of limiting the invention. Moreover, it is to be understood that expressions of one or more of the following described aspects, and/or examples may be combined with any one or more of the other below described aspects, and/or examples.
Surgical hub
Referring to fig. 1, a computer-implemented interactive surgical system 100 includes one or more
Fig. 2 shows an example of a
Other types of robotic systems may be readily adapted for use with the
Various examples of CLOUD-BASED analysis performed by the CLOUD 104 and suitable for use with the present disclosure are described in U.S. provisional patent application serial No. 62/611,340 entitled "CLOUD-BASED MEDICAL ANALYTICS," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety.
In various aspects, the
The optical components of the
The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as in the invisible spectrum. The visible spectrum (sometimes referred to as the optical spectrum or the luminescence spectrum) is that portion of the electromagnetic spectrum that is visible to (i.e., detectable by) the human eye, and may be referred to as visible light or simple light. A typical human eye will respond to wavelengths in air from about 380nm to about 750 nm.
The invisible spectrum (i.e., the non-luminescent spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380nm and above about 750 nm). The human eye cannot detect the invisible spectrum. Wavelengths greater than about 750nm are longer than the red visible spectrum and they become invisible Infrared (IR), microwave and radio electromagnetic radiation. Wavelengths less than about 380nm are shorter than the violet spectrum and they become invisible ultraviolet, x-ray and gamma-ray electromagnetic radiation.
In various aspects, the
In one aspect, the imaging device employs multispectral monitoring to distinguish topography from underlying structures. A multispectral image is an image that captures image data across a particular range of wavelengths of the electromagnetic spectrum. The wavelengths may be separated by filters or by using instruments that are sensitive to specific wavelengths, including light from frequencies outside the visible range, such as IR and ultraviolet. Spectral imaging may allow extraction of additional information that the human eye fails to capture with its red, green, and blue receptors. The use of multispectral Imaging is described in more detail under the heading "Advanced Imaging Acquisition Module" of U.S. provisional patent application serial No. 62/611,341, entitled "INTERACTIVE SURGICAL PLATFORM," filed on 28.12.2017, the disclosure of which is incorporated herein by reference in its entirety. Multispectral monitoring may be a useful tool for repositioning the surgical site after completion of a surgical task to perform one or more of the previously described tests on the treated tissue.
It is self-evident that severe sterilization of the operating room and surgical equipment is required during any surgical procedure. The stringent hygiene and sterilization conditions required in a "surgical room" (i.e., an operating room or treatment room) require the highest possible sterility of all medical devices and equipment. Part of this sterilization process is any substance that needs to be sterilized, including the
In various aspects, the
As shown in fig. 2, a
In one aspect,
Referring to fig. 2, a
Referring now to fig. 3,
During surgery, the application of energy to tissue for sealing and/or cutting is typically associated with smoke evacuation, aspiration of excess fluid, and/or irrigation of the tissue. Fluid lines, power lines and/or data lines from different sources are often tangled during surgery. Solving the problem during surgery may waste valuable time. Disconnecting the lines may require disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular housing 136 provides a unified environment for managing power, data, and fluid lines, which reduces the frequency of entanglement between such lines.
Aspects of the present disclosure provide a surgical hub for use in a surgical procedure involving application of energy to tissue at a surgical site. The surgical hub includes a hub housing and a composite generator module slidably received in a docking station of the hub housing. The docking station includes data contacts and power contacts. The combined generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component seated in a single cell. In one aspect, the combined generator module further comprises a smoke evacuation component for connecting the combined generator module to at least one energy delivery cable of the surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluids, and/or particles generated by application of the therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component.
In one aspect, the fluid line is a first fluid line and the second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub housing. In one aspect, the hub housing includes a fluid interface.
Certain surgical procedures may require more than one energy type to be applied to tissue. One energy type may be more advantageous for cutting tissue, while a different energy type may be more advantageous for sealing tissue. For example, a bipolar generator may be used to seal tissue, while an ultrasonic generator may be used to cut the sealed tissue. Aspects of the present disclosure provide a solution in which the hub modular housing 136 is configured to accommodate different generators and facilitate interactive communication therebetween. One of the advantages of the hub modular housing 136 is the ability to quickly remove and/or replace various modules.
Aspects of the present disclosure provide a modular surgical housing for use in a surgical procedure involving the application of energy to tissue. The modular surgical housing includes a first energy generator module configured to generate first energy for application to tissue, and a first docking station including a first docking port including first data and power contacts, wherein the first energy generator module is slidably movable into electrical engagement with the power and data contacts, and wherein the first energy generator module is slidably movable out of electrical engagement with the first power and data contacts,
in addition to the above, the modular surgical housing further comprises a second energy generator module configured to generate a second energy different from the first energy for application to tissue, and a second docking station comprising a second docking port comprising a second data and power contact, wherein the second energy generator module is slidably movable into electrical engagement with the power and data contact, and wherein the second energy generator is slidably movable out of electrical contact with the second power and data contact.
In addition, the modular surgical housing further includes a communication bus between the first docking port and the second docking port configured to facilitate communication between the first energy generator module and the second energy generator module.
Referring to fig. 3-7, aspects of the present disclosure are presented as a hub modular housing 136 that allows for modular integration of the generator module 140, smoke evacuation module 126, and suction/irrigation module 128. The hub modular housing 136 also facilitates interactive communication between the modules 140, 126, 128. As shown in fig. 5, the generator module 140 may be a generator module with integrated monopolar, bipolar, and ultrasound components supported in a
In one aspect, the hub modular housing 136 includes a modular power and communications backplane 149 having external and wireless communications connections to enable removable attachment of the modules 140, 126, 128 and interactive communications therebetween.
In one aspect, the hub modular housing 136 includes a docking cradle or drawer 151 (also referred to herein as a drawer) configured to slidably receive the modules 140, 126, 128. Fig. 4 illustrates a partial perspective view of the surgical hub housing 136 and the
In various aspects, the smoke evacuation module 126 includes a
In various aspects, the suction/irrigation module 128 is coupled to a surgical tool that includes an aspiration fluid line and a suction fluid line. In one example, the aspiration fluid line and the suction fluid line are in the form of flexible tubes extending from the surgical site toward the suction/irrigation module 128. The one or more drive systems may be configured to irrigate fluid to and aspirate fluid from the surgical site.
In one aspect, a surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, a suction tube, and an irrigation tube. The draft tube may have an inlet at a distal end thereof, and the draft tube extends through the shaft. Similarly, a draft tube may extend through the shaft and may have an inlet adjacent the energy delivery tool. The energy delivery tool is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable that initially extends through the shaft.
The irrigation tube may be in fluid communication with a fluid source, and the aspiration tube may be in fluid communication with a vacuum source. The fluid source and/or vacuum source may be seated in the suction/irrigation module 128. In one example, the fluid source and/or vacuum source may be seated in the hub housing 136 independently of the suction/irrigation module 128. In such examples, the fluid interface can connect the suction/irrigation module 128 to a fluid source and/or a vacuum source.
In one aspect, the modules 140, 126, 128 on the hub modular housing 136 and/or their corresponding docking stations may include alignment features configured to enable alignment of the docking ports of the modules into engagement with their corresponding ports in the docking stations of the hub modular housing 136. For example, as shown in fig. 4, the combined
In some aspects, the drawers 151 of the hub modular housing 136 are the same or substantially the same size, and the modules are sized to be received in the drawers 151. For example, the
In addition, the contacts of a particular module may be keyed to engage the contacts of a particular drawer to avoid inserting the module into a drawer having unmatched contacts.
As shown in fig. 4, the docking port 150 of one drawer 151 may be coupled to the docking port 150 of another drawer 151 by a communication link 157 to facilitate interactive communication between modules seated in the hub modular housing 136. Alternatively or additionally, the docking port 150 of the hub modular housing 136 can facilitate wireless interactive communication between modules seated in the hub modular housing 136. Any suitable wireless communication may be employed, such as, for example, Air Titan-Bluetooth.
Fig. 6 illustrates a single power bus attachment for multiple lateral docking ports of a lateral
Fig. 7 illustrates a vertical
In various aspects, the imaging module 138 includes an integrated video processor and modular light source, and is adapted for use with a variety of imaging devices. In one aspect, the imaging device is constructed of a modular housing that can be fitted with a light source module and a camera module. The housing may be a disposable housing. In at least one example, the disposable housing is removably coupled to the reusable controller, the light source module, and the camera module. The light source module and/or the camera module may be selectively selected according to the type of the surgical operation. In one aspect, the camera module includes a CCD sensor. In another aspect, the camera module includes a CMOS sensor. In another aspect, the camera module is configured for scanning beam imaging. Also, the light source module may be configured to be capable of delivering white light or different light depending on the surgical procedure.
During a surgical procedure, it may be inefficient to remove a surgical device from a surgical site and replace the surgical device with another surgical device that includes a different camera or a different light source. Temporary loss of vision at the surgical site can lead to undesirable consequences. The modular imaging apparatus of the present disclosure is configured to enable the replacement of a light source module or a camera module during a surgical procedure without having to remove the imaging apparatus from the surgical site.
In one aspect, an imaging device includes a tubular housing including a plurality of channels. The first channel is configured to slidably receive a camera module that may be configured for snap-fit engagement with the first channel. The second channel is configured to slidably receive a light source module that may be configured for snap-fit engagement with the second channel. In another example, the camera module and/or the light source module may be rotated within their respective channels to a final position. Threaded engagement may be used instead of snap-fit engagement.
In various examples, multiple imaging devices are placed at different locations in a surgical field to provide multiple views. The imaging module 138 may be configured to be able to switch between imaging devices to provide an optimal view. In various aspects, the imaging module 138 may be configured to be able to integrate images from different imaging devices.
Various IMAGE PROCESSORs AND imaging devices suitable for use in the present disclosure are described in U.S. patent No. 7,995,045 entitled "COMBINED SBI AND associated IMAGE PROCESSOR" published on 9.8.2011, which is incorporated by reference herein in its entirety. Further, U.S. patent 7,982,776 entitled "MOTION artifact AND METHOD," published 7/19/2011, which is incorporated herein by reference in its entirety, describes various systems for removing MOTION artifacts from image data. Such a system may be integrated with the imaging module 138. Further, U.S. patent application publication No. 2011/0306840 entitled "control able MAGNETIC SOURCE TO fine particle identification and pore application published on 12/15/2011 and U.S. patent application publication No. 2014/0243597 entitled" SYSTEM FOR PERFORMING A MINIMALLY INVASIVE target product "published on 8/28/2014, each of which is incorporated herein by reference in its entirety.
Fig. 8 illustrates a
Modular devices 1a-1n located in an operating room may be coupled to a
It should be understood that
In one aspect, the
Applying cloud computer data processing techniques to the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical results, reduced costs and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to observe tissue conditions to assess leakage or perfusion of sealed tissue after tissue sealing and cutting procedures. At least some of the devices 1a-1n/2a-2m may be employed to identify pathologies, such as the effects of disease, using cloud-based computing to examine data including images of body tissue samples for diagnostic purposes. This includes localization and edge confirmation of tissues and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using various sensors integrated with imaging devices and techniques, such as overlaying images captured by multiple imaging devices. The data (including image data) collected by the devices 1a-1n/2a-2m may be transmitted to the
In one implementation, the operating room devices 1a-1n may be connected to the
In another implementation, the operating room devices 2a-2m may be connected to the
In one example,
In other examples, the operating room devices 1a-1n/2a-2m may communicate with the
The
Fig. 9 illustrates a computer-implemented interactive surgical system 200. The computer-implemented interactive surgical system 200 is similar in many respects to the computer-implemented interactive surgical system 100. For example, the computer-implemented interactive surgical system 200 includes one or more surgical systems 202 that are similar in many respects to the
Fig. 10 shows the
The
In one aspect,
The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in nonvolatile memory. For example, nonvolatile memory can include ROM, Programmable ROM (PROM), Electrically Programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes Random Access Memory (RAM), which acts as external cache memory. Further, RAM may be available in a variety of forms, such as SRAM, Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM).
The
It is to be appreciated that the
A user enters commands or information into the
The
In various aspects, the
A communication connection refers to the hardware/software used to interface the network to the bus. While a communication connection is shown for exemplary clarity within the computer system, it can also be external to
Fig. 11 illustrates a functional block diagram of one aspect of a USB hub 300 device in accordance with at least one aspect of the present disclosure. In the illustrated aspect, the USB hub device 300 employs a Texas Instruments TUSB2036 integrated circuit hub. The USB hub 300 is a CMOS device that provides an upstream USB transceiver port 302 and up to three downstream USB transceiver ports 304, 306, 308 according to the USB2.0 specification. The upstream USB transceiver port 302 is a differential root data port that includes a differential data negative (DP0) input paired with a differential data positive (DM0) input. The three downstream USB transceiver ports 304, 306, 308 are differential data ports, where each port includes a differential data positive (DP1-DP3) output paired with a differential data negative (DM1-DM3) output.
The USB hub 300 device is implemented with a digital state machine rather than a microcontroller and does not require firmware programming. Fully compatible USB transceivers are integrated into the circuitry for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full-speed devices and low-speed devices by automatically setting the slew rate according to the speed of the device attached to the port. The USB hub 300 device may be configured to be capable of being in a bus-powered mode or a self-powered mode and includes hub power logic 312 for managing power.
The USB hub 300 device includes a serial interface engine 310 (SIE). SIE 310 is the front end of the USB hub 300 hardware and handles most of the protocols described in
In various aspects, the USB hub 300 may connect 127 functions configured in up to six logical layers (tiers) to a single computer. Further, the USB hub 300 may be connected to all external devices using a standardized four-wire cable that provides both communication and power distribution. The power configuration is a bus powered mode and a self-powered mode. The USB hub 300 may be configured to support four power management modes: bus-powered hubs with individual port power management or package port power management, and self-powered hubs with individual port power management or package port power management. In one aspect, the USB hub 300, upstream USB transceiver port 302, are plugged into the USB host controller using a USB cable, and downstream USB transceiver ports 304, 306, 308 are exposed for connection of USB compatible devices, or the like.
Additional details regarding the structure and function OF the surgical HUB and/or surgical HUB network can be found in U.S. provisional patent application No. 62/659,900 entitled "METHOD OF HUB COMMUNICATION" filed on 19.4.2018, which is hereby incorporated by reference in its entirety.
Cloud system hardware and functional module
Fig. 12 is a block diagram of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. In one aspect, a computer-implemented interactive surgical system is configured to monitor and analyze data related to the operation of various surgical systems, including surgical hubs, surgical instruments, robotic devices, and operating rooms or medical facilities. A computer-implemented interactive surgical system includes a cloud-based analysis system. While the cloud-based analysis system is described as a surgical system, it is not necessarily so limited and may generally be a cloud-based medical system. As shown in fig. 12, the cloud-based analysis system includes a plurality of surgical instruments 7012 (which may be the same as or similar to instrument 112), a plurality of surgical hubs 7006 (which may be the same as or similar to hub 106), and a surgical data network 7001 (which may be the same as or similar to network 201) to couple the
In addition, the surgical instrument 7012 can include a transceiver for transmitting data to and from its corresponding surgical hub 7006 (which can also include a transceiver). The combination of the surgical instrument 7012 and the
Based on the connections to the various
The particular cloud computing system configurations described in this disclosure are specifically designed to address various issues arising in the context of medical procedures and procedures performed using medical devices (such as the surgical instruments 7012, 112). In particular, the surgical instrument 7012 can be a digital surgical device configured to interact with the cloud 7004 for implementing techniques that improve performance of a surgical procedure. Various surgical instruments 7012 and/or the
Fig. 13 is a block diagram illustrating a functional architecture of a computer-implemented interactive surgical system in accordance with at least one aspect of the present disclosure. The cloud-based analysis system includes a plurality of
For example, the data collection and
The patient
The control
The cloud-based analytics system may include security features implemented by the cloud 7004. These security features may be managed by the authorization and
Further, for security purposes, the cloud may maintain a database of
The surgical instrument 7012 may use the wireless transceiver to transmit a wireless signal, which may represent, for example, authorization credentials for accessing the corresponding
The cloud-based analysis system may allow monitoring of multiple medical facilities (e.g., medical facilities such as hospitals) to determine improved practices and suggest changes accordingly (e.g., via suggestion module 2030). Thus, the processor 7008 of the cloud 7004 may analyze data associated with each medical facility to identify the facility and aggregate the data with other data associated with other medical facilities in the group. For example, groups may be defined based on similar operational practices or geographic locations. In this way, the cloud 7004 can provide analysis and recommendations across a group of medical facilities. Cloud-based analytics systems may also be used to enhance situational awareness. For example, the processor 7008 may predictively model the impact of the recommendations on the cost and effectiveness of a particular facility (relative to the overall operation and/or various medical procedures). The cost and effectiveness associated with that particular facility may also be compared to corresponding local areas of other facilities or any other comparable facility.
Data classification and
Additional details regarding the cloud analysis system can be found in U.S. provisional patent application 62/659,900 entitled "METHOD OF HUBCOMMUNICATION," filed on 19/4.2018, which is hereby incorporated by reference in its entirety.
Situation awareness
While a "smart" device that includes a control algorithm responsive to sensed data may be an improvement over a "dumb" device that operates without regard to sensed data, some sensed data may be incomplete or uncertain when considered in isolation, i.e., in the context of no type of surgical procedure being performed or type of tissue being operated upon. Without knowing the surgical context (e.g., knowing the type of tissue being operated on or the type of procedure being performed), the control algorithm may control the modular device incorrectly or sub-optimally given the particular no-context sensing data. For example, the optimal manner in which a control algorithm for controlling a surgical instrument in response to a particular sensed parameter may vary depending on the particular tissue type being operated on. This is due to the fact that: different tissue types have different characteristics (e.g., tear resistance) and thus respond differently to actions taken by a surgical instrument. Thus, it may be desirable for a surgical instrument to take different actions even when the same measurement for a particular parameter is sensed. As one particular example, the optimal manner in which a surgical stapling and severing instrument is controlled in response to the instrument sensing an unexpectedly high force for closing its end effector will vary depending on whether the tissue type is prone to tearing or tear-resistant. For tissue that is prone to tearing (such as lung tissue), the instrument's control algorithm will optimally ramp down the motor speed in response to an unexpectedly high force for closure, thereby avoiding tearing tissue. For tissue that is resistant to tearing (such as stomach tissue), the instrument's control algorithm will optimally ramp the motor speed up in response to an unexpectedly high force for closure, thereby ensuring that the end effector is properly clamped on the tissue. The control algorithm may make a suboptimal decision without knowing whether lung tissue or stomach tissue has been clamped.
One solution utilizes a surgical hub that includes a system configured to derive information about the surgical procedure being performed based on data received from various data sources, and then control the paired modular devices accordingly. In other words, the surgical hub is configured to infer information about the surgical procedure from the received data and then control the modular devices paired with the surgical hub based on the inferred context of the surgical procedure. Fig. 14 illustrates a diagram of a situation-aware surgical system 5100 in accordance with at least one aspect of the present disclosure. In some examples, the data source 5126 includes, for example, a modular device 5102 (which may include sensors configured to be able to detect parameters associated with the patient and/or the modular device itself), a database 5122 (e.g., an EMR database containing patient records), and a patient monitoring device 5124 (e.g., a Blood Pressure (BP) monitor and an Electrocardiogram (EKG) monitor).
The surgical hub 5104, which may be similar in many respects to the
The situational awareness system of the surgical hub 5104 may be configured to derive contextual information from data received from the data source 5126 in a number of different ways. In one example, the situational awareness system includes a pattern recognition system or machine learning system (e.g., an artificial neural network) that has been trained on training data to associate various inputs (e.g., data from the database 5122, the patient monitoring device 5124, and/or the modular device 5102) with corresponding contextual information about the surgical procedure. In other words, the machine learning system may be trained to accurately derive contextual information about the surgical procedure from the provided inputs. In another example, the situational awareness system may include a look-up table that stores pre-characterized contextual information about the surgical procedure in association with one or more inputs (or input ranges) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table may return corresponding context information that the situational awareness system uses to control the modular device 5102. In one example, the contextual information received by the situational awareness system of the surgical hub 5104 is associated with a particular control adjustment or set of control adjustments for one or more modular devices 5102. In another example, the situational awareness system includes additional machine learning systems, look-up tables, or other such systems that generate or retrieve one or more control adjustments for one or more of the modular devices 5102 when providing contextual information as input.
The surgical hub 5104 incorporating the situational awareness system provides a number of benefits to the surgical system 5100. One benefit includes improved interpretation of sensed and collected data, which in turn will improve processing accuracy and/or use of the data during the surgical procedure. Returning to the previous example, the situational awareness surgical hub 5104 may determine the type of tissue being operated on; thus, when an unexpectedly high force is detected for closing the end effector of the surgical instrument, the situation aware surgical hub 5104 can properly ramp up or ramp down the motor speed for the tissue-type surgical instrument.
As another example, the type of tissue being operated on may affect the adjustment of the compressibility and loading thresholds of the surgical stapling and severing instrument for a particular tissue gap measurement. The situational aware surgical hub 5104 can infer whether the surgical procedure being performed is a chest procedure or an abdominal procedure, allowing the surgical hub 5104 to determine whether the tissue held by the end effector of the surgical stapling and severing instrument is lung tissue (for chest procedures) or stomach tissue (for abdominal procedures). The surgical hub 5104 can then appropriately adjust the compression rate and load thresholds of the surgical stapling and severing instrument for the type of tissue.
As yet another example, the type of body cavity that is manipulated during an insufflation procedure may affect the function of the smoke extractor. The situational awareness surgical hub 5104 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the type of procedure. Since one type of procedure is typically performed within a particular body cavity, the surgical hub 5104 can then appropriately control the motor speed of the smoke extractor for the body cavity in which it is operating. Thus, the situational awareness surgical hub 5104 may provide consistent smoke output for both chest and abdominal surgery.
As yet another example, the type of procedure being performed may affect the optimal energy level at which an ultrasonic surgical instrument or a Radio Frequency (RF) electrosurgical instrument operates. For example, arthroscopic surgery requires higher energy levels because the end effector of an ultrasonic surgical instrument or RF electrosurgical instrument is immersed in fluid. The situation aware surgical hub 5104 can determine whether the surgical procedure is an arthroscopic procedure. The surgical hub 5104 may then adjust the RF power level or ultrasound amplitude (i.e., the "energy level") of the generator to compensate for the fluid-filled environment. Relatedly, the type of tissue being operated on may affect the optimal energy level at which the ultrasonic surgical instrument or RF electrosurgical instrument operates. The situational awareness surgical hub 5104 can determine the type of surgical procedure being performed and then customize the energy level of the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, according to the expected tissue profile of the surgical procedure. Further, the situation-aware surgical hub 5104 may be configured to enable adjustment of the energy level of the ultrasonic surgical instrument or RF electrosurgical instrument throughout the surgical procedure, rather than only on a procedure-by-procedure basis. The situation aware surgical hub 5104 can determine the steps of the surgical procedure being performed or to be performed subsequently and then update the control algorithm for the generator and/or ultrasonic surgical instrument or RF electrosurgical instrument to set the energy level at a value appropriate for the desired tissue type according to the surgical procedure.
As yet another example, data may be extracted from additional data sources 5126 to improve the conclusion that the surgical hub 5104 extracts from one data source 5126. The situation aware surgical hub 5104 can augment the data it receives from the modular device 5102 with contextual information about the surgical procedure that has been built from other data sources 5126. For example, the situational awareness surgical hub 5104 may be configured to determine whether hemostasis has occurred (i.e., whether bleeding at the surgical site has stopped) based on video or image data received from the medical imaging device. However, in some cases, the video or image data may be uncertain. Thus, in one example, the surgical hub 5104 may also be configured to compare physiological measurements (e.g., blood pressure sensed by a BP monitor communicatively connected to the surgical hub 5104) with hemostatic visual or image data (e.g., from the medical imaging device 124 (fig. 2) communicatively coupled to the surgical hub 5104) to determine the integrity of the suture or tissue weld. In other words, the situational awareness system of the surgical hub 5104 may take into account the physiological measurement data to provide additional context when analyzing the visualization data. Additional context may be useful when the visualization data itself may be ambiguous or incomplete.
Another benefit includes actively and automatically controlling the paired modular devices 5102 according to the particular step of the surgical procedure being performed to reduce the number of times medical personnel need to interact with or control the surgical system 5100 during the surgical procedure. For example, if the situation-aware surgical hub 5104 determines that a subsequent step of the procedure requires the use of an RF electrosurgical instrument, it may actively activate a generator connected to the instrument. Actively activating the energy source allows the instrument to be ready for use as soon as the previous step of the procedure is completed.
As another example, the situation aware surgical hub 5104 may determine whether a different view or degree of magnification on the display is required for the current or subsequent step of the surgical procedure based on features that the surgeon expects to need to view at the surgical site. The surgical hub 5104 may then actively change the displayed view accordingly (e.g., provided by the medical imaging device for the visualization system 108), such that the display is automatically adjusted throughout the surgical procedure.
As yet another example, the situation aware surgical hub 5104 can determine which step of the surgical procedure is being performed or is to be performed subsequently and whether a comparison between particular data or data is required for that step of the surgical procedure. The surgical hub 5104 may be configured to automatically invoke a data screen based on the steps of the surgical procedure being performed without waiting for the surgeon to request this particular information.
Another benefit includes checking for errors during setup of the surgical procedure or during the course of the surgical procedure. For example, the situational awareness surgical hub 5104 may determine whether the operating room is properly or optimally set for the surgical procedure to be performed. The surgical hub 5104 may be configured to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding manifest, product location, or setup requirements, and then compare the current operating room layout to the standard layout determined by the surgical hub 5104 for the type of surgical procedure being performed. In one example, the surgical hub 5104 may be configured to be able to compare, for example, a list of items for procedures scanned by a suitable scanner, and/or a list of devices paired with the surgical hub 5104, with a recommended or expected list of items and/or devices for a given surgical procedure. The surgical hub 5104 may be configured to provide an alert indicating the absence of a particular modular device 5102, patient monitoring device 5124, and/or other surgical item if any discontinuity exists between the lists. In one example, the surgical hub 5104 may be configured to be able to determine the relative distance or location of the modular device 5102 and the patient monitoring device 5124, e.g., via proximity sensors. The surgical hub 5104 can compare the relative position of the devices to a recommended or expected layout for a particular surgical procedure. The surgical hub 5104 may be configured to provide an alert indicating that the current layout for the surgical procedure deviates from the recommended layout if there are any discontinuities between layouts.
As another example, the situational awareness surgical hub 5104 can determine whether the surgeon (or other medical personnel) is making mistakes or otherwise deviating from the expected course of action during the surgical procedure. For example, the surgical hub 5104 may be configured to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding list of steps or order of device usage, and then compare the steps being performed or the devices being used during the surgical procedure to the expected steps or devices determined by the surgical hub 5104 for the type of surgical procedure being performed. In one example, the surgical hub 5104 may be configured to provide an alert indicating that an unexpected action is being performed or an unexpected device is being used at a particular step in the surgical procedure.
In general, the situational awareness system for the surgical hub 5104 improves surgical results by adjusting the surgical instruments (and other modular devices 5102) for the particular context of each surgical procedure, such as for different tissue types, and verifying actions during the surgical procedure. The situational awareness system also improves the surgeon's efficiency in performing the surgical procedure by automatically suggesting next steps, providing data, and adjusting the display and other modular devices 5102 in the operating room, depending on the particular context of the procedure.
Referring now to fig. 15, a
The situation aware
As a
In a
In a
Fourth, the medical staff opens the
In a
In a seventh step 5214, the patient's lungs being operated on are collapsed (while ventilation is switched to the contralateral lungs). For example, the
In an
Ninth step 5218, the surgical team begins the dissection step of the procedure. The
In a
In a
A
Finally, a
Situational awareness is further described in U.S. provisional patent application serial No. 62/659,900 entitled "METHOD OF humomultination" filed on 19/4/2018, which is incorporated herein by reference in its entirety. In certain instances, operation of the robotic surgical system (including, for example, the various robotic surgical systems disclosed herein) may be controlled by the
Automatic correlation/integration of data streams
In various aspects, a computer system (e.g., a cloud analysis system, a surgical hub, and/or a data warehouse system) may be programmed to perform automatic correlation of multiple data streams and/or integrated into a single interface having one or more focal regions of otherwise accessible or coverable data. In one aspect, real-time interpretation information may be displayed to a user on a device, where the interpretation is based on data from at least one function of the device and data from a second, different source.
In one aspect, the interpretation information may be displayed to the user based on at least one function of the device, including not originating from at least one data source within the device. In another aspect, the at least one additional source may include a measurement device capable of determining parameters related to patient and device functions, and the ability of the instrument to automatically update aspects of the ancillary information on the display of the device and update the ancillary information in real-time. In one aspect, real-time updating is accomplished by the surgical instrument being able to repeatedly calculate new data in a user-defined selection form.
Referring to fig. 16, a modular device in the form of a handheld
Referring to the upper left corner of fig. 16, the field of
In the example of fig. 16, the
Referring to fig. 17, a schematic view of a
As shown in the lower right hand corner of fig. 16, the
The
In another example, the interpretation information associated with the blood pressure of a blood vessel in the tissue being grasped may be displayed simultaneously with the firing speed and/or the waiting time parameters of the firing stroke, for example. In various examples, blood pressure may be monitored separately, and blood pressure data or information interpreted from blood pressure data may be transmitted to the
In another example, the at least one function may be grasping tissue. Tissue compression or pressure may be a parameter of tissue grasping, which may be displayed on the
In at least one example, the interpretation information is based on information that does not originate from at least one data source within the
This data interpretation may occur locally at the
Routing data and/or information between the
In various examples, the data source may be a sensor on another modular device. In various examples, the data source is an imaging device. In at least one example, the data source can be one or more components of the visualization system 108 (fig. 1). Various components of the
In various examples, the imaging device can be configured to record and/or process imaging data related to functions performed by the
In various examples, the interpretation information is displayed by the
Referring to fig. 17 and 18, the
Referring to fig. 17 and 19, the
Automated image interpretation
In various aspects, a surgical hub (or surgical instrument or other system) may be configured to reduce the captured images to a representation, such as a transection result. In one aspect, the surgical hub (e.g., 106, 206) or its imaging module (e.g., 138, 238) can include algorithms to decompose pixels of an image (e.g., an image captured by an endoscope) and perform calculations to determine color differences and/or leaks (e.g., bubbles, dyes, or blood) between tissue and the end effector to determine the presence, amount, and location of any leaks (L) or end effectors, as shown in the lower left corner of fig. 16. For example, the algorithm may be programmed to compare the weights of pixels and sub-pixels from the resulting image to the mathematical value of the pixel. The detected leak in the previously treated tissue may then be presented to a user of the
In one aspect, the surgical hub (e.g., 106, 206) or an imaging module thereof (e.g., 138, 238) may include a classification algorithm for performing digital image processing on an image (e.g., an image captured by an endoscope) to identify (classify) end effectors, bleeding, air bubbles, and other events from other tissue classes within the image. For example, the algorithm may be programmed to perform comparative pixelation, where the image is reduced to a constrained grid pattern, and each element of the grid is reduced to, for example, 256 color names of pixels. The first scan may remove from the analysis all correctly colored pixels that are not in the category sought (e.g., bleeding). The potential hemorrhage area is then compared to adjacent areas or frame backs in order to identify flowing blood within the image.
In one aspect, the surgical hub (e.g., 106, 206) or an imaging module thereof (e.g., 138, 238) may include algorithms to perform feature extraction image processing to reduce the image from almost infinite variation in aspects to a region formed by reducing the number of random variables to a set of similar variables. For example, a user may select a type of tissue or a feature of an anatomical structure, and the imaging system may reduce the characteristic variation within the image to a uniform average aspect of the selected feature. This would allow the user to find limits on, for example, tissue planes, boundaries of different organs, or tissue surfaces damaged by infection or cancer.
In one aspect, the surgical hub (e.g., 106, 206) or imaging module (e.g., 138, 238) thereof may include a pattern recognition algorithm to identify the target feature. A variety of such techniques are disclosed in International intellectual evidence-assisted Polypropylene Detection for Colonocopy, Initial experiment, Masashi et al, Gastroenterology, Vol.154, No. 8, 202-2029. e3, which is hereby incorporated by reference in its entirety and is accessible at www.gastrojournal.org/article/S0016-5085(18) 30415-3/pdf.
In various examples, the
Image manipulation
In various aspects, the algorithm can be programmed to manipulate one image feed to adapt another feed to allow visualization of a static image on a dynamic image. In one aspect, the algorithm may use landmarks and the ability to define the elasticity of the stack shape over the primary feed to allow the image to distort and force the underlying anchor to engage in motion. This would allow, for example, pre-operative CT scans of a tumor or surgical site to be stacked over a real-time feed from an endoscope during surgery. This may be used, for example, to infer that the preoperative image is lateral or complex from a portion of the scan that is open to visualization, allowing the surgeon to see the tissue or structure that is currently occluded from the visible view on the user display.
User selectable data set
In various aspects, the surgical hub (e.g., 106, 206) or its imaging module (e.g., 138, 238) may be configured to receive a user-selectable, highlighted, or markable data set that will display its changing data digitally, graphically, or as a highlighted area on another image feed.
User selectable data sets may be used in a variety of surgical contexts. For example, a blood pressure monitor of a selected blood vessel or capillary tube may be selected because it will be transected and the surgeon wants to continuously observe the pressure calculated in that region in order to monitor the proximity of the anatomy to the blood vessel or adjacent nerve, or as a means of deciding how long to coagulate a particular region prior to transection. As another example, a surgeon may select a series of blood vessels for a surgical hub system to provide a visual feed of a continuous update of the amount of blood moving through the series of blood vessels while they are skeletonized, dissected, and then transected separately. In this example, the laser doppler visualization system may show the magnitude of blood flow measurements in a wide area superimposed on the visual image and fluctuations in blood flow during the interactive anatomical step with the blood flow area. Such interpreted information may be displayed on the
In various aspects, the user may interact with the display of information on the
Fig. 20 is a diagram illustrating a
In at least one example, if the processed image shows something the user should be aware of, then hyperspectral imaging-based highlighting (i.e., processing the image to visualize a particular type of structure) may trigger a warning indicator even if the user does not request a particular imaging associated with the warning. For example, if a critical structure is detected but not visible under direct visualization, the surgical hub (e.g., 106, 206) and/or the
In at least one example, as shown in fig. 20, an un-prompted adjustment 205009 to a parameter setting (firing speed setting 205005) is recommended through a
Referring to fig. 17 and 21, the
Referring to fig. 17 and 22, the
Fig. 23 is a schematic diagram illustrating a
In various examples, automatic adjustment of the field of view of the medical imaging device can include automatic focusing and/or centering based on the position of a critical structure (such as an end effector of the surgical device 205001). In other examples, the critical structure may be an anatomical structure or a surgical site location. In at least one example, the center of the area visualized on the
Fig. 24 is a logic flow diagram depicting a process 205100 for automatically adjusting a control program or logic configuration of a medical imaging device relative to a field of view of a detected critical structure. The process 205100 includes detecting 205102 critical structures and evaluating 205104 their position relative to the field of view of the imaging device.
The imaging module (e.g., 138, 238) may employ various suitable image interpretation techniques as described above to detect 205100 critical structures and/or to assess 205104 their location relative to the field of view of the imaging device. In one example, the surgical hub (e.g., 106, 206) or its imaging module (e.g., 138, 238) may include algorithms to decompose pixels of an image (e.g., an image captured by an oscilloscope) and perform calculations to determine color differences between critical structures and the surrounding environment. The determined color differences are used to detect 205100 critical structures and/or to evaluate 205104 their position relative to the field of view of the imaging device. In another aspect, the surgical hub (e.g., 106, 206) or an imaging module thereof (e.g., 138, 238) may include a classification algorithm for performing digital image processing on images (e.g., images captured by an endoscope) to detect 205100 (classify) critical structures and/or to assess 205104 their location relative to a field of view of the imaging device.
Where it is determined 205106 that the critical structure is at the edge of the current field of view of the imaging device, and the medical imaging device is able to adjust the field of view on the trajectory of the critical structure (e.g., end effector), the
In at least one example, the visualization system (e.g.,
If the critical structure is an end effector of a surgical device, the end effector may be articulated to a new position at the center of the field of view, for example, according to instructions from a surgical hub. Alternatively, the imaging device may be moved to reposition the field of view relative to the critical structure.
Fig. 25 is a logic flow diagram depicting a process 205200 for automatically adjusting a control program or logic configuration of a medical imaging device relative to a field of view of a detected critical structure. The process 205100 includes receiving input from a surgical hub (e.g., 106, 206) indicating a location of a critical structure determined by a visualization or imaging module (e.g., 138, 238) relative to a current field of view of the medical imaging device. The process 205200 also includes causing the 205204
In various examples, the medical imaging device includes a camera that aligns the
Various aspects of the subject matter described herein are set forth in the following numbered examples:
- 上一篇:一种医用注射器针头装配设备
- 下一篇:包括多个驱动系统的外科器械