Teleoperation method for actively engaging and disengaging a surgical robotic system

文档序号:156784 发布日期:2021-10-26 浏览:44次 中文

阅读说明:本技术 用于主动接合和脱离手术机器人系统的遥操作的方法 (Teleoperation method for actively engaging and disengaging a surgical robotic system ) 是由 T·J·科恩 J·萨瓦尔 A·L·弗赖恩万卡普里 E·M·约翰逊 于 2018-12-31 设计创作,主要内容包括:一种用于接合和脱离手术机器人系统的手术器械的方法,该方法包括从手术机器人系统的一个或多个用户接口设备接收用户输入的序列;通过通信地耦合到用户接口设备和手术器械的一个或多个处理器来确定用户输入的序列是否指示遥操作模式的有意接合或脱离,在遥操作模式中,手术器械由从用户接口设备接收的用户输入来控制;响应于接合的确定,将手术机器人系统转换到遥操作模式中;以及响应于脱离的确定,将手术机器人系统从遥操作模式转换出来,使得防止用户接口设备控制手术器械。(A method for engaging and disengaging a surgical instrument of a surgical robotic system, the method comprising receiving a sequence of user inputs from one or more user interface devices of the surgical robotic system; determining, by one or more processors communicatively coupled to the user interface device and the surgical instrument, whether a sequence of user inputs indicates intentional engagement or disengagement of a teleoperation mode in which the surgical instrument is controlled by user inputs received from the user interface device; in response to the determination of engagement, transitioning the surgical robotic system into a teleoperational mode; and in response to a determination of detachment, transitioning the surgical robotic system out of the teleoperational mode such that the user interface device is prevented from controlling the surgical instrument.)

1. A method for engaging and disengaging a surgical instrument of a surgical robotic system, the method comprising:

receiving a sequence of user inputs from one or more user interface devices of a surgical robotic system;

determining, by one or more processors communicatively coupled to the user interface device and the surgical instrument, whether a sequence of user inputs indicates intentional engagement or disengagement of a teleoperation mode in which the surgical instrument is controlled by user inputs received from the user interface device;

in response to the determination of engagement, transitioning the surgical robotic system into a teleoperational mode; and

in response to the determination of detachment, the surgical robotic system is transitioned out of the teleoperational mode such that the user interface device is prevented from controlling the surgical instrument.

2. The method of claim 1, wherein the user interface device comprises at least one of a hand-held user input device and a foot pedal.

3. The method of claim 1, wherein the sequence of user inputs indicating engagement or disengagement of the teleoperational mode is different.

4. The method of claim 1, wherein the sequence of user inputs indicating engagement or disengagement of the teleoperational modes is the same.

5. The method of claim 1, wherein the sequence of user inputs comprises a sequence of first user inputs from a first user interface device and a sequence of second user inputs from a second user interface device.

6. The method of claim 1, wherein the one or more user interface devices comprise a handheld user input device and a clutch pedal, and the sequence of user inputs comprises a sequence of first user inputs received from the handheld user input device and a sequence of second user inputs received from the clutch pedal.

7. The method of claim 1, wherein the sequence of user inputs indicating intentional engagement or disengagement corresponds to pressing and holding a clutch pedal of the one or more user interface devices and double tapping a finger clutch of the one or more user interface devices.

8. The method of claim 1, wherein the sequence of user inputs indicative of intentional engagement or disengagement corresponds to pressing and holding a clutch pedal of one or more user interface devices and squeezing another one of the one or more user interface devices.

9. The method of claim 1, wherein the sequence of user inputs indicating intentional engagement or disengagement corresponds to double tapping of a clutch pedal of one or more user interface devices.

10. The method of claim 1, wherein the method further comprises determining whether at least one of the following conditions is met prior to transitioning the surgical robotic system into the teleoperational mode: the chair of the surgical robotic system is locked, the user is looking at the screen of the surgical robotic system, and the user interface device is within the surgical robotic system workspace.

11. The method of claim 1, further comprising user feedback for alerting a user that the teleoperational mode is engaged or disengaged.

12. A surgical robotic system comprising:

a surgical instrument;

a user console including a user interface device and a foot pedal, the user interface device being mechanically ungrounded with respect to the user console; and

one or more processors communicatively coupled to the surgical instrument and the user console, the processors configured to:

receiving a sequence of user actions through a user interface device and/or a foot pedal;

determining that the surgical robotic system is in a non-teleoperational mode and that the sequence of user actions indicates intentional engagement; and

the surgical robotic system is transitioned into a teleoperational mode in which the surgical instrument is controlled by user inputs received from the user interface device and the foot pedal.

13. The surgical robotic system as claimed in claim 12, wherein the user interface device comprises a first handheld user input device and a second handheld user input device, and the sequence of user actions indicating intentional engagement comprises tapping or squeezing both the first and second handheld user input devices.

14. The surgical robotic system as claimed in claim 12, wherein the foot pedal is a clutch pedal and the sequence of user actions indicative of intentional engagement includes pressing and holding the clutch pedal and tapping or squeezing the user interface device.

15. The surgical robotic system as claimed in claim 12, wherein the foot pedal is a clutch pedal and the sequence of user actions indicative of intentional engagement includes tapping the clutch pedal and tapping or squeezing the user interface device.

16. A surgical robotic system comprising:

a surgical instrument;

a user console including a user interface device and a foot pedal, the user interface device being mechanically ungrounded with respect to the user console; and

one or more processors communicatively coupled to the surgical instrument and the user console, the processors configured to:

receiving a sequence of user actions through a user interface device and/or a foot pedal;

determining that the surgical robotic system is in a teleoperational mode and that the sequence of user actions indicates intentional disengagement, wherein in the teleoperational mode the surgical instrument is controlled by user input received from the user interface device and the foot pedal; and

the surgical robotic system is transitioned into a non-teleoperational mode in which the user interface device or foot pedal is prevented from controlling the surgical instrument.

17. The system of claim 16, wherein the foot pedal is a clutch pedal and the sequence of user actions indicating intentional disengagement includes double tapping of the clutch pedal.

18. The system of claim 16, wherein the foot pedal is a clutch pedal and the sequence of user actions indicative of intentional disengagement includes pressing and holding the clutch pedal and tapping a finger clutch of the user interface device.

19. The system of claim 16, wherein the foot pedal is a clutch pedal and the sequence of user actions indicative of intentional disengagement includes pressing and holding the clutch pedal and squeezing the user interface device.

20. The system of claim 19, wherein the user interface device is a handheld user input device including a sensor operable to detect user input, and at least one of the user inputs includes docking of the portable handheld user interface device.

Technical Field

Embodiments related to a robotic system are disclosed. More particularly, embodiments related to a surgical robotic system and corresponding method for engaging (engaging) and disengaging (disengaging) a teleoperation (teleoperation) mode of the surgical robotic system are disclosed.

Background

Endoscopic surgery involves looking at a patient's body and performing a procedure within the body using an endoscope and other surgical tools. For example, laparoscopic surgery may use a laparoscope to access and view the abdominal cavity. Endoscopic surgery may be performed using hand tools and/or surgical robotic systems with robotic aids.

The surgical robotic system may be remotely operated by a surgeon to command a robotic auxiliary tool located at an operating table. Such manipulation of the robotic auxiliary tool remotely by the surgeon may be generally referred to as teleoperation. For example, a surgeon may use a computer control console (console) located in the operating room or it may be located in a different city to command the robot to manipulate surgical tools mounted on the operating table. The robotically controlled surgical tool may be an endoscope mounted on a robotic arm. Thus, endoscopic surgery may be performed by a remote surgeon using a surgical robotic system.

The surgeon may provide input commands to the surgical robotic system, and one or more processors of the surgical robotic system may control system components in response to the input commands. For example, a surgeon may hold a user input device (such as a joystick or a computer mouse) in her hand that she manipulates to generate control signals to cause movement of surgical robotic system components (e.g., actuators of the robotic system, robotic arms, and/or surgical tools).

Background

Disclosure of Invention

The surgical robotic system may be considered to have various system modes. The primary mode of operation may be referred to herein as a teleoperation mode. A surgical robotic system is considered to be in a teleoperational mode when a user actively controls or is able to actively control surgical robotic system components (e.g., actuators, robotic arms, surgical tools, and/or endoscopes), for example, via a user interface device or foot pedals. On the other hand, when the user is unable to actively control the surgical robotic system component, then the system is deemed to have exited the teleoperational mode, exited (be out of) the teleoperational mode, be in a non-teleoperational mode, or have exited the teleoperational mode. For example, the system may be considered to have exited or exited the teleoperational mode or to be in a non-teleoperational mode when (1) the surgical robotic system has not accepted user input, (2) a user input command of a Graphical User Interface (GUI) of the system is accepted but cannot control an associated surgical robotic system component, or (3) the user has not yet been able to control a surgical robotic system component but a sequence of intentional actions will cause the user to enter or engage the teleoperational mode. In teleoperated surgical robotic systems, it is important to provide a safe and simple way for a user to actively engage a teleoperational mode to avoid injury due to inadvertent tool movement. Furthermore, it is desirable to provide a way for a user to safely disengage from a teleoperational mode whenever the user actively wants to disengage or perform an action that makes teleoperation anything but the primary focus. While surgical robotic systems may include various safety requirements to prevent inadvertent operation, they may not cover the specific operation of engaging and/or disengaging the teleoperational mode and/or may be overly complex and thus difficult for a user to remember and/or follow.

Accordingly, the present invention is directed to a surgical robotic system that can detect an intended user action, a sequence of intended user actions, or a set of intended user actions, and automatically engage or disengage a teleoperational mode based on the detection. Representatively, in one aspect, a sequence of intended user actions or a set of intended user actions is detected by a user interface device. For example, the sequence of intentional user action(s) may be a set of predetermined gestures detected by the user interface device, such as a user tapping (tap) on or squeezing the user interface device. Alternatively, the intentional action(s) may be an action that results in the user interface device being in a particular position, orientation, and/or location, such as a user docking the user interface device in a docking station or positioning the user interface in close proximity to the docking station. Still further, the action(s) may be actions related to foot pedals of the surgical robotic system, alone or in combination with actions previously discussed in connection with the user interface device. For example, the action(s) or set of actions may include tapping a clutch pedal or pressing and holding the clutch pedal in combination with tapping and/or squeezing the user interface device.

Representatively, in one aspect, the user interface device may be a handheld user input or interface device that includes a handle that a surgeon may manipulate using her hand to generate input commands to move an actuator to which a robotic surgical tool and/or end effector is coupled in a surgical robotic system. The surgeon may move the handle within a working space (workspace), such as a range of motion of a linkage system connected to the handle, to remotely cause corresponding movement of the actuator when in the teleoperational mode. When the limits of the workspace are reached, such as when the linkage system is fully extended, the surgeon may press a clutch button or finger clutch so that inputs from the surgical robotic system do not cause movement of the actuator, and in turn, the robotic surgical tool and/or end effector are paused or otherwise held in the current position. That is, when the finger clutch is pressed, the handle may be repositioned within the workspace without causing corresponding movement of the actuator, robotic surgical tool, and/or end effector, which would otherwise occur if the finger clutch was not pressed. To actuate the finger clutch, the surgeon must apply a force large enough to oppose the return spring force of the clutch. For example, the surgeon must press down on the finger clutch. When the surgeon releases this downward press on the finger clutch, the robotic surgical tool and/or end effector are no longer paused in the current position, and the user may continue movement of the actuators and associated components using the user interface device. However, it should be understood that the single action of pressing/holding the finger clutch to pause the associated robotic surgical component in the current position or releasing the finger clutch to continue movement of the component should not be construed as an intentional sequence of user actions that may be used to disengage the teleoperational mode. Rather, the clutching or pausing of the associated robotic surgical component is a more temporary operation, caused by relatively simple user action, than disengaging the teleoperational mode. In particular, during the clutching operation, the signal from the user interface device to the actuator, robotic surgical tool, and/or end effector to move the actuator, robotic surgical tool, and/or end effector is temporarily suspended (suspend), or overridden (override), to suspend operation, but once a single action occurs, such as the user removing their finger from the finger clutch, the desired movement may continue. In contrast, when the teleoperational mode is disengaged, for example, in response to intentional action(s) by the user as previously discussed, the user interface device may be considered permanently disconnected from the actuator, tool, and/or end effector. For example, when the teleoperational mode is disengaged, the signal from the user interface device may no longer be transmitted to the actuator, robotic surgical tool, and/or end effector, and/or the user interface device may remain disconnected until a relatively complex and intentional sequence of actions is performed by the user. The sequence of intentional actions is an action that indicates the user's intent to engage the teleoperational mode and may be more complex than a single action for clutching or pausing the system. In other words, the system will not engage (or disengage) the teleoperational mode when a single action for clutching is detected. Rather, a sequence of intentional actions must be detected for the system to reengage the teleoperational mode and then reconnect the user interface device with the actuator, tool, and/or end effector. In other words, simply removing the first action will not automatically reconnect the user interface device with the actuator, tool, and/or end effector, as compared to a clutching operation. Rather, another intentional action (which may be the same as the first intentional action (i.e., a symmetric action) in some cases) is required to reconnect the user interface device to the actuator, tool, and/or end effector, and thus engage the teleoperational mode.

More particularly, in one aspect, the present invention relates to a method for engaging and disengaging a surgical instrument of a surgical robotic system. The method may include receiving a sequence of user inputs from one or more user interface devices of a surgical robotic system. The method may further include determining, by one or more processors communicatively coupled to the user interface device and the surgical instrument, whether the sequence of user inputs indicates intentional engagement or disengagement of a teleoperation mode in which the surgical instrument is controlled by user inputs received from the user interface device. In response to the determination of engagement, the surgical robotic system may transition into a teleoperational mode; and in response to a determination of detachment, the surgical robotic system may transition out of the teleoperational mode such that the user interface device is prevented from controlling the surgical instrument. The user interface device may include at least one of a hand-held user input device and a foot pedal. The sequence of user inputs indicating engagement or disengagement of the teleoperational mode may be different. In other cases, the sequence of user inputs indicating engagement or disengagement of the teleoperational mode may be the same. In some cases, the sequence of user inputs may include a sequence of first user inputs from a first user interface device and a sequence of second user inputs from a second user interface device. In a still further aspect, the one or more user interface devices may include a handheld user input device and a clutch pedal, and the sequence of user inputs includes a sequence of first user inputs received from the handheld user input device and a sequence of second user inputs received from the clutch pedal. In some aspects, the sequence of user inputs indicating intentional engagement or disengagement may correspond to pressing and holding a clutch pedal of one or more user interface devices and double tapping a finger clutch of one or more user interface devices. Still further, the sequence of user inputs indicating intentional engagement or disengagement may correspond to pressing and holding a clutch pedal of one or more user interface devices and squeezing another of the one or more user interface devices. Still further, the sequence of user inputs indicating intentional engagement or disengagement may correspond to double tapping of the clutch pedal of one or more user interface devices. The method may further include determining whether at least one of the following conditions is met prior to transitioning the surgical robotic system into the teleoperational mode: the chair of the surgical robotic system is locked, the user is looking at the screen of the surgical robotic system, and the user interface device is within the surgical robotic system workspace. The method may further comprise user feedback for alerting a user that the teleoperational mode is engaged or disengaged.

In another aspect, the present invention relates to a surgical robotic system comprising: a surgical instrument; a user console including a user interface device and a foot pedal, the user interface device being mechanically ungrounded with respect to the user console; and one or more processors communicatively coupled to the surgical instrument and the user console. The processor may be configured to receive a sequence of user actions through the user interface device and/or the foot pedal, determine that the surgical robotic system is in a non-teleoperational mode and that the sequence of user actions indicates intentional engagement, and transition the surgical robotic system into a teleoperational mode in which the surgical instrument is controlled by user inputs received from the user interface device and the foot pedal. In one aspect, the user interface device may include a first handheld user input device and a second handheld user input device, and the sequence of user actions indicating intentional engagement includes tapping or squeezing both the first and second handheld user input devices. In another aspect, the foot pedal may be a clutch pedal and the sequence of user actions indicating intentional engagement includes pressing and holding the clutch pedal and tapping or squeezing the user interface device, or the sequence of user actions indicating intentional engagement includes tapping the clutch pedal and tapping or squeezing the user interface device.

In another aspect, a surgical robotic system may include: a surgical instrument; a user console including a user interface device and a foot pedal, the user interface device being mechanically ungrounded with respect to the user console; and one or more processors communicatively coupled to the surgical instrument and the user console. In this regard, the processor may be configured to receive a sequence of user actions through the user interface device and/or the foot pedal, determine that the surgical robotic system is in a teleoperational mode in which the surgical instrument is controlled by user input received from the user interface device and the foot pedal, and transition the surgical robotic system into a non-teleoperational mode in which the user interface device or the foot pedal is prevented from controlling the surgical instrument, and the sequence of user actions indicates intentional disengagement. In one aspect, the foot pedal is a clutch pedal and the sequence of user actions indicating intentional disengagement includes double tapping of the clutch pedal. In another aspect, the sequence of user actions indicating intentional disengagement may include pressing and holding a clutch pedal, and tapping a finger clutch of the user interface device, or pressing and holding a clutch pedal, and squeezing the user interface device. In a still further aspect, the user interface device is a handheld user input device comprising a sensor operable to detect user input, and at least one of the user inputs comprises docking of the portable handheld user interface device.

The above summary does not include an exhaustive list of all aspects of the present invention. It is contemplated that the present invention includes all systems and methods that can be practiced from all suitable combinations of the various aspects summarized above, as well as those disclosed in the detailed description below and particularly pointed out in the claims filed with the application. Such a combination has particular advantages not specifically recited in the above summary of the invention.

Drawings

Embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to "an" or "one" embodiment of the invention in this disclosure are not necessarily to the same embodiment, and they mean at least one. Moreover, for the sake of simplicity and to reduce the total number of figures, a given figure may be used to illustrate features of more than one embodiment of the invention, and not all elements in a figure may be required for a given embodiment.

Fig. 1 is a schematic view of an example surgical robotic system in an operating site (operating arena), under an embodiment.

FIG. 2 is a schematic diagram of a user interface device having a finger clutch, according to an embodiment.

Fig. 3 is a perspective view of a user interface device according to an embodiment.

Fig. 4 is a perspective view of a user interface device in a docking station according to an embodiment.

Fig. 5 is a block diagram of a computer portion of a surgical robotic system, according to an embodiment.

Fig. 6 is a block diagram of an exemplary process for disengaging a teleoperational mode, according to an embodiment.

Fig. 7 is a block diagram of an exemplary process for engaging a teleoperational mode, according to an embodiment.

Fig. 8 is a block diagram of an exemplary process for engaging a teleoperational mode, according to an embodiment.

Fig. 9 is a block diagram of a computer portion of a surgical robotic system, according to an embodiment.

Detailed Description

Embodiments describe processes for engaging and disengaging a teleoperational mode based on user actions in conjunction with, for example, a User Interface Device (UID) that may be used by a robotic system to control an actuator of a mobile robotic arm or tool. The robotic system may be a surgical robotic system, the robotic arm may be a surgical robotic arm, and the tool may be a surgical tool. However, the UID may be used by other systems, such as interventional cardiology systems, vision systems, or aircraft systems, to control other output components. These other systems are just to name a few possible applications.

In various embodiments, reference is made to the accompanying drawings. However, certain embodiments may be practiced without one or more of these specific details, or in combination with other known methods and configurations. In the following description, numerous specific details are set forth, such as specific configurations, dimensions, and processes, in order to provide a thorough understanding of the embodiments. In other instances, well-known processes and manufacturing techniques have not been described in particular detail in order to not unnecessarily obscure the description. Reference throughout this specification to "one embodiment," "an embodiment," or the like means that a particular feature, structure, configuration, or characteristic described is included in at least one embodiment. Thus, appearances of the phrases "one embodiment," "an embodiment," or the like in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, configurations, or characteristics may be combined in any suitable manner in one or more embodiments.

The use of relative terms throughout the description may refer to relative positions or orientations. For example, "far end" may indicate a first direction away from a reference point (e.g., away from the user). Similarly, "proximal" may indicate a position in a second direction (e.g., toward the user) opposite the first direction. However, such terms are provided to establish a framework of relative reference and are not intended to limit, for example, the use or orientation of a UID to the particular configurations described in the various embodiments below.

Referring to fig. 1, this is a schematic view of an example surgical robotic system 1 in an operating site. The robotic system 1 includes a user console 2, a control tower 3, and one or more surgical robotic arms 4 at a surgical robotic platform 5 (e.g., table, bed, etc.). The system 1 may incorporate any number of devices, tools, or accessories for performing a procedure on a patient 6. For example, the system 1 may include one or more surgical tools 7 for performing a procedure. The surgical tool 7 may be an end effector attached to the distal end of the surgical arm 4 for performing a surgical procedure.

During surgery, each surgical tool 7 may be operated manually, robotically, or both. For example, the surgical tool 7 may be a tool for accessing, viewing or manipulating the internal anatomy of the patient 6. In an embodiment, the surgical tool 7 is a grasper that can grasp tissue of a patient. The surgical tool 7 may be manually controlled by a bedside operator 8; or it may be robotically controlled via actuation movement of the surgical robotic arm 4 to which it is attached. The robotic arm 4 is shown as a table mounted system, but in other configurations the arm 4 may be mounted in a cart, ceiling or side wall, or in another suitable structural support.

Typically, a remote operator 9, such as a surgeon or other operator, may use the user console 2 to remotely manipulate the arm 4 and/or attached surgical tool 7, e.g., teleoperation. As shown in fig. 1, user console 2 may be located in the same operating room as the rest of system 1. However, in other environments, the user console 2 may be located in an adjacent or nearby room, or it may be located at a remote location, such as in a different building, city, or country. User console 2 may include a chair 10, foot-operated controls 13, one or more handheld user input or interface devices, a UID 14, and at least one user display 15, user display 15 being configured to display a view of a surgical site within patient 6, for example. In the example user console 2, a remote operator 9 sits in a seat 10 and views a user display 15 while manipulating foot-operated controls 13 and handheld UID 14 in order to remotely control arm 4 and surgical tool 7 (which is mounted on the distal end of arm 4).

In some variations, the bedside operator 8 may also operate the system 1 in an "over the bed" mode, where the next operator 8 (user) is now at the side of the patient 6, and is simultaneously manipulating a robotically driven tool (such as an end effector attached to the arm 4), for example, with a hand-held UID 14 and a manual laparoscopic tool held in one hand. For example, the left hand of the bedside operator may manipulate the handheld UID to control the robotic components, while the right hand of the bedside operator may manipulate manual laparoscopic tools. Thus, in these variations, the bedside operator 8 may perform both robot-assisted minimally invasive surgery and manual laparoscopic surgery on the patient 6.

During an example procedure (surgery), the patient 6 is prepared and covered (dreped) in a sterile manner to achieve anesthesia. Initial access to the surgical site may be performed manually when the arms of the robotic system 1 are in a stowed (stowed) configuration or a retracted configuration (to facilitate access to the surgical site). Once the access is complete, an initial positioning or preparation of the robotic system 1 (including its arms 4) may be performed. Next, the procedure is performed by remote operator 9 at user console 2 using foot-operated controls 13 and UID 14 to manipulate the various end effectors and possibly the imaging system to perform the procedure. Manual assistance may also be provided at the operating or operating table by bedside staff wearing sterile workwear, for example bedside operators 8 who may perform tasks such as retrieving (extracting) tissue, performing manual repositioning and tool replacement on one or more of the robotic arms 4. Non-sterile personnel may also be present (at present) to assist the remote operator 9 at the user console 2. When a procedure or procedure is completed, the system 1 and user console 2 may be configured or set in a state to facilitate post-operative procedures (post-operative procedures), such as cleaning or disinfection and healthcare input or printout via the user console 2.

In one embodiment, remote operator 9 holds and moves UID 14 to provide input commands to move robotic arm actuator 17 in robotic system 1. UID 14 may be communicatively coupled to the rest of robotic system 1, for example, via console computer system 16. UID 14 may generate spatial status signals corresponding to the movement of UID 14, such as the position and orientation of the hand-held housing of the UID, and the spatial status signals may be input signals for controlling the motion of robotic arm actuator 17. The robotic system 1 may use control signals derived from the spatial state signals to control the proportional motion (proportionality motion) of the actuator 17. In one embodiment, a console processor of console computer system 16 receives the spatial status signals and generates corresponding control signals. Based on these control signals, which control how actuator 17 is energized to move the segments or links of arm 4, the movement of the respective surgical tool attached to the arm may mimic the movement of (mimic) UID 14. Similarly, the interaction between remote operator 9 and UID 14 may generate, for example, a grasp control signal that causes jaws (jaw) of a grasper of surgical tool 7 to close and grasp tissue of patient 6.

UID 14 may also be provided to command other operations through surgical robotic system 1. For example, UID 14 may include a finger clutch, and pressing on the finger clutch may generate a clutch signal to pause movement of actuator 17 corresponding to surgical robotic arm 4 and surgical tool 7, as described below. For example, when a user presses a finger clutch of UID 14 with a finger, the finger clutch may generate a clutch signal, and the clutch signal may be an input signal to pause all movements of actuator(s) 7 and correspondingly of surgical robotic arm 4 and surgical tool 7. When all the movements of the surgical robot arm 4 and the surgical tool 7 are suspended, there is no movement in any direction and there is no change in the orientation of the surgical robot arm 4 and the surgical tool 7. When the determination (assertion) of the signal halts the movement of the actuator(s) 7, the clutch signal may be referred to as a "clutch activation signal". Similarly, the input signal may be a "clutch deactivation signal" when no touch by the operator 9 is detected and the movement of the actuator(s) 7 is not paused. A clutch signal, such as a clutch activation signal, when determined, may halt movement of the robotic arm and surgical tool regardless of the spatial status signal. Thus, the clutch signal effectively overrides the actuation command derived from the spatial state signal.

Furthermore, UID 14 may be used to detect an intended user action, a sequence of intended user actions, or a set of intended user actions that indicate a desire by a user to engage or disengage a teleoperational mode. For example, UID 14 may detect a tap, double tap, squeeze, or other gesture associated with UID 14, and the system may be programmed to recognize one or more of these intended user action(s) as a command to engage or disengage the teleoperational mode, and then send an engage or disengage signal to the surgical robotic system to engage or disengage the teleoperational mode. In one embodiment, the intentional action(s) required to disengage or engage a teleoperational mode may be the same, or otherwise considered symmetric actions. For example, a double tap on the finger clutch may be used to engage the teleoperational mode when not in the teleoperational mode, and a double tap of the finger clutch may be used to disengage the teleoperational mode when in the teleoperational mode. In other cases, the intentional action for engaging or disengaging the teleoperational mode may be different (e.g., asymmetric). Further, in some cases, the intentional user action(s) may include positioning UID 14 in docking station 132 (or at least near docking station 132), or may be intentional user action(s) with respect to another input device of surgical robotic system 1. For example, foot-operated controller(s) 13 may include a clutch pedal, and tapping the clutch pedal alone or in combination with any of the intentional actions previously discussed may engage and/or disengage the surgical robotic system from the teleoperational mode. In an embodiment, one or more sensors, such as capacitive sensing pads, may be located on UID 14, foot-operated controller(s) 13 (e.g., clutch pedal), and/or docking station 132, and an intended user action or a set of intended user actions may be detected by the sensing pads and send an engage and/or disengage signal to one or more processors of surgical robotic system 1 to engage and/or disengage the teleoperational mode.

Surgical robotic system 1 may include several UIDs 14, with a respective control signal generated for each UID that controls the actuators of the respective arm 4 and the surgical tool (end effector). For example, remote operator 9 may move first UID 14 to control the movement of actuator 17 in the left robotic arm, where the actuator responds by moving a link, gear, etc. in arm 4. Similarly, movement of second UID 14 by remote operator 9 controls the movement of another actuator 17, which in turn moves other links, gears, etc. of robotic system 1. The robotic system 1 may comprise a right arm 4 fixed to a bed or table on the right side of the patient, and a left arm 4 at the left side of the patient. The actuators 17 may comprise one or more motors controlled such that they drive the rotation of the joints of the arm 4 to change the orientation of the endoscope or grasper of the surgical tool 7 attached to the arm, e.g. relative to the patient. The motion of several actuators 17 in the same arm 4 may be controlled by spatial state signals generated by a particular UID 14. UID 14 may also control the movement of the respective surgical tool graspers. For example, each UID 14 may generate a respective grasping signal to control movement of an actuator (e.g., a linear actuator) that opens or closes jaws of a grasper at a distal end of surgical tool 7 to grasp tissue within patient 6.

In some aspects, communication between the platform 5 and the user console 2 may be through a control tower 3, which the control tower 3 may translate user commands received from the user console 2 (and more particularly, from the console computer system 16) into robot control commands that are transmitted to the arm 4 on the robotic platform 5. The control tower 3 may also transmit status and feedback from the platform 5 back to the user console 2. The communication connections between the robotic platform 5, the user console 2, and the control tower 3 may use any suitable protocol of a variety of data communication protocols via wired and/or wireless links. Any wired connection may optionally be built into the floor and/or walls or ceiling of the operating room. The robotic system 1 may provide video output to one or more displays, including displays within the operating room and remote displays accessible via the internet or other network. The video output or feed may also be encrypted to ensure privacy, and all or part of the video output may be saved to a server or electronic healthcare recording system.

It will be understood that the operating room scenario in fig. 1 is illustrative and may not accurately represent certain medical practices.

Referring to fig. 2, a schematic diagram of a UID with a finger clutch is shown, according to an embodiment. UID 14 may include a device housing 202 to be held by an operator or user 107. For example, user 107 may hold device housing 202 between several fingers and move UID 14 within the workspace. The workspace may be a range within reach (reach) of the user 107. As described below, UID 14 may include tracking sensors to detect a position and/or orientation of device housing 202 as user 107 moves UID 14, and the detected position and/or orientation may be associated with another component of the surgical robotic system. For example, the tracking sensor may detect translation, rotation, or tilting of the device housing 202 within the workspace. The tracking sensors may include accelerometers and/or gyroscopes or other inertial sensors. Movement of UID 14 within the workspace may cause corresponding movement of a surgical robotic arm of the surgical robotic system, a surgical tool, or an end effector (e.g., a grasper or jaws) of the surgical tool.

UID 14 may include a clutch mechanism to decouple movement of UID 14 from movement of surgical robotic arm 112 and/or surgical tool 104. For example, UID 14 may include a finger clutch 206 mounted on device housing 202 to engage and disengage the surgical robotic system. The finger clutch 206 may be so called because it may be actuated by a single pressing action from the finger of the user 107. That is, when the user 107 presses the finger clutch 206 with a finger, the touch may be detected as a clutch input. In response to the clutch input, the movement of UID 14 detected by the tracking sensor may not be used by the one or more processors to control movement of the surgical robotic system. When the clutch input is removed (when the touch ends), movement of UID 14 may again cause corresponding movement of the surgical robotic system. That is, when finger clutch 206 is released, for example by removing a finger from finger clutch 206, UID 14 movement may again be detected by surgical robotic system 1 and used as a motion control input.

When the limits of the workspace have been reached, the clutching mechanism of UID 14 may allow user 107 to reposition UID 14 within the workspace. For example, by extending the arm completely from the starting position in one direction while holding UID 14, user 107 may reach the limits of the workspace, such as the edges of the workspace. To reposition UID 14 within the workspace and allow additional movement in the direction of the workspace edge, user 107 may press finger clutch 206 with the index finger to disconnect the robotic system from movement of UID 14. User 107 may then move UID 14 back to the starting position within the workspace and release surgical robotic system 1 by lifting (lift) the index finger from finger clutch 206. Additional movement in the first direction may then be performed by moving UID 14 to command movement of surgical robotic arm 112.

UID 14 may also include a sensor 204 or sensing portion to detect an intended user action, a sequence of intended user actions, or a set of intended user actions for engaging and/or disengaging a teleoperational mode. In some embodiments, the sensor 204 may be positioned within the finger clutch 206, along the finger clutch 206, on the finger clutch 206, or otherwise near the finger clutch 206 such that it is easily accessible by the user's finger. For example, the sensor 204 may be a pressure sensor or a conductive sensor located within the finger clutch 206, along the finger clutch 206, on the finger clutch 206, or otherwise near the finger clutch 206. Representatively, sensor 204 may be a portion of an outer touch surface of finger clutch 206 that faces outwardly toward the ambient environment. When a finger of the user 107 touches the outer touch surface of the finger clutch 206, the finger is separated from the conductive pads within the finger clutch 206 by the wall thickness of the clutch cover. The clutch cover may be formed of a dielectric material (e.g., plastic) and, thus, when a conductive finger of the user 107 touches the external touch surface, the capacitance across the walls of the finger clutch cover will change.

Referring to fig. 3, a perspective view of a UID is shown, according to an embodiment. As can be seen from this view, device housing 202 of UID 14 may further include a gripping surface 302 to remain between the fingers of user 107. The device housing 202 may have one or more rounded or bulbous (bulbous) surface contours. For example, the device housing 202 may be generally oval or egg-shaped, or it may be ellipsoid. In an embodiment, a portion of the device housing 202 in front of the circumferential ridge 310 of the device housing 202 may be shorter and have a less gradual profile or taper than a portion of the device housing 202 behind the ridge 310.

In an embodiment, the finger clutch 206 is mounted on the housing end 304. For example, the housing end 304 may be the distal end of the device housing 202. The housing end 304 may be a location or surface located at the end of the housing 202 in the first longitudinal direction. For example, the location may be the edge of the housing 202 that is farthest from the opposite end of the housing (e.g., the proximal end 306).

The finger clutch 206 may extend distally from the housing end 304. Positioning finger clutch 206 at the front of UID 14 may allow user 107 to easily reach forward and touch finger clutch 206, and in turn allow sensor 204 to detect such a touch with the index finger while holding gripping surface 302 between the thumb and the other finger. UID 14 may thus be sized and shaped (shape) to be comfortably held in the hand of user 107. Command signals input through UID 14 may be transmitted to computer system 16 by wire or, more preferably, by a wireless connection.

Representatively, in some embodiments, UID 14 may be a portable handheld user input device or controller that is not grounded with respect to another component of the surgical robotic system. For example, UID 14 may be ungrounded when connected (connected) or disconnected from a user console. The term "ungrounded" is intended to refer to implementations in which, for example, both UIDs are neither mechanically nor kinematically constrained with respect to a user console. For example, a user may hold UID 14 in hand and move freely to any possible location and orientation within a space limited only by a tracking mechanism, such as a user console. Signals (e.g., tracking sensor signals, clutch signals, or engage/disengage teleoperation mode signals) may be wirelessly communicated between UID 14 and computer system 16. Further, a power source (such as a rechargeable battery) may be stored within the housing of UID 14 such that it does not need to be mechanically connected to the power source, such as by a wire or cable. Because UID 14 may be unconnected and/or mechanically ungrounded, remote operator 9 and/or bedside operator 8 may carry UID 14 and move about one or more locations within the operating site (e.g., bedside of patient 6) during the procedure. In this regard, UID 14 allows operator 9 and/or operator 8 to control arm 4 and/or surgical tool 7 from one or more locations within the operating site (e.g., various locations near the bedside of patient 6). Further, when at the bedside, the operator may control arm 4 and/or surgical tool 7 simultaneously using UID 14 and a manual tool. The simultaneous manipulation of arm 4 and/or surgical tool 7 and the hand tool reduces the need for another operator (e.g., a surgical assistant) to manipulate the hand tool while remote operator 9 is manipulating arm 4 and/or surgical tool 7.

In addition, portable handheld UID 14 will allow operator 8 or 9 to alternate between sitting at console 10 and controlling arm 7 and/or tool 7 and standing at another location within the operating site (e.g., near the bedside) during the course of a procedure. For example, for certain surgical procedures, the operator 8 or 9 may find his or her movements more comfortable or easier to control while sitting at the console 10, and for others closer to the patient's bedside. Thus, the operator may move between different positions and/or locations as dictated by the procedure.

In other embodiments, wires may extend from UID 14 to connect UID 14 to computer system 16 and/or console 10. The wires may provide power to UID 14 and may transmit sensor signals to computer system 16, such as tracking sensor signals, clutch signals, or engage/disengage teleoperation mode signals. UID 14 may thus be a peripheral device for entering commands into computer system 16. UID 14 may be used in combination with other peripheral input devices. For example, a clutch pedal (e.g., foot-operated controller(s) 13) may be connected to the computer system 16 to provide clutch input or engagement/disengagement input to the surgical robotic system 1. Although each UID 14 may be clutched individually to suspend teleoperation of the respective surgical robotic arm or surgical tool, the respective surgical robotic arms or tools may also be clutched simultaneously by pressing a clutch pedal. Accordingly, movement of actuator 17 may be commanded by UID 14 and other peripheral input devices of computer system 16, such as a clutch pedal operated by a user's foot.

Referring to fig. 4, fig. 4 illustrates a perspective view of a UID in a docking station according to an embodiment. Representatively, docking station 132 may form an interior chamber sized to receive proximal end 306 of UID 14. Furthermore, in some embodiments, docking station 132 may further include docking sensor 404, docking sensor 404 detecting when UID 14 is in docking station 132 or proximal to docking station 132. For example, docking sensor 404 may be a pressure sensor or a capacitive sensor that may detect when UID 14 contacts the surface of docking station 132. In other aspects, sensor 404 may be a sensor that detects when UID 14 is in close proximity to docking station 132, but not necessarily in contact with docking station 132. For example, sensor 404 may be a proximity sensor that detects the proximity of UID 14 to docking station 132. In still further instances, UID 14 may include a sensor in addition to or in lieu of docking sensor 404 to detect when UID 14 is in contact with or in close proximity to docking station 132. Once docking sensor 404 detects UID 14 at docking station 132, docking sensor 404 sends a signal to surgical robotic system 1 indicating that UID 14 is docked (or nearby and about to be docked), and thus, for example, that the teleoperational mode should be disengaged. In addition, docking sensor 404 may also detect when UID 14 is no longer docked or near docking station 132, and thus send a signal indicating that the user is ready to enter a teleoperation mode, and thus that various components (e.g., actuators, tools, and/or end effectors) are ready to re-engage.

Fig. 5 is a block diagram of a computer portion of a surgical robotic system, according to an embodiment. As illustrated in fig. 5, UID 14 may include UID 14L and UID 14R. UID 14L may be operated by the left hand of the user, while UID 14R may be operated by the right hand of the user. For example, UID 14L may be used to control a surgical robotic component, e.g., the left side of a patient, while UID 14R may be used to control a surgical robotic component, e.g., the right side of a patient, as previously discussed. Each of UIDs 14L and 14R may include an interior volume for receiving various electronics and/or other components. For example, each of UIDs 14L and 14R may include a UID processor 506 installed within device housing 202. UID processor 506 may include circuitry for analog and digital signal processing, including sense amplifier circuitry and analog-to-digital conversion circuitry for interfacing with capacitive sensors, and logic circuitry including programmable logic or a programmable digital processor. The UID processor 506 may be mounted on a printed circuit board with various sensor terminals to connect the UID processor 506 to device sensors, such as the finger clutch sensor 502 or the pinch sensor 504. Batteries (not shown) may be mounted on or otherwise associated with the printed circuit boards to power the electronic components of UIDs 14L and 14R.

UIDs 14R and 14L may each further include a finger clutch sensor 502. The finger clutch sensor 502 may be substantially the same as, for example, the sensor 204 previously discussed with reference to fig. 4. Representatively, the finger clutch sensor 502 may be a sensor that can detect a predetermined intended user action or a set of intended user actions indicative of a desire by a user to engage or disengage a teleoperational mode. For example, the finger clutch sensor 502 may be a capacitive sensor associated with the finger clutch 206, and the finger clutch 206 may be actuated when the user 107 taps a finger on the finger clutch cover. The UID processor 506 may be configured to determine that a predetermined intended user action (or sequence of intended user actions) has been performed by the user 107 in response to a change in capacitance of the finger clutch sensor 502. For example, in embodiments, UID processor 506 may be configured to determine that a predetermined intended user action, a sequence of intended user actions, or a set of user actions has occurred when the change in capacitance persists for a predetermined period of time and/or repeats itself for a predetermined time. For example, the finger clutch sensor 502 may detect a single or double finger tap by the user 107 on the finger clutch 206 and transmit a corresponding signal to the UID processor 506. A single or double tap may be a gesture of the user 107 that includes touching a finger on the finger clutch sensor 502 for a predetermined period of time (e.g., 0.5 seconds or less) and, in the case of a double tap, at a predetermined interval (e.g., 0.5 seconds or less between taps). When the detected change in capacitance is greater than a predetermined threshold over a predetermined period of time (and/or interval), UID processor 506 may determine that user 107 has performed one or a set of predetermined intentional actions, indicating a desire to engage and/or disengage a teleoperational mode. Accordingly, UID processor 506 may then generate a corresponding signal that is transmitted to surgical system processor 912 (e.g., computer system 16). The surgical system processor 912 may then determine whether the signal indicates a desire to engage the teleoperational mode or disengage the teleoperational mode, for example, based on whether the system is already in the teleoperational mode. Once the associated command is determined, an engage signal to engage the teleoperational mode or a disengage signal to disengage the teleoperational mode may be generated or otherwise initiated by the processor to engage (e.g., enter) or disengage (e.g., exit) the surgical robotic system from the teleoperational mode.

It will be appreciated that the finger clutch sensor 502 may be one or more sensor types for detecting a touch by the user 107. More particularly, while the finger clutch sensor 502 has been primarily described as including a capacitive sensor, the finger clutch sensor 502 may be a different type of sensor or combination of sensors to determine that the user 107 has touched the finger clutch. For example, finger clutch sensor 502 may include a proximity sensor that may detect the proximity of a finger of user 107 to finger clutch 206 or generally UID 14L or 14R before the finger of user 107 actually contacts the surface. Thus, the above-described embodiments are intended to include different types of sensors that detect touch based on the presence or proximity of an object, and in some cases, do not require detection of a threshold force exerted by the object on the finger clutch 206.

UIDs 14L and 14R may further include crush sensors 504 associated with device housing 202. In an embodiment, the squeeze sensor 504 generates a squeeze signal when squeezed. More particularly, the squeeze sensor 504 is configured to generate a squeeze signal in response to a squeeze on the device housing 202. Thus, the squeeze sensor 504 may detect when the user 107 is engaged (engaged in) an intentional user action, a sequence of intentional user actions, or a set of intentional user actions involving squeezing the device housing 202 as an indication of the user's desire to engage or disengage the teleoperational mode. For example, the squeeze sensor 504 may detect when the user 107 squeezes the device housing 202 once or repeatedly squeezes the device housing 202. Crush sensor 504 may then output a corresponding crush signal to UID processor 506, UID processor 506 in turn transmitting the signal to surgical robotic system processor 912. Surgical robotic system processor 912 may then determine whether the squeeze signal indicates a desire by the user to engage and/or disengage the teleoperational mode. For example, when a squeeze is detected within a predetermined time period (e.g., 0.5 seconds or less), and at predetermined intervals (e.g., less than 0.5 seconds between squeezes in the case of a double squeeze, the processor will interpret this as an intentional user action, a sequence of intentional user actions, or a set of intentional user actions to engage and/or disengage the teleoperational mode. In some cases, the squeeze sensor 504 may be or may be associated with a grip flex circuit that will deform and the physical deformation may be converted into an electrical signal, such as a capacitive signal, when the user 107 squeezes the device housing 202. In other cases, the crush sensor 504 may be an optical sensor, such as a proximity sensor that may detect the proximity (or change in distance) of an inner wall of the device housing 202. In another example, the crush sensor 504 may include an ultrasonic sensor, a magnetic sensor, an inductive sensor, or other suitable kind of proximity sensor. Further, it should be understood that in some embodiments, the user may also squeeze the device housing 202 to control the grasping action of the end effector, however, the squeeze associated with the grasping action is different than an intentional single or double squeeze action indicating a desire by the user to engage and/or disengage the teleoperational mode, and the processor 912 is programmed to distinguish between the two to avoid any inadvertent operation.

UID 14 may include other circuitry. For example, UID 14 may include a drop (drop) detection sensor to prevent inadvertent instrument movement, e.g., when UID 14 is dropped. For example, the drop detection sensor may generate a drop signal in response to entering a free-fall state upon dropping. In embodiments, drop detection sensor may be a tracking sensor that monitors movement of UID 14. When the tracking sensor detects movement corresponding to the drop state, the sensor generates a clutch signal to pause all motion of the surgical robotic system 1. Furthermore, in some embodiments, UID 14 may include a docking sensor. For example, docking sensor 404, discussed previously, may be coupled to UID 14 in addition to docking station 132 or instead of docking station 132. Docking sensors may detect when UID 14 is docked, or otherwise be near docking station 132 in a manner that suggests that the user is about to dock UID 14.

Alternatively, each of docking stations 510L and 510R associated with a respective one of UIDs 14L and 14R may include docking sensor 512. The docking sensor 512 may be substantially the same as the docking sensor 404 previously discussed with reference to fig. 4. For example, docking sensor 512 may be associated with (e.g., mounted to) each docking station 510L and 510R and may detect when a respective one of UIDs 14L and 14R is positioned within docking stations 510L and 510R or proximal to docking stations 510L and 510R. For example, docking sensor 512 may be a pressure sensor or a capacitive sensor that may detect when UID 14L or 14R contacts the surface of docking station 510L or 510R. In other aspects, docking sensor 512 may be a sensor that detects when UID 14L or 14R is in close proximity to docking station 510L or 510R, but not necessarily in contact with docking station 510L or 510R. Once docking sensor 512 detects UID 14L or 14R at docking station 510L or 510R, docking sensor 512 sends a signal to surgical system processor 912. The surgical system processor 912 may then use this information to determine if this is an intentional user action indicative of the user's desire to engage and/or disengage the teleoperational mode. Docking sensor 512 may also detect when UID 14L or 14R is no longer docked or in proximity to docking station 510L or 510R and send a corresponding signal to surgical system processor 912 for use in determining whether to engage and/or disengage the teleoperational mode.

The surgical robotic system may further include a foot-operated controller such as a foot pedal 13. User 107 may use foot pedal 13 to control surgical robotic components (e.g., actuators, tools, and/or end effectors) or to pause operation, for example, in the case of a clutch pedal, as previously discussed. For example, user 107 may press on foot pedal 13 to control a corresponding surgical robotic component, or to suspend operation of a surgical robotic component (in the case of a clutch pedal). In some cases, foot pedal 13 may include or may be a clutch pedal having a clutch sensor 508, which clutch sensor 508, similar to any of sensors 502, 504, and 512 previously discussed, may be used to detect a predetermined intentional user action or a set of intentional user actions that indicate a desire by user 107 to engage or disengage a teleoperational mode. In this case, however, the action may be performed with the user's foot. For example, the pedal clutch sensor 508 may detect an intentional user action, such as a single tap of the user's foot on the clutch pedal, or a double tap of the user's foot on the clutch pedal. For example, when a single foot tap on the foot pedal is detected within a predetermined time period (e.g., 0.5 seconds or less), and in the case of a double tap, at a predetermined interval (e.g., 0.5 seconds or less between taps), the processor will interpret this as an intentional user action, a sequence of intentional user actions, or a set of intentional user actions that indicate a desire to engage and/or disengage the teleoperational mode. Once detected, the foot pedal clutch sensor 508 may send a corresponding signal to the surgical system processor 912, and the processor 912 may determine whether to engage and/or disengage the teleoperational mode based on the signal and, for example, whether the system is already in the teleoperational mode.

An intended user action, a sequence of intended user actions, or a collection of intended user actions, as well as various combinations, which may be detected by the sensors disclosed herein for determining whether to engage and/or disengage a teleoperational mode will now be described in more detail with reference to fig. 6-7. Representatively, fig. 6 is a block diagram of an exemplary process for disengaging a teleoperational mode, and fig. 7 and 8 illustrate an exemplary process for engaging the teleoperational mode upon detection of an intended user action or a set of intended user actions. As previously discussed, the intended user action, the sequence of intended user actions, or the set of intended user actions may include a single action, a repetitive action, and/or a combination of actions detected by the surgical robotic system and recognized by the surgical robotic system as a user command to engage and/or disengage the teleoperational mode. Furthermore, in some embodiments, the same sequence of intended user actions or set of intended user actions used, for example, to disengage a teleoperation mode may also be used to engage the teleoperation mode. In this regard, an intended user action, a sequence of intended user actions, or a set of intended user actions for engaging and/or disengaging a teleoperational mode may be considered a symmetric action. In other embodiments, the intended user action, the sequence of intended user actions, or the set of intended user actions are different or asymmetric. It should further be appreciated that because both UIDs 14L and 14R may be mechanically ungrounded, uncoupled from any other component of the surgical robotic system, or unconstrained, intended user actions, sequences of intended user actions, or collections of intended user actions and/or various combinations of actions may be performed easily and at any number of locations within the surgical site.

Referring now to fig. 6 in more detail, fig. 6 illustrates a process for disengaging a teleoperational mode, where process 600 may include an initial operation of detecting intentional user action(s) (block 602). Representatively, in one embodiment, the intentional sequence of user action(s) detected by the surgical robotic system is a double tap of the UID finger clutch on both UIDs simultaneously. For example, the user may double tap left UID 14L and right UID 14R at the same time, and each double tap may be detected by a respective finger clutch sensor 502 associated with each UID 14L and 14R, as previously discussed. It should be appreciated that the double tap or other intentional user action(s) detected by the finger clutch sensor 502 to engage or disengage the teleoperational mode is different from the user input that would cause the clutching operation (e.g., pressing the finger clutch) to halt the system, as previously discussed.

In another embodiment, the intentional user action(s) detected at operation 502 may be a docking action in which one or more of the UIDs are docked in or positioned near their respective docking stations. Typically, docking of both UIDs 14L and 14R, docking one of UIDs 14L or 14R, or positioning one or both UIDs 14L and 14R near their respective docking stations 510L and/or 510R may be intentional action(s) that are detected and indicate a desire by a user to disengage from a teleoperational mode. For example, in one aspect, docking of both UIDs 14L and 14R (or positioning both UIDs 14L and 14R near a docking station) may result in disengagement of a teleoperation mode and may prevent all operations controlled by UIDs 14L and 14R. In other embodiments, docking of one of UIDs 14L or 14R (or positioning one of UIDs 14L and 14R near a docking station) may result in disengagement of the teleoperation mode and may prevent all operations controlled by UIDs 14L and 14R. In still further embodiments, docking of only UID 14L (or positioning UID 14L near a docking station) may be used to disengage the teleoperation mode only with respect to surgical robotic components controlled by UID 14L, and not with respect to surgical robotic components controlled by UID 14R. In other words, the user may continue to operate the surgical robotic components controlled by UID 14R. Similarly, docking of UID 14R (or positioning UID 14R near a docking station) may be used to disengage teleoperation mode only with respect to surgical robotic components controlled by UID 14R, and not with respect to surgical robotic components controlled by UID 14L. Further, it should be appreciated that docking of one or both UIDs 14L and/or 14R may be used to disengage teleoperational modes of any number of surgical robotic components (e.g., one surgical robotic component, two surgical robotic components, three surgical robotic components, or more).

It should further be appreciated that although specific intentional user action(s) have been mentioned, it is contemplated that multiple intentional user actions and/or combinations of intentional user action(s) with respect to various input devices of the system may be detected in operation 602. Table 1 provides a list of predetermined intended user action(s), sequence of user action(s), set of intended user action(s), and/or combination of intended user action(s) that may be performed in combination (e.g., in order) and detected and recognized by the system as indicating that the user desires to engage and/or disengage the teleoperational mode.

TABLE 1

It should be further understood that in some embodiments, in addition to the intentional user action(s) previously discussed, the user may press and hold the clutch pedal prior to performing any one or more of the action(s). For example, in one embodiment, the user may press and hold the clutch pedal, then (1) tap two finger clutches a single time, (2) tap two finger clutches a double time, (3) squeeze two UIDs a single time, and/or (4) squeeze two UIDs a double time. For example, when UIDS 14L and 14R are at a position/orientation that makes the finger clutch hard to reach, if the user desires to disengage, the user may press and hold the clutch pedal, orient UID to better reach the finger clutch, double tap the UID finger clutch (the system disengages at this point), and release the clutch pedal.

Returning now to process 600, process 600 may further include determining operation of the surgical robotic system in a teleoperational mode (block 604). For example, since, as previously discussed, in some embodiments, the intended user action(s) indicating a desire to disengage the teleoperational mode may be the same as those intended user actions used to engage the teleoperational mode, the current mode of the surgical robotic system may be determined in order to interpret the action(s). In the event that it is determined that the surgical robotic system is in the teleoperational mode, in operation 606, a sequence of intentional action(s) is determined to indicate a desire to exit the teleoperational mode. The teleoperational mode of the surgical robotic system is then disengaged in operation 608.

Further, in some embodiments, process 600 includes optional user feedback operations (block 610) that provide feedback to the user, alerting them to mode transitions and/or the current operating mode. Representatively, the feedback may be visual (e.g., on a screen or LED lights on the display 15 or console 2), auditory (e.g., tones (tones) from a surgeon bridge or control tower 130), or tactile (e.g., UID vibration). Any one or combination of these feedback systems is possible, for example, when the user is out of the teleoperational mode. For example, haptic feedback mechanisms may be incorporated into one or more of UIDs 14L and 14R that will give the user a physical sensation that they have disengaged control of the associated surgical robotic tool. The visual feedback may be a LED strip on or around the display 15 and for example a green light may indicate that the system is disengaged from the teleoperation mode and a red light may indicate that the system is engaged in the teleoperation mode. Still further, the user may receive on-screen visual feedback from a graphical user interface (which may involve text, icons, or both) alerting them to the current mode. The notification may automatically disappear after the user completes disengagement of the teleoperational mode, or after the current time period. Further consider an audible warning (e.g., beep, tone, etc.).

Fig. 7 illustrates a process for engaging a teleoperational mode. The process 700 may include an initial operation of detecting intentional user action(s) (block 702). Representatively, in one embodiment, the intentional user action(s) detected by the surgical robotic system is double tapping the UID finger clutches on both UIDs simultaneously to engage the teleoperational mode. For example, the user may double tap left UID 14L and right UID 14R at the same time, and each double tap may be detected by a corresponding finger clutch sensor 502 associated with each UID 14L and 14R, as previously discussed.

It should further be appreciated that while specific intentional user action(s) have been mentioned that indicate a desire to engage a teleoperational mode, it is contemplated that multiple intentional user actions and/or combinations of intentional user action(s) may be performed in combination (e.g., in sequence) using one or more input devices of the system that may be detected in operation 702. Representatively, in one embodiment, the intentional user action(s) detected by the surgical robotic system may be any one or more of the intentional user actions listed in table 2.

TABLE 2

It should be further understood that in some embodiments, in addition to the intentional user action(s) previously discussed, the user may press and hold the clutch pedal prior to performing any one or more of the action(s). For example, in one embodiment, the user may press and hold the clutch pedal, then (1) tap two finger clutches a single time, (2) tap two finger clutches a double time, (3) squeeze two UIDs a single time, and/or (4) squeeze two UIDs a double time. For example, if a user desires to orient UIDs 14L and 14R in a particular manner (e.g., to match a tool), the user may also press and hold a clutch pedal, then double tap the UID finger clutch (on one or both of UIDs 14L and 14R), orient the UID as desired, and then release the clutch pedal. Further, in some embodiments, prior to engaging the teleoperational mode, the user may need to perform various unlocking gestures, such as locking an icon using a graphical user interface, and then aiming the UID, as will be discussed in more detail with reference to fig. 8.

Returning now to process 700, process 700 may further include determining operation of the surgical robotic system not in the teleoperational mode (block 704). For example, since, as previously discussed, in some embodiments, intended user action(s) or a sequence of intended user actions indicating a desire to engage the teleoperational mode may be the same as those intended user actions for disengaging the teleoperational mode, a current mode of the surgical robotic system may be determined in order to interpret the action(s). In the event that it is determined that the surgical robotic system is not in the teleoperational mode, in operation 706, a sequence of intentional actions is determined to indicate a desire to engage the teleoperational mode. The teleoperational mode of the surgical robotic system is then engaged in operation 708.

Further, in some embodiments, process 700 includes an optional user feedback operation (block 710) that provides feedback to the user, alerting them to the mode transition and/or current operating mode. Typically, the feedback may be visual (e.g., on a screen or LED lights on the display 15 or console 2), auditory (e.g., tones from the surgeon bridge or control tower 130), or tactile (e.g., UID vibration). Any one or combination of these feedback systems is possible, for example, when the user engages the teleoperational mode. For example, haptic feedback mechanisms may be incorporated into one or more of UIDs 14L and 14R that will give the user a physical sensation that they have engaged control of the associated surgical robotic tool. The visual feedback may be a LED strip on or around the display 15 and for example a red light may indicate that the system is engaged in the teleoperational mode and a green light may indicate that the system is disengaged from the teleoperational mode. Still further, the user may receive on-screen visual feedback from a graphical user interface (which may involve text, icons, or both) alerting the user that the system is in the teleoperational mode. The notification may automatically disappear after the user completes engagement of the teleoperational mode, or after the current time period. Further consider an audible warning (e.g., beep, tone, etc.).

Further, it should be appreciated that there may be additional security requirements that must be met, in addition to the intentional user action(s) described herein, prior to engaging the teleoperational mode, to ensure that it is safe to enter the teleoperational mode. Typically, the surgical robotic system may further include additional or alternative intentional actions, including alignment requirements that must be met before the teleoperational mode is engaged. Thus, it is contemplated that in some embodiments, detection of intentional user action(s), such as described with reference to operation 700, may occur after such additional or alternative actions occur, for example, and before the final alignment operation. This process is illustrated in more detail in fig. 8. Representatively, a process 800 for engaging a teleoperational mode includes determining that a surgical robotic system is in a non-teleoperational mode (802), and detecting a first set or sequence of intended user actions (block 804). Representative actions may be, for example, locking a chair, detecting that a user is looking at a display screen, detecting that a UID is not docked, and that the UID is in a workspace (e.g., using a tracking system). Next, a second set or sequence of intentional user action(s) is detected indicating a desire to enter a teleoperation mode as previously discussed (block 806). The process 800 further includes an additional operation of detecting alignment of at least one of the UIDs with a corresponding surgical robotic component (e.g., an actuator, a tool, and/or an end effector) (block 808). For example, during a surgical procedure, a surgical robotic tool may be positioned within a patient's body and the rotation/orientation of the tool displayed to a user. Before engaging the teleoperation, the user must align the rotation/orientation of the UID (he is held) with the rotation/orientation of the tool within the patient, as viewed on the display. Then, upon detecting each of the first intentional user action and the second intentional user action(s) and satisfying the alignment operation (e.g., operations 804 and 808), the teleoperational mode is engaged (block 810).

Referring now to fig. 9, fig. 9 is a block diagram of a computer portion of a surgical robotic system for performing the operations previously discussed, according to an embodiment. Surgical robotic system 1 may generally include UID(s) 14, user console 2 having computer system 16, and robotic components 104, 112 associated with actuators 17. Computer system 16 and UID 14 have circuitry suitable for the particular function, and thus, the illustrated circuitry is provided as an example and not a limitation.

One or more processors of the user console 2 may control portions of the surgical robotic system 1, such as the surgical robotic arm 112 and/or the surgical tool 104. UID 14 may be communicatively coupled to computer system 16 (of user console 2) and/or surgical robotic system 1 to provide input commands that are processed by one or more processors of system 1 to control movement of surgical robotic arm 112 and/or surgical tool 104 mounted on the arm. For example, UID 14 may transmit electrical command signal 902 to computer system 16, such as a spatial state signal generated by UID processor 606 in response to a signal from tracking sensor 922, or a clutch signal generated by UID processor 606. The electrical signal may be an input command that causes movement of the surgical robotic system 1 or pauses movement of the surgical robotic system 1. Further, the input command may correspond to intentional user action(s) detected by UID 14 indicating a desire to engage and/or disengage the teleoperational mode, and which may be used by a processor of system 1 to engage and/or disengage the teleoperational mode, as described with reference to fig. 6-8.

The input electrical signals may be transmitted by UID processor 606 to console processor 906 of computer system 16 via a wired or wireless connection. For example, UID 14 may transmit command signals 902 to console processor 906 via a wire. Alternatively, UID 14 may transmit command signals 902 to console processor 906 via wireless communication link 904. Wireless communication links may be established by computer system 16 and the respective RF circuitry of UID 14. The wireless communication may be via radio frequency signals, e.g., Wi-Fi or short range signals, and/or a suitable wireless communication protocol, such as bluetooth.

The console processor 906 of the computer system 16 may execute instructions to perform the various functions and capabilities described above. Instructions executed by console processor(s) 906 of user console 2 may be retrieved from a local memory (not shown), which may include a non-transitory machine-readable medium. The instructions may be in the form of an operating system program having device drivers to control components of the surgical robotic system 1 (e.g., the actuators 17 operably coupled to the surgical robotic arm(s) 112 or surgical tool(s) 104), and/or engage/disengage teleoperational modes.

In an embodiment, console processor 906 controls the components of user console 2. For example, one or more seat actuators 909 may receive commands from the console processor 906 to control movement of the seat 122. The seat actuator(s) 909 may move the seat 122 in one or more degrees of freedom (such as forward/backward, backrest tilt, headrest position, etc.), for example, to align the seat 122 with the display 15 (to engage the teleoperational mode). Console processor 906 may also transmit video data for presentation on display 15. Thus, console processor 906 may control the operation of user console 2. Input commands to the seat actuator(s) 909 or console processor 906 may be input by a user via foot pedal(s) 13 or another input device 911 such as a keyboard or joystick.

The console processor 906 may output control signals 903 to other components of the surgical robotic system 1 via a link 910. Control signals 903 may be transmitted to control the movement of the surgical robotic system 1. In an embodiment, the computer system 16 is communicatively coupled to downstream components of the surgical robotic system 1, such as the control tower 130, via a wired or wireless link. The link may transmit the control signal 903 to one or more surgical system processors 912. For example, at least one processor 912 may be located in the control tower 130 and may be communicatively coupled to system components, such as the surgical robotic platform 5 or one or more displays 920. Actuator 17 of surgical robotic system 1 may receive control signals from surgical system processor 912 to cause movement of arm 112 and/or tool 104 corresponding to movement of UID 14. The control signals may also pause the movement of the robotic components by engaging and/or disengaging an interlock (interlock) of the surgical robotic system 1 when the user 107 presses the finger clutch 206 or drops the UID 14, or may engage and/or disengage the system from the teleoperational mode.

In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope of the invention as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.

29页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:手术工具

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!