Combining strain-based shape sensing with catheter control

文档序号:554725 发布日期:2021-05-14 浏览:26次 中文

阅读说明:本技术 将基于应变的形状感测与导管控制相结合 (Combining strain-based shape sensing with catheter control ) 是由 C·F·格拉茨 D·P·诺南 于 2019-08-05 设计创作,主要内容包括:本发明提供了可改进基于应变的形状感测的用于导航管腔网络的机器人系统和方法。在一个方面,该系统可将基于应变的形状数据与基于机器人数据(例如,运动学模型数据、扭矩测量、机械模型数据、命令数据等)确定的形状数据进行比较,并且根据需要调整该基于应变的形状数据。可基于该比较来调整、不同地加权或丢弃该基于应变的形状数据的任何部分。例如,来自可信来源的数据可指示器械的形状表现出或应当表现出一个或多个特性。如果该系统确定该基于应变的形状数据的任何部分与此类特性不一致,则该系统可调整该基于应变的形状数据的该部分,使得经调整的基于应变的形状数据与该器械的特性一致。(The present invention provides robotic systems and methods for navigating luminal networks that may improve strain-based shape sensing. In one aspect, the system can compare the strain-based shape data to shape data determined based on robot data (e.g., kinematic model data, torque measurements, mechanical model data, command data, etc.) and adjust the strain-based shape data as needed. Any portion of the strain-based shape data may be adjusted, weighted differently, or discarded based on the comparison. For example, data from a trusted source may indicate that the shape of the instrument exhibits or should exhibit one or more characteristics. If the system determines that any portion of the strain-based shape data is inconsistent with such characteristics, the system may adjust the portion of the strain-based shape data such that the adjusted strain-based shape data is consistent with the characteristics of the instrument.)

1. A method of controlling an instrument within an interior region of a body, the method comprising:

accessing robot data about the instrument;

accessing strain data from an optical fiber positioned within the instrument, the strain data indicative of strain on a portion of the instrument positioned within the interior region of the body;

determining shape data based on the strain data;

comparing the robot data and the shape data;

adjusting the shape data based on the comparison of the robot data and the shape data;

determining an estimated state of the instrument based on the adjusted shape data; and

outputting the estimated state of the instrument.

2. The method of claim 1, wherein adjusting the shape data comprises modifying at least a portion of the shape data such that the determination of the estimated state of the instrument is based on the modified portion of the shape data.

3. The method of claim 1, wherein adjusting the shape data comprises removing at least a portion of the shape data such that the determination of the estimated state of the instrument is not based on the removed portion of the shape data.

4. The method of claim 1, further comprising:

accessing Electromagnetic (EM) data captured using (i) an EM sensor located proximal to a tip of the instrument and (ii) an EM field generator located outside the body;

comparing the EM data and the shape data; and

further adjusting the shape data based on the comparison of the EM data and the shape data.

5. The method of claim 1, further comprising:

accessing image data captured by an imaging device located proximal to a tip of the instrument;

comparing the image data and the shape data; and

further adjusting the shape data based on the comparison of the image data and the shape data.

6. The method of claim 1, wherein the strain data is generated based on a Fiber Bragg Grating (FBG) produced on a portion of the optical fiber.

7. The method of claim 1, wherein the shape data comprises one of a curvature value of the portion of the instrument or a time history data of the portion of the instrument.

8. The method of claim 7, further comprising adjusting the shape data based on determining that the curvature value is greater than or equal to a threshold curvature value in the robot data.

9. The method of claim 7, further comprising adjusting the shape data based on determining that the time history data satisfies a threshold time history condition in the robot data.

10. The method of claim 1, further comprising adjusting the shape data based on temperature changes.

11. The method of claim 1, further comprising adjusting the shape data based on determining that the tip of the instrument is articulating.

12. The method of claim 1, further comprising adjusting the shape data based on determining that a non-shape changing strain is being applied to the instrument.

13. The method of claim 1, further comprising assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes a distal end of the instrument, the confidence value being higher than a confidence value assigned to the shape data corresponding to the first portion.

14. The method of claim 1, further comprising assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes a proximal end of the instrument, the confidence value being lower than a confidence value assigned to the shape data corresponding to the first portion.

15. The method of claim 1, further comprising determining an estimated state of a sheath covering the instrument based on the estimated state of the instrument.

16. The method of claim 1, further comprising assigning a confidence value to the shape data based on a comparison of the shape data and additional data indicative of a shape of a sheath covering the instrument.

17. The method of claim 1, further comprising:

determining that damage is imminent to the instrument based on the estimated state of the instrument; and

controlling the instrument to avoid the damage.

18. The method of claim 1, further comprising:

determining that a mismatch between the robot data and the shape data has been detected for at least a threshold amount of time; and

outputting a warning indicating that the instrument may be damaged.

19. A non-transitory computer readable storage medium having stored thereon instructions that, when executed, cause a processor of an apparatus to at least:

accessing robot data about an instrument;

accessing strain data from an optical fiber positioned within the instrument, the strain data indicative of strain on a portion of the instrument positioned within an interior region of a body;

determining shape data based on the strain data;

comparing the robot data and the shape data;

adjusting the shape data based on the comparison of the robot data and the shape data;

determining an estimated state of the instrument based on the adjusted shape data; and

outputting the estimated state of the instrument.

20. The non-transitory computer-readable storage medium of claim 19, wherein adjusting the shape data comprises modifying at least a portion of the shape data such that the determination of the estimated state of the instrument is based on the modified portion of the shape data.

21. The non-transitory computer-readable storage medium of claim 19, wherein adjusting the shape data comprises removing at least a portion of the shape data such that the determination of the estimated state of the instrument is not based on the removed portion of the shape data.

22. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to:

accessing Electromagnetic (EM) data captured using (i) an EM sensor located proximal to a tip of the instrument and (ii) an EM field generator located outside the body;

comparing the EM data and the shape data; and

further adjusting the shape data based on the comparison of the EM data and the shape data.

23. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to:

accessing image data captured by an imaging device located proximal to a tip of the instrument;

comparing the image data and the shape data; and

further adjusting the shape data based on the comparison of the image data and the shape data.

24. The non-transitory computer-readable storage medium of claim 19, wherein adjusting the shape data comprises adjusting a confidence value of the shape data.

25. The non-transitory computer readable storage medium of claim 19, wherein the strain data is generated based on a Fiber Bragg Grating (FBG) produced on a portion of the optical fiber.

26. The non-transitory computer-readable storage medium of claim 19, wherein the shape data comprises one of a curvature value of the portion of the instrument or a temporal history of the portion of the instrument.

27. The non-transitory computer readable storage medium of claim 26, wherein the instructions, when executed, further cause the processor to adjust the shape data based on determining that the curvature value is greater than or equal to a threshold curvature value in the robot data.

28. The non-transitory computer readable storage medium of claim 26, wherein the instructions, when executed, further cause the processor to adjust the shape data based on determining that the time history data satisfies a threshold time history condition in the robot data.

29. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to adjust the shape data based on a change in temperature.

30. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to adjust the shape data based on determining that the tip of the instrument is articulating.

31. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to adjust the shape data based on determining that a non-shape-changing strain is being applied to the instrument.

32. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to assign a confidence value to the robot data corresponding to a first portion of the instrument based on determining that the first portion includes a distal end of the instrument, the confidence value being higher than a confidence value assigned to the shape data corresponding to the first portion.

33. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to assign a confidence value to the robot data corresponding to a first portion of the instrument based on determining that the first portion includes a proximal end of the instrument, the confidence value being lower than a confidence value assigned to the shape data corresponding to the first portion.

34. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to determine an estimated state of a sheath covering the instrument based on the estimated state of the instrument.

35. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to assign a confidence value to the shape data based on a comparison of the shape data and additional data indicative of a shape of a sheath covering the instrument.

36. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to:

determining that damage is imminent to the instrument based on the estimated state of the instrument; and

controlling the instrument to avoid the damage.

37. The non-transitory computer readable storage medium of claim 19, wherein the instructions, when executed, further cause the processor to:

determining that a mismatch between the robot data and the shape data has been detected for at least a threshold amount of time; and

outputting a warning indicating that the instrument may be damaged.

38. A medical robotic system for controlling instruments within an interior region of a body, the system comprising:

an instrument having an optical fiber positioned within the instrument;

a sensor configured to generate strain data indicative of strain on a portion of the instrument positioned within the interior region of the body;

an instrument positioning device attached to the instrument and configured to move the instrument;

at least one computer-readable memory having executable instructions stored thereon; and

one or more processors in communication with the at least one computer-readable memory and configured to execute the instructions to cause the system to at least:

accessing robot data about the instrument;

accessing the strain data;

determining shape data based on the strain data;

comparing the robot data and the shape data;

adjusting the shape data based on the comparison of the robot data and the shape data;

determining an estimated state of the instrument based on the adjusted shape data; and

outputting the estimated state of the instrument.

39. The medical robotic system according to claim 38, wherein adjusting the shape data includes modifying at least a portion of the shape data such that the determination of the estimated state of the instrument is based on the modified portion of the shape data.

40. The medical robotic system according to claim 38, wherein adjusting the shape data includes removing at least a portion of the shape data such that the determination of the estimated state of the instrument is not based on the removed portion of the shape data.

41. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to:

accessing Electromagnetic (EM) data captured using (i) an EM sensor located proximal to a tip of the instrument and (ii) an EM field generator located outside the body;

comparing the EM data and the shape data; and

further adjusting the shape data based on the comparison of the EM data and the shape data.

42. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to:

accessing image data captured by an imaging device located proximal to a tip of the instrument;

comparing the image data and the shape data; and

further adjusting the shape data based on the comparison of the image data and the shape data.

43. The medical robotic system according to claim 38, wherein adjusting the shape data includes adjusting a confidence value of the shape data.

44. The medical robotic system according to claim 38, wherein the strain data is generated based on a Fiber Bragg Grating (FBG) produced on a portion of the optical fiber.

45. The medical robotic system according to claim 38, wherein the shape data includes one of a curvature value of the portion of the instrument or time history data of the portion of the instrument.

46. The medical robotic system of claim 45, wherein the instructions, when executed, further cause the system to adjust the shape data based on determining that the curvature value is greater than or equal to a threshold curvature value in the robot data.

47. The medical robotic system of claim 45, wherein the instructions, when executed, further cause the system to adjust the shape data based on determining that the time history data satisfies a threshold time history condition in the robot data.

48. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to adjust the shape data based on temperature changes.

49. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to adjust the shape data based on determining that a tip of the instrument is articulating.

50. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to adjust the shape data based on determining that a non-shape-changing strain is being applied to the instrument.

51. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to assign a confidence value to the robotic data corresponding to a first portion of the instrument based on determining that the first portion includes a distal end of the instrument, the confidence value being higher than a confidence value assigned to the shape data corresponding to the first portion.

52. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to assign a confidence value to the robot data corresponding to a first portion of the instrument based on determining that the first portion includes a proximal end of the instrument, the confidence value being lower than a confidence value assigned to the shape data corresponding to the first portion.

53. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to determine an estimated state of a sheath covering the instrument based on the estimated state of the instrument.

54. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to assign a confidence value to the shape data based on a comparison of the shape data and additional data indicative of a shape of a sheath covering the instrument.

55. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to:

determining that damage is imminent to the instrument based on the estimated state of the instrument; and

controlling the instrument to avoid the damage.

56. The medical robotic system according to claim 38, wherein the instructions, when executed, further cause the system to:

determining that a mismatch between the robot data and the shape data has been detected for at least a threshold amount of time; and

outputting a warning indicating that the instrument may be damaged.

57. A method of controlling an instrument within an interior region of a body, the method comprising:

accessing robot data about the instrument;

accessing strain data from an optical fiber positioned within the instrument, the strain data indicative of strain on a portion of the instrument positioned within the interior region of the body;

determining shape data based on the strain data;

comparing the robot data and the shape data;

adjusting a confidence value associated with the shape data based on the comparison of the robot data and the shape data;

determining an estimated state of the instrument based on the adjusted confidence value; and

outputting the estimated state of the instrument.

58. The method of claim 57, further comprising:

accessing Electromagnetic (EM) data captured using (i) an EM sensor located proximal to a tip of the instrument and (ii) an EM field generator located outside the body;

comparing the EM data and the shape data; and

adjusting the confidence value associated with the shape data further based on the comparison of the EM data and the shape data.

59. The method of claim 57, further comprising:

accessing image data captured by an imaging device located proximal to a tip of the instrument;

comparing the image data and the shape data; and

adjusting the confidence value associated with the shape data further based on the comparison of the image data and the shape data.

60. The method of claim 57, wherein the strain data is generated based on a Fiber Bragg Grating (FBG) produced on a portion of the optical fiber.

61. The method of claim 57, wherein the shape data includes one of a curvature value of the portion of the instrument or a time history data of the portion of the instrument.

62. The method of claim 61, further comprising adjusting the confidence value based on determining that the curvature value is greater than or equal to a threshold curvature value in the robot data.

63. The method of claim 61, further comprising adjusting the confidence value based on determining that the time history data satisfies a threshold time history condition in the robot data.

64. The method of claim 57, further comprising adjusting the confidence value based on a change in temperature.

65. The method of claim 57, further comprising adjusting the confidence value based on a determination that the tip of the instrument is articulating.

66. The method of claim 57, further comprising adjusting the confidence value based on determining that a non-shape-changing strain is being applied to the instrument.

67. The method of claim 57, further comprising assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes a distal end of the instrument, the confidence value being higher than a confidence value assigned to the shape data corresponding to the first portion.

68. The method of claim 57, further comprising assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes a proximal end of the instrument, the confidence value being lower than a confidence value assigned to the shape data corresponding to the first portion.

69. The method of claim 57, further comprising determining an estimated state of a sheath covering the instrument based on the estimated state of the instrument.

70. The method of claim 57, further comprising adjusting the confidence value further based on a comparison of the shape data and additional data indicative of a shape of a sheath covering the instrument.

71. The method of claim 57, further comprising:

determining that damage is imminent to the instrument based on the estimated state of the instrument; and

controlling the instrument to avoid the damage.

72. The method of claim 57, further comprising:

determining that a mismatch between the robot data and the shape data has been detected for at least a threshold amount of time; and

outputting a warning indicating that the instrument may be damaged.

Technical Field

The systems and methods disclosed herein relate to surgical robots, and more particularly to navigation of medical instruments within a tubular network of a patient's body.

Background

Bronchoscopy is a medical procedure that allows a physician to examine the internal condition of a patient's pulmonary airways, such as the bronchi and bronchioles. The lung airways carry air from the trachea or airways to the lungs. During a medical procedure, a thin, flexible tubular tool called a bronchoscope may be inserted into the patient's mouth and down through the patient's larynx into his/her lung airway, and the patient is typically anesthetized in order to relax their larynx and lung cavity for surgical examination and surgery during the medical procedure.

The bronchoscope may include a light source and a small camera that allows a physician to examine the patient's trachea and airways, and a rigid tube may be used in conjunction with the bronchoscope for surgical purposes, for example, when there is a significant amount of bleeding in the patient's lungs or when a large object obstructs the patient's throat. When using rigid tubes, the patient is typically anesthetized. Robotic bronchoscopes provide great advantages when navigating in a tubular network. They can be convenient to use and allow convenient administration of treatments and biopsies even during the bronchoscopy phase.

In addition to a mechanical device or platform (e.g., the robotic bronchoscope described above), various methods and software models can be used to assist in performing the surgical procedure. For example, Computed Tomography (CT) scans of a patient's lungs are typically performed during the pre-operative period of a surgical examination. Data from the CT scan may be used to generate a three-dimensional (3D) model of the airways of the lungs of the patient, and the generated 3D model enables the physician to access visual references that may be useful during the course of operation of the surgical examination.

However, even when employing medical devices (e.g., robotic bronchoscopes) and when using existing methods (e.g., performing CT scans and generating 3D models), prior techniques for navigating tubular networks still have challenges. For example, based on changes in the position and orientation of the device, motion estimation of a medical device (e.g., a bronchoscopic tool) within the patient's body may be inaccurate, and thus the position of the device may not be accurately or correctly positioned within the patient's body in real-time. Inaccurate positional information of such instruments can provide misleading information to physicians using 3D models as visual references during medical procedures.

Accordingly, there is a need for improved techniques for navigating in a network of tubular structures.

Disclosure of Invention

Robotic systems and methods for navigating luminal networks that may improve strain-based shape sensing are described. In one aspect, the system may compare the strain-based shape data to shape data determined based on robot data (e.g., command data, force and distance data, mechanical model data, kinematic model data, etc.) and adjust the strain-based shape data as needed. Any portion of the strain-based shape data may be adjusted, weighted differently, or discarded based on the comparison. For example, data from a trusted source may indicate that the shape of the instrument exhibits or should exhibit one or more characteristics. If the system determines that any portion of the strain-based shape data is inconsistent with such characteristics, the system may adjust the portion of the strain-based shape data such that the adjusted strain-based shape data is consistent with the characteristics of the instrument.

Accordingly, one aspect relates to a method of navigating an instrument within an interior region of a body. The method can comprise the following steps: accessing robot data about an instrument; accessing strain data from an optical fiber positioned within an instrument, the strain data indicative of strain on a portion of the instrument positioned within an interior region of a body; determining shape data based on the strain data; comparing the robot data and the shape data; adjusting the shape data based on a comparison of the robot data and the shape data; determining an estimated state of the instrument based on the adjusted shape data; and outputting the estimated state of the instrument.

The aspects described in the preceding paragraphs may also include one or more of the following features in any combination: (a) wherein adjusting the shape data comprises modifying at least a portion of the shape data such that the determination of the estimated state of the instrument is based on the modified portion of the shape data; (b) wherein adjusting the shape data comprises removing at least a portion of the shape data such that the determination of the estimated state of the instrument is not based on the removed portion of the shape data; (c) wherein the method further comprises accessing EM data captured using (i) an Electromagnetic (EM) sensor located proximal to a tip of the instrument and (ii) at least one external EM sensor or EM field generator located outside the body, comparing the EM data to the shape data, and further adjusting the shape data based on the comparison of the EM data and the shape data; (d) wherein the method further comprises accessing image data captured by an imaging device located proximal to the tip of the instrument, comparing the image data to the shape data, and further adjusting the shape data based on the comparison of the image data and the shape data; (e) wherein the strain data is generated based on a Fiber Bragg Grating (FBG) produced on a portion of the optical fiber; (f) wherein the shape data includes one of a curvature value of the portion of the instrument or time history data of the portion of the instrument; (g) wherein the method further comprises adjusting the shape data based on determining that the curvature value is greater than or equal to a threshold curvature value in the robot data; (h) wherein the method further comprises adjusting the shape data based on determining that the time history data satisfies a threshold time history condition in the robot data; (i) wherein the method further comprises adjusting the shape data based on the temperature change; (j) wherein the method further comprises adjusting the shape data based on determining that the tip of the instrument is articulating; (k) wherein the method further comprises adjusting the shape data based on determining that a non-shape changing strain is being applied to the instrument; (l) Wherein the method further comprises assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes the distal end of the instrument, the confidence value being higher than the confidence value assigned to the shape data corresponding to the first portion; (m) wherein the method further comprises assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes the proximal end of the instrument, the confidence value being lower than the confidence value assigned to the shape data corresponding to the first portion; (n) wherein the method further comprises determining an estimated state of a sheath covering the instrument based on the estimated state of the instrument; (o) wherein the method further comprises assigning a confidence value to the shape data based on a comparison of the shape data and additional data indicative of a shape of a sheath covering the instrument; (p) wherein the method further comprises determining that damage is imminent to the instrument based on the estimated state of the instrument, and controlling the instrument to avoid the damage; and (q) wherein the method further comprises determining that a mismatch between the robot data and the shape data has been detected for at least a threshold amount of time, and outputting a warning indicating that the instrument may be damaged.

Another aspect relates to a method of navigating an instrument within an interior region of a body. The method can comprise the following steps: accessing robot data about an instrument; accessing strain data from an optical fiber positioned within an instrument, the strain data indicative of strain on a portion of the instrument positioned within an interior region of a body; determining shape data based on the strain data; comparing the robot data and the shape data; adjusting a confidence value associated with the shape data based on the comparison of the robot data and the shape data; determining an estimated state of the instrument based on the adjusted confidence value; and outputting the estimated state of the instrument.

The aspects described in the preceding paragraphs may also include one or more of the following features in any combination: (a) wherein the method further comprises accessing EM data captured using (i) an Electromagnetic (EM) sensor located proximal to a tip of the instrument and (ii) at least one external EM sensor or EM field generator located outside the body, comparing the EM data to the shape data, and adjusting a confidence value associated with the shape data further based on the comparison of the EM data and the shape data; (b) wherein the method further comprises accessing image data captured by an imaging device located proximal to the tip of the instrument, comparing the image data to the shape data, and adjusting a confidence value associated with the shape data further based on the comparison of the image data and the shape data; (c) wherein the strain data is generated based on a Fiber Bragg Grating (FBG) produced on a portion of the optical fiber; (d) wherein the shape data includes one of a curvature value of the portion of the instrument or time history data of the portion of the instrument; (e) wherein the method further comprises adjusting the confidence value based on determining that the curvature value is greater than or equal to a threshold curvature value in the robot data; (f) wherein the method further comprises adjusting the confidence value based on determining that the time history data satisfies a threshold time history condition in the robot data; (g) wherein the method further comprises adjusting the confidence value based on the temperature change; (h) adjusting the confidence value based on determining that the tip of the instrument is articulating; (i) wherein the method further comprises adjusting the confidence value based on determining that a non-shape changing strain is being applied to the instrument; (j) wherein the method further comprises assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes the distal end of the instrument, the confidence value being higher than the confidence value assigned to the shape data corresponding to the first portion; (k) wherein the method further comprises assigning a confidence value to the robot data corresponding to the first portion based on determining that the first portion of the instrument includes the proximal end of the instrument, the confidence value being lower than the confidence value assigned to the shape data corresponding to the first portion; (l) Wherein the method further comprises determining an estimated state of a sheath covering the instrument based on the estimated state of the instrument; (m) wherein the method further comprises adjusting the confidence value further based on a comparison of the shape data and additional data indicative of a shape of a sheath covering the instrument; (n) wherein the method further comprises determining that damage is imminent to the instrument based on the estimated state of the instrument, and controlling the instrument to avoid the damage; and (o) wherein the method further comprises determining that a mismatch between the robot data and the shape data has been detected for at least a threshold amount of time, and outputting a warning indicating that the instrument may be damaged.

Drawings

The disclosed aspects will hereinafter be described in conjunction with the appended drawings, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.

Fig. 1A illustrates an exemplary surgical robotic system, according to one embodiment.

Fig. 1B-1F illustrate various perspective views of a robotic platform coupled to the surgical robotic system shown in fig. 1A, according to one embodiment.

Fig. 2 illustrates an exemplary command console for an exemplary surgical robotic system, according to one embodiment.

Fig. 3A illustrates an isometric view of an exemplary independent drive mechanism of the Instrument Device Manipulator (IDM) shown in fig. 1A, according to one embodiment.

Fig. 3B illustrates a conceptual diagram showing how force may be measured by the strain gauge of the independent drive mechanism shown in fig. 3A, according to one embodiment.

FIG. 4A shows a top view of an exemplary endoscope according to one embodiment.

FIG. 4B illustrates an exemplary endoscope cross-section of the endoscope shown in FIG. 4A, according to one embodiment.

Fig. 4C illustrates an exemplary strain-based shape sensing mechanism, according to one embodiment.

Fig. 4D-4E illustrate an actual shape of an exemplary endoscope and a strain-based prediction of the endoscope, according to one embodiment.

Fig. 5 shows an exemplary schematic setup of an EM tracking system included in a surgical robotic system, according to one embodiment.

Fig. 6A-6B illustrate an exemplary anatomical lumen and an exemplary 3D model of the anatomical lumen, according to one embodiment.

Fig. 7 illustrates a computer-generated 3D model representing an anatomical space, in accordance with one embodiment.

FIG. 8A illustrates a high-level overview of an example block diagram of a navigation configuration system, according to one embodiment.

Fig. 8B illustrates a block diagram showing exemplary modules included in a strain-based algorithm module, according to one embodiment.

Fig. 8C illustrates a block diagram that shows an example of robot data stored in a robot data store, according to one embodiment.

FIG. 9 illustrates an exemplary block diagram of a shape data determination module according to one embodiment.

FIG. 10 illustrates an exemplary block diagram of a shape data comparison module and a shape data adjustment module, according to one embodiment.

FIG. 11 illustrates an exemplary block diagram of a shape-based state estimation module, according to one embodiment.

Fig. 12 illustrates a flow diagram showing an exemplary method operable by a surgical robotic system or components thereof for determining and adjusting shape data, according to one embodiment.

Fig. 13 illustrates a conceptual diagram illustrating an exemplary method operable by a surgical robotic system or components thereof for operating an instrument, according to one embodiment.

Fig. 14 illustrates a conceptual diagram illustrating an exemplary method operable by a surgical robotic system or components thereof for operating an instrument, according to one embodiment.

Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying drawings. It should be noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the systems (or methods) described herein for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.

Detailed Description

I. Surgical robot system

Fig. 1A illustrates an exemplary surgical robotic system 100, according to one embodiment. The surgical robotic system 100 includes a base 101 coupled to one or more robotic arms, such as robotic arm 102. The base 101 is communicatively coupled to a command console, described further in section ii. The base 101 may be positioned such that the robotic arm 102 is accessible to perform a surgical procedure on a patient, while a user, such as a physician, may control the surgical robotic system 100 according to the comfort of the command console. In some embodiments, the base 101 may be coupled to a surgical table or bed for supporting a patient. Although not shown in fig. 1A for clarity, the base 101 may include subsystems such as control electronics, pneumatics, power sources, light sources, and the like. The robotic arm 102 includes a plurality of arm segments 110 coupled at joints 111, which provide the robotic arm 102 with multiple degrees of freedom, e.g., seven degrees of freedom corresponding to seven arm segments. The base 101 may contain a power source 112, a pneumatic pressure device 113, and control and sensor electronics 114, including components such as a central processing unit, data buses, control circuitry, and memory, and associated actuators such as motors to move the robotic arm 102. The electronics 114 in the base 101 may also process and transmit control signals communicated from the command console.

In some embodiments, the base 101 includes wheels 115 to transport the surgical robotic system 100. The mobility of the surgical robotic system 100 helps to accommodate space constraints in the surgical operating room and facilitates proper positioning and movement of the surgical device. Further, the mobility allows the robotic arm 102 to be configured such that the robotic arm 102 does not interfere with the patient, the physician, the anesthesiologist, or any other equipment. During the procedure, a user may control the robotic arm 102 using a control device, such as a command console.

In some embodiments, the robotic arm 102 includes a fitting joint that uses a combination of brakes and counter balancing devices to maintain the position of the robotic arm 102. The counter-balancing means may comprise a gas spring or a helical spring. The brake (e.g., a fail-safe brake) may include mechanical and/or electronic components. Further, the robotic arm 102 may be a gravity assisted passively supported robotic arm.

Each robotic arm 102 may be coupled to an instrumentation manipulator (IDM)117 using a mechanical transducer interface (MCI) 116. The IDM117 may be removed and replaced with a different type of IDM, for example, a first type of IDM manipulates an endoscope, while a second type of IDM manipulates a laparoscope. The MCI 116 includes connectors for transferring pneumatic pressure, electrical power, electrical signals, and optical signals from the robotic arm 102 to the IDM 117. The MCI 116 may be a set screw or a substrate connector. IDM117 manipulates surgical instruments such as endoscope 118 using techniques including direct drive, harmonic drive, gear drive, belt and pulley, magnetic drive, and the like. MCI 116 is interchangeable based on the type of IDM117 and can be customized for a certain type of surgical procedure. The robotic arm 102 may include joint-level torque sensing and a wrist at a distal end, such as KUKALBR5 robotic arm.

The endoscope 118 is a tubular flexible surgical instrument that is inserted into a patient's anatomy to capture images of the anatomy (e.g., body tissue). In particular, the endoscope 118 includes one or more imaging devices (e.g., cameras or other types of optical sensors) that capture images. The imaging device may include one or more optical components, such as an optical fiber, a fiber array, or a lens. The optical components move with the tip of the endoscope 118 such that movement of the tip of the endoscope 118 causes a change in the image captured by the imaging device. The endoscope 118 is further described in section iv. endoscope with reference to fig. 3A-4B.

The robotic arm 102 of the surgical robotic system 100 manipulates the endoscope 118 using an elongated motion member. The elongated motion member may include a pull wire (also referred to as a push-pull wire), a cable, a fiber, or a flexible shaft. For example, the robotic arm 102 actuates a plurality of pull wires coupled to the endoscope 118 to deflect the distal end of the endoscope 118. The puller wire may comprise both metallic and non-metallic materials, such as stainless steel, Kevlar, tungsten, carbon fiber, and the like. The endoscope 118 may exhibit non-linear behavior in response to forces applied by the elongated motion member. This non-linear behavior may be based on the stiffness and compressibility of the endoscope 118, as well as the variability in slack or stiffness between different elongated motion members.

Fig. 1B-1F show various perspective views of a surgical robotic system 100 coupled to a robotic platform 150 (or surgical bed) according to various embodiments. In particular, fig. 1B shows a side view of the surgical robotic system 100 with the robotic arm 102 manipulating the endoscope 118 to insert the endoscope into a patient and the patient lying on the robotic platform 150. Fig. 1C shows a top view of the surgical robotic system 100 and the robotic platform 150 with the endoscope 118 manipulated by the robotic arm inserted into the patient. Fig. 1D shows a perspective view of the surgical robotic system 100 and the robotic platform 150, with the endoscope 118 controlled to be positioned horizontally parallel to the robotic platform. Fig. 1E shows another perspective view of the surgical robotic system 100 and the robotic platform 150, with the endoscope 118 controlled to be positioned relatively perpendicular to the robotic platform. In more detail, in fig. 1E, the angle between the horizontal surface of the robotic platform 150 and the endoscope 118 is 75 degrees. FIG. 1F shows a perspective view of the surgical robotic system 100 and the robotic platform 150 shown in FIG. 1E, and in more detail, the angle between the endoscope 118 and a virtual line 160 connecting one end 180 of the endoscope 118 and the robotic arm 102 positioned relatively remotely from the robotic platform is 90 degrees.

Command console

Fig. 2 illustrates an exemplary command console 200 for the exemplary surgical robotic system 100, according to one embodiment. The command console 200 includes a console base 201, a display module 202 such as a monitor, and control modules such as a keyboard 203 and a joystick 204. In some embodiments, one or more of the command console 200 functions may be integrated into the base 101 of the surgical robotic system 100 or another system communicatively coupled to the surgical robotic system 100. A user 205, such as a physician, uses the command console 200 to remotely control the surgical robotic system 100 from an ergonomic position.

The console base 201 may include a central processing unit, memory unit, data bus and associated data communication ports that are responsible for interpreting and processing signals such as camera imagery and tracking sensor data from, for example, the endoscope 118 shown in fig. 1. In some embodiments, both the console base 201 and the base 101 perform signal processing to achieve load balancing. The console base 201 may also process commands and instructions provided by the user 205 through the control modules 203 and 204. In addition to the keyboard 203 and joystick 204 shown in fig. 2, the control module may include other devices such as a computer mouse, a touch pad, a trackball, a control pad, a video game controller, and sensors (e.g., motion sensors or cameras) that capture hand gestures and finger gestures.

The user 205 may use the command console 200 to control a surgical instrument, such as the endoscope 118, in a speed mode or a position control mode. In the speed mode, the user 205 uses the control module to directly control the pitch and yaw motions of the distal end of the endoscope 118 based on direct manual control. For example, the motion on the joystick 204 may be mapped to yaw and pitch motions in the distal end of the endoscope 118. The joystick 204 may provide tactile feedback to the user 205. For example, the joystick 204 vibrates to indicate that the endoscope 118 cannot be further translated or rotated in a certain direction. The command console 200 may also provide visual feedback (e.g., a pop-up message) and/or audible feedback (e.g., a beep) to indicate that the endoscope 118 has reached a maximum translation or rotation.

In position control mode, the console 200 is commanded to control a surgical instrument, such as the endoscope 118, using a three-dimensional (3D) map of the patient and a predetermined computer model of the patient. The command console 200 provides control signals to the robotic arm 102 of the surgical robotic system 100 to maneuver the endoscope 118 to a target location. The position control mode requires an accurate mapping of the patient's anatomy due to the reliance on 3D maps.

In some embodiments, the user 205 may manually manipulate the robotic arm 102 of the surgical robotic system 100 without using the command console 200. During set-up in a surgical operating room, the user 205 may move the robotic arm 102, endoscope 118, and other surgical devices into the patient. The surgical robotic system 100 may rely on force feedback and inertial control from the user 205 to determine the appropriate configuration of the robotic arm 102 and devices.

Display module 202 may include an electronic monitor, a virtual reality viewing device such as goggles or glasses, and/or other devices of a display device. In some embodiments, the display module 202 is integrated with the control module, for example, into a tablet device having a touch screen. Further, the user 205 may use the integrated display module 202 and control module to view data and input commands to the surgical robotic system 100.

The display module 202 may display a 3D image using a stereoscopic device such as a visor or goggles. The 3D image provides an "interior view" (i.e., an endoscopic view), which is a computer 3D model showing the anatomy of the patient. The "inside view" provides the virtual environment inside the patient and the intended location of the endoscope 118 within the patient. The user 205 compares the "inside view" model to the actual image captured by the camera to help mentally orient and confirm that the endoscope 118 is in the correct-or approximately correct-position within the patient. The "inside view" provides information about the anatomy surrounding the distal end of the endoscope 118, such as the shape of the patient's small intestine or colon. The display module 202 may simultaneously display a 3D model of the anatomy around the distal end of the endoscope 118 and a Computed Tomography (CT) scan. Additionally, the display module 202 may overlay the navigation path of the endoscope 118 that has been determined on the 3D model and a scan/image (e.g., a CT scan) generated based on the pre-operative model data.

In some embodiments, a model of the endoscope 118 is displayed along with the 3D model to help indicate the status of the surgical procedure. For example, CT scans identify lesions in the anatomy that may require a biopsy. During operation, the display module 202 may show a reference image captured by the endoscope 118 corresponding to the current position of the endoscope 118. The display module 202 may automatically display different views of the model of the endoscope 118 depending on user settings and the particular surgical procedure. For example, the display module 202 shows a top fluoroscopic view of the endoscope 118 as the endoscope 118 approaches the operating region of the patient during the navigation step.

Instrument device manipulator

Fig. 3A illustrates an isometric view of an exemplary independent drive mechanism of IDM117 shown in fig. 1, according to one embodiment. The independent drive mechanisms may tighten or loosen pull wires 321, 322, 323, and 324 (e.g., independent of each other) of the endoscope by rotating output shafts 305, 306, 307, and 308, respectively, of IDM 117. Just as the output shafts 305, 306, 307, and 308 transmit force down to the pull wires 321, 322, 323, and 324 through angular motion, respectively, the pull wires 321, 322, 323, and 324 transmit force back to the output shafts. The IDM117 and/or the surgical robotic system 100 may measure the transmitted force using a sensor (e.g., a strain gauge described further below).

Fig. 3B shows a conceptual diagram illustrating how force may be measured by the strain gauge 334 of the independent drive mechanism shown in fig. 3A, according to one embodiment. The force 331 may be directed away from the output shaft 305 of the motor mount 333 coupled to the motor 337. Thus, the force 331 causes a horizontal displacement of the motor mount 333. Additionally, a strain gauge 334 horizontally coupled to the motor mount 333 experiences strain in the direction of the force 331. Strain may be measured as the ratio of the horizontal displacement of the tip 335 of the strain gauge 334 to the total horizontal width 336 of the strain gauge 334.

In some embodiments, IDM117 includes additional sensors, such as inclinometers or accelerometers, to determine the orientation of IDM 117. Based on measurements from additional sensors and/or strain gauges 334, the surgical robotic system 100 may calibrate readings from the strain gauges 334 to account for gravitational loading effects. For example, if IDM117 is oriented on the horizontal side of IDM117, the weight of certain components of IDM117 may cause strain on motor mount 333. Thus, strain gauge 334 may measure strain that is not caused by strain on the output shaft without accounting for the effects of gravity loading.

IV, endoscope

Top view of endoscope

Fig. 4A shows a top view of an exemplary endoscope 118, according to one embodiment. Endoscope 118 includes a tubular member of lead 415 nested or partially nested within and longitudinally aligned with the tubular member of sheath 411. The sheath 411 comprises a proximal sheath segment 412 and a distal sheath segment 413. The lead 415 has a smaller outer diameter than the sheath 411 and includes a proximal lead segment 416 and a distal lead segment 417. The sheath base 414 and the lead base 418 actuate the distal sheath portion 413 and the distal lead segment 417, respectively, for example, based on control signals from a user of the surgical robotic system 100. The sheath base 414 and the lead base 418 are part of the IDM117 shown in fig. 1, for example.

Both the sheath base 414 and the leader base 418 include a drive mechanism (e.g., an independent drive mechanism as further described in section iii. instrument device manipulator with reference to fig. 3A-3B) to control pull wires coupled to the sheath 411 and the leader 415. For example, the sheath base 414 generates a tensile load on the puller wire coupled to the sheath 411 to deflect the distal sheath segment 413. Similarly, the leading element base 418 generates a tensile load on the pull wires coupled to the leading element 415 to deflect the distal leading element segment 417. Both the sheath base 414 and the lead base 418 may also include couplings for routing pneumatic pressure, electrical power, electrical signals, or optical signals from the IDM to the sheath 411 and the lead 415, respectively. The puller wire may comprise a steel coil within the sheath 411 or the lead 415 along the length of the puller wire that transfers axial compression back to the origin of the load, such as the sheath base 414 or the lead base 418, respectively.

Since the pull wires coupled to the sheath 411 and the lead 415 provide multiple degrees of freedom, the endoscope 118 can easily navigate the anatomy of the patient. For example, four or more pull wires may be used in the sheath 411 and/or the lead 415, thereby providing eight or more degrees of freedom. In other embodiments, up to three pull wires may be used, thereby providing up to six degrees of freedom. The sheath 411 and the leading piece 415 may be rotated up to 360 degrees along the longitudinal axis 406 to provide more degrees of motion. The combination of the angle of rotation and the multiple degrees of freedom provides a user-friendly and spontaneous control of the endoscope 118 for a user of the surgical robotic system 100. Although not shown in fig. 4A, the endoscope 118 may include one or more optical fibers for sensing shapes in one or more portions of the endoscope 118. For example, as shown in FIG. 4B, an optical fiber can be included in the leading part of the endoscope 118. Alternatively or additionally, the optical fiber can be included in a sheath portion of the endoscope 118. As will be explained in more detail below, information from the optical fiber may be used in conjunction with information from other input sources (such as other input sensors, modeling data, known attributes and characteristics of the endoscope, etc.) to enhance performance of the navigation system, catheter control, etc.

Iv.b. endoscope cross-sectional views

FIG. 4B illustrates an exemplary endoscope cross-section 430 of the endoscope 118 shown in FIG. 4A, according to one embodiment. In fig. 4B, the endoscope cross-section 430 includes an illumination source 432, an Electromagnetic (EM) coil 434, and a shape sensing fiber 436. The illumination source 432 provides light to illuminate an interior portion of the anatomical space. The provided light can allow an imaging device disposed at the tip of the endoscope 118 to record an image of the space, which can then be transmitted to a computer system, such as command console 200, for processing as described herein. The EM coil 434 may be used with an EM tracking system to detect the position and orientation of the tip of the endoscope 118 when the tip is disposed within the anatomical system. In some embodiments, the coils may be angled to provide sensitivity to EM fields along different axes, giving the ability to measure all 6 degrees of freedom: three positions and three angles. In other embodiments, only a single coil may be disposed within the tip of the endoscope 118 with its axis oriented along the endoscope shaft of the endoscope 118; due to the rotational symmetry of such a system, it is insensitive to rolling around its axis, so that only 5 degrees of freedom can be detected in this case. The endoscope cross-section 430 also includes a working channel 438 through which a surgical instrument, such as a biopsy needle, may be inserted along the endoscope shaft, allowing access to the area near the end of the endoscope. As used herein, "instrument" may refer to surgical instruments, medical instruments, and any other instrument or device that may be navigated in a luminal network.

Although the illustrated embodiment is disclosed as including an illumination source 432 and EM coil 434 and corresponding imaging device and EM tracking system, it is contemplated that modified embodiments of the endoscope described herein may be devoid of one or more of such features. Additionally, although the shape sensing fiber 436 is described as being integrated into the endoscope 118, in other embodiments, any of the one or more shape sensing fibers 436 may alternatively be a removable working channel device that is insertable into the working channel 438 and removable from the working channel 438 after performing shape sensing. In other embodiments, the shape sensing fiber 436 may be mounted externally to the endoscope 118.

Shape sensing optical fiber

Fig. 4C shows a system 450 having a shape detector 452 that can be used to generate and detect light for determining the shape of the instrument, endoscope 454, and optical fiber 456. The optical fiber 456 may include one or more segments of a Fiber Bragg Grating (FBG)458 that reflects certain wavelengths of light while transmitting other wavelengths of light. The grating 458 may include a series of refractive index modulations to produce a spatial periodicity in the refractive index. During fabrication of the grating 458, the modulations may be spaced apart by a known distance, causing reflection of a known wavelength band. The shape detector 452 may transmit light through the optical fiber 456 and receive light reflected from the optical fiber 456. The shape detector 452 may also generate reflection spectrum data based on the wavelength of light reflected by the grating 458.

As shown in fig. 4C, a single optical fiber may include multiple sets of fiber bragg gratings. The endoscope 454 can include a plurality of optical fibers, and the shape detector 452 can detect and analyze signals from more than one optical fiber. One or more optical fibers may be included in the front guide 415 of fig. 4A, the jacket 411 of fig. 4A, or both. Although endoscope 454 is used as an example, the techniques described herein may be applied to any other elongated instrument. The shape detector 452 may be operably coupled with a controller configured to determine the geometry or configuration of the optical fiber 456, and thus at least a portion of the endoscope 454 (or other elongate instrument, such as a catheter, etc.), based on spectral analysis of the detected reflected light signal.

A controller (e.g., the surgical robotic system 500) within or in communication with the shape detector 452 may analyze the reflected spectral data to generate position and orientation data of the endoscope 454 in two or three dimensions. In particular, as the endoscope 454 is bent, the internally positioned optical fibers 456 are also bent, which can induce strain on the optical fibers 456. When strain is induced on the fiber 456, the pitch of the modulation will change depending on the amount of strain on the fiber 456. To measure the strain, light is sent down the fiber 456 and the characteristics of the returning light are measured. For example, the grating 458 may produce a reflected wavelength that varies with strain on the fiber 456 (as well as other factors such as temperature). Based on the particular wavelength of light reflected by the grating 458, the system may determine an amount of strain on the optical fiber 456, and further predict the shape of the optical fiber 456 based on the amount of strain (e.g., based on how the strain characteristics of a "straight" endoscope may differ from those of a "bent" endoscope). Thus, the system can determine, for example, the degree to which the endoscope 454 has been bent in one or more directions (e.g., in response to a command from the surgical robotic system 500) by identifying differences in the reflected spectrum data.

In some embodiments, the optical fiber 456 includes multiple cores within a single cladding. In such embodiments, each core may operate as a separate optical fiber with sufficient distance and cladding separating the cores so that the light in each core does not significantly interact with the light carried in the other cores. In other embodiments, the number of cores may vary, or each core may be contained in a separate optical fiber. When strain and shape analysis is applied to a multi-core fiber, bending of the fiber 456 can induce strain on the cores, which can be measured by monitoring wavelength shifts in each core. By off-axis two or more cores in the optical fiber 456, the bending of the fiber can induce different strains on each core. These strains vary with the local bending of the fiber. For example, if the region of the core containing the grating 458 is located at the point where the fiber 456 is bent, it can thus be used to determine the amount of bending at those points. These data, in combination with the known spacing of the gratings 458, can be used to reconstruct the shape of the fiber 456.

The optical fiber is suitable for data acquisition within the patient because no line of sight to the shape sensing optical fiber is required. Various systems and methods for monitoring the shape and relative position of an Optical Fiber in three dimensions are described in U.S. patent application publication 2006/0013523 entitled "Fiber Optical position and shape sensing device and method relating," filed on 13.7.2005 and U.S. patent 6,389,187 entitled "Optical Fiber bend sensor," filed on 17.6.1998, the contents of both of which are incorporated herein by reference in their entirety.

Although the illustrated embodiment uses a fiber with a bragg grating, in a modified variation, the optical fiber may include minor imperfections that result in changes in the refractive index along the core of the optical fiber. These variations can result in a small amount of backscatter known as rayleigh scattering. Changes in the strain or temperature of the optical fiber cause changes in the effective length of the optical fiber. This variation in effective length results in a variation or variation in the spatial position of the rayleigh scattering point. Cross-correlation techniques can measure this change in rayleigh scattering and can extract information about strain. These techniques may include the use of optical frequency domain reflectometry techniques in much the same manner as associated with low reflectivity fiber gratings.

Methods and apparatus for calculating birefringence in an optical fiber based on rayleigh scattering and apparatus and methods for measuring strain in an optical fiber using spectral shifts of rayleigh scattering can be found in PCT publication WO 2006/099056 filed on 3/9 2006 and U.S. patent 6,545,760 filed on 3/24 2000, both of which are incorporated herein by reference in their entirety. Birefringence may be used to measure axial strain and/or temperature in the waveguide.

Iv.d. improving strain-based shape data

Strain-based shape sensing may allow the shape of an endoscope or other instrument to be reconstructed by measuring strain along an optical fiber running within the instrument. The measurement of strain captures the spatiotemporal variation of the reflection of light on a grating within the fiber. The distance between each grating affects the reflection and can therefore be used to measure strain at precise locations along the fiber (or instrument). However, in some cases, strain-based shape sensing may be negatively affected by noise. In such cases, it may be difficult to distinguish between actual and false strain changes.

The improved strain-based shape sensing system may utilize other data available to the system (e.g., robot data, image data, EM data, etc.) to improve the accuracy of (or adjust the confidence of) state estimates for its strain-based shape sensing or for determinations based on such strain-based shape sensing. Alternatively or additionally, the improved strain-based shape sensing system may utilize shape data determined based on its strain-based shape sensing to improve the accuracy of (or adjust the confidence of) its other data (e.g., robot data, image data, EM data, etc.) or state estimates determined based on such data.

Fig. 4D-4E illustrate how the system can leverage the information available to the system to improve, adjust, or weight its strain-based shape sensing. The system may access data available to the system, such as robot data (e.g., command data, force and distance data, mechanical model data, kinematic model data, etc.), and determine certain characteristics about the shape of the instrument (or a particular portion thereof) being navigated within the patient based on such data. Such characteristics may include curvature information (e.g., a maximum curvature that the instrument is capable of assuming, or a range of curvature values that are acceptable given the current force and distance data indicated by the robot data), movement information (e.g., a maximum speed that the instrument is capable of moving, or a range of speed values that are acceptable given the current force and distance data indicated by the robot data), sheath information (e.g., a current shape of a sheath covering one or more portions of the instrument), and so forth. Upon determining that the strain-based shape prediction does not satisfy one or more constraints indicated by the characteristics determined based on the robot-based data, the system may adjust the strain-based shape prediction such that the adjusted strain-based shape prediction satisfies the constraints, reduce a confidence or weight associated with the particular strain-based shape prediction, or ignore the strain-based shape prediction.

Fig. 4D shows an actual shape 472 of the endoscope 118, a robot data based shape prediction 473 and a strain based shape prediction 474 of the endoscope 118. The actual shape 472 exhibits an actual curvature 476, while the robot data based shape prediction 473 exhibits a predicted curvature 477, and the strain based shape prediction 474 exhibits a predicted curvature 478. In the example of fig. 4D, the system may determine, based on the robotic data, one or more conditions that the endoscope 118 is expected to satisfy (e.g., the curvature value at a given point along the endoscope 118 should be within a predetermined range of values, or should be within a range of values determined based on the tension on the pull wire and/or the distance the pull wire has been actuated). Upon determining that a portion of the strain-based shape prediction 474 does not satisfy such conditions (e.g., by indicating a predicted curvature value outside of a range of expected curvature values determined based on robot data corresponding to the endoscope 118), the system may adjust the strain-based shape prediction 474 such that the portion of the strain-based shape prediction 474 satisfies the conditions (e.g., such that the shape data no longer indicates a predicted curvature value outside of the range of expected curvature values), reduce a confidence or weight associated with the portion of the strain-based shape prediction 474, or ignore the portion of the strain-based shape prediction 474 (e.g., refrain from using the portion of the strain-based shape prediction 474 when estimating the current state of the endoscope 118). For example, as shown in fig. 4D, the system may determine a robot-data-based shape prediction 473 that exhibits a predicted curvature 477 at a given point based on the robot data (e.g., tension and distance). The system may compare the predicted curvature 477 to a predicted curvature 478 exhibited by the strain-based shape prediction 474. Upon determining that the predicted curvature 478 is different from the predicted curvature 477, the system may adjust the predicted curvature 487 to be equal to the predicted curvature 477. Alternatively, upon determining that the predicted curvature 478 is not within a given threshold range (e.g., ± 10%) from the predicted curvature 477, the system may adjust the predicted curvature 487 to be within the threshold range (e.g., set to an upper limit of the threshold range if the predicted curvature 478 is outside of the range, and set to a lower limit of the threshold range if the predicted curvature 478 is not within the range). Additionally or alternatively, the system may decrease a confidence value associated with the strain-based shape prediction 474 based on determining that the predicted curvature 478 is different from the predicted curvature 477 (or that the predicted curvature 478 is not within a given threshold range) and/or increase a confidence value associated with the strain-based shape prediction 474 based on determining that the predicted curvature 478 is equal to the predicted curvature 477 (or that the predicted curvature 478 is within a given threshold range).

Fig. 4E illustrates robot data based shape prediction 482 of endoscope 118 and strain based shape prediction 484 of endoscope 118. Shape prediction based on robot data 482 exhibits predicted movement 486, while shape prediction based on strain 484 exhibits predicted movement 488. As described with reference to fig. 4D, the system may determine one or more conditions that the endoscope 118 is expected to satisfy based on the robot data. For example, based on the robot data, the system may determine that the speed at which the endoscope 118 is moving should be within a certain range of values. Upon determining that a portion of strain-based shape prediction 484 does not satisfy such conditions, the system may adjust strain-based shape prediction 484 such that the portion of strain-based shape prediction 484 satisfies the conditions (e.g., such that the shape data no longer indicates a predicted velocity value outside of the range of expected velocity values), lower the confidence or weight associated with the portion of strain-based shape prediction 484, or ignore the portion of strain-based shape prediction 484 (e.g., refrain from using the portion of strain-based shape prediction 484 in estimating the current state of endoscope 118). For example, as shown in fig. 4E, the system may determine a robot-data-based shape prediction 482 that exhibits a predicted movement 486 based on the robot data (e.g., a varying tension and/or distance over time). The system may compare the predicted movement 486 to the predicted movement 488 exhibited by the strain-based shape prediction 484. Upon determining that predicted movement 488 is different than predicted movement 486, the system may adjust predicted movement 488 to be the same as predicted movement 486. Alternatively, upon determining that the predicted movement 488 is not within a given threshold range (e.g., the movement speed is within ± 10% of the movement speed of the predicted movement 486), the system may adjust the predicted movement 488 to be within the threshold range (e.g., set to an upper limit of the threshold movement speed range if the predicted movement speed of the strain-based shape prediction 484 is outside the movement speed range, and set to a lower limit of the threshold movement speed range if the predicted movement speed of the strain-based shape prediction 484 is not within the movement speed range). Additionally or alternatively, the system may decrease confidence values associated with the strain-based shape predictions 484 based on determining that the predicted movement 488 is different from the predicted movement 486 (or that the predicted movement 488 is not within a given threshold), and/or increase confidence values associated with the strain-based shape predictions 484 based on determining that the predicted movement 488 is the same as the predicted movement 486 (or that the predicted movement 488 is within a given threshold).

The process of acquiring strain data and other data (some or all of which may be used to improve strain-based shape data) and determining a state estimate is described in more detail below with reference to fig. 8-11.

Registration transformation of a V.EM system to a 3D model

Schematic setup of v.a.em tracking system

In certain embodiments, the EM tracking system may be used in conjunction with the systems described herein. Fig. 5 shows an exemplary schematic setup of such an EM tracking system 505 that may be included in a surgical robotic system 500, according to one embodiment. In fig. 5, EM tracking system 505 includes a plurality of robotic components (e.g., window field generators, reference sensors as described below). The surgical robotic system 500 includes a surgical bed 511 for holding the body of a patient. Below the bed 511 is a Window Field Generator (WFG)512 configured to sequentially activate a set of EM coils (e.g., EM coils 434 shown in fig. 4B). WFG 512 generates an Alternating Current (AC) magnetic field in a large volume; for example, in some cases, it may form an AC field in a volume of about 0.5m × 0.5m × 0.5 m.

Additional fields may be applied by additional field generators to help track instruments within the body. For example, a Planar Field Generator (PFG) may be attached to the system arm adjacent to the patient and oriented to provide the EM field at an angle. The reference sensor 513 may be placed on the patient's body to provide a local EM field, further increasing tracking accuracy. Each of the reference sensors 513 may be attached to the command module 515 by a cable 514. The cable 514 is connected to the command module 515 through the interface unit 516, which handles communication with its corresponding device, and provides power. The interface unit 516 is coupled to a System Control Unit (SCU)517, which acts as the overall interface controller for the various entities described above. The SCU 517 also drives the field generators (e.g., WFG 512), and collects sensor data from the interface unit 516, from which the position and orientation of the sensor within the body is calculated. The SCU 517 may be coupled to a Personal Computer (PC)518 to allow user access and control.

The command module 515 is also connected to various IDMs 519 coupled to the surgical robotic system 500, as described herein. The IDM 519 is typically coupled to a single surgical robotic system (e.g., surgical robotic system 500) and is used to control and receive data from its respective connected robotic component; such as a robotic endoscopic tool or a robotic arm. As described above, the IDM 519 is coupled to an endoscopic tool (not shown here) of the surgical robotic system 500, as an example.

The command module 515 receives data communicated from the endoscopic tool. The type of data received depends on the corresponding type of instrument attached. For example, exemplary received data includes sensor data (e.g., image data, EM data), robot data (e.g., command data, force and distance data, mechanical model data, kinematic model data, etc.), control data, and/or video data. To better process video data, a Field Programmable Gate Array (FPGA)520 may be configured to handle image processing. Comparing the data obtained from the various sensors, devices, and field generators allows the SCU 517 to accurately track the motion of the different components of the surgical robotic system 500, as well as, for example, the position and orientation of these components.

In order to track the sensors through the patient's anatomy, the EM tracking system 505 may require a process called "registration" in which the system finds a geometric transformation that aligns the individual objects between different coordinate systems. For example, a particular anatomical site on the patient has two different representations in the 3D model coordinates and the EM sensor coordinates. To be able to establish consistency and common language between these two different coordinate systems, the EM tracking system 505 needs to find a transformation that correlates these two representations (i.e., registration). For example, the position of the EM tracker relative to the position of the EM field generator may be mapped to a 3D coordinate system to isolate the position in the corresponding 3D model.

V.B.3D model representation

Fig. 6A-6B illustrate an exemplary anatomic lumen 600 and an exemplary 3D model 620 of the anatomic lumen, according to one embodiment. More specifically, fig. 6A-6B illustrate centerline coordinates, diameter measurements, and the relationship of the anatomical space between the actual anatomical lumen 600 and its 3D model 620. In fig. 6A, an anatomical lumen 600 is generally longitudinally tracked by centerline coordinates 601, 602, 603, 604, 605, and 606, where each centerline coordinate is generally near the center of a tomographic scan slice of the lumen. The centerline coordinates are connected and visualized by centerline 607. The volume of the lumen may be further visualized by measuring the diameter of the lumen at each centerline coordinate, e.g., diameters 608, 609, 610, 611, 612, and 613 represent measurements of the anatomical lumen 600 corresponding to coordinates 601, 602, 603, 604, 605, and 606.

Fig. 6B illustrates an exemplary 3D model 620 of the anatomical lumen 600 shown in fig. 6A, according to one embodiment. In fig. 6B, the anatomical lumen 600 is visualized in 3D space by first locating centerline coordinates 601, 602, 603, 604, 605, and 606 in 3D space based on the centerline 607. For example, at each centerline coordinate, the lumen diameter may be visualized as a 2D circular space (e.g., 2D circular space 630) having diameters 608, 609, 610, 611, 612, and 613. By connecting these 2D circular spaces to form a 3D space, the anatomical lumen 600 is approximated and visualized as a 3D model 620. A more accurate approximation may be determined by increasing the resolution of the centerline coordinates and measurements, i.e., increasing the density of centerline coordinates and measurements for a given lumen or sub-segment. The centerline coordinates may also include markers to indicate points of interest to the physician, including the foci.

In some embodiments, the preoperative software package is also used to analyze and derive a navigation path based on the generated 3D model of the anatomical space. For example, the software package may derive the shortest navigation path to a single lesion (marked by a centerline coordinate) or to several lesions. The navigation path may be presented to the operator intraoperatively in two or three dimensions according to the preference of the operator. In some implementations, the navigation path (or a portion thereof) may be selected by the operator preoperatively. The path selection may include identifying one or more target locations (also referred to simply as "targets") within the patient's anatomy.

Fig. 7 shows a computer-generated 3D model 700 representing an anatomical space, according to one embodiment. As discussed above in fig. 6A-6B, the 3D model 700 may be generated using a centerline 701 obtained by viewing a preoperatively generated CT scan. In some embodiments, the computer software is able to map the navigation path 702 within the tubular network to access the surgical site 703 (or other target) within the 3D model 700. In some embodiments, the surgical site 703 may be connected to individual centerline coordinates 704, which allows the computer algorithm to topologically search the centerline coordinates of the 3D model 700 to obtain the best path 702 within the tubular network. In certain embodiments, the topological search of the path 702 may be constrained by certain operator-selected parameters, such as the location of one or more targets, one or more waypoints, and the like.

In some embodiments, the distal end of the endoscopic tool is tracked within the patient anatomy, and the tracked position of the endoscopic tool within the patient anatomy is mapped and placed within the computer model, which enhances the navigation capabilities of the tubular network. To track the distal working end of the endoscopic tool, i.e., the position and orientation of the working end, a variety of methods may be employed, either individually or in combination.

In a sensor-based method for positioning, a sensor, such as an EM tracker, may be coupled to the distal working end of an endoscopic tool to provide a real-time indication of the progress of the endoscopic tool. In EM-based tracking, an EM tracker embedded in the endoscopic tool measures changes in the electromagnetic field formed by one or more EM transmitters. The transmitter (or field generator) may be placed near the patient (e.g., as part of a surgical bed) to create a low-intensity magnetic field. This induces a small current in the sensor coil of the EM tracker that is related to the distance and angle between the sensor and the generator. The electrical signals may then be digitized by an interface unit (on-chip or PCB) and sent via cable/wiring back to the system cart and then to the command module. The data can then be processed to interpret the current data and calculate the precise position and orientation of the sensor relative to the emitter. Multiple sensors may be used at different locations in the endoscopic tool, such as in the leader and sheath, in order to calculate the various positions of these components. Thus, based on readings from the artificially generated EM field, the EM tracker may detect changes in field strength as it moves through the patient anatomy.

VI navigation configuration system

High level overview of a VI.A. navigation configuration system

FIG. 8A illustrates an exemplary block diagram of a navigation configuration system 900 according to one embodiment. In fig. 8A, the navigation configuration system 900 includes a plurality of input data stores, a navigation module 905 that receives various types of input data from the plurality of input data stores, and an output navigation data store 990 that receives output navigation data from the navigation module 905. The block diagram of the navigation configuration system 900 shown in fig. 8A is merely an example, and in alternative embodiments not shown, the navigation configuration system 900 may include different and/or additional elements, or none of one or more of the elements shown in fig. 8A. Also, the functions performed by the various elements of the navigation configuration system 900 may vary from embodiment to embodiment. The navigation configuration system 900 may be similar to the navigation system described in U.S. patent 9,727,963 published 8/2017, which is incorporated by reference herein in its entirety.

As used herein, input data refers to raw or processed data collected from input devices (e.g., command modules, optical sensors, EM sensors, IDMs) used to generate estimated state information of the endoscope 118 (or other instrument) and output navigation data. The plurality of input data stores 901 and 941 may include strain data store 901, image data store 910, EM data store 920, robot data store 930, 3D model data store 940, and other data stores 941. Each type of input data store 901 and 941 stores data of a type indicated by a name for access and use by the navigation module 905. The strain data may include one or more strain measurements along the endoscope 118 (e.g., generated and/or stored by the shape detector 452 of fig. 4C).

The image data may include one or more image frames captured by the imaging device at the instrument tip, as well as information, such as a frame rate or timestamp, that allows the time elapsed between pairs of frames to be determined.

The robotic data may include data that is typically used by the system for functions related to control of the instrument (e.g., the endoscope 118 and/or a sheath thereof) and/or physical movement of the instrument or a portion of the instrument (e.g., an instrument tip or sheath) within the tubular network. The robotic data may allow the state of the instrument to be inferred based on data measured while navigating the instrument within the tubular network. The kinematic model and the dynamic model may be generated based on a priori information collected during a calibration phase. This a priori information may be stored on the device and read and utilized by the robot to improve the driving, control, and navigation of the instrument, and to improve other types of data available to the robot (e.g., EM data, image data, strain-based shape data, etc.). The robot data may include parameters specific to each instrument.

Fig. 8C illustrates an example of robot data that may be stored in the robot data store 930 of fig. 8A. As shown in fig. 8C, the robot data may include command data 931 indicating that the instrument tip reaches and/or changes its orientation to a particular anatomical site (e.g., articulation data indicating that the instrument exhibits a desired articulation at a particular pitch, roll, and yaw, insertion and retraction data indicating insertion and retraction of one or both of the front guide and sheath, etc.), force and distance data 932 (e.g., the distance that the pull wires have been actuated since the device was loaded onto the robot, the amount of force applied to the pull wires measured by the torque sensors in the IDM, the amount of insertion force applied by the robotic arms to insert or retract the instrument, etc.), mechanical model data 933 representing mechanical motion of the elongate members of the instrument (e.g., motion of one or more pull wires, tendons, or shafts of an endoscope driving actual movement of the medical instrument within the tubular network), mechanical model data 933, b, C, and d, Kinematic model data 934 representing the motion and shape of the instrument (e.g., geometric parameters indicative of the position of the instrument, and/or any changes in the geometric parameters relative to a predetermined or reference position or coordinate system), etc.

The EM data may be acquired by one or more EM sensors (e.g., located proximal to the instrument tip) and/or an EM tracking system as described above. The 3D model data may in particular be derived from a 2D CT scan as described above.

The output navigation data store 990 receives and stores the output navigation data provided by the navigation module 905. The output navigation data is indicative of information that is helpful in guiding the instrument through the patient's anatomy and, in one example, through the tubular network to reach a particular destination within the tubular network, and is based on estimated state information of the instrument at each time instant. The estimated state information may include a position and orientation of the instrument within the tubular network. In one embodiment, updated output navigation data indicative of the motion and position/orientation information of the instrument is provided in real-time as the instrument moves within the tubular network, which better facilitates its navigation within the tubular network.

To determine the output navigation data, the navigation module 905 locates (or determines) an estimated state of the instrument within the tubular network. As shown in fig. 8A, the navigation module 905 also includes various algorithm modules, such as a strain-based algorithm module 945, an EM-based algorithm module 950, an image-based algorithm module 960, a robot-based algorithm module 970, an algorithm module based on other data 971, and so on. These modules may each primarily consume certain types of input data and contribute different types of data to the state estimator 980. As shown in fig. 8A, for illustration, the different kinds of data output by these modules (labeled as strain-based estimated state data, EM-based estimated state data, image-based estimated state data, and robot-based estimated state data, and estimated state data based on other data) may be generally referred to as "intermediate data". In some cases, the navigation module 905 determines that damage or failure (e.g., buckling, prolapse, etc.) to the instrument is imminent based on the estimated state of the instrument. In such cases, the navigation module 905 may cause the instrument to be controlled in a manner that avoids damage or malfunction. The detailed components of each algorithm module and state estimator 980 are described in more detail below.

VI.B. navigation module

VI.B.1. state estimator

As described above, the navigation module 905 also includes a state estimator 980 and a plurality of algorithm modules that employ different algorithms to navigate in the tubular network. For clarity of description, the state estimator 980 is described first, followed by a description of the various modules that exchange data with the state estimator 980.

A state estimator 980 included in the navigation module 905 receives the various intermediate data and provides an estimated state of the instrument tip (or other portion of the instrument) as a function of time, wherein the estimated state is indicative of estimated position and orientation information of the instrument tip (or other portion of the instrument) within the tubular network. The estimated state data is stored in an estimated state data store 985 included in the state estimator 980. Although the description herein is described in the context of determining estimated position and orientation information of an instrument tip (or other portion of an instrument) within a tubular network, in other arrangements, this information may be used to determine estimated position and orientation information of the instrument tip (or other portion of the instrument) relative to a patient in general.

VI.B.2. estimate state data repository

Estimated state data store 985 may include a bifurcation data store, a location data store, a depth data store, and an orientation data store. However, this particular failure of a data store is merely an example, and in alternative embodiments not shown, different and/or additional data stores may be included in estimated state data store 985.

The various repositories introduced above represent the estimated state data in various ways. Bifurcation data may refer to the location of a medical instrument relative to a set of branches (e.g., bifurcations, trifurcations, or branches into more than three branches) within a tubular network. For example, the bifurcation data may be a set of branch choices selected by the instrument as it traverses the tubular network, based on, for example, a larger set of available branches provided by a 3D model that maps the entire tubular network. The bifurcation data may also include information ahead of the position of the instrument tip, such as branches (bifurcations) that the instrument tip is close to but not yet traversing, but which may be detected, for example, based on current position information of the tip relative to the 3D model or based on captured images of upcoming bifurcations.

The position data may indicate the three-dimensional position of some portion of the instrument within the tubular network or the three-dimensional position of some portion of the tubular network itself. The position data may be in the form of absolute positions or relative positions with respect to the 3D model of e.g. the tubular network. For example, the position data may include an indication of the location of the position of the instrument within a particular branch. The identification of a particular branch may also be stored as a segment Identification (ID) that uniquely identifies a particular segment of the model in which the instrument tip is located.

The depth data may be indicative of depth information of the instrument tip within the tubular network. Exemplary depth data includes the total insertion (absolute) depth of the instrument within the patient and the (relative) depth within the identified branch (e.g., the segment identified by location data store 1087). The depth data may be determined based on position data about both the tubular network and the instrument.

The orientation data may be indicative of orientation information of the instrument tip, and may include overall roll, pitch, and yaw relative to the 3D model, as well as pitch, roll, and yaw within the identified bifurcation.

VI.B.3. data output to algorithm module

As shown in fig. 8A, the state estimator 980 provides the estimated state data back to the algorithm module for use in generating more accurate intermediate data that the state estimator uses to generate improved and/or updated estimated states, and so on to form a feedback loop. State estimator 980 receives estimated state data from one or more of the algorithm modules shown in fig. 8A. The state estimator 980 uses this data to generate "estimated state data (previous)" associated with the timestamp "t-1". The state estimator 980 then provides the data to one or more of the algorithm modules (which may be a combination of different algorithm modules than the algorithm module from which the estimated state data was previously received). The "estimated state data (previous)" may be based on a combination of different types of intermediate data (e.g., image data, mechanical model data, command data, dynamics model data) associated with the time stamp "t-1" generated and received from different algorithm modules. For example, estimated state data based on a combination of non-strain based data 972 may be provided to strain based algorithm module 945, and strain based algorithm module 945 may determine and output estimated state data based on strain to state estimator 980.

Next, one or more of the algorithm modules run their respective algorithms using the received estimated state data (before) represented by the "estimated state data (current)" shown for the respective algorithm module and associated with the timestamp "t" to output improved and updated estimated state data to the state estimator 980. The process may be repeated for future timestamps to generate estimated state data.

Since the state estimator 980 may use several different kinds of intermediate data to obtain its estimate of the state of the instrument within the tubular network, the state estimator 980 is configured to account for various different kinds of errors and uncertainties in both measurement and analysis, each type of base data (robot, EM, image) and each type of algorithm module may be created or carried into the intermediate data for consideration in determining the estimated state. To address these issues, two concepts of probability distributions and confidence values are discussed.

As used herein, the term "probability" in the phrase "probability distribution" refers to the likelihood that an estimate of the likely position and/or orientation of the instrument is correct. For example, the different probabilities may be calculated by one of the algorithm modules indicating the relative likelihood of the instrument being in one of several different possible branches within the tubular network. In one embodiment, the type of probability distribution (e.g., discrete distribution or continuous distribution) is selected to match the characteristics of the estimated states (e.g., the type of estimated state, such as continuous location information and discrete branch selection). For example, the estimated state used to identify which segment the instrument entered for a trifurcation may be represented by a discrete probability distribution and may include three discrete values of 20%, 30%, and 50% representing the chance of being in a position within each of the three branches as determined by one of the algorithm modules. As another example, the estimated state may include an instrument roll angle of 40 + -5 degrees, and the segment depth of the instrument tip within the bifurcation may be 4 + -1 mm, each represented by a Gaussian distribution, which is a type of continuous probability distribution. Different methods or patterns may be used to generate probabilities that will vary with the algorithm modules, as described more fully below with reference to later figures.

In contrast, as used herein, a "confidence value" reflects a measure of confidence in a state estimate provided by one of the algorithms based on one or more factors. For strain-based algorithms that use shape sensing fibers, factors such as temperature, proximity to the proximal end of the catheter, etc. may affect the confidence in the estimation of the state. For example, thermal expansion and contraction of the fiber portion may falsely indicate that the instrument is bending. Further, in some embodiments, the strain measurement of the distal portion of the instrument relies on shape/position data determined based on the strain measurement of the proximal portion of the instrument (e.g., closer to the shape detector 452), and any error in the strain measurement of the proximal portion may be amplified in the strain measurement of the distal portion. For EM-based algorithms, factors such as distortion of the EM field, inaccuracy of the EM registration, patient offset or motion, and patient breathing may affect the confidence in the state estimation. In particular, the confidence value in the state estimate provided by the EM-based algorithm may depend on the patient's breathing cycle, the motion of the patient or the EM field generator, and the location within the anatomy where the instrument tip is located. For image-based algorithms, exemplary factors that may affect the confidence value in the state estimation include the lighting conditions of the location within the anatomy where the image was captured, the presence of fluids, tissue, or other obstructions against or in front of the optical sensor where the image was captured, the patient's breathing, the condition of the patient's own tubular network (e.g., lungs) such as general fluid inside the tubular network and obstruction of the tubular network, and the particular operating techniques used in, for example, navigation or image capture.

For example, one factor may be that a particular algorithm has different levels of accuracy at different depths in the patient's lungs, such that relatively close to the airway opening, the particular algorithm may have high confidence in its estimate of the instrument's position and orientation, but the further the instrument travels in the bottom of the lungs, the confidence value may drop. Generally, the confidence value is based on one or more system factors related to the process of determining the outcome, while the probability is a relative measure that is generated when attempting to determine the correct outcome from multiple possibilities using a single algorithm based on the underlying data.

For example, the mathematical formula for calculating the result of the estimated states represented by the discrete probability distribution (e.g., a trifurcated branch/segment identification with three values of the estimated state involved) may be as follows:

S1=CEM*P1,EM+CImage*P1,Image+CRobot*P1,Robot

S2=CEM*P2,EMCImage*P2,Image+CRobot*P2,Robot

S3=CEM*P3,EM+CImage*P3,Image+CRobot*P3,Robot

in the above exemplary mathematical formula, Si(i ═ 1, 2, 3) represents a possible example value of the estimated state in the case where 3 possible segments are identified or present in the 3D model, CEM、CImageAnd CRobotRepresenting confidence values corresponding to EM-based, image-based, and robot-based algorithms, and Pi,EM、Pi,ImageAnd Pi,RobotRepresenting the probability of segment i.

To better illustrate the concepts of probability distributions and confidence values associated with estimated states, detailed examples are provided herein. In this example, the user attempts to identify the segment where the instrument tip is located in a particular trifurcation within the central airway (predicted region) of the tubular network, and the three algorithm modules used include EM-based algorithms, image-based algorithms, and robot-based algorithms. In this example, the probability distribution corresponding to the EM-based algorithm may be 20% in the first branch, 30% in the second branch, and 50% in the third (last) branch, and the confidence value applied to the EM-based algorithm and the central airway is 80%. For the same example, the probability distribution corresponding to the image-based algorithm may be 40%, 20%, 40% for the first, second and third branches, and the confidence value applied to the image-based algorithm is 30%; while the probability distribution corresponding to the robot-based algorithm may be 10%, 60%, 30% for the first, second and third branches and the confidence value applied to the image-based algorithm is 20%. The difference in confidence values applied to the EM-based algorithm and the image-based algorithm indicates that the EM-based algorithm may be a better choice for segment identification in the central airway than the image-based algorithm. An exemplary mathematical calculation of the final estimated state may be: for the first branch: 20% + 80% + 40% + 30% + 10% + 20% + 30%; for the second branch: 30% + 80% + 20% + 30% + 60% + 20% + 42%; and for the third branch: 50% + 80% + 40% + 30% + 20% + 58%.

In this example, the output estimated state of the instrument tip may be the resulting values (e.g., 30%, 42%, and 58% of the results), or derivative values from these resulting values, such as determining that the instrument tip is in the third branch.

As described above, the estimated states may be represented in a number of different ways. For example, the estimated state may also include an absolute depth from the airway to the instrument tip location, and a data set representing the set of branches traversed by the instrument within the tubular network, which is a subset of the entire set of branches provided by, for example, a 3D model of the patient's lungs. Applying the probability distribution and the confidence value to the estimated state allows for improving the accuracy of estimating the position and/or orientation of the instrument tip within the tubular network.

VI.B.4. Strain-based algorithm module

Element of a strain-based algorithm module

The strain-based algorithm module 945 determines an estimated state of the instrument within the tubular network using the strain data. Fig. 8B and 9-11 illustrate modules that may be included in the strain-based algorithm module 945. As shown in fig. 8B, the strain-based algorithm module 945 may include (i) a shape data determination module 906 for determining shape data based on the strain data, (ii) a shape data comparison module 907 for comparing the shape data to the robot data, (iii) a shape data adjustment module 908 for adjusting the shape data (or confidence in the shape data) based on the comparison between the shape data and the robot data; and (iv) a shape-based state estimation module 909 to determine shape-based estimated state data based on the adjusted shape data (or the adjusted confidence in the shape data). Although shown as separate components, the modules 906 and 909 may be implemented as one or more hardware components (e.g., a single component, separate components, or any number of components), one or more software components (e.g., a single component, separate components, or any number of components), or any combination thereof. Modules 906 and 909 are described in more detail below with reference to fig. 9-11.

Vi.b.4.ii. determining shape data

Fig. 9 illustrates an exemplary shape data determination module that may be included in the strain-based algorithm module 945. As shown in fig. 9, the shape data determination module 906 receives strain data from the strain data store 901 and outputs shape data to the shape data store 902. The shape data determination module 906 may determine shape data based on the strain data received from the strain data store 901. As described with reference to fig. 8A, the strain data may include one or more measurements of strain along the one or more optical fibers 456 (or one or more cores therein) generated and/or stored by the shape detector 452 of fig. 4C. The shape data may include angles, coordinates, or a combination thereof that indicate the current shape of the instrument. In some cases, the shape data may include curvature information (e.g., curvature values of one or more portions of the instrument), orientation information (e.g., roll, pitch, and/or yaw of one or more portions of the instrument), position information (e.g., a position of one or more portions of the instrument in a reference coordinate system used by the system to navigate the instrument, for example), and/or other information that may be used to indicate a shape of the instrument.

Vi.b.4.iii. use of robot data to improve shape data

Fig. 10 illustrates an example shape data comparison module and shape data adjustment module that may be included in the strain-based algorithm module 945. As shown in fig. 10, the shape data comparison module 907 receives data from a plurality of data stores 902 and 941. For example, the received data may include shape data from shape data store 902 and robot data from robot data store 930. The shape data comparison module 907 may compare the received shape data with the received robot data and determine whether the received shape data is consistent with the received robot data.

As described herein, robot data may include, for example, kinematic model data indicative of the movement of an instrument expected to be caused by a given set of control commands. The shape data comparison module 907 may compare the movement indicated by the robot data to the movement indicated by the shape data received from the shape data determination module 906. Based on the comparison, the shape data comparison module 907 may output a comparison result indicating whether the shape data is consistent with the robot data and a degree of difference between the shape data and the robot data. For example, the comparison result may indicate that the curvature of the instrument indicated by the shape data is outside of the range of acceptable curvatures indicated by the robot data (e.g., beyond the highest acceptable curvature by a certain amount). As another example, the comparison may indicate that the shape data corresponding to the particular portion of the instrument is inconsistent with the torque measurements included in the robot data (e.g., the measurements of the torque applied to the pull wires).

Vi.b.4.iv. use of data other than robot data to improve shape data

In other embodiments, shape data comparison module 907 may compare shape data to any combination of image data received from image data store 910, shape data to EM data received from EM data store 920, shape data to 3D model data received from 3D model data store 940, shape data to other data received from other data store 941, and/or data received from two or more of data stores 910 and 941.

For example, the shape data comparison module 907 may determine an expected orientation of the instrument (e.g., at or near a distal end of the instrument) based on the image data received from the image data repository 910. The shape data comparison module 907 may then determine whether the shape data does not coincide with the expected orientation of the instrument (e.g., the image data indicates that the tip of the instrument points in a direction parallel to the anatomical lumen, but the shape data indicates that the tip of the instrument points to the inner wall of the anatomical lumen).

In another example, the shape data comparison module 907 may determine, based on the 3D model data received from the 3D model data store 940, that the anatomical lumen in which the instrument is located has a range of possible coordinate values. The shape data comparison module 907 may then determine whether the shape data indicates that the instrument is outside a range of possible coordinate values, or whether the shape data indicates that the instrument is shaped in a manner that will not fit in the anatomical lumen.

In yet another example, the shape data comparison module 907 may determine a set of coordinate values corresponding to the current position of the instrument in the reference coordinate system based on the EM data received from the EM data store 920. The shape data comparison module 907 may then determine whether the shape data is inconsistent with the expected orientation of the instrument (e.g., the set of coordinate values indicated by the shape data is different from the set of coordinate values indicated by the EM data, or deviates from the set of coordinate values indicated by the EM data by more than a threshold amount).

In yet another example, the fluoroscopic (X-ray) image may be analyzed by a computer vision algorithm to extract the profile of the instrument, and then the shape data comparison module 907 may determine whether the shape data is inconsistent with the extracted profile of the instrument.

In yet another example, different sensing modes may fit into the working channel 438 and may be connected to work with the system. These sensing modes include radial endobronchial ultrasound (REBUS) probes, multispectral imaging (spectroscopes), tomographic imaging (optical coherence tomography, confocal microscopy, two-photon excitation microscopy, etc.). Using the sensor data generated by these sensing patterns, the shape data comparison module 907 may determine whether the shape data is inconsistent with the sensor data.

Other examples of vi.b.4.v. shape data comparison

In some embodiments, the shape data comparison module 907 determines that a mismatch between the shape data and the robot data has been detected for more than a threshold amount of time, and outputs a warning indicating that the instrument may be damaged. For example, the shape data comparison module 907 may determine that the last five comparison results output to the shape data adjustment module 908 indicate that the shape data is inconsistent with the robot data and output a warning (e.g., indicating that the instrument may be damaged, stuck, or otherwise malfunctioning).

Although not shown in fig. 10, additionally or alternatively, the shape data comparison module 907 may compare the shape data to estimated state data received from the state estimator 980. In some cases, the shape of the sheath may be known (e.g., based on shape sensing using optical fibers within the sheath, or robotic data corresponding to the sheath). In such cases, the shape data comparison module 907 may access shape data corresponding to a sheath surrounding the instrument and compare the shape data of the instrument to the shape data of the sheath.

In some cases, the shape data comparison module 907 determines that the robot data has a higher confidence value than the shape data and, based on the determination, compares the shape data to the robot data. Alternatively, in some cases, the shape data comparison module 907 determines that the robot data has a lower confidence value than the shape data and based on that determination, refrains from comparing the shape data to the robot data.

For example, at or near the distal end of the instrument, the confidence value assigned to the shape data or strain data may be lower than the confidence value assigned to the robot data because, as described above, the strain measurement of the distal portion of the instrument may rely on the shape/position data determined based on the strain measurement of the proximal portion of the instrument (e.g., closer to the shape detector 452), and any error in the strain measurement of the proximal portion may be magnified in the strain measurement of the distal portion. On the other hand, at or near the proximal end of the instrument, the confidence value assigned to the shape data or strain data may be higher than the confidence value assigned to the robot data.

In some embodiments, the shape data is compared to robot data at or near the distal end of the instrument and adjusted as needed, but the shape data is not compared to robot data at or near the proximal end of the instrument. In other embodiments, the shape data is compared to robot data at or near the distal and proximal ends of the instrument and adjusted as needed.

Vi.b.4.vi. adjusting shape data using the comparison results

The shape data comparison module 907 outputs the comparison result to the shape data adjustment module 908. The comparison results may indicate which portion (if any) of the shape data does not satisfy the one or more conditions indicated by the data (e.g., robot data) to which the shape data is compared. For example, as described with reference to fig. 4D, the comparison may indicate that the shape data corresponding to a portion of the endoscope 118 indicates that the portion exhibits a curvature value that is inconsistent with the robot data. In another example, as described with reference to fig. 4E, the comparison may indicate that the shape data corresponding to a portion of the endoscope 118 indicates that the portion is moving at a speed that is inconsistent with the robot data.

In other cases, the comparison result may indicate that the direction of the instrument tip indicated by the shape data deviates from the direction of the instrument tip indicated by the robot data by more than a threshold amount, the shape data indicates a shape of the instrument such that a portion of the instrument will be outside of the anatomical lumen, or the position of the instrument indicated by the shape data deviates from the position of the instrument indicated by the robot data by more than a threshold amount. The comparison results may indicate an error or deviation from system expectations based on one or more of data from various sources and/or estimated states from state estimator 980.

Based on the received comparison results, shape data adjustment module 908 adjusts the shape data and outputs the adjusted shape data to shape data store 902. For example, upon determining that the curvature value indicated by the shape data is inconsistent with the robot data, the shape data adjustment module 908 may modify the shape data such that the curvature value is within the acceptable curvature range indicated by the robot data. As another example, upon determining that the current speed indicated by the shape data is inconsistent with the robot data, the shape data adjustment module 908 may modify the shape data such that the current speed is within an acceptable speed range indicated by the robot data. In yet another example, upon determining that some or all of the shape data does not satisfy one or more conditions indicated by the robot data, shape data adjustment module 908 may discard such shape data instead of adjusting the shape data. In yet another example, upon determining that a portion or all of the shape data does not satisfy one or more conditions indicated by the robot data, the shape data adjustment module 908 may decrease the confidence of the shape data (e.g., by decreasing a confidence value associated with the shape data) instead of adjusting the shape data. The adjusted shape data is stored in shape data store 902. In some cases, the adjusted shape data is stored in another data store different from shape data store 902.

In some cases, the shape data adjustment module 908 may make alternative or additional adjustments based on other factors. For example, the shape data adjustment module 908 may adjust the shape data (or adjust the confidence of the shape data) based on temperature changes. In such examples, the shape data adjustment module 908 may adjust the shape data (or adjust the confidence of the shape data) based on thermal expansion and contraction characteristics of the optical fiber. The shape data adjustment module 908 may make such an adjustment in response to determining that the received comparison result indicates that the shape data is inconsistent with the at least one other data. In other cases, the shape data determination module 906 takes into account the current temperature when determining the shape data based on the received strain data, and the shape data adjustment module 908 does not make additional temperature-based adjustments to the shape data.

In some embodiments, the shape data adjustment module 908 may adjust the shape data (or adjust the confidence of the shape data) based on whether the tip of the instrument is articulating. Alternatively or in addition, the shape data adjustment module 908 may adjust the shape data (or adjust the confidence of the shape data) based on whether a non-shape changing strain (e.g., temperature, articulation pattern, etc.) is applied to the instrument. The shape data adjustment module 908 may make one or both of these adjustments in response to determining that the received comparison result indicates that the shape data is inconsistent with the at least one other data.

FIG. 11 illustrates an exemplary shape-based state estimation module that may be included in the strain-based algorithm module 945. As shown in fig. 11, the shape-based state estimation module 909 receives the adjusted shape data from the shape data store 902 and determines shape-based estimated state data based on the adjusted shape data and previous estimated state data received from the estimated state data store 985. The shape-based state estimation module 909 outputs the shape-based estimated state data to the estimated state data store 985. The process may be repeated to generate estimated state data for future timestamps. In some cases, the shape-based state estimation module 909 determines an estimated state of a sheath covering the instrument based on the estimated instrument state.

Vii.a. overview of shape data adjustment based on robot data

Fig. 12 is a flow diagram illustrating an exemplary method operable by a surgical robotic system or components thereof for determining and adjusting shape data based on other data available to the surgical robotic system (such as robot data), according to one embodiment. For example, the steps of the method 1200 shown in fig. 12 may be performed by a processor and/or other components of a medical robotic system (e.g., the surgical robotic system 500) or an associated system (e.g., the strain-based algorithm module 945 of the navigation configuration system 900). For convenience, the method 1200 is described as being performed by a surgical robotic system (also referred to simply as a "system" in connection with the description of the method 1200).

Method 1200 begins at block 1201. At block 1205, the system accesses robot data regarding an instrument navigated (or to be navigated) within the interior region of the body. The robotic data may include data related to control of the instrument (e.g., endoscope 118 and/or a sheath thereof) and/or physical movement of a portion of the instrument (e.g., an instrument tip or sheath) within the tubular network. As described above, the robot data may include command data, force and distance data, mechanical model data, kinematic model data, and the like.

At block 1210, the system accesses strain data from an optical fiber positioned within the instrument. The strain data may be indicative of strain on a portion of the instrument positioned within the interior region of the body. In some cases, the strain data is indicative of one or both of strain on the distal end of the instrument and strain on the proximal end of the instrument. The strain data may be generated by the shape detector 452 and stored in the strain data store 901, and the system may access the strain data from the strain data store 901.

At block 1215, the system determines shape data based on the strain data. For example, based on the strain on a particular portion of the instrument indicated by the strain data, the system may predict the shape of the particular portion of the instrument. The shape data may include angles, coordinates, or a combination thereof that indicate the current shape of the instrument. In some cases, the shape data may include curvature information (e.g., curvature values of one or more portions of the instrument), orientation information (e.g., roll, pitch, and/or yaw of one or more portions of the instrument), position information (e.g., a position of one or more portions of the instrument in a reference coordinate system used by the system to navigate the instrument, for example), and/or other information that may be used to indicate a shape of the instrument.

At block 1220, the system compares the robot data and the shape data. In some embodiments, the comparing comprises determining whether a particular value included in the shape data satisfies a corresponding condition indicated by the robot data. For example, robot data accessed by the system may indicate that the instrument cannot be controlled in a manner that results in a curvature value greater than a maximum curvature value or outside of a given curvature value range. In such examples, the system may determine whether the curvature value for the given portion of the instrument indicated by the shape data exceeds a maximum curvature value or is outside of a given curvature value range indicated by the robot data for the given portion of the instrument. In another example, the robot data accessed by the system may indicate that the instrument cannot move faster than a maximum speed or outside of a particular range of movement. In such examples, the system may determine whether the movement (e.g., velocity, movement path, or other time history data) of the given portion of the instrument indicated by the shape data satisfies the movement conditions (e.g., maximum velocity, movement velocity range, etc.) indicated by the robot data of the given portion of the instrument. In other cases, similar techniques may be applied such that the system may determine whether any parameter values indicated by the shape data satisfy the corresponding shape conditions indicated by the robot data (e.g., may indicate whether a given parameter value in the shape data is or may be erroneous minimum, maximum, and/or range values).

At block 1225, the system adjusts the shape data based on the comparison of the robot data and the shape data. In some embodiments, the adjusting includes modifying at least a portion of the shape data such that the determination (at block 1230) of the estimated state of the instrument is based on the modified portion of the shape data. For example, upon determining that the curvature value indicated by the shape data exceeds the maximum curvature value indicated by the robot data, the system may modify the shape data such that the curvature value is less than or equal to the maximum curvature value indicated by the robot data. As another example, upon determining that the current speed indicated by the shape data exceeds the maximum speed indicated by the robot data, the system may modify the shape data such that the current speed is less than or equal to the maximum speed indicated by the robot data. In other embodiments, the adjusting includes removing at least a portion of the shape data such that the determination (at block 1230) of the estimated state of the instrument is not based on the removed portion of the shape data. For example, upon determining that some or all of the shape data does not satisfy one or more conditions indicated by the robot data, the system may discard such shape data or disregard such shape data when determining the estimated state at block 1230.

Adjusting the shape data may also include assigning confidence values or weights to the shape data or adjusting such confidence values or weights assigned to the shape data. For example, the system may increase a confidence value or weight associated with the shape data when it is determined that the shape data satisfies one or more conditions indicated by the robot data. Alternatively, the system may decrease the confidence value or weight associated with the shape data upon determining that the shape data does not satisfy the one or more conditions indicated by the robot data.

At block 1230, the system determines an estimated state of the instrument based on the adjusted shape data. In some cases, the system may determine an estimated state of the instrument based on a combination of the adjusted shape data and data from one or more data stores shown in fig. 8A and/or one or more estimated state data from state estimator 980 in fig. 8A. At block 1235, the system outputs an estimated state of the instrument. The method 1200 ends at block 1240.

Vii.b. shape data adjustment procedure overview

Fig. 13 is a conceptual diagram illustrating an exemplary method operable by a surgical robotic system or components thereof for operating an instrument, according to one embodiment. For example, the steps shown in diagram 1300 shown in fig. 13 may be performed by a processor and/or other components of a medical robotic system (e.g., surgical robotic system 500) or an associated system (e.g., image-based algorithm module 945 of navigation configuration system 900). For convenience, the process illustrated in diagram 1300 is described as being performed by a surgical robotic system (also referred to simply as a "system").

As shown in fig. 13, the shape data 1305 and the robot data 1310 are fed into a decision block 1315. At block 1315, the system determines from the robot data 1310 whether the shape data 1305 is acceptable. Upon determining that the shape data is not acceptable, the system proceeds to block 1320 and adjusts the shape data. Upon determining that the shape data is acceptable, the system proceeds to block 1325 to drive the instrument based at least on the shape data (or adjusted shape data), and/or proceeds to block 1330 to navigate the instrument based at least on the shape data (or adjusted shape data).

Vii.c. shape data confidence adjustment procedure overview

Fig. 14 is a conceptual diagram illustrating an exemplary method operable by a surgical robotic system or components thereof for operating an instrument, according to one embodiment. For example, the steps shown in diagram 1400 shown in fig. 14 may be performed by a processor and/or other components of a medical robotic system (e.g., surgical robotic system 500) or an associated system (e.g., image-based algorithm module 945 of navigation configuration system 900). For convenience, the process illustrated in diagram 1400 is described as being performed by a surgical robotic system (also referred to simply as a "system").

As shown in fig. 14, the system acquires shape data 1405 and robot data 1410 and generates weighting data 1415. The weighted data 1415 may be a weighted sum of the shape data 1405 and the robot data 1410, where the shape data 1405 and the robot data 1410 are weighted based on their respective confidence values. If the confidence value associated with the shape data 1405 is higher than the confidence value of the robot data 1410, the shape represented by the weighting data 1415 may be closer to the shape represented by the shape data 1405. On the other hand, if the confidence value associated with the robot data 1410 is higher than the confidence value of the shape data 1405, the shape represented by the weighted data 1415 may be closer to the shape represented by the robot data 1410. At block 1420, the system determines from the robot data 1410 whether the shape data 1405 is acceptable. Upon determining that the shape data 1405 is not acceptable, the system proceeds to block 1425 and adjusts the confidence value associated with the shape data 1405. The system then proceeds to block 1430 to drive the instrument based at least on the weighted data reflecting the adjusted confidence value, and/or to block 1435 to navigate the instrument based at least on the weighted data reflecting the adjusted confidence value. On the other hand, upon determining that the shape data 1405 is acceptable, the system proceeds to block 1430 to drive the instrument based at least on the weighting data 1415, and/or to block 1435 to navigate the instrument based at least on the weighting data 1415.

As just described with reference to fig. 14, various embodiments described herein can adjust the confidence value to improve navigation or control of the medical instrument. For example, in some cases, the navigation system may use the adjusted confidence value to reduce the weight assigned to the strain-based shape data to determine the position of the medical instrument relative to the patient anatomy, as may be represented by a pre-operative model of the patient. As described elsewhere in this disclosure, a navigation system (see, e.g., fig. 8A) may receive multiple different state estimates of a medical instrument from corresponding state estimators, and according to the embodiment shown in fig. 14, the navigation system may reduce the weight given to a state derived from strain-based shape data based on the comparison. Depending on the adjusted confidence, the navigation system may ignore state estimates derived from the state estimator using the strain-based shape data, or may reduce the effect of the state estimator on determining the estimated state of the medical device.

It should be understood that the reverse is also possible for the embodiments contemplated by the present disclosure. That is, if a comparison between the strain-based shape data and the robot-data-based shape data determines that the two types of data closely match (as may be determined by a threshold amount), the navigation system may increase the confidence or weight of the state estimate derived from the strain-based shape data.

Similarly, depicted in fig. 14, some embodiments may include a control system that controls the actuation of the medical instrument based on a comparison between the strain-based shape data and the robotic data. Such control systems may use or ignore strain-based shape data based on the comparison when controlling the pose of the medical instrument.

Although fig. 4D, 4E, and 9-14 are described herein with respect to robot data, in other embodiments, other data may be used instead of or in combination with robot data. Additionally, while some of the techniques described herein are described with reference to a surgical robotic system, in other embodiments, such techniques may be applied to non-surgical systems, such as medical robotic systems and systems for controlling instruments within internal regions of the body that do not involve surgery.

Implementation systems and terminology

Implementations disclosed herein provide systems, methods, and devices for detecting physiological noise during navigation of a luminal network.

It should be noted that, as used herein, the terms "coupled," "coupled," or other variations of the word coupled may indicate either an indirect connection or a direct connection. For example, if a first element is "coupled" to a second element, the first element can be indirectly connected to the second element via another element or directly connected to the second element.

The functions described herein may be stored as one or more instructions on a processor-readable or computer-readable medium. The term "computer-readable medium" refers to any available medium that can be accessed by a computer or processor. By way of example, and not limitation, such media can include Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, compact disc read only memory (CD-ROM), or other optical disk storage can include RAM, ROM, EEPROM, flash memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. It should be noted that computer-readable media may be tangible and non-transitory. As used herein, the term "code" may refer to software, instructions, code or data that is executable by a computing device or processor.

The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is required for proper operation of the method being described, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.

The term "plurality", as used herein, means two or more. For example, a plurality of components indicates two or more components. The term "determining" encompasses a variety of actions, and thus "determining" can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Additionally, "determining" may include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory), and the like. Additionally, "determining" may include resolving, selecting, choosing, establishing, and the like.

The phrase "based on" does not mean "based only on," unless expressly specified otherwise. In other words, the phrase "based on" describes that "is based only on" and "is based at least on" both.

The previous embodiments of the disclosed embodiments are provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the scope of the invention. For example, it should be understood that one of ordinary skill in the art will be able to employ a number of corresponding alternative and equivalent structural details, such as equivalent ways of fastening, mounting, coupling or engaging tool components, equivalent mechanisms for generating specific actuation motions, and equivalent mechanisms for delivering electrical energy. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

59页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:蠕动运动自动测量方法、蠕动运动自动测量程序、蠕动运动自动测量装置以及蠕动运动自动测量系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!