Ultrasound imaging apparatus, treatment support system, and image display method

文档序号:1910839 发布日期:2021-12-03 浏览:29次 中文

阅读说明:本技术 超声波摄像装置、治疗辅助系统以及图像显示方法 (Ultrasound imaging apparatus, treatment support system, and image display method ) 是由 竹岛启纯 田中智彦 今井亮 于 2021-03-12 设计创作,主要内容包括:本发明提供超声波摄像装置、治疗辅助系统以及图像显示方法。提供能使用线性阵列探头来掌握导丝与血管等摄像对象的三维的位置关系的超声波摄像装置。超声波摄像装置具备:超声波探头,其对被检体照射超声波并接收所述超声波的反射波,并且接收来自插入被检体内部的信标的超声波;探头位置取得部,其取得超声波探头的三维位置和朝向;信标位置取得部,其根据从基于超声波探头中接收到的超声波的超声波图像求得的信标相对于超声波探头的相对位置或相对速度、和探头位置取得部中取得的超声波探头的三维位置和朝向来求取信标的三维位置;和显示图像形成部,其使用基于超声波探头中接收到的超声波的超声波图像来形成在显示部显示的图像。(The invention provides an ultrasonic imaging apparatus, a treatment support system, and an image display method. Provided is an ultrasonic imaging apparatus capable of grasping the three-dimensional positional relationship between a guide wire and an imaging target such as a blood vessel using a linear array probe. The ultrasonic imaging apparatus includes: an ultrasonic probe that irradiates an object with ultrasonic waves and receives reflected waves of the ultrasonic waves, and receives the ultrasonic waves from a beacon inserted inside the object; a probe position acquisition unit that acquires a three-dimensional position and orientation of the ultrasonic probe; a beacon position acquiring unit that acquires a three-dimensional position of a beacon based on a relative position or a relative speed of the beacon with respect to the ultrasonic probe, which is obtained from an ultrasonic image based on ultrasonic waves received by the ultrasonic probe, and the three-dimensional position and the orientation of the ultrasonic probe acquired by the probe position acquiring unit; and a display image forming unit that forms an image displayed on the display unit using an ultrasonic image based on the ultrasonic waves received by the ultrasonic probe.)

1. An ultrasonic imaging apparatus is characterized by comprising:

an ultrasonic probe that irradiates an object with ultrasonic waves and receives reflected waves of the ultrasonic waves, and receives the ultrasonic waves from a beacon inserted inside the object;

a probe position acquisition unit that acquires a three-dimensional position and orientation of the ultrasonic probe;

a beacon position acquiring unit that acquires a three-dimensional position of the beacon based on a relative position or a relative speed of the beacon with respect to the ultrasonic probe, which is obtained from an ultrasonic image based on ultrasonic waves received by the ultrasonic probe, and the three-dimensional position and the orientation of the ultrasonic probe, which are acquired by the probe position acquiring unit; and

and a display image forming unit that forms an image displayed on a display unit using an ultrasonic image based on the ultrasonic waves received by the ultrasonic probe.

2. The ultrasonic imaging apparatus according to claim 1,

the ultrasonic probe is a linear array probe.

3. The ultrasonic imaging apparatus according to claim 2,

inserting the beacon into a blood vessel of a subject and moving the beacon along the blood vessel,

the beacon position acquiring unit acquires the relative position of the beacon with respect to the ultrasonic probe based on at least one of a change in the position of the beacon and the ultrasonic probe in the imaging plane of the ultrasonic probe and a change in the rotational position of the imaging plane of the ultrasonic probe with respect to the beacon.

4. The ultrasonic imaging apparatus according to claim 3,

the beacon position acquiring unit performs filtering processing to reduce an error when acquiring a relative position of the beacon with respect to the ultrasonic probe.

5. The ultrasonic imaging apparatus according to claim 4,

the beacon position acquisition unit performs filtering processing by modeling an error when determining the relative position of the beacon with respect to the ultrasonic probe.

6. The ultrasonic imaging apparatus according to claim 5,

the beacon position acquisition unit estimates a state with the highest statistical probability from an error model, a position and a positional change of the beacon with respect to the ultrasonic probe, and a position and a positional change of the ultrasonic probe, and performs filtering processing.

7. The ultrasonic imaging apparatus according to claim 4,

the ultrasonic imaging apparatus includes: the mechanical arm of the ultrasonic probe is operated,

the probe position acquisition unit sets the operation position information of the robot arm as the three-dimensional position of the ultrasonic probe.

8. The ultrasonic imaging apparatus according to claim 7,

the mechanical arm tracks the beacon so that the beacon is recognized in a photographic image of a circular slice section of the blood vessel photographed with the ultrasonic probe.

9. The ultrasonic imaging apparatus according to claim 7,

the mechanical arm tracks the beacon so that the beacon is recognized in a captured image of a long-axis cross section of the blood vessel captured with the ultrasonic probe.

10. The ultrasonic imaging apparatus according to claim 7,

the robot arm tracks the beacon so that the beacon position acquisition unit obtains the relative position of the beacon with respect to the ultrasonic probe from at least one of a change in the position of the beacon and the ultrasonic probe in the imaging plane of the ultrasonic probe and a change in the rotational position of the imaging plane of the ultrasonic probe with respect to the beacon.

11. The ultrasonic imaging apparatus according to claim 2,

the display image forming unit forms a display image based on the anatomical structure information of the subject acquired in advance and the three-dimensional position of the beacon obtained by the beacon position acquiring unit.

12. The ultrasonic imaging apparatus according to claim 11,

the display image forming unit forms a display image by adding distortion of a living body.

13. A medical assistance system is characterized by comprising:

the ultrasonic imaging apparatus according to any one of claims 1 to 12; and

a guidewire with a beacon at the leading end.

14. An image display method for an ultrasonic imaging apparatus, the ultrasonic imaging apparatus comprising: a linear array probe that irradiates an object with an ultrasonic wave and receives a reflected wave of the ultrasonic wave, and receives the ultrasonic wave from a beacon inserted inside the object, the image display method being characterized in that,

acquiring the three-dimensional position and orientation of the linear array probe,

determining a relative position of the beacon with respect to the linear array probe from the ultrasound image received in the linear array probe,

deriving the three-dimensional position of the beacon from the relative position of the beacon and the three-dimensional position and orientation of the linear array probe,

operating the position of the linear array probe based on the three-dimensional position of the beacon,

displaying an ultrasound image based on ultrasound received with the linear array probe.

Technical Field

The present invention relates to an ultrasound imaging apparatus, a treatment support system, and an image display method for imaging an ultrasound image by inserting a guide wire (guide wire) having an ultrasound generating source such as photoacoustic waves into a body.

Background

Catheter treatment is a surgical method that is less burdened by a patient than an operation such as an open chest operation, and therefore is widely used mainly for treatment of vascular stenosis. In catheter treatment, it is important to grasp the relationship between a region to be treated and a catheter, and X-ray fluoroscopy is used as an imaging method for assisting the grasping. Further, as in patent document 1, there is an attempt to use an ultrasonic image as an auxiliary image instead of an X-ray fluoroscopic image.

In detail, patent document 1 discloses the following technique: regarding the ultrasonic waves generated by the ultrasonic wave generation source mounted on the guide wire, the guide wire tip position is estimated using the arrival time difference of the ultrasonic waves (ultrasonic waves from the ultrasonic wave generation source) that reach the element array constituting the ultrasonic probe or the image of the ultrasonic wave generation source that depends on the distance from the imaging region, and the relative positional relationship between the imaging result and the guide wire tip is grasped using the estimation result.

Documents of the prior art

Patent document

Patent document 1: JP patent publication 2019-213680

According to the above-described conventional technique, the position of the tip position of the insert (guide wire) within the imaging region can be estimated. However, in order to grasp the three-dimensional positional relationship between a living body such as a blood vessel to be imaged and the distal end portion of the insertion object, it is necessary to use a two-dimensional array probe in which elements constituting the ultrasonic probe are arranged in a matrix.

On the other hand, while the positional relationship of the catheter or the like relative to the imaging region can be grasped, the absolute position cannot be detected, and for example, the position cannot be aligned with (position-matched with) information on the living body obtained by other imaging methods.

Disclosure of Invention

An object of the present invention is to provide an ultrasound imaging apparatus, a treatment support system, and an image display method that enable grasping of a three-dimensional positional relationship between a guide wire and an imaging target such as a blood vessel using a linear array probe.

In order to solve the above problem, an ultrasonic imaging apparatus according to the present invention includes: an ultrasonic probe that irradiates an object with ultrasonic waves and receives reflected waves of the ultrasonic waves, and receives the ultrasonic waves from a beacon inserted inside the object; a probe position acquisition unit that acquires a three-dimensional position and orientation of the ultrasonic probe; a beacon position acquiring unit that acquires a three-dimensional position of the beacon based on a relative position or a relative speed of the beacon with respect to the ultrasonic probe, which is obtained from an ultrasonic image based on ultrasonic waves received by the ultrasonic probe, and the three-dimensional position and the orientation of the ultrasonic probe, which are acquired by the probe position acquiring unit; and a display image forming unit that forms an image displayed on the display unit using an ultrasonic image based on the ultrasonic waves received by the ultrasonic probe.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the present invention, the linear array probe can be used to show the operator the three-dimensional positional relationship between the guide wire and the imaging target such as the blood vessel.

Drawings

Fig. 1 is a diagram illustrating an overall configuration of a medical support system according to an embodiment.

Fig. 2 is a diagram illustrating the appearance of the distal end portion of the guide wire.

Fig. 3 is a block diagram of the ultrasonic imaging apparatus.

Fig. 4 is a flowchart of the processing of the ultrasonic imaging apparatus.

Fig. 5 is a diagram illustrating a coordinate system of the ultrasonic probe.

Fig. 6 is a diagram illustrating a method of detecting the absolute position of the PA signal generation source from the positional changes of the ultrasonic probe and the PA signal generation source.

Fig. 7 is a diagram illustrating a method of detecting the absolute position of the PA signal generation source from a change in the rotational position of the imaging surface of the ultrasonic probe.

Fig. 8 is a flowchart of a process of detecting the absolute position of the PA signal generation source using the velocity of the ultrasonic probe.

Fig. 9 is a block diagram of an ultrasonic imaging apparatus having another configuration.

Fig. 10 is another processing flowchart of the ultrasonic imaging apparatus.

Fig. 11 is a diagram showing a display example of the display unit.

Fig. 12 is a diagram showing another display example of the display unit.

Fig. 13A is a diagram showing an example of a movement plan of the robot arm of the operation planning unit.

Fig. 13B is a diagram showing an example of an ultrasonic image of the ultrasonic probe.

Fig. 14A is a diagram showing another example of the movement plan of the robot arm of the operation planning unit.

Fig. 14B is a diagram showing another example of an ultrasonic image of the ultrasonic probe.

Description of reference numerals

10 ultrasonic wave generating apparatus

11 guide wire (organism inserting apparatus)

12 optical fiber

13 PA Signal Generation Source (photoacoustic ultrasound Generation Source) (Beacon)

20 ultrasonic probe

30 ultrasonic imaging unit

31 transmitting part

32 receiving part

33 input unit

34 display part

35 memory

40 control part

41 Signal processing part

411 reflected ultrasonic signal processing unit

412 PA Signal analysis section (ultrasonic Signal analysis section)

42 PA sound source position detecting unit

421 relative position detecting part

422 relative speed measuring part

423 position filter

424 absolute position detecting unit (beacon position acquiring unit)

425 Probe speed measuring part (Probe position acquiring part)

43 display image forming part

443D anatomical information

45 operation planning part

51 displaying an image

52 display image

80 test object

90 mechanical arm

91 Probe position/attitude sensor (Probe position acquisition part)

100 medical assistance system

Detailed Description

Embodiments of the present invention are described in detail below with reference to the accompanying drawings.

Fig. 1 is a diagram illustrating an overall configuration of an ultrasonic imaging apparatus according to the present embodiment and a catheter treatment support system (hereinafter, referred to as a medical support system) using the same.

Fig. 2 is a diagram illustrating the shape of the distal end portion of the guide wire.

As shown in fig. 1, the medical support system 100 includes: a biological insertion instrument 11 (guide wire) on which an ultrasonic wave generation device 10 including an ultrasonic wave generation source 13 (beacon) and a light generation unit 15 is mounted; the ultrasonic probe 20 (probe) and the robot arm 90 include: an ultrasound imaging unit 30 for acquiring an ultrasound image of a subject 80 inserted into the biological insertion instrument 11, and a display unit 34 for the same.

The biological insertion instrument 11 is, for example, a therapeutic instrument such as a balloon catheter, a microcatheter, or a nutrition catheter, or an elongated tubular medical instrument such as a guide wire for delivering the therapeutic instrument to a target site. The following describes a case where the biological insertion instrument is a guide wire.

The ultrasonic wave generating apparatus 10 including the PA signal generating source 13 that generates an ultrasonic wave signal by a Photoacoustic (PA) effect is described below, but an ultrasonic wave may be generated by a piezoelectric element.

As shown in fig. 1 and 2, the ultrasonic wave generating apparatus 10 includes: an optical fiber 12 (not shown) disposed in the hollow portion of the flexible hollow guide wire 11; a photoacoustic ultrasonic wave generation source 13 fixed to the insertion-side end surface of the optical fiber 12; and a light generating section 15 connected to the other end of the optical fiber 12 (the end opposite to the end to which the photoacoustic ultrasonic wave generating source 13 is fixed) and generating laser light. The optical fiber 12 functions as a light guide member for guiding the laser light generated by the light generation unit 15 to the photoacoustic ultrasonic wave generation source 13 at the distal end. The ultrasonic wave generating device 10 and the hollow guide wire 11 are included and referred to as a photoacoustic source mounting wire.

As shown in fig. 2, the ultrasonic probe 20 detects ultrasonic waves emitted from a photoacoustic ultrasonic wave generation source 13 (hereinafter referred to as a PA signal generation source 13) at the distal end of the guide wire 11 inserted into the blood vessel 82. Then, the ultrasonic imaging unit 30 superimposes the detected image of the PA signal generation source 13 on the cross-sectional images of the subject 80 and the blood vessel 82 formed based on the ultrasonic waves emitted from the ultrasonic probe 20.

This makes it possible to grasp the positions of the guide wire 11 in the subject 80 and the blood vessel 82.

The PA signal generating source 13 is made of a material that receives laser light and thermally insulates and expands to emit ultrasonic waves such as PA signals, for example, a known dye (photosensitizer), metal nanoparticles, carbon-based compound, or the like. The tip of the optical fiber 12 including the PA signal generating source 13 is covered with a resinous sealing member. In addition, in fig. 2, the PA signal generating source 13 is positioned at the leading end of the guide wire 11, but is not limited to the wire leading end.

Next, details of the ultrasonic imaging unit 30 constituting the ultrasonic imaging apparatus according to the embodiment will be described. Fig. 3 is a block diagram of an ultrasonic imaging apparatus included in the ultrasonic imaging unit 30.

The ultrasonic imaging unit 30 includes: a control unit 40 to be described later; a transmission unit 31 for transmitting an ultrasonic signal to the ultrasonic probe 20; a receiving unit 32 that receives reflected waves (RF signals) detected by the ultrasonic probe 20 and performs processing such as phase modulation and addition; an input unit 33 for inputting conditions and instructions required for imaging by a user; a display unit 34 for displaying an ultrasonic image acquired by the ultrasonic imaging unit 30, a GUI (graphical User Interface), and the like; and a memory 35 for storing the processing result, the display image formed by the display image forming unit 43 based on the processing result, and the like.

The control unit 40 includes: a signal processing unit 41 that processes an ultrasonic image (including a reflected ultrasonic signal and a PA signal) received by the ultrasonic probe 20; a PA sound source position detection unit 42 that detects a sound source position of the PA signal from the signal processed by the signal processing unit 41; a display image forming unit 43 for forming an image to be displayed on the display unit 34 using the PA sound source position detected by the PA sound source position detecting unit 42 and the previously acquired 3D anatomical information 44 of the subject; and an operation planning unit 45 for determining the operation position of the ultrasonic probe 20 by using the sound source position detected by the PA sound source position detecting unit 42 and the previously acquired 3D anatomical information 44 of the subject.

The 3D anatomical information 44 may be a 3D volume image acquired by CT (Computer Tomography) or MRI (Magnetic Resonance Imaging) of the subject, or a 3D volume image acquired by sweep (sweep) of an ultrasonic Imaging apparatus.

The signal processing unit 41 includes: a reflected ultrasonic signal processing unit 411 for creating an ultrasonic image such as a B-mode image using the RF signal received as a reflected wave by the receiving unit 32; an ultrasonic signal analyzing unit (PA signal analyzing unit) 412 that detects and processes the PA signal emitted from the PA signal generating source 13 and detected by each transducer element of the ultrasonic probe 20 based on the light emission timing of the laser light of the light generating unit 15.

The control unit 40 has the same configuration as a general ultrasonic imaging apparatus except that a PA signal analyzing unit 412 that receives a PA signal and a PA sound source position detecting unit 42 that detects the generation position of the PA signal are added.

The PA sound source position detection unit 42 includes: a relative position detection unit 421 that estimates a position within the imaging region of the PA signal generation source 13 from the PA signal analyzed by the PA signal analysis unit 412; a relative velocity measuring unit 422 that differentiates the position of the PA signal generating source 13 detected by the relative position detecting unit 421 to derive a velocity; a probe speed measuring unit 425 that measures the speed of the ultrasonic probe 20 based on the operation position information of the robot arm 90; an absolute position detection unit 424 that detects the absolute position of the PA signal generation source 13 based on the speed measured by the relative speed measurement unit 422 and the speed of the ultrasonic probe 20 measured by the probe speed measurement unit 425; and a position filter 423 for reducing an error by filtering the absolute position detected by the absolute position detecting unit 424.

The position filter 423 reduces an error included in the detection result of the absolute position detecting unit 424 by an arbitrary filtering process. For example, if the PA signal generating source 13 moves at a Low speed, the position detection error can be reduced by smoothing with a Filter such as a moving average Filter or LPF (Low Pass Filter).

The position filter 423 may be a filter that takes into account an error model that models a detection error that would occur in principle in the position detection method of the PA sound source position detecting unit 42. Specifically, a gamma filter, a particle filter, or the like that estimates a state with the highest statistical probability can be used by comprehensively considering the position and speed of the PA signal generation source 13 detected in the PA sound source position detection section 42, the position, speed, and attitude of the ultrasonic probe 20, and the error model of the PA sound source position detection section 42.

Details of other configurations in the PA sound source position detection unit 42 will be described later.

Part or all of the functions of the control unit 40 can be realized by executing software for programming the functions thereof in a computer having a CPU, a GPU, and a memory. In addition, a part or all of the functions of each part may be realized by hardware such as an electronic circuit, an ASIC, and an FPGA. The control unit 40 may be installed in a single computer, or may be installed in a plurality of computers for each function.

The ultrasonic probe 20 is a 1D array probe (linear array probe) having an array arrangement in which a large number of transducer elements are arranged in one dimension. In addition, various ultrasonic probes 20 such as a 1D3 array probe having a two-dimensional or three-dimensional array arrangement in a direction orthogonal to the array arrangement direction of the 1D array probe, and a 2D array probe having a large number of array arrangements in a two-dimensional direction can be used. The signal processing unit 41 adopts an analysis method according to the type of the ultrasonic probe 20 used.

Next, the operation of the medical support system 100 according to the embodiment will be described.

Here, while the ultrasonic imaging apparatus performs ultrasonic imaging of the subject 80 using the ultrasonic probe 20 of the 1D array probe, the operator inserts the guide wire 11, such as a guide catheter, having the PA signal generation source 13 provided at the distal end thereof into the body of the subject, and the ultrasonic imaging apparatus monitors the distal end position of the guide wire 11 based on the PA signal. Next, a case where the ultrasonic imaging apparatus operates the robot arm 90 so that the ultrasonic probe 20 tracks the tip position of the guide wire will be described.

Fig. 4 is a flowchart of the processing of the ultrasonic imaging apparatus.

In step S401, the ultrasound imaging apparatus determines whether or not the assisting operation for tracking the ultrasound probe 20 is valid, and if not valid (no in S401), the processing is terminated. If valid (yes in S401), the process proceeds to step S402.

In step S402, the ultrasound imaging apparatus performs reflected ultrasound imaging (hereinafter referred to as an imaging mode) by the ultrasound probe 20. In this imaging mode, ultrasonic imaging similar to that of a conventional ultrasonic imaging apparatus is performed.

Specifically, the ultrasonic probe 20 transmits an ultrasonic wave from the transmission unit 31, and the ultrasonic probe 20 receives a reflected wave of the transmitted ultrasonic wave reflected from the tissue inside the subject. The receiving unit 32 performs processing such as phase modulation and addition on the received signal received from the ultrasonic probe 20 for each frame, and sends the signal to the signal processing unit 41. The reflected ultrasonic signal processing unit 411 creates an ultrasonic image, for example, a B-mode image, using the frame signal from the receiving unit 32, and supplies the ultrasonic image to the display image forming unit 43 which forms an image to be displayed on the display unit 43.

In this imaging mode, when the ultrasonic probe 20 is a 1D array probe, information on the intensity of the reflected wave in the direction of the array probe and in the depth direction can be obtained, and therefore two-dimensional information on the intensity of the reflected wave can be obtained from the information on the intensity of the reflected wave.

When the 2D array probe is used for the ultrasonic probe 20, information corresponding to the intensity of the three-dimensional reflected wave combining the probe surface and the depth direction is obtained in the 1-time imaging mode.

In step S403, the ultrasonic imaging apparatus receives a PA signal from the ultrasonic probe 20 (hereinafter referred to as a PA reception mode). In the PA reception mode, the PA signal of the PA signal generation source 13 is monitored when a catheter is inserted into the body of the subject, for example, a blood vessel.

Specifically, in the PA reception mode, the operation of the transmission unit 31 is temporarily stopped, the light generation unit 15 is operated, and the pulsed laser light is emitted from the light generation unit 15. The light emitted from the light generating unit 15 is irradiated to the PA signal generating source 13 via the optical fiber 12 of the guide wire 11 inserted into the body. The PA signal (ultrasonic wave) is generated from the photoacoustic material constituting the PA signal generation source 13 by the irradiation light, and is detected by each element of the ultrasonic probe 20.

The PA signal analyzing unit 412 creates signal data synchronized with light irradiation from the light generating unit 15 based on the PA signal received by the ultrasonic probe 20, and supplies the signal data to the relative position detecting unit 421 of the PA sound source position detecting unit 42. In synchronization of the received PA signals, the timing of light irradiation may be obtained from a trigger signal output to the PA signal analyzing unit 412 when the light generating unit 15 irradiates light, or the timing of light irradiation may be estimated from the PA signals received by the respective elements of the ultrasonic probe 20.

In the case of using the ultrasonic wave generating apparatus 10 that generates ultrasonic waves using a piezoelectric element, the timing of generating a signal can be estimated by transmitting ultrasonic waves using the transmitting unit 31, and in addition, synchronization can be achieved by a trigger signal and an external signal source in the same manner as when using a PA signal generating source.

In step S404, the PA sound source position detection unit 42 detects the position of the PA signal generation source 13 from the PA signal delivered from the PA signal analysis unit 412 and the information on the absolute position (three-dimensional) and orientation (direction) of the ultrasonic probe 20 delivered from the robot arm 9.

Specifically, in the PA sound source position detection unit 42, first, the relative position detection unit 421 detects the relative position of the PA signal generation source 13 with respect to the ultrasonic probe 20 from the PA signal delivered from the PA signal analysis unit 412.

Next, the relative velocity is detected from a temporal change in the relative position detected by the relative velocity measuring unit 422, and the probe velocity measuring unit 425 detects the velocity and angular velocity of the ultrasonic probe 20 from a temporal change in the information on the absolute position and orientation (direction) of the ultrasonic probe 20 delivered from the robot arm 90.

The relative velocity measuring unit 422 may measure the relative velocity using the doppler effect generated in the received PA signal.

Then, the absolute position detection section 424 detects the absolute position of the PA signal generation source 13 from these values.

The detected absolute position of the PA signal generating source 13 is filtered by the position filter 423 to reduce the detection error.

In step S405, the display image forming unit 43 forms display contents and displays the display contents on the display unit 34.

Specifically, the display image forming unit 43 forms a display image that allows the operator to grasp the stereoscopic positional relationship of the PA signal generation source 13 by using the reflected ultrasonic image formed by the reflected ultrasonic signal processing unit 411, the position of the PA signal generation source 13 detected by the PA sound source position detecting unit 42, and a 3D volume image of the anatomical structure of the subject 80 acquired in advance and recorded in the information 44 of the acquired 3D anatomy, and displays the display image on the display unit 34.

A specific display example of the display unit 34 will be described with reference to fig. 11 and 12.

In step S406, the operation planning unit 45 plans the operation of the robot 90 using the information 44 of the acquired 3D anatomy of the subject 80, the position and coordinates of the probe obtained from the robot 90, and the reflected ultrasonic image and PA sound source position obtained by the ultrasonic imaging device, and instructs the robot 90 of the operation position.

For example, the action planning unit 45 drives the robot arm 90 so that the PA signal generation source 13 is positioned at a predetermined position of the reflected ultrasonic image in step S402, and tracks the PA signal generation source 13. Thereby, the ultrasonic probe 20 can continuously receive the PA signal (ultrasonic wave) generated from the PA signal generation source 13.

A more specific example of tracking the ultrasonic probe 20 will be described later with reference to fig. 13A and 14A.

Then, the process returns to step S401, and steps S402 to S406 are repeated.

The imaging mode (step 402) and the PA reception mode (step S403) are not limited to the flowchart of fig. 4. For example, the operation of executing the imaging mode 4 times and the PA reception mode 1 time may be repeated, or the operation of executing the imaging mode 1 time and the PA reception mode 1 time may be repeated.

The processing of the absolute position detection unit 424 in step S404 in fig. 4 will be described in detail later.

Fig. 5 is a diagram for explaining a coordinate system of the ultrasonic probe 20 in the following description. The coordinate system has a major axis 22 as an array arrangement direction of the acoustic element array 21 of the ultrasonic probe 20 as a 1D array probe, a minor axis 23 as an axis parallel to the array receiving surface and orthogonal to the major axis 22, a depth axis 24 as an axis orthogonal to the array receiving surface, and an origin of the coordinate system as a center of the major axis direction and the minor axis direction in the surface of the acoustic element array 21.

When the ultrasonic probe 20 is a 2D array probe having a large number of arrays arranged in two-dimensional directions, the orientations of the major axis 22 and the minor axis 23 can be arbitrarily determined.

First, a method of detecting the absolute position of the PA signal generation source 13 from the relationship between the position and the posture when the ultrasonic probe 20 moves in the direction of the minor axis 23, that is, from the positional change (translational movement) of the PA signal generation source 13 and the ultrasonic probe 20 will be described with reference to fig. 6.

As shown in fig. 6, when the ultrasonic probe 20 as the 1D array probe is moved forward from 712B to 712A in the direction of the minor axis 23 and then moves forward to 711, the absolute position of the ultrasonic probe 20 in each of 712A and 712B can be obtained by the robot arm 90. Further, since the arrival time of the PA signal of the ultrasonic probe 20 can be measured by the PA signal analyzing unit 412 based on the light emission of the light generating unit 15, the distances 713A and 713B from the PA signal generating source 13 to each of 712A and 712B can be obtained.

Therefore, since the position 715 of the PA signal generation source 13 is the intersection of the arc 714A of the distance 713A and the arc 714B of the distance 713B, the absolute position can be calculated from the absolute position of the ultrasonic probe 20 in each of the distances 712A and 712B and the distances 713A and 713B to the PA signal generation source 13.

Specifically, assuming that the arrival times of the RA signals before and after the movement are ta and tb, the distances 713a (la) and 713b (lb) to the PA signal generation source 13 are obtained by the equation (1).

[ mathematical formula 1 ]

lb=c·tb,la=c·ta (1)

Where c is the speed of sound.

Then, when the ultrasonic probe 20 is translated from 712B to 712A at the velocity v and the minute time Δ t, the position 715(yPA, zPA) of the PA signal generation source 13 is obtained by equation (2).

[ mathematical formula 2 ]

Where y is a relative value in the short axis 23 direction, z is a relative value in the depth axis 24 direction, and the minute time Δ t is set to be sufficiently short, and is approximately la ≈ lb.

In the above description, the PA signal generating source 13 is stationary, but may be moving. For this reason, although an error may occur in the absolute position of the PA signal generation source 13 determined by the movement of the PA signal generation source 13, the error is mitigated by the position filter 423.

Next, a method of detecting the absolute position of the PA signal generating source 13 from the relationship between the position and the posture when the ultrasonic probe 20 is rotationally moved about the depth axis 24, that is, from the rotational position change of the imaging plane of the ultrasonic probe 20 will be described with reference to fig. 7.

As shown in fig. 7, when the ultrasonic probe 20 as the 1D array probe is rotated from 722B to 722A in the rotation direction 721 about the depth axis 24, the PA signal analyzing unit 412 can acquire the positions 723B and 723A in the longitudinal direction of each PA signal generation source 13, and the robot arm 90 can acquire the absolute positions of the positions 723B and 723A in the longitudinal direction of each PA signal generation source 13.

Therefore, since the position 725 of the PA signal generation source 13 is an intersection of the straight line 724A and the straight line 724B, the absolute position of the position 725 of the PA signal generation source 13 can be calculated from the absolute positions of 723A and 723B.

Specifically, when the position change in the direction of the long axis 22 in the PA signal generation source 13 before and after the movement is xPA, the angular velocity of the rotation 721 around the depth axis 24 is ω, the arrival time of the PA signal is t, and the sound velocity is c, the position 725(yPA, zPA) of the PA signal generation source 13 is obtained by equation (3).

[ mathematical formula 3 ]

In the above-described method of detecting the absolute position of the PA signal generation source 13 from the positional change of the ultrasonic probe 20 and the PA signal generation source 13 or the rotational positional change of the imaging plane of the ultrasonic probe 20, the absolute position of the PA signal generation source 13 is detected using the positions before and after the movement, but when the time difference before and after the movement is short, the absolute position of the PA signal generation source 13 is detected using the speed of the ultrasonic probe 20.

A process flow of detecting the absolute position of the PA signal generating source 13 by the PA sound source position detecting unit 42 using the velocity of the ultrasonic probe 20 will be described with reference to fig. 8.

In this flow, the absolute position of the PA signal generation source 13 is detected by translation and rotation extracted from the movement of the ultrasonic probe 20.

In step S801, the probe speed measuring section 425 measures the speed and angular speed of movement of the ultrasonic probe 20. Here, the speed and angular velocity of the movement of the ultrasonic probe 20 may be acquired from the robot arm 90.

In step S802, the absolute position detection unit 424 extracts the translational velocity in the direction of the minor axis 23 of the ultrasonic probe 20 and the rotational velocity component of the ultrasonic probe 20 around the depth axis 24 from the measured velocity and angular velocity of the movement of the ultrasonic probe 20.

In step S803, the absolute position detecting unit 424 determines whether or not the translational velocity in the extracted direction of the minor axis 23 is equal to 0, and if it is equal to 0 (0 in S803), the process proceeds to step S805. If not equal to 0 ("≠ 0" at S803), it proceeds to step S804.

In step S804, the absolute position detection unit 424 detects the absolute position of the PA signal generation source 13 from the translational velocity in the direction of the minor axis 23 of the ultrasonic probe 20 of the PA signal generation source 13, in the same manner as the method described with reference to fig. 6. Specifically, the PA signal generation source 13 is set to be located at a point equal to the distance measured by the PA signal analysis unit 412 from the 2 o' clock separated by the measurement time of the velocity at the translational velocity, and the absolute position of the PA signal generation source 13 is detected. And then proceeds to step S805.

In step S805, the absolute position detecting unit 424 determines whether or not the extracted rotational speed of the ultrasonic probe 20 about the depth axis 24 is equal to 0, and if equal to 0 (0 in S805), proceeds to step S807. If not equal to 0 ("≠ 0" at S805), it proceeds to step S806.

In step S806, the absolute position detection unit 424 detects the absolute position of the PA signal generation source 13 using the rotational speed of the ultrasonic probe 20 about the depth axis 24, as in the method described with reference to fig. 7. Specifically, the absolute position of the PA signal generating source 13 is detected by assuming that the PA signal generating source 13 is located at the intersection of the normals of 2 points separated by the measurement time of the rotation speed. And then proceeds to step S807.

In step S807, the position filter 423 performs filter processing on at least one of the absolute positions of the PA signal generation source 13 obtained in step S804 and step S806. Thus, the position detection is performed with high accuracy.

In the above description, the case where the 1D array probe is used as the ultrasonic probe 20 is described, but the case where the 2D array probe is used as the ultrasonic probe 20 is described.

In this case, since the three-dimensional position of the PA signal generation source 13 with respect to the ultrasonic probe 20 can be acquired instead of the three-dimensional position of the PA signal generation source 13 being acquired by the forward and backward movement and rotation of the ultrasonic probe 20, the three-dimensional absolute position of the PA signal generation source 13 can be detected by adding the three-dimensional relative positions acquired from the three-dimensional absolute position and posture of the ultrasonic probe 20 notified from the robot arm 90.

In the ultrasonic imaging apparatus and the treatment support system described above, the three-dimensional absolute position of the PA signal generation source 13 is detected based on the absolute position of the ultrasonic probe 20 detected by the robot 90 by operating the ultrasonic probe 20 with the robot 90, but a probe position/posture sensor 91 for detecting the absolute position of the ultrasonic probe 20 may be provided. The ultrasonic probe 20 may be operated by the robot arm 90, and the absolute position of the ultrasonic probe 20 may be detected by the probe position/orientation sensor 91.

The probe position/orientation sensor 91 may be incorporated in the ultrasonic probe 20 or may be separately configured.

The probe position/orientation sensor 91 is not limited as long as it is a sensor configured by using a combination of a positioning sensor such as a geomagnetic sensor, an acceleration sensor, and a gyro sensor, a sensor using an external camera, and the like, and can measure the position and orientation of the probe.

Fig. 9 is a block diagram showing the configuration of an ultrasound imaging apparatus and a treatment support system in which the probe position/orientation sensor 91 detects the absolute position of the ultrasound probe 20.

The ultrasound imaging apparatus of fig. 9 differs from the ultrasound imaging apparatus described with reference to fig. 3 in that the operation planning unit 45 of the robot arm 90 is removed and the probe position/posture sensor 91 is provided, and has the same configuration.

The probe position/orientation sensor 91 periodically detects the absolute position and orientation of the ultrasonic probe 20 and notifies the control unit 40 of the absolute position and orientation.

The probe speed measuring unit 425 detects the speed and angular velocity of the ultrasonic probe 20 from the temporal change of the information on the absolute position and orientation (direction) of the ultrasonic probe 20 delivered from the probe position/orientation sensor 91, instead of the robot arm 90.

The display image forming unit 43 performs the same processing as in the case of detecting the three-dimensional absolute position of the PA signal generating source 13 based on the absolute position of the ultrasonic probe 20 detected by the robot arm 90.

Fig. 10 is a flowchart of the processing of the ultrasonic imaging apparatus.

Compared with the processing flow of the ultrasonic imaging apparatus described in fig. 4, the following points are different: the differences between the processing of the operation planning unit for instructing the operation of the robot arm in step S406 and the replacement of step S407 with step S404 are eliminated.

In step S407, the PA sound source position detection unit 42 detects the position of the PA signal generation source 13 from the PA signal delivered from the PA signal analysis unit 412 and the information on the absolute position (three-dimensional) and orientation (direction) of the ultrasonic probe 20 delivered from the probe position/orientation sensor 91.

Thereby, even when the ultrasonic probe 20 is manually operated, the three-dimensional absolute position of the PA signal generating source 13 can be detected.

Although the embodiments of position detection by the PA signal generating source 13 have been described above, the embodiments can be combined as appropriate as long as there is no technical contradiction, and such a combination is also included in the present invention.

Here, a display image displayed on the display unit 34 in step S405 in fig. 4 and 10 will be described.

Fig. 11 is a diagram showing an example of a display image 51 displayed on the display unit 34.

As shown in fig. 11, the display image 51 displayed on the display unit 34 displays: an image 511 that displays the position of the PA signal generation source 13 in the blood vessel 82 of the subject 80 and displays the long-axis cross section of the blood vessel 82; and an image 512 showing a circular slice cross-section of the blood vessel 82. And displays in images 511, 512: display of the body of the subject 80 and the blood vessel 82; and a display 519 of the position of the PA signal generating source 13 and a display 518 of the trajectory. These display elements may be displayed in their entirety or in part.

The display images 511 and 512 may be displayed using a display image captured by the ultrasound imaging unit 30 and formed by the display image forming unit 43, or information on the anatomical structure outside the imaging area of the ultrasound imaging unit 30 may be displayed by the display image forming unit 43 by forming, in CG (Computer Graphics), the PA signal generation source 13 detected by the PA sound source position detection unit 42 and the previously acquired 3D anatomical information 44 of the subject 80 acquired in advance.

The content of the image displayed on the display unit 34 may include any image processing for assisting a surgical operation, and may include a display for highlighting a focus of interest such as a blood vessel or a lesion existing in the ultrasound image or the CG, a display for showing a lesion position outside the screen, and the like.

In the image formation by CG, in order to detect a position on the absolute coordinates detected by the robot arm 90 with respect to the previously acquired 3D anatomical information 44 of the subject 80, which is acquired in advance, and to perform registration, the feature points in the ultrasonic image formed by the display image forming unit 43 and the feature points in the previously acquired 3D anatomical information 44 are compared, and a matching part is detected. For example, a branch of a blood vessel or a bone may be used as the feature point, and other feature points may be used.

In the image formation by CG, the display image may be formed by directly using the previously acquired 3D anatomical information 44, or by detecting a deformation of a previously acquired anatomical structure and an anatomical structure during surgery with reference to a reflected ultrasound image formed during surgery, and by deforming the previously acquired 3D anatomical information 44 so that the previously acquired 3D anatomical information 44 is more suitable for the information of the structure during surgery. For example, the following processing is considered: the vascular wall in the reflected ultrasound image formed during the operation is identified, and the vascular wall contained in the previously acquired 3D anatomical information 44 is deformed into the same shape as the vascular wall in the reflected ultrasound image.

The content of the image displayed on the display unit 34 is not limited to the screen on which the image from the 2-direction illustrated in fig. 11 is displayed, and the image from the 1-direction may be displayed or the image may be displayed by the 3 rd angle method as long as the stereoscopic positional relationship between the lesion and the PA signal generation source 13 can be presented. Further, an image from an arbitrary viewpoint may be displayed on 4 screens or more, and the display method and the viewpoint of the displayed image are not limited.

Fig. 12 is a diagram showing an example of a display image 52 by the 3 rd angle method.

The display image 52 is composed of a side view 521 of the blood vessel, a circular slice 522 of the blood vessel, and an image from 3 directions of a top view 523 of the blood vessel.

Reference numeral 528 of fig. 12 denotes a track of the PA signal generation source 13, and 529 denotes a position of the PA signal generation source 13.

Next, the operation instruction of the robot arm planned by the operation planning unit 45 in step S406 in fig. 4 will be described in detail.

An example of the movement plan of the robot arm 90 by the operation planning unit 45 will be described with reference to fig. 13A and 14A.

In fig. 13A, when the PA signal generation source 13 moves from the position 613 before the movement to the position 614 after the movement, the robot arm 90 drives the ultrasonic probe 20 so that the imaging region of the ultrasonic probe 20 is maintained to set the blood vessel 82 as a circular slice, and the tracking PA signal generation source 13 moves from the position 611 before the movement to the position 612 after the movement.

Fig. 13B is an ultrasonic image in the case where the robot arm 90 drives the ultrasonic probe 20 as in fig. 13A.

The blood vessel 82 and the PA signal generation source 13 (position 615) of the subject 80 are depicted in the ultrasound image, and the position 615 of the PA signal generation source 13 in the blood vessel 82 of the subject 80 can be easily grasped.

In fig. 14A, the robot arm 90 drives the ultrasonic probe 20 so as to track the positions 623 and 624 before and after the movement of the PA signal generation source 13, and move from the position 621 before the movement to the position 622 after the movement, so that the imaging region of the ultrasonic probe 20 includes the long-axis cross section of the blood vessel 82.

Fig. 14B is an ultrasonic image of the case where the robot arm 90 drives the ultrasonic probe 20 as in fig. 14A.

The blood vessel 82 of the subject 80 and the PA signal generation source 13 (position 625) are depicted in the ultrasound image, and when the occlusion portion 83 is present in the blood vessel 82 of the subject 80, the positional relationship between the occlusion portion 83 and the PA signal generation source 13 (position 625) can be easily grasped.

The driving method of the ultrasonic probe 20 is not limited to the operation method of fig. 13A and 14A, as long as the reflected ultrasonic signal and the PA signal necessary for forming the display image can be obtained in the display image formation of step S405 in fig. 4 and 10.

For example, as described with reference to fig. 8, when the speed of the translation 711 described with reference to fig. 6 and the speed of the rotation 721 described with reference to fig. 7 are both not 0, the position detection can be performed.

Therefore, the ultrasonic probe 20 is oscillated by the robot arm 90 so that the velocity ve of the translation 711 and the angular velocity ω d of the rotation of the ultrasonic probe 20 become equation (4), and the translation 711 and the rotation 721 do not become 0 at the same time. This allows position detection by the PA signal generator 13 to be continued at all times. Where v0 and ω 0 are constants for adjusting the velocity, and T is the period of the shake.

[ mathematical formula 4 ]

The embodiments of the ultrasound imaging apparatus and the catheter treatment support system according to the present invention have been described above, but the respective embodiments can be combined as appropriate as long as they are not technically contradictory, and such a combination is also included in the present invention.

23页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种鼻腔采样器及其使用方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!