Medical system and control unit

文档序号:1617736 发布日期:2020-01-10 浏览:23次 中文

阅读说明:本技术 医疗系统和控制单元 (Medical system and control unit ) 是由 菊地大介 杉江雄生 中村幸弘 深沢健太郎 池田宪治 于 2018-03-28 设计创作,主要内容包括:[问题]当在外科手术中使用多个成像装置时,要使由多个成像装置捕获的图像的外观一致。[解决方案]本公开提供了一种医疗系统,该医疗系统具有多个手术成像设备和控制单元,该控制单元具有与手术成像设备连接的信号处理单元,该信号处理单元协调由各个手术成像设备捕获的图像。通过这种配置,在布置多个成像设备用于手术的情况下,能够使得由各个成像设备捕获的图像之间的外观一致。([ problem ] to make the appearance of images captured by a plurality of imaging devices uniform when the plurality of imaging devices are used in a surgical operation. The present disclosure provides a medical system having a plurality of surgical imaging devices and a control unit having a signal processing unit connected with the surgical imaging devices, the signal processing unit coordinating images captured by the respective surgical imaging devices. With this configuration, in the case where a plurality of imaging devices are arranged for surgery, it is possible to make the appearance uniform between images captured by the respective imaging devices.)

1. A medical system, comprising:

a plurality of surgical imaging devices; and

a control unit to which each of the surgical imaging devices is connected, the control unit including a signal processing unit that coordinates images captured by the respective surgical imaging devices.

2. The medical system of claim 1, wherein the plurality of surgical imaging devices includes at least two of an endoscope, an exoscope, a microscope, and a surgical scene camera.

3. The medical system of claim 1, wherein the signal processing unit switches whether to perform the coordination based on an occurrence of an adjustment trigger.

4. The medical system of claim 1, wherein

The signal processing unit performs processing for the coordination according to occurrence of an adjustment trigger, and

the adjustment trigger occurs by user operation.

5. The medical system of claim 1, wherein

The signal processing unit performs processing for the coordination according to occurrence of an adjustment trigger, and

the adjustment trigger occurs in a case where a plurality of the surgical imaging apparatuses image the same subject.

6. The medical system of claim 1, wherein

The signal processing unit performs processing for the coordination according to occurrence of an adjustment trigger, and

the adjustment trigger occurs according to the status of the operating surgeon performing the operation.

7. The medical system of claim 1, wherein

The signal processing unit performs processing for the coordination according to occurrence of an adjustment trigger, and

the adjustment trigger occurs based on identification information identifying a plurality of the surgical imaging devices.

8. The medical system of claim 1, wherein the signal processing unit performs processing to adapt colors between images captured by respective ones of the plurality of surgical imaging devices.

9. The medical system of claim 1, wherein the signal processing unit performs processing to adapt brightness between images captured by respective ones of the plurality of surgical imaging devices.

10. The medical system of claim 1, wherein the signal processing unit performs processing that adapts contrast between images captured by respective ones of the plurality of surgical imaging devices.

11. The medical system of claim 1, wherein the signal processing unit performs processing that adapts resolution between images captured by respective ones of the plurality of surgical imaging devices.

12. The medical system of claim 1, wherein the signal processing unit performs processing that adapts noise between images captured by respective ones of the plurality of surgical imaging devices.

13. The medical system of claim 1, wherein the signal processing unit performs processing that adapts a depth of field between images captured by respective ones of the plurality of surgical imaging devices.

14. The medical system of claim 1, wherein the signal processing unit performs a process of adapting an amount of jitter between images captured by respective ones of the plurality of surgical imaging devices.

15. The medical system of claim 1, wherein the signal processing unit performs processing to adapt depth between stereoscopic images captured by respective ones of the plurality of surgical imaging devices.

16. The medical system of claim 1, wherein the signal processing unit performs processing to adapt a viewing angle between images captured by respective ones of the plurality of surgical imaging devices.

17. The medical system of claim 1, wherein the signal processing unit coordinates an image captured by one of the plurality of surgical imaging devices with an image captured by another surgical imaging device with reference thereto.

18. The medical system of claim 1, wherein the signal processing unit coordinates any target image with images captured by a plurality of the surgical imaging devices with reference thereto.

19. A control unit to which each of a plurality of surgical imaging devices is connected, the control unit comprising a signal processing unit that coordinates images captured by respective ones of the surgical imaging devices.

20. A medical system, comprising:

a plurality of surgical imaging devices;

a control unit to which each of the surgical imaging devices is connected; and

an integrated device to which each of a plurality of the control units is connected, the integrated device including a signal processing unit that coordinates images captured by respective ones of the surgical imaging devices.

Technical Field

The present disclosure relates to medical systems and control units.

Background

Conventionally, for example, the following patent document 1 discloses that, in an endoscope apparatus capable of using a probe-type endoscope, two images are accurately matched and a composite image is generated regardless of the position of a probe tip portion and the degree of curvature of a scope tip portion.

Patent document

Patent document 1: japanese patent laid-open publication No. 2011-55939

Disclosure of Invention

Problems to be solved by the invention

In the medical imaging apparatus, there are cases where a plurality of cameras are used simultaneously. For example, in brain surgery or the like, there are cases where an endoscope is used when observing a near portion of a surgical region and an endoscope is used when observing a deep portion of the surgical region. In this case, when images captured by a plurality of cameras are displayed as they are, the two images have different appearances, which makes the observer feel uncomfortable. Further, for example, even if the same subject is imaged and displayed, disadvantages such as difficulty in recognizing that subjects having different appearances are the same subject and difficulty in recognizing the relationship between two images are caused.

The technique disclosed in the above-mentioned patent document 1 specifies the magnification/reduction ratio and the amount of phase shift based on the protruding length of the probe tip portion and the bending angle of the scope tip portion, and makes the size of an observation object such as a lesion coincide with the size of the observation object in a normal observation image. The technique disclosed in patent document 1 is to make the image sizes coincide with each other in a case where the positional relationship of one image with respect to the other image is determined in advance, but no measure is taken at all to adjust the appearance between the two images in accordance with the images captured by different devices.

Therefore, in the case where a plurality of imaging devices are used for surgery, it is desirable to adjust the appearance between images captured by the respective imaging devices.

Means for solving the problems

According to the present disclosure, a medical system is provided that includes a plurality of surgical imaging devices and a control unit to which each of the surgical imaging devices is connected, the control unit including a signal processing unit that coordinates images captured by the respective surgical imaging devices.

Further, according to the present disclosure, a control unit connected to each of the plurality of surgical imaging devices is provided, the control unit including a signal processing unit that coordinates images captured by the respective surgical imaging devices.

Additionally, in accordance with the present disclosure, a medical system is provided that includes a plurality of surgical imaging devices, a control unit connected to each of the surgical imaging devices, and an integrated device connected to each of the plurality of control units, the integrated device including a signal processing unit that coordinates images captured by the respective surgical imaging devices.

Effects of the invention

As described above, according to the present disclosure, in the case where a plurality of imaging devices are used for surgery, the appearance between images captured by the respective imaging devices can be adjusted.

Note that the above-described effects are not necessarily limited, and any effect indicated in this specification or other effects that can be learned from this specification may be displayed together with or instead of the above-described effects.

Drawings

Fig. 1 is a schematic diagram showing an outline of the configuration of a surgical system according to each embodiment of the present disclosure.

Fig. 2 is a schematic diagram showing a configuration including an input module that relays between a camera unit and a Camera Control Unit (CCU) in addition to the configuration in fig. 1.

Fig. 3 is a schematic diagram showing the configuration and action of a signal processing unit in the CCU.

Fig. 4 is a schematic diagram illustrating a case where an object is observed using an endoscope and an endoscope at the same time.

Fig. 5 is a schematic diagram illustrating a method of displaying an endoscope image and an endoscope image.

Fig. 6 is a schematic diagram illustrating a processing method of displaying the color tone of the endoscopic image according to the color tone of the endoscopic image.

Fig. 7 is a schematic diagram for explaining an example of adapting the shake between the exterior mirror image and the endoscope image.

Fig. 8 is a schematic diagram for explaining an example of adapting the shake between the exterior mirror image and the endoscope image.

Fig. 9 is a schematic diagram showing an example of adapting the brightness (brightness) and the contrast between the exterior mirror image and the endoscope image.

Fig. 10 is a schematic diagram showing an example of adapting the sense of resolution and the depth of field between the endoscope image and the endoscope image.

Fig. 11 is a schematic diagram showing an example of adapting noise between an endoscope image and an endoscope image.

Fig. 12 is a schematic diagram showing an example of adapting an image direction between an endoscope image and an endoscope image.

Fig. 13 is a schematic diagram showing an example of a case where an endoscope image that can be captured in three dimensions (3D) and an endoscope image that can be captured in 3D are used together.

Fig. 14 is a schematic diagram showing a system including a plurality of CCUs connected to a plurality of camera units, each of which is connected to an integration apparatus 600.

Detailed Description

Hereinafter, advantageous embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that in the present specification and the drawings, constituent elements having substantially the same functional configuration will be denoted by the same reference numerals, and redundant description will be omitted.

Note that the description will be given in the following order.

1. Configuration example of System

2. Arrangement of signal processing units

3. Color adjustment between an endoscope image and an exoscope image

4. Shake adjustment between an endoscope image and an exoscope image

5. Adjustment of brightness and contrast between an endoscope image and an exoscope image

6. Adjustment of sense of resolution and depth of field between endoscope image and endoscope image

7. Noise adjustment between an endoscopic image and an endoscopic image

8. Adjustment of direction and angle of view between an endoscopic image and an endoscopic image

9. Adjustment of depth perception between endoscope image and endoscope image

10. Configuration example including multiple CCUs connected with multiple camera units

1. Configuration example of System

First, an outline of the configuration of a surgical system 1000 according to each embodiment of the present disclosure will be described with reference to fig. 1. As shown in fig. 1, the surgical system 1000 is composed of a plurality of camera units 100, a CCU (control unit) 200 to which a plurality of cameras can be connected and which can perform a plurality of outputs, and a plurality of monitors 300. The surgical system 1000 generates a plurality of output images by signal processing based on information of a plurality of input signals from a plurality of camera units 100, and outputs the generated output images to the monitor 300.

Fig. 2 is a schematic diagram showing a configuration example of a system including an input module (camera unit) 400 that relays between the camera unit 100 and the CCU200 in addition to the configuration in fig. 1, in which the surgical system 1000 inputs an image to the CCU200 after performing preprocessing in the input module 400. The input module 400 is used, for example, to ensure compatibility between the camera unit 100 and the CCU 200.

In the present embodiment, the plurality of camera units 100 are a plurality of cameras used for surgery, and represent, for example, an endoscope (rigid endoscope or flexible endoscope), an external scope (external scope), a microscope, a surgery field camera, and the like. The surgical system 1000 may include a light source apparatus that irradiates a subject when the camera unit 100 images the subject.

A plurality of camera units 100 are sometimes used simultaneously during surgery. For example, in brain surgery or the like, there are cases where an endoscope is used when observing a near portion of a surgical region and an endoscope is used when observing a deep portion of the surgical region. As an example, conceivable cases for cerebral aneurysms in the open brain include a case where the front side of an affected part is observed with an external scope and the back side of the affected part is observed with an insertion endoscope. In this case, in the present embodiment, a process of adapting the appearance of each image among the plurality of camera units 100 is performed. In other words, in the present embodiment, a process of coordinating images of the plurality of camera units 100 with each other is performed. Examples of combinations of the plurality of camera units 100 used simultaneously include endoscopes and external scopes, endoscopes and surgical microscopes, rigid endoscopes and flexible endoscopes, and endoscopes, surgical field cameras, and the like. Note that although the endoscope is suitable for a case of observing details of an object, if the endoscope is moved away from the object so as to capture an image of a wider range, the image may be distorted. The exterior mirror includes a dedicated optical system, and is capable of capturing an image without causing distortion in this case; therefore, there is an advantage that it is easy to perform an operation because, for example, a sufficient distance can be secured between the subject and the exterior mirror. In addition, the surgical site camera is a camera that takes an image of the entire surgical situation.

2. Arrangement of signal processing units

Fig. 3 is a schematic diagram showing the configuration and action of a signal processing unit 210 in a CCU200 in a system in which two cameras, i.e., an external view mirror and an endoscope, are connected to the CCU200 as a plurality of camera units 100. The signal processing unit 210 is provided in the CCU200 to which the plurality of camera units 100 are connected, and performs processing so as to adapt the appearance between an image generated based on an imaging signal obtained from one camera unit 100 and an image generated based on an imaging signal obtained from another camera unit 100.

As shown in fig. 3, the signal processing unit 210 includes an adjustment parameter calculation unit 212, an adjustment parameter application unit 214, and an adjustment parameter application unit 216. Adjustment parameter calculation section 212 receives input of image 10 information from an endoscope (hereinafter also referred to as an endoscope image) and image 12 information from an endoscope (hereinafter also referred to as an endoscope image).

Further, the adjustment trigger 14 is input to the adjustment parameter calculation unit 212. The adjustment trigger 14 is information defined from the outside or information defined from the input image, and is information serving as a trigger for performing adjustment using an adjustment parameter. For example, when it is detected that both the endoscope image 10 and the endoscope image 12 have the same subject, an adjustment trigger is generated.

When the adjustment trigger 14 occurs, a process of adapting the appearance between the endoscope image 10 and the endoscope image 12 is performed. The adjustment parameter is a parameter for performing processing for adapting the appearance between the endoscope image 10 and the endoscope image 12. On the other hand, in a case where the adjustment trigger 14 does not occur, the process of adapting the appearance between the endoscope image 10 and the endoscope image 12 is not performed, and the endoscope image 12 and the endoscope image 10 are processed independently.

When the adjustment trigger 14 occurs and is input to the adjustment parameter calculation unit 212, the adjustment parameter calculation unit 212 calculates adjustment parameters relating to color, brightness, contrast, depth of field, noise, angle of view, image direction, and the like, from the endoscope image 10 or the endoscope image 12 or both the endoscope image 10 and the endoscope image 12. The tuning parameters are sent to one or both of the tuning parameter application units 214 and 216. When receiving the adjustment parameter, the adjustment parameter application unit 214 applies the adjustment parameter to the exterior mirror image 10. Further, upon receiving the adjustment parameter, the adjustment parameter application unit 216 applies the adjustment parameter to the endoscopic image 12. As described above, the signal processing unit 210 applies the adjustment parameter to one or both of the endoscope image 10 and the endoscope image 12, and generates an output image for each image. With this processing, the appearance between the exterior mirror image 10 and the endoscope image 12 can be adapted, and it is possible to prevent inconsistency in the appearance of the subject caused when the user observes an image obtained from one camera unit 100 after observing an image obtained from another camera unit 100.

The adjustment trigger 14 may be caused to occur according to a user operation, the state (position) of the surgeon, information on a connected device, or the like, except for the case where the endoscope image 10 and the endoscope image 12 have the same subject as described above. In the case where the adjustment trigger 14 is caused to occur by the user operation, the adjustment trigger 14 occurs when the user operates the operation input unit of the CCU 200. In the case where the adjustment trigger 14 is caused to occur according to the state of the surgeon, for example, the position of the surgeon is determined from the image of the operating site camera installed in the operating room, and the adjustment trigger 14 is caused to occur in the case where it is determined based on the position of the surgeon that the surgeon performs observation with an endoscope in addition to observation with an external scope. Further, in the case where the adjustment trigger is caused to occur based on the information on the connected devices, the identification information of the devices is acquired from each of the plurality of camera units 100 connected to the CCU200, and the adjustment trigger 14 is caused to occur in the case where the plurality of camera units 100 for which the adjustment trigger 14 is scheduled to occur are connected to the CCU 200.

In the case of adapting the appearance between the endoscope image 10 and the endoscope image 12, the endoscope image 12 may be adapted to the endoscope image 10 with reference to the endoscope image 10, or the endoscope image 10 may be adapted to the endoscope image 12 with reference to the endoscope image 12. Alternatively, each of the endoscope image 12 and the endoscope image 10 may be adapted with a target image as a reference for adapting the appearance.

3. Color adjustment between endoscope image and scope image

Fig. 4 is a schematic diagram illustrating a case where the object is observed using the exterior mirror 110 and the endoscope 120 at the same time. As described above, for example, in brain surgery or the like, since the endoscope 110 is used when observing a near portion of a surgical region and the endoscope 120 is used when observing a deep portion of the surgical region, a case where the endoscope 110 and the endoscope 120 are used simultaneously is conceivable.

At this time, the endoscope 110 and the endoscope 120 irradiate the subject with illumination from different light sources. As shown in fig. 4, the exterior mirror 110 irradiates the subject by illumination of the light source device (1)400, and the endoscope 120 irradiates the subject 500 by illumination of the light source device (2) 410.

In addition, the scope 110 and the endoscope 120 receive light that has passed through different lenses with different sensors. Therefore, even if the development process is performed with the same parameters, the exterior mirror 110 and the endoscope 120 generally have different color tones from each other.

Fig. 5 is a schematic diagram illustrating a method of displaying the endoscope image 10 and the endoscope image 12. As shown in fig. 5, there is a method of displaying two images as follows: a method of displaying two images on two monitors 300 placed side by side, and a method of temporarily switching and displaying two images on one monitor 300. However, if two images having different hues are displayed as they are, the observer may feel uncomfortable because the hues of the two images are different. Further, for example, even if the same subject is imaged and displayed, it becomes difficult to recognize that subjects having different hues are the same subject, which results in difficulty in associating the two images.

Therefore, in the present embodiment, the color tone of the endoscope image 12 and the color tone of the endoscope image 10 are adapted to each other at the time of display. At this time, by applying the adjustment parameter to the outside mirror image or the endoscope image, or both the outside mirror image and the endoscope image, the color tone can be adjusted between the image of the outside mirror 110 and the image of the endoscope 120. Fig. 6 is a schematic diagram illustrating a processing method of displaying an image tone of the endoscope 120 according to an image tone of the exterior mirror 110.

First, it is determined whether the same subject and area are displayed in two images. At this time, by matching the two images, it can be detected whether the two images have the same subject, and a common area can be detected. In the example shown in fig. 6, it is detected by block matching that the region a of the endoscope image 10 and the region B of the endoscope image 12 have the same subject and are a common region.

Next, an example of a processing method in the case where the color tone of the endoscope image 12 is adapted to the color tone of the scope image 10 will be described. For the already matched region a and region B in which the two regions have the same position of the same subject, in the case where the respective color values at the corresponding positions (indicated by × marks in fig. 6) in the region a and region B have (R1, G1, B1) for the endoscope image 12 and (R2, G2, B2) for the scope image 10, the color relationship (RGB values) between the two images can be represented by a linear formula as shown in the following formula (1).

[ mathematical formula 1]

Figure BDA0002291325160000101

At this time, when the above equation is solved from the RGB values of the plurality of points by the least square method and the coefficients a to i are found, the linear equation in the equation (1) is used as a conversion equation of the RGB values from the endoscope image 12 to the scope image 10. Here, the coefficients (a to i) used for the conversion correspond to the adjustment parameters. Therefore, the color tone of the endoscopic image 12 can be adapted to the color tone of the endoscopic image 10.

4. Shake adjustment between an endoscope image and an exoscope image

Next, a process of adapting a shake between the endoscope image 10 and the endoscope image 12 in a case where observation is performed using the endoscope 110 and the endoscope 120 at the same time (camera shake removal process) will be described based on fig. 7 and 8. As shown in fig. 7, the endoscope 110 is generally fixed by the fixing tool 20, but the endoscope 120 is held by the hand of the endoscopist or surgeon. Therefore, in some cases, a shake is caused in the endoscope image 12.

Fig. 8 is a schematic diagram showing a processing method for adapting the shake between the endoscope image 10 and the endoscope image 12. Similarly to fig. 6, in fig. 8, the region a of the exterior mirror image 10 and the region B of the endoscope image 12 are detected by block matching to have the same subject and to be a common region. As shown in a region a in fig. 8, the mirror image 10 of the mirror 110 held by the holding tool 20 is not shaken. On the other hand, since the endoscope 120 is supported by the person, a shake is caused in the endoscope image 12 as shown in a region B in fig. 8. Therefore, the endoscopic image 12 causing the shake is adapted to the endoscopic image 10 not causing the shake.

When it is determined by block matching that the area a and the area B have the same subject and area, the endoscopic image 12 is tracked from the endoscopic image 10, and the shake in the endoscopic image 12 is corrected by removing the shake component of the endoscopic image 12. By this processing, since the shake in the endoscopic image 12 is removed from the fixed endoscopic image 10, it is possible to suppress a sense of incongruity generated when the user moves the line of sight between the endoscopic image 12 and the endoscopic image 10.

Further, in the case where different areas are shown in the endoscope image 10 and the endoscope image 12 as a result of block matching, the shake in the endoscope image 12 is corrected without using the endoscope image 10. In this case, shake correction of the endoscopic image 12 is performed by recognizing shake in the endoscopic image 12 from the translational, magnification, and rotation components of the endoscopic image 12 and multiplying by an inverse matrix of the shake.

5. Adjustment of brightness and contrast between an endoscope image and an exoscope image

Fig. 9 is a schematic diagram showing an example of adapting the brightness (brightness) and the contrast between the exterior mirror image 10 and the endoscope image 12. The method of calculating the luminance conversion coefficient of each color may be performed in a similar manner to the case of adapting the hue described with reference to fig. 6. When adapting the luminance Y obtained by conversion from RGB values, the luminance Y1 of the endoscope image 10 is converted into the luminance Y2 of the endoscope image 12 by the following formula (2). For the already matched region a and region B in which the two regions have the same position of the same subject, in the case where the respective luminance values at the corresponding positions (indicated by × marks in fig. 9) in the region a and region B have Y2 for the endoscope image 12 and Y1 for the scope image 10, the noise relationship between the two images can be represented by a linear formula as shown in the following formula (2). Note that, in formula (2), reference character a denotes a conversion coefficient.

Y2=a·Y1...(2)

The brightness is adjusted by applying a gain to the darker image according to the relationship of R, G, B or Y. Fig. 9 shows an example in which the endoscope image 10 is fitted to the endoscope image 12. The conversion coefficient may be applied to the entire screen by calculating one coefficient mainly from the central region of the image, or may be applied to each region by calculating the coefficient of each corresponding region in the image separately. By separately calculating the coefficient of each corresponding region and applying the coefficient to each region, the contrast between the endoscope image 10 and the endoscope image 12 can be made uniform.

6. Adjustment of sense of resolution and depth of field between endoscope image and endoscope image

Fig. 10 is a schematic diagram showing an example of adapting the sense of resolution and the depth of field between the endoscope image 12 and the external view mirror image 10. The conversion coefficient is calculated by replacing the RGB values in fig. 6 and formula (1) with a value DR indicated by the difference between the maximum value and the minimum value of the pixel values of a predetermined region around the pixel of interest. For the already-matched region a and region B in which the two regions have the same position of the same subject, the respective difference values DR at the corresponding positions (indicated by × marks in fig. 10) in the region a and region B have a DR for the endoscopic image 12BAnd has DR for the exterior mirror image 10AIn this case, the difference between the maximum value and the minimum value around the pixel of interest of each image can be represented by a linear formula as shown in the following formula (3). Note that, in formula (3), reference character a represents a conversion coefficient.

DRB=a·DRA...(3)

According to the DR ratio, the intensity of enhancement processing of an image having a smaller DR is increased. Also in this case, the conversion coefficient may be applied to the entire screen by calculating one coefficient mainly from the central region of the image, or may be applied to each region by calculating the coefficient of each corresponding region separately. By performing the enhancement processing from one image having a deeper depth of field, the depth of field of the other image can also be increased.

7. Noise adjustment between an endoscopic image and an endoscopic image

Fig. 11 is a schematic diagram showing an example of adapting noise between the endoscope image 10 and the endoscope image 12. The method of calculating the conversion coefficient is similar to the case of adapting the hue described with reference to fig. 6. The conversion coefficient is calculated by assuming that the RGB values in fig. 6 and formula (1) are replaced with the standard deviation σ of the pixel values in a predetermined region around the pixel of interest.

Also in this case, for the already matched region a and region B in which the two regions have the same position of the same subject, the noise value (standard deviation σ) at the corresponding position (indicated by the × mark in fig. 11) in the region a and region B has σ for the endoscopic image 12BAnd has σ for the outside mirror image 10AIn the case of (2), the noise relationship between the two images can be expressed by a linear formula as shown in the following formula (4). Note that, in formula (4), reference character a denotes a conversion coefficient.

σA=a·σB...(4)

Based on the noise ratio σ, the Noise Reduction (NR) strength of the image that is more noisy is increased. Further, in addition to simply adjusting the noise reduction strength higher or lower, when noise reduction is applied to an image having more noise, noise reduction can be applied by using edge information of an image having less noise, thereby performing higher-performance noise reduction.

8. Adjustment of direction and angle of view between an endoscopic image and an endoscopic image

Fig. 12 is a schematic diagram showing an example of adapting the image direction and the angle of view between the endoscope image 12 and the exterior mirror image 10. In this case, geometric correction is performed based on the matching result between the endoscope image 10 and the endoscope image 12, and correction for fitting the orientation of the endoscope image 12 to the orientation of the endoscope image 10 is performed.

Specifically, the positional relationship and the correspondence between the endoscope image 10 and the endoscope image 12 are acquired, for example, by block matching or the like, and geometric correction is performed according to the result of the acquired information. In the example shown in fig. 12, the image direction of the endoscopic image 12 is adapted to the image direction of the endoscopic image 10. Note that the endoscope direction shown in the endoscope image 10 may be detected so that the on and off of the correction are automatically switched according to the shown endoscope direction. For example, if the endoscope is oriented such that the endoscope image 12 and the scope image 10 are upside down, the correction is turned on.

9. Adjustment of depth perception between endoscope image and endoscope image

Fig. 13 is a schematic diagram showing an example of a case where the endoscope image 10 that can be captured in 3D and the endoscope image 12 that can be captured in 3D are used together. For example, when the viewpoint is switched from the monitor 300 displaying the 3D image of the exterior mirror image 10 to the monitor 300 displaying the 3D image of the endoscope image 12, if the sense of depth at the point of interest on the exterior mirror image 10 is significantly different from the sense of depth at the corresponding point of interest (corresponding point) on the endoscope image 12, the user feels more uncomfortable when switching the line of sight.

Therefore, the parallax d at the point of interest is detected from the left and right eye images of the endoscope image 10 by the block matching process or the like, and similarly, the parallax d' at the corresponding point is also detected from the left and right eye images of the endoscope image 12 by the block matching process or the like. Then, the parallax adjustment process is performed on the left and right eye images of the endoscopic image 12, thereby establishing d' ═ d.

Further, when the line of sight is switched from the monitor 300 displaying the 3D image of the endoscope image 12 to the monitor 300 displaying the 3D image of the scope image 10, parallax adjustment is performed on the left and right eye images on the scope image 10 side, thereby establishing D-D' conversely.

By the parallax adjustment as described above, the sense of depth of the focus point on the endoscope image 10 and the corresponding point on the endoscope image 12 is adjusted to the same degree, and therefore, the sense of discomfort can be reduced when the user switches the line of sight.

In the case where the depth range is significantly different between the endoscope image 12 and the endoscope image 10, if the parallax is fitted to one image, it is possible to produce a subject that is too prominent or too retracted on the 3D image. Therefore, the depth is estimated from the left and right eye images in each of the endoscopic image 12 and the endoscopic image 10, and in the case where the depth range of one image is much larger than that of the other image (the case where the difference between the depth range of one image and that of the other image exceeds a predetermined value), parallax adjustment may not be performed, or processing of reducing the degree of parallax adjustment or the like may be performed.

Note that the point of interest on the image in this embodiment may be specified by the user using a User Interface (UI) such as a pointing device, or may be automatically detected by a line-of-sight detecting device. In addition, a surgical instrument such as an electric knife or forceps can be detected so that, for example, the tip of the electric knife or forceps, which is often noticed by the surgeon on the image by the user, is set as the point of interest. Further, a central portion of the image in which the viewpoint may be generally focused may be set as the point of interest.

10. Configuration example including multiple CCUs connected with multiple camera units

In the above example, the surgical system 1000 in which a plurality of camera units 100 are connected to the CCU200 has been described; however, the present disclosure may also be applied to a system including a plurality of CCUs 200, to each of which CCUs 200 a plurality of camera units 100 are connected, wherein each CCU200 is connected to the integration apparatus 600. Fig. 14 is a schematic diagram showing a system 2000 including a plurality of CCUs 200 connected to a plurality of camera units 100, each CCU200 being connected to an integration apparatus 600.

In the system 2000 shown in fig. 14, information on the image of the camera unit 100 is transmitted to the integration apparatus 600 via the CCU 200. In the system 2000 shown in fig. 14, the signal processing unit 210 shown in fig. 3 is provided in the integrated device 600 instead of the CCU 200. The integrated apparatus 600 performs processing of adapting the appearance between images transmitted from the respective camera units 100 by the function of the signal processing unit 210. Therefore, according to the system 2000 shown in fig. 14, the appearances of the images of the respective camera units 100 connected to the respective CCUs 200 connected to the integrated apparatus 600 can be unified.

Note that, in the above description, the case where observation is performed using both the endoscope image 10 and the endoscope image 12 is mainly described, however, as described with reference to fig. 5, the present disclosure may also be applied to the case where observation is performed by switching between the endoscope image 10 and the endoscope image 12. In this case, the image immediately before switching is held, and various corrections are performed on the image after switching using the held image. Further, the image to be corrected is not limited to the endoscope image 10 or the endoscope image 12, and may be changed according to the situation.

As described above, according to the present embodiment, in the case where two different camera units 100 are used together to image the same subject, images having the same appearance can be generated. Therefore, in the case where two output images are placed and displayed side by side, or in the case where two output images are switched and displayed, observation without any sense of discomfort is achieved in both cases, and it becomes easy to understand the relationship between the two images with each other.

So far, advantageous embodiments of the present disclosure have been described in detail with reference to the accompanying drawings. However, the technical scope of the present disclosure is not limited to these examples. It is apparent that those having ordinary knowledge in the technical field of the present disclosure can make various changes or modifications within the technical spirit disclosed in the claims, and of course, such changes or modifications are to be understood as a part of the technical scope of the present disclosure.

Further, the effects described in the present specification are merely illustrative or exemplary, and are not restrictive. That is, the technology according to the present disclosure may show other effects apparent to those skilled in the art from the description of the present specification, as well as or instead of the above-described effects.

Note that the configuration described below is also within the technical scope of the present disclosure.

(1) A medical system, comprising:

a plurality of surgical imaging devices; and

a control unit to which each of the surgical imaging devices is connected, the control unit including a signal processing unit that coordinates images captured by the respective surgical imaging devices.

(2) The medical system according to the above (1), wherein the plurality of surgical imaging apparatuses includes at least two of an endoscope, an external view mirror, a microscope, and a surgical site camera.

(3) The medical system according to the above (1) or (2), wherein the signal processing unit switches whether to perform coordination according to occurrence of the adjustment trigger.

(4) The medical system according to any one of (1) to (3) above, wherein

The signal processing unit performs processing for coordination according to the occurrence of the adjustment trigger, and

through user operation, an adjustment trigger occurs.

(5) The medical system according to any one of (1) to (3) above, wherein

The signal processing unit performs processing for coordination according to the occurrence of the adjustment trigger, and

in the case where a plurality of surgical imaging apparatuses image the same subject, an adjustment trigger occurs.

(6) The medical system according to any one of (1) to (3) above, wherein

The signal processing unit performs processing for coordination according to the occurrence of the adjustment trigger, and

the adjustment trigger occurs according to the state of the operating surgeon performing the operation.

(7) The medical system according to any one of (1) to (3) above, wherein

The signal processing unit performs processing for coordination according to the occurrence of the adjustment trigger, and

an adjustment trigger occurs based on the identification information for identifying the plurality of surgical imaging devices.

(8) The medical system according to any one of (1) to (7) above, wherein the signal processing unit performs processing of adapting colors between images captured by respective ones of the plurality of surgical imaging devices.

(9) The medical system according to any one of (1) to (7) above, wherein the signal processing unit performs processing of adapting brightness between images captured by respective ones of the plurality of surgical imaging devices.

(10) The medical system according to any one of (1) to (7) above, wherein the signal processing unit performs processing of adapting contrast between images captured by respective ones of the plurality of surgical imaging devices.

(11) The medical system of claim 1, wherein the signal processing unit performs processing that adapts resolution between images captured by respective ones of the plurality of surgical imaging devices.

(12) The medical system according to any one of (1) to (7) above, wherein the signal processing unit performs processing of adapting noise between images captured by respective ones of the plurality of surgical imaging devices.

(13) The medical system according to any one of (1) to (7) above, wherein the signal processing unit performs processing of adapting a depth of field between images captured by respective ones of the plurality of surgical imaging devices.

(14) The medical system of claim 1, wherein the signal processing unit performs a process of adapting an amount of jitter between images captured by respective ones of the plurality of surgical imaging devices.

(15) The medical system according to any one of (1) to (7) above, wherein the signal processing unit performs processing of adapting a depth between stereoscopic images captured by respective ones of the plurality of surgical imaging devices.

(16) The medical system according to any one of (1) to (7) above, wherein the signal processing unit performs processing of adapting a viewing angle between images captured by respective ones of the plurality of surgical imaging devices.

(17) The medical system according to any one of (1) to (16) above, wherein the signal processing unit coordinates an image captured by one of the plurality of surgical imaging devices with an image captured by another surgical imaging device with reference to the image.

(18) The medical system according to any one of (1) to (16) above, wherein the signal processing unit coordinates an arbitrary target image with images captured by a plurality of the surgical imaging apparatuses with reference thereto.

(19) A control unit to which each of a plurality of surgical imaging devices is connected includes a signal processing unit that coordinates images captured by the respective surgical imaging devices.

(20) A medical system, comprising:

a plurality of surgical imaging devices;

a control unit to which each of the surgical imaging devices is connected; and

an integrated device to which each of the plurality of control units is connected, the integrated device including a signal processing unit that coordinates images captured by the respective surgical imaging devices.

List of reference numerals

100 camera unit

200 CCU

210 signal processing unit

600 Integrated device

1000 surgical system

27页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:分析物传感器和制造分析物传感器的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!