Distance measuring system, light receiving module, and method of manufacturing band pass filter

文档序号:1102502 发布日期:2020-09-25 浏览:6次 中文

阅读说明:本技术 测距系统、光接收模块以及制造带通滤波器的方法 (Distance measuring system, light receiving module, and method of manufacturing band pass filter ) 是由 横川创造 菊池勇辉 诹访泰介 千代田亮 于 2019-02-19 设计创作,主要内容包括:该测距系统设置有:光源单元,其向物体发射红外光;光接收单元,其接收来自物体的红外光;以及算术处理单元,其基于来自光接收单元的数据获取关于物体的距离信息,其中,包括带通滤波器的光学构件被设置在光接收单元的光接收表面侧,该带通滤波器选择性地透射规定波长范围内的红外光,并且该带通滤波器的光入射表面具有凹形。(This ranging system is provided with: a light source unit that emits infrared light to an object; a light receiving unit that receives infrared light from an object; and an arithmetic processing unit that acquires distance information on the object based on data from the light receiving unit, wherein an optical member including a band-pass filter that selectively transmits infrared light in a prescribed wavelength range and has a light incident surface having a concave shape is provided on a light receiving surface side of the light receiving unit.)

1. A ranging system, comprising:

a light source unit that emits infrared light to a target object;

a light receiving unit that receives the infrared light from the target object; and

an arithmetic processing unit that obtains information on a distance to the target object based on data from the light receiving unit,

wherein an optical member including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range is arranged on a light receiving surface side of the light receiving unit, and

the band pass filter has a concave light incident surface.

2. The ranging system according to claim 1,

the optical member includes a lens disposed on a light incident surface side of the band-pass filter, and

an incident angle of the light at the maximum image height with respect to a light incident surface of the band-pass filter is 10 degrees or less.

3. The ranging system according to claim 1,

the half width of the transmission band of the band-pass filter is 50nm or less.

4. The ranging system according to claim 1,

the band-pass filter comprises

A first filter transparent to light in a predetermined wavelength range of infrared light, an

And a second filter opaque to visible light and transparent to infrared light.

5. The ranging system according to claim 4,

the first filter and the second filter are stacked and formed on one side of a substrate.

6. The ranging system according to claim 4,

the first filter is formed on one surface of the base material, and

the second filter is formed on the other surface of the substrate.

7. The ranging system according to claim 1,

the first filter is arranged on the light incidence surface side, and

the second filter is arranged on the light receiving unit side.

8. The ranging system according to claim 7,

the second filter has a concave shape that mimics the light incident surface.

9. The ranging system according to claim 7,

the second filter has a planar shape.

10. The ranging system according to claim 1,

the second filter is arranged on the light incidence surface side, and

the first filter is arranged on the light receiving unit side.

11. The ranging system according to claim 10,

the first filter has a concave shape that mimics the light incident surface.

12. The ranging system according to claim 1,

the light source unit includes an infrared laser element or an infrared light emitting diode element.

13. The ranging system according to claim 1,

the light source unit emits infrared light having a center wavelength of about 850nm, about 905nm, or about 940 nm.

14. The ranging system according to claim 1,

the arithmetic processing unit obtains distance information based on a time of flight of light reflected from the target object.

15. The ranging system according to claim 1,

the infrared light is emitted to the target object in a predetermined pattern, and

the arithmetic processing unit obtains distance information based on a pattern of light reflected from the target object.

16. A light receiving module comprising:

a light receiving unit that receives infrared light; and

an optical member arranged on a light receiving surface side of the light receiving unit and including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range,

wherein the band pass filter has a concave light incident surface.

17. The light receiving module of claim 16,

the optical member includes a lens disposed on a light incident surface side of the band-pass filter.

18. The light receiving module of claim 17,

an incident angle of the light at the maximum image height with respect to a light incident surface of the band-pass filter is 10 degrees or less.

19. A method of manufacturing a bandpass filter, the method comprising:

forming a band-pass filter layer on a diaphragm that is transparent to at least an infrared light component and undergoes plastic deformation;

placing the diaphragm on which the band-pass filter layer has been formed on a mold in which a recess is formed on one surface and an opening is formed through from the recess to the other surface; and is

Air in the recess is sucked from the other surface through the opening.

20. The method of manufacturing a bandpass filter according to claim 19, the method further comprising:

dividing the diaphragm on which the band-pass filter layer has been formed into a predetermined shape including a concave surface formed by sucking air in the concave portion.

Technical Field

The present disclosure relates to a ranging system, a light receiving module, and a method of manufacturing a band pass filter.

Background

In recent years, a ranging system has been proposed in which information on a distance to a target object is obtained by emitting light to the target object and receiving reflected light (for example, see patent document 1). The configuration of emitting infrared light and receiving reflected light to obtain distance information has advantages, for example, the light source is not very noticeable, and the operation can be performed in parallel with capturing a normal visible light image.

In terms of reducing the interference affecting the measurement, it is preferable to limit the wavelength range of infrared light as the electromagnetic wavelength to be imaged as narrow as possible. For this reason, a band-pass filter transparent only to a specific wavelength band is generally disposed in front of the imaging element.

Reference list

Patent document

Patent document 1: japanese patent application laid-open No. 2017-150893

Disclosure of Invention

Problems to be solved by the invention

In order to cope with the reduction in height of the housing of the electronic apparatus, the light receiving module or the like used in the portable electronic apparatus is forced to have a configuration of an optical system having a so-called pupil correction in which a principal ray angle greatly differs between the center and the periphery of the imaging element. The band characteristic of the band pass filter shifts in the wavelength direction according to the angle of incident light. Therefore, in order to receive the target light without difficulty in the center and the periphery of the light receiving unit including the imaging element and the like, it is necessary to set the bandwidth of the band pass filter to be wider than the normal bandwidth. This results in an increased influence of disturbing light.

Accordingly, an object of the present disclosure is to provide a ranging system, a light receiving module, and a method of manufacturing a band pass filter, which enable setting a narrow bandwidth for the band pass filter and reducing the influence of disturbance light.

Solution to the problem

In order to achieve the above object, a ranging system according to the present disclosure includes:

a light source unit that emits infrared light to a target object;

a light receiving unit that receives infrared light from a target object; and

an arithmetic processing unit that obtains information on a distance to the target object based on the data from the light receiving unit,

wherein an optical member including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range is arranged on a light receiving surface side of the light receiving unit, and

the band-pass filter has a concave light incident surface.

In order to achieve the above object, a light receiving module according to the present disclosure includes:

a light receiving unit that receives infrared light; and

an optical member arranged on a light receiving surface side of the light receiving unit and including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range,

wherein the band pass filter has a concave light incident surface.

In order to achieve the above object, a method of manufacturing a band pass filter according to the present disclosure includes:

forming a band-pass filter layer on a diaphragm that is transparent to at least an infrared light component and undergoes plastic deformation;

placing the diaphragm on which the band-pass filter layer has been formed on a mold in which a recess is formed on one surface and an opening passing from the recess to the other surface is formed; and is

Air in the recess is sucked from the other surface through the opening.

Drawings

Fig. 1 is a schematic diagram showing a basic configuration of a ranging system according to a first embodiment of the present disclosure.

Fig. 2 is a schematic view showing a configuration of an optical member in the distance measuring system of the reference example.

Fig. 3A is a schematic diagram showing a relationship between an image height and an angle with respect to a principal ray angle (CRA) in an optical member of a reference example. Fig. 3B is a schematic diagram showing characteristics of a band-pass filter in the optical member of the reference example.

Fig. 4A is a schematic view showing the configuration of an optical member in the distance measuring system according to the first embodiment. Fig. 4B is a schematic diagram showing the characteristics of the band-pass filter in the optical member according to the first embodiment.

Fig. 5 is a schematic diagram showing the relationship between the wavelength shift and the angle relative to the CRA in the bandpass filter.

Fig. 6A and 6B are schematic diagrams showing the configuration of a band-pass filter. Fig. 6C is a diagram showing the characteristics of the band-pass filter.

Fig. 7A is a schematic diagram showing the characteristics of the first filter. Fig. 7B is a diagram showing the characteristics of the second filter.

Fig. 8 is a diagram showing a configuration example of the first filter, and fig. 8A is a table showing a stacking relationship. Fig. 8B shows the transmission characteristics of the filter.

Fig. 9 is a diagram showing a configuration example of the second filter, and fig. 9A is a table showing a stacking relationship. Fig. 9B shows the transmission characteristics of the filter.

Fig. 10A, 10B, 10C, and 10D are schematic diagrams illustrating a first method of manufacturing a band pass filter.

Fig. 11A, 11B, 11C, and 11D are schematic diagrams illustrating a second method of manufacturing a band pass filter.

Fig. 12A, 12B, and 12C are schematic diagrams showing another configuration example of the band-pass filter.

Fig. 13A, 13B, 13C, and 13D are schematic diagrams illustrating a third method of manufacturing a bandpass filter.

Fig. 14A, 14B, 14C, and 14D are schematic diagrams illustrating a fourth method of manufacturing a band pass filter.

Fig. 15 is a schematic view showing a configuration of a sheet used in a fifth method of manufacturing a band-pass filter.

Fig. 16A, 16B, and 16C are schematic diagrams illustrating vacuum formation in a fifth method of manufacturing a bandpass filter.

Fig. 17 is a schematic view showing a press working in a fifth method of manufacturing a band pass filter.

Fig. 18A and 18B are schematic views showing a method of manufacturing a light receiving module.

Fig. 19A and 19B are schematic diagrams showing the structure of a light receiving module.

Fig. 20 is a schematic view showing the structure of a light receiving module including a lens.

Fig. 21A, 21B, and 21C are schematic diagrams showing the configuration of a semiconductor device used in a ranging system.

Fig. 22 is a schematic diagram showing a first modification of the distance measuring system.

Fig. 23 is a schematic diagram showing a second modification of the distance measuring system.

Fig. 24 is a schematic diagram showing a third modification of the distance measuring system.

Fig. 25 is a schematic view showing a fourth modification of the distance measuring system.

Fig. 26A and 26B are schematic diagrams showing an example of the arrangement of a light receiving unit and a light source unit in a portable electronic apparatus.

Fig. 27 is a block diagram showing an example of a schematic configuration of a vehicle control system.

Fig. 28 is an explanatory view showing an example of the mounting positions of the vehicle exterior information detector and the imaging unit.

Fig. 29 is a diagram showing an example of a schematic configuration of an endoscopic surgery system.

Fig. 30 is a block diagram showing an example of the functional configuration of the camera head and CCU shown in fig. 29.

Detailed Description

The present disclosure will be described below based on embodiments with reference to the accompanying drawings. The present disclosure is not limited to the embodiments, and various numerical values, materials, and the like in the embodiments are examples. In the following description, the same elements or elements having the same function will be denoted by the same reference numerals without redundant description. Note that description will be made in the following order.

1. General description of ranging system and light receiving module according to the present disclosure

2. First embodiment

3. First modification

4. Second modification example

5. Third modification example

6. Fourth modification example

7. First application example

8. Second application example

9. Arrangements of the present disclosure

[ general description of ranging system and light receiving module according to the present disclosure ]

As described above, the ranging system according to the present disclosure includes:

a light source unit that emits infrared light to a target object;

a light receiving unit that receives infrared light from a target object; and

an arithmetic processing unit that obtains information on a distance to the target object based on the data from the light receiving unit,

wherein an optical member including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range is arranged on a light receiving surface side of the light receiving unit, and

the band-pass filter has a concave light incident surface.

The ranging system according to the present disclosure may have a configuration in which,

the optical member includes a lens disposed on a light incident surface side of the band-pass filter, and

the light at the maximum image height has an incident angle of 10 degrees or less with respect to a light incident surface of the band-pass filter.

The ranging system of the present disclosure, including the above-described preferred configuration, may have a configuration in which,

the half width of the transmission band of the band-pass filter is 50nm or less.

The ranging system of the present disclosure, including the various preferred configurations described above, may have a configuration in which,

the band-pass filter comprises

A first filter transparent to light in a predetermined wavelength range of infrared light, an

And a second filter opaque to visible light and transparent to infrared light.

In this case, the mobile station, in the configuration,

the first filter and the second filter may be stacked and formed on one side of the substrate.

Optionally, in a configuration,

the first filter may be formed on one surface of the substrate, and

the second filter may be formed on the other surface of the substrate.

The ranging system of the present disclosure, including the various preferred configurations described above, may have a configuration in which,

the first filter is arranged on the light incidence surface side, and

the second filter is arranged on the light receiving unit side.

In this case, the second filter may have a concave shape imitating the light incident surface in the configuration. Alternatively, the second filter may have a planar shape in configuration.

Optionally, in a configuration,

the second filter may be arranged on the light incident surface side, and

the first filter may be disposed on the light receiving unit side.

In this case, the first filter may have a concave shape imitating the light incident surface in the configuration.

The ranging system of the present disclosure, including the various preferred configurations described above, may have a configuration in which,

the light source unit includes an infrared laser element or an infrared light emitting diode element.

The ranging system of the present disclosure, including the various preferred configurations described above, may have a configuration in which,

the light source unit emits infrared light having a center wavelength of about 850nm, about 905nm, or about 940 nm.

The ranging system of the present disclosure, including the various preferred configurations described above, may have a configuration in which,

the arithmetic processing unit obtains distance information based on a time of flight of light reflected from the target object.

Optionally, in a configuration,

the infrared light may be emitted to the target object in a predetermined pattern, and

the arithmetic processing unit may obtain the distance information based on a pattern of light reflected from the target object.

As described above, the light receiving module according to the present disclosure includes:

a light receiving unit that receives infrared light; and

an optical member arranged on a light receiving surface side of the light receiving unit and including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range,

wherein the band pass filter has a concave light incident surface.

The light receiving module according to the present disclosure may have a configuration in which,

the optical member includes a lens disposed on a light incident surface side of the band-pass filter. In this case, in this configuration, the incident angle of the light at the maximum image height with respect to the light incident surface of the band-pass filter may be 10 degrees or less.

As described above, the method of manufacturing a band pass filter according to the present disclosure includes:

forming a band-pass filter layer on a diaphragm that is transparent to at least an infrared light component and undergoes plastic deformation;

placing the diaphragm on which the band-pass filter layer has been formed on a mold in which a recess is formed on one surface and an opening passing from the recess to the other surface is formed; and is

Air in the recess is sucked from the other surface through the opening.

The method of manufacturing a band pass filter according to the present disclosure may have a configuration in which,

the diaphragm on which the band-pass filter layer has been formed is divided into a predetermined shape including a concave surface formed by sucking air in the concave portion.

In the ranging system and the light receiving module of the present disclosure including the various preferred configurations described above, for example, a photoelectric conversion element or an imaging element (such as a CMOS sensor or a CCD sensor in which pixels including various pixel transistors are arranged in a two-dimensional matrix in a row direction and a column direction) may be used as the light receiving unit.

In the ranging system of the present disclosure including the various preferred configurations described above, it is possible to have a configuration in which an arithmetic processing unit that obtains information on the distance to the target object based on data from the light receiving unit operates based on physical connection by hardware, or operates based on a program. The same applies to a controller or the like controlling the entire ranging system.

[ first embodiment ]

The first embodiment relates to a ranging system and a light receiving module according to the present disclosure.

Fig. 1 is a schematic diagram showing a basic configuration of a ranging system according to a first embodiment of the present disclosure.

The ranging system 1 includes:

a light source unit 70 that emits infrared light to a target object;

a light receiving unit 20 that receives infrared light from a target object; and

an arithmetic processing unit 40 that obtains information on the distance to the target object based on the data from the light receiving unit 20.

An optical member 10 is disposed on a light receiving surface side of the light receiving unit 20, the optical member 10 including a band-pass filter 12 selectively transparent to infrared light in a predetermined wavelength range. The band-pass filter 12 has a light incident surface of a concave shape. The optical member 10 includes a lens (lens group) 11 arranged on the light incident surface side of the band-pass filter 12.

The light receiving unit 20 is constituted by a CMOS sensor or the like, and a signal of the light receiving unit 20 is digitized by the analog-to-digital converting unit 30 and sent to the arithmetic processing unit 40. These operations are controlled by the controller 50.

The light source unit 70 emits, for example, infrared light having a wavelength in the range of about 700nm to 1100 nm. The light source unit 70 includes a light emitting element such as an infrared laser element or an infrared light emitting diode element. The deviation from the center wavelength was about 1nm for the former and about 10nm for the latter. The light source unit 70 is driven by the light source driving unit 60 controlled by the controller 50.

The wavelength of the infrared light emitted by the light source unit 70 may be appropriately selected according to the intended use and configuration of the ranging system. For example, a value such as about 850nm, about 905nm, or about 940nm may be selected as the center wavelength.

The light receiving unit 20, the analog-to-digital conversion unit 30, the arithmetic processing unit 40, the controller 50, and the light source driving unit 60 are formed on a semiconductor substrate including, for example, silicon. They may be configured as a single chip, or may be configured as a plurality of chips according to their functions. This will be described with reference to fig. 21A described later.

The receiving system 1 may be configured as one unit to be suitable for being built in, for example, a device, or may be configured separately.

The basic configuration of the ranging system 1 has been described above. Next, in order to facilitate understanding of the present disclosure, a reference example of a configuration in which a band-pass filter has a planar light incident surface and a problem thereof will be described.

Fig. 2 is a schematic diagram showing a configuration of an optical member in the ranging system of the reference example.

The optical member 90 of the reference example is different from the optical member 10 shown in fig. 1 in that the optical member 90 has a planar band-pass filter 92.

Fig. 3A is a schematic diagram showing a relationship between an image height and an angle with respect to a principal ray angle (CRA) in the optical member of the reference example. Fig. 3B is a schematic diagram showing characteristics of a band-pass filter in the optical member of the reference example.

For example, in the case where the lens is configured to cope with the height reduction, the lens is forced to have a configuration in which the chief ray angle greatly differs between the central portion and the peripheral portion of the light receiving unit 20. Fig. 3A shows the relationship between the image height and the angle with respect to the CRA in this case. The graph is normalized based on the case where the image height at the light receiving unit 20 is the largest, which generally corresponds to the four corners of the screen. As shown in the figure, the angle with respect to the CRA is changed by about 30 degrees in the case where the image height is the maximum, compared to the case where the image height is 0.

Therefore, the incident angle of light with respect to the band-pass filter 92 also changes by about 30 degrees in the case where light is incident on the central portion of the light-receiving unit 20 and in the case where light is incident on the peripheral portion. In the case where light is obliquely incident on the band-pass filter 92, the optical path length of light passing through the filter increases, so that the characteristic shifts to the short wavelength side.

Therefore, for example, in the case where the reception target is infrared light whose center wavelength is 905nm, it is necessary to set the band center of the band pass filter 92 to a wavelength longer than 905nm in the case where the angle with respect to the CRA is 0. Further, it is also necessary to set the bandwidth so as to transmit 905nm even in the case where the angle with respect to the CRA is 0 to 30 degrees. Therefore, it is necessary to set the bandwidth of the band-pass filter 92 to be wider than the normal bandwidth. This results in an increased influence of disturbances such as those involving ambient light.

The reference example of the configuration in which the band-pass filter has the planar light incident surface and the problem thereof have been described above.

Subsequently, the first embodiment will be described.

Fig. 4A is a schematic diagram showing the configuration of an optical member in the ranging system according to the first embodiment. Fig. 4B is a schematic diagram showing the characteristics of the band-pass filter in the optical member according to the first embodiment.

As shown in fig. 4A, the band-pass filter 12 in the first embodiment has a light incident surface having a concave shape. With this arrangement, variations in the incident angle of light with respect to the band-pass filter 12 are reduced.

Therefore, for example, in the case where the reception target is infrared light whose center wavelength is 905nm, the band center of the band-pass filter 12 in the case where the angle with respect to the CRA is 0 may be set to be close to 905 nm. Further, even in the case where light is incident on the peripheral portion of the light receiving unit 20, the shift amount of the characteristic of the band-pass filter 12 to the short wavelength side is reduced. Therefore, the bandwidth of the band-pass filter 92 can be set narrower, and the influence of the interference can be suppressed. By this arrangement, the measurement accuracy can be improved.

Fig. 5 is a schematic diagram showing the relationship between the wavelength shift and the angle relative to the CRA in the bandpass filter. More specifically, the shift amount of the value on the short wavelength side and the shift amount of the value on the long wavelength side of the transmission band of the band-pass filter 12 are shown.

According to fig. 5, the transmission band of the band-pass filter 12 is shifted by about 20nm at an angle of about 30 degrees with respect to the CRA. On the other hand, in the case where the angle with respect to the CRA is about 10 degrees, the shift amount of the transmission band can be suppressed to about one tenth. Therefore, the shape of the band-pass filter 12 is preferably set so that the incident angle of light at the maximum image height with respect to the light incident surface of the band-pass filter 12 is 10 degrees or less. Further, the half width of the transmission band of the band-pass filter 12 is preferably 50nm or less.

The band-pass filter 12 may have the following configuration: including a first filter transparent to light within a predetermined wavelength range of infrared light and a second filter opaque to visible light and transparent to infrared light. Configuration examples and manufacturing methods of the band-pass filter 12 will be described below with reference to the drawings.

Fig. 6A and 6B are schematic diagrams showing the configuration of a band-pass filter. Fig. 6C is a diagram showing the band-pass filter characteristics.

Fig. 6A shows a configuration example in which the first filter 12A is arranged on the light incident surface side, and the second filter 12B is arranged on the light receiving unit 20 side. Fig. 6B shows a configuration example in which the second filter 12B is arranged on the light incident surface side, and the first filter 12A is arranged on the light receiving unit 20 side. Both of which exhibit transmission characteristics as shown in fig. 6C.

Fig. 7A is a schematic diagram showing the characteristics of the first filter. Fig. 7B is a diagram showing the characteristics of the second filter.

The optical filter may be constituted by, for example, a multilayer film in which a high refractive index material and a low refractive index material are appropriately stacked. However, in the case where the optical filter is designed so that the wavelength band including the target light may have the transmission characteristics, even light having, for example, a frequency having a multiplication relation exhibits some transmission characteristics. Therefore, as shown in fig. 7A, the characteristic of the first filter 12A is schematically shown. For this purpose, as shown in fig. 7B, a second filter 12B opaque to visible light and transparent to infrared light is further included. Therefore, the characteristics of the entire filter are as shown in fig. 6C.

Fig. 8 is a diagram showing a configuration example of the first filter, and fig. 8A is a table showing a stacking relationship. Fig. 8B shows the transmission characteristics of the filter.

In this example, the first filter 12A is constituted by an 11-layer multilayer film. Silicon oxide is used as the high refractive index material, and silicon is used as the low refractive index material.

Fig. 9 is a diagram showing a configuration example of the second filter, and fig. 9A is a table showing a stacking relationship. Fig. 9B shows the transmission characteristics of the filter.

In this example, the second filter 12B is constituted by a 5-layer multilayer film. Silicon oxide is used as the high refractive index material, and silicon is used as the low refractive index material.

Known methods such as CVD, PDV, ALD can be used as a method for forming a multilayer film, and ALD having advantages such as high-precision film formation and good coverage is preferably selected.

The first filter 12A and the second filter 12B may have a configuration in which they are stacked and formed on one side of the substrate. The manufacturing method will be described below.

Fig. 10A, 10B, 10C, and 10D are schematic diagrams illustrating a first method of manufacturing a band pass filter.

A base material 13 (see fig. 10A) composed of a material transparent to infrared light and having a concave surface formed on a surface is prepared, and a second filter 12B (see fig. 10B) composed of a multilayer film is formed thereon. Next, the first filter 12A composed of a multilayer film is formed thereon (see fig. 10C). Thereafter, the band-pass filter 12 may be obtained by dividing the band-pass filter 12 into predetermined shapes including concave surfaces (see fig. 10D).

Note that, in the above example, the second filter 12B is formed, and then the first filter 12A is formed. However, a configuration in which both are interchanged may be employed.

Fig. 11A, 11B, 11C, and 11D are schematic diagrams illustrating a second method of manufacturing a band pass filter.

This example is similar to the process flow described with reference to fig. 10 except for the difference that the base material 13A having a concave surface formed on the front surface and a convex surface on the corresponding rear surface portion is used, and the description thereof will be omitted.

In the above configuration, the first filter 12A and the second filter 12B are stacked, but another configuration may be used. For example, in such a configuration, the first filter 12A is formed on one surface of the base material, and the second filter 12B is formed on the other surface of the base material.

Fig. 12A, 12B, and 12C are schematic diagrams showing another configuration example of the band-pass filter.

In fig. 12A and 12B, the first filter 12A and the second filter 12B are arranged at fixed intervals. In fig. 12A, the first filter 12A is arranged on the light incident surface side, and the second filter 12B is arranged on the light receiving unit 20 side. On the other hand, in fig. 12B, the second filter 12B is arranged on the light incident surface side, and the first filter 12A is arranged on the light receiving unit 20 side. Fig. 12C is a modification of fig. 12A, and the second filter 12B is planar.

Fig. 13A, 13B, 13C, and 13D are schematic diagrams illustrating a third method of manufacturing a bandpass filter.

A base material 13A having a concave surface formed on the front surface and a convex surface on the corresponding rear surface portion is prepared (see fig. 13A), and a first filter 12A composed of a multilayer film is formed on the front surface (see fig. 13B). Next, a second filter 12B composed of a multilayer film is formed on the rear surface of the base 13A (see fig. 13C). Thereafter, the band-pass filter 12 may be obtained by dividing the band-pass filter 12 into a predetermined shape including a concave surface (see fig. 13D).

Note that, in the above example, the second filter 12B is formed, and then the first filter 12A is formed. However, a configuration in which both are interchanged may be employed.

Fig. 14A, 14B, 14C, and 14D are schematic diagrams illustrating a fourth method of manufacturing a band pass filter.

This example is similar to the process flow described with reference to fig. 14 except for the difference that the substrate 13 having a concave surface formed on the front surface and having a flat rear surface is used, and the description thereof will be omitted.

Fig. 15, 16A, 16B, 16C, and 17 are drawings illustrating a fifth method of manufacturing a band pass filter.

Fig. 15 is a schematic diagram showing the configuration of the diaphragm 15 used in the fifth method of manufacturing the band-pass filter. A diaphragm 15A made of a material that is transparent to at least an infrared light component and is plastically deformed when an external force is applied is prepared, and a reflection film 12C (a band-pass filter layer or a BPF layer) is formed on one surface of the diaphragm 15A by vapor deposition. Next, the antireflection film 12D (AR layer) is formed on the other surface of the diaphragm 15A by vapor deposition. With this arrangement, the diaphragm 15 on which the band-pass filter layer and the like are formed can be obtained.

Note that the antireflection film 12D may be vapor-deposited first on the diaphragm 15A, and then the reflection film 12C may be vapor-deposited. Further, the diaphragm 15A has a band-pass filter function obtained by mixing an absorbing material. Specifically, an absorbent material is mixed or vapor-deposited on a material based on a resin substrate such as cycloolefin polymer, polyethylene terephthalate (PET), or polycarbonate to obtain a film having a band pass characteristic. With this configuration, light in a wavelength band that cannot be removed only by the reflection film vapor-deposited on one surface of the diaphragm can be removed by the diaphragm having the band-pass characteristic. Note that the diaphragm 15A is not limited to the configuration of the present disclosure, and a diaphragm material having no band-pass characteristic may be applied.

Fig. 16A, 16B, and 16C are schematic diagrams illustrating vacuum forming in a fifth method of manufacturing a bandpass filter. A suction die 16 (mold) is prepared, in which a concave portion 16A having a predetermined curvature is formed on one surface, and an opening 16B is formed near the center of the concave portion 16A, and the opening 16B passes through the other surface side (see fig. 16A) in the suction die 16. Next, on the surface of the suction mold 16 formed with the concave portion 16A, the diaphragm 15 is placed so that the reflection film may face upward (so that the reflection prevention film and the suction mold may be opposed to each other) (see fig. 16B). Thereafter, the air in the recess 16A is sucked from the other surface of the suction mold 16 through the opening 16B, and the diaphragm 15 is plastically deformed (see fig. 16C). Next, by removing the diaphragm 15 from the suction mold 16, the diaphragm 15 in which a concave portion having a predetermined curvature is formed can be obtained.

Fig. 17 is a schematic view showing a press working in a fifth method of manufacturing a band pass filter. The film sheet 15 is vacuum-formed by the method shown in fig. 16A, 16B and 16C to form a plurality of concave portions on the film sheet 15. After that, the band-pass filter 12 may be obtained by being divided into predetermined shapes including concave portions by press working.

By using the fifth manufacturing method, the band-pass filter layer can be vapor-deposited on the planar film, so that the band-pass filter layer can be vapor-deposited uniformly and the manufacturing cost can be reduced.

The light receiving unit 20 and the optical member 10 may also be configured as an integrated light receiving module. A manufacturing method of the light receiving module and the like will be described below.

Fig. 18A and 18B are schematic views illustrating a method of manufacturing a light receiving module. Fig. 19A and 19B are schematic diagrams illustrating the structure of a light receiving module.

A semiconductor wafer 200 having a plurality of imaging elements formed thereon, a wafer-shaped frame 140 having an opening formed therein corresponding to a light receiving surface, and a wafer 120 having a plurality of band-pass filters formed thereon are stacked (see fig. 18A), and then diced and divided into chips having a predetermined shape (see fig. 18B). Fig. 19A shows a cross section of a singulated chip. Reference numeral 14A denotes a frame. In this configuration, a cavity exists between the base material 13 and the light receiving unit 20.

In some cases, in configurations, the frame 140 having the opening may be replaced with an adhesive member having no opening. Fig. 19B shows a cross section of a singulated chip having such a configuration. Reference numeral 14B denotes an adhesive member. In this configuration, there is no cavity between the base material 13 and the light receiving unit 20.

Fig. 20 shows an example of a light receiving module further including a lens. In this configuration, the chip and the lens manufactured as described above are incorporated in the housing.

The method of manufacturing the light receiving module and the like have been described above.

As described above, the light receiving unit 20, the analog-to-digital conversion unit 30, the arithmetic processing unit 40, the controller 50, and the light source driving unit 60 shown in fig. 1 may be configured as a single chip or may be configured as a plurality of chips according to their functions. Fig. 21A, 21B, and 21C are schematic diagrams illustrating a configuration of a semiconductor device used in the ranging system.

Subsequently, acquisition of the distance information will be described. In the ranging system 1 shown in fig. 1, the arithmetic processing unit 40 may have a configuration in which distance information is obtained based on the time of flight of light reflected from a target object, or may have a configuration in which infrared light is emitted to a target object in a predetermined pattern, and the arithmetic processing unit 40 obtains distance information based on the pattern of light reflected from the target object. These will be described below as various modifications.

[ first modification ]

Fig. 22 shows a configuration in which distance information is obtained based on the time of flight of reflected light. In the distance measuring system 1A, the light diffusion member 71 is arranged in front of the light source unit 70 to emit diffused light. The light source unit 70 is modulated at a frequency of, for example, several tens kHz to several hundreds MHz. Then, the distance information can be obtained by detecting the reflected light component in synchronization with the modulation of the light source unit 70.

[ second modification ]

Fig. 23 also shows a configuration in which distance information is obtained based on the time of flight of reflected light. In the ranging system 1B, the scanning unit 72 scans the light from the light source unit 70. Then, distance information can be obtained by detecting the reflected light component in synchronization with the scanning.

[ third modification ]

Fig. 24 shows a configuration in which infrared light is emitted to a target object in a predetermined pattern, and the arithmetic processing unit 40 obtains distance information based on the pattern of light reflected from the target object. In the ranging system 1C, the pattern projection unit 73 causes the light from the light source unit 70 to be emitted to the target object in a predetermined pattern. The distance information may be obtained by detecting information on a spatial distribution of the illuminance pattern or distortion of the pattern image on the target object.

[ fourth modification ]

Fig. 25 shows a configuration in which stereoscopic information is also obtained by arranging a plurality of light receiving units at a distance from each other. Note that the configuration may be any one of the following configurations: a configuration of emitting diffused light as in the first modification, a configuration of scanning light from a light source as in the second modification, or a configuration of emitting light in a predetermined pattern as in the third modification. Fig. 26A and 26B are schematic diagrams showing an example of the arrangement of a light receiving unit and a light source unit in the case where the light receiving unit and the light source unit are provided in a portable electronic apparatus.

In the first embodiment, the band of the band-pass filter can be narrowed, and the influence of disturbance light can be reduced. Therefore, high-quality range-finding imaging can be achieved even under external light. Further, by setting the shape of the band pass filter according to the lens module, a light receiving module having excellent wavelength selectivity can be provided.

[ first application example ]

The techniques according to the present disclosure may be applied to a variety of products. For example, the techniques according to the present disclosure may be implemented as a device mounted on any type of moving object, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobile, an airplane, a drone, a ship, a robot, a construction machine, or an agricultural machine (tractor).

Fig. 27 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which vehicle control system 7000 is an example of a moving object control system to which the technique according to the present disclosure can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010. In the example shown in fig. 27, the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside-vehicle information detection unit 7400, an inside-vehicle information detection unit 7500, and an integrated control unit 7600. The communication network 7010 that connects the plurality of control units may be, for example, a Controller Area Network (CAN), a Local Interconnect Network (LIN), a Local Area Network (LAN), or an in-vehicle communication network conforming to an optional standard such as FlexRay (registered trademark).

Each control unit includes: a microcomputer that performs arithmetic processing according to various programs; a storage unit that stores a program executed by the microcomputer, parameters for various calculations, and the like; and a drive circuit that drives a device on which various controls are performed. Each control unit includes a network interface for performing communication with another control unit via the communication network 7010, and also includes a communication interface for performing wired or wireless communication with devices, sensors, and the like inside or outside the vehicle. Fig. 27 shows a functional configuration of an integrated control unit 7600, which includes a microcomputer 7610, a general-purpose communication interface 7620, an exclusive communication interface 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device interface 7660, an audio/image output unit 7670, an in-vehicle network interface 7680, and a storage unit 7690. In a similar manner, the other control units also include a microcomputer, a communication interface, a storage unit, and the like.

The drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 7100 functions as a device for controlling a driving force generating device (such as an internal combustion engine or a drive motor) for generating a driving force of a vehicle, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism that adjusts a steering angle of the vehicle, a brake device that generates a braking force of the vehicle, and the like. The drive system control unit 7100 may have a function as a device for controlling an anti-lock brake system (ABS), an Electronic Stability Control (ESC), or the like.

The drive system control unit 7100 is connected to a vehicle state detector 7110. The vehicle state detector 7110 includes, for example, a gyro sensor that detects an angular velocity of shaft rotation of the vehicle body; an acceleration sensor that detects an acceleration of the vehicle; or at least one of sensors for detecting an operation amount of an accelerator pedal, an operation amount of a brake pedal, a steering angle of a steering wheel, an engine speed, a wheel rotation speed, and the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detector 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.

The system control unit 7200 controls the operations of various devices mounted on the vehicle body according to various programs. For example, the system control unit 7200 functions as a device for controlling a keyless entry system, a smart key system, a power window device, or various lamps (such as a headlamp, a backlight, a brake lamp, a flasher, or a fog lamp). In this case, radio waves transmitted from the portable device in place of the key or signals from various switches may be input to the body system control unit 7200. The body system control unit 7200 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.

The battery control unit 7300 controls the secondary battery 7310 as a power source for driving the motor according to various programs. For example, information such as a battery temperature, a battery output voltage, or a battery remaining capacity is input to the battery control unit 7300 from a battery device including the secondary battery 7310. Battery control unit 7300 performs arithmetic processing using these signals, and performs temperature adjustment control of secondary battery 7310 or control of a cooling device or the like included in the battery device.

The off-vehicle information detecting unit 7400 detects information outside the vehicle on which the vehicle control system 7000 is mounted. For example, the vehicle exterior information detecting unit 7400 is connected to at least one of the imaging unit 7410 or the vehicle exterior information detector 7420. The imaging unit 7410 includes at least one of a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and another camera. The off-vehicle information detector 7420 includes, for example, at least one of an environment sensor for detecting the current weather or climate or an ambient information detection sensor for detecting another vehicle, an obstacle, a pedestrian, or the like in the surroundings of the vehicle on which the vehicle control system 7000 is mounted.

The environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall. The ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR ("light detection and ranging" or "laser imaging detection and ranging") device. These imaging unit 7410 and off-board information detector 7420 may each be provided as an independent sensor or device, or may be provided as an integrated device including a plurality of sensors or devices.

Here, fig. 28 shows an example of the mounting positions of the imaging unit 7410 and the vehicle exterior information detector 7420. The imaging units 7910, 7912, 7914, 7916, and 7918 are provided, for example, at least one of a front nose, side mirrors, a rear bumper, a rear door, and a top of a windshield in the vehicle interior of the vehicle 7900. The imaging unit 7910 provided at the front nose of the vehicle interior and the imaging unit 7918 provided at the top of the windshield mainly acquire images of the front of the vehicle 7900. The imaging units 7912 and 7914 provided at the side view mirror mainly acquire an image of a side view from the vehicle 7900. The imaging unit 7916 provided at the rear bumper or the rear door mainly acquires an image behind the vehicle 7900. The imaging unit 7918 provided at the top of the windshield inside the vehicle is mainly used to detect a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, and the like.

Note that fig. 28 shows an example of an imaging range of each of the imaging units 7910, 7912, 7914, and 7916. The imaging range a represents an imaging range of the imaging unit 7910 provided at the nose, the imaging ranges b and c represent imaging ranges of the imaging units 7912 and 7914 provided at the side mirrors, respectively, and the imaging range d represents an imaging range of the imaging unit 7916 provided at the rear bumper or the rear door. For example, by superimposing a plurality of image data captured by the imaging units 7910, 7912, 7914, and 7916, an overhead image of the vehicle 7900 viewed from above can be obtained.

The off-vehicle information detectors 7920, 7922, 7924, 7926, 7928, and 7930 disposed at the front, rear, sides, and corners of the vehicle 7900 and at the top of the windshield inside the vehicle may be, for example, ultrasonic sensors or radar devices. The off-vehicle information detectors 7920, 7926 and 7930 disposed at the tops of the front nose, rear bumper, rear door and windshield inside the vehicle 7900 may be, for example, LIDAR devices. These vehicle exterior information detectors 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, and the like.

Returning to fig. 27, the description will be continued. The vehicle exterior information detecting unit 7400 causes the imaging unit 7410 to capture an image of the outside of the vehicle, and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the connected vehicle exterior information detector 7420. When the vehicle exterior information detector 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives information from the received reflected waves. The vehicle exterior information detecting unit 7400 may perform object detection processing or distance detection processing of a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received information. The vehicle exterior information detecting unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, and the like based on the received information. The vehicle exterior information detecting unit 7400 may calculate a distance to an object outside the vehicle based on the received information.

Further, the vehicle exterior information detecting unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data. The vehicle exterior information detecting unit 7400 may also generate an overhead image or a panoramic image by performing processing such as distortion correction or positioning on the received image data, and generating a composite image from a plurality of pieces of image data captured by different imaging units 7410. The vehicle exterior information detecting unit 7400 may perform viewpoint conversion processing using a plurality of image data captured by different imaging units 7410.

The in-vehicle information detection unit 7500 detects information inside the vehicle. The in-vehicle information detection unit 7500 is connected to a driver state detector 7510 that detects the state of the driver, for example. The driver state detector 7510 may include a camera that captures an image of the driver, a bio sensor that detects bio information of the driver, a microphone that collects sound in the interior of the vehicle, and the like. The biosensor is provided at, for example, a seat surface, a steering wheel, or the like, and detects biological information of an occupant seated on the seat or a driver holding the steering wheel. Based on the detection information input from the driver state detector 7510, the in-vehicle information detection unit 7500 may calculate the degree of fatigue or the degree of concentration of the driver, or determine whether the driver has fallen asleep. The in-vehicle information detection unit 7500 can perform processing such as noise removal processing on the signal of the collected sound.

The integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs. The integrated control unit 7600 is connected to the input unit 7800. The input unit 7800 includes devices that can be used by the occupant to perform input operations, such as a touch panel, buttons, a microphone, switches, a joystick, and the like. Data obtained by voice recognition of voice input via the microphone may be input to the integrated control unit 7600. The input unit 7800 may be a remote control device using infrared rays or other radio waves, for example, or may be an externally connected device such as a mobile phone or a Personal Digital Assistant (PDA) that may be used to operate the vehicle control system 7000. The input unit 7800 may be, for example, a camera, in which case the occupant may input information through gestures. Alternatively, the data to be input may be obtained by detecting a motion of a wearable appliance worn by the occupant. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on information input by the occupant or the like using the above-described input unit 7800, and outputs the input signal to the integrated control unit 7600. Through the operation input unit 7800, the occupant or the like inputs various types of data to the vehicle control system 7000 or gives instructions regarding processing operations.

The storage unit 7690 may include a Read Only Memory (ROM) for storing various programs executed by the microcomputer and a Random Access Memory (RAM) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may include a magnetic storage device such as a Hard Disk Drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.

The universal communication interface 7620 is a universal communication interface that mediates communications with various types of devices present in the external environment 7750. The generic communication interface 7620 may implement a cellular communication protocol such as global system for mobile communications (GSM) (registered trademark), WiMAX, Long Term Evolution (LTE), or LTE-a advanced (LTE-a), or another wireless communication protocol such as wireless LAN (also known as Wi-Fi (registered trademark)) or bluetooth (registered trademark). The generic communication interface 7620 may connect to devices (e.g., application servers or control servers) residing on an external network (e.g., the internet, a cloud network, or an operator private network) via, for example, a base station or access point. Further, the universal communication interface 7620 may be connected to terminals present in the vicinity of the vehicle (e.g., terminals of drivers, pedestrians, or shops, or Machine Type Communication (MTC) terminals), for example, using a point-to-point (P2P) technology.

The proprietary communication interface 7630 is a communication interface that supports communication protocols designed for use in vehicles. The dedicated communication interface 7630 may implement, for example, a standard protocol such as Wireless Access (WAVE) in a vehicular environment, which is a combination of lower IEEE802.11p and upper IEEE1609, Dedicated Short Range Communication (DSRC), or a cellular communication protocol. The dedicated communication interface 7630 generally performs V2X communication, which V2X communication is a concept including at least one of vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication.

For example, the positioning unit 7640 receives a Global Navigation Satellite System (GNSS) signal from a GNSS satellite (e.g., receives a Global Positioning System (GPS) signal from a GPS satellite), performs positioning, and generates position information including latitude, longitude, and altitude of the vehicle. Note that the positioning unit 7640 may specify the current position by exchanging signals with a wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smart phone having a positioning function.

For example, the beacon receiving unit 7650 receives radio waves or electromagnetic waves transmitted from a wireless station or the like installed on a road to acquire information such as a current location, traffic congestion, traffic halt, or required time. Note that the function of the beacon receiving unit 7650 may be included in the dedicated communication interface 7630 described above.

The in-vehicle device interface 7660 is a communication interface that mediates a connection between the microcomputer 7610 and various types of in-vehicle devices 7760 existing inside the vehicle. The in-vehicle device interface 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, bluetooth (registered trademark), Near Field Communication (NFC), or wireless usb (wusb). Further, the in-vehicle device interface 7660 may establish a wired connection such as a Universal Serial Bus (USB), a high-definition multimedia interface (HDMI) (registered trademark), or a mobile high-definition link (MHL) via a connection terminal (not shown) (and a cable, if necessary). The in-vehicle device 7760 may include, for example, at least one of a mobile device or a wearable device owned by an occupant or an information device carried in or attached to a vehicle. In addition, the in-vehicle device 7760 may include a navigation device that searches for a route to an optional destination. The in-vehicle device interface 7660 exchanges a control signal or a data signal with the in-vehicle device 7760.

The in-vehicle network interface 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The in-vehicle network interface 7680 transmits and receives a signal or the like based on a predetermined protocol supported by the communication network 7010.

The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 according to various programs, based on information acquired via at least one of the general communication interface 7620, the special communication interface 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device interface 7660, and the in-vehicle network interface 7680. For example, the microcomputer 7610 may calculate a control target value of a driving force generation device, a steering mechanism, or a brake device based on information acquired from the inside and outside of the vehicle, and output a control command to the drive system control unit 7100. For example, the microcomputer 7610 may perform cooperative control for the purpose of realizing the function of an Advanced Driver Assistance System (ADAS) including collision avoidance or impact mitigation of the vehicle, follow-up running based on the inter-vehicle distance, vehicle speed hold running, vehicle collision warning, vehicle lane departure warning, and the like. Further, the microcomputer 7610 can perform cooperative control for the purpose of automatic operation (i.e., automatic driving without an operation by the driver) or the like by controlling a driving force generation device, a steering mechanism, a brake device, and the like based on information acquired from the periphery of the vehicle.

The microcomputer 7610 may generate information on a three-dimensional distance between the vehicle and an object such as a structure or a person in the periphery of the vehicle based on information acquired via at least one of the general communication interface 7620, the special communication interface 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device interface 7660, and the in-vehicle network interface 7680, and create local map information including information in the periphery of the current position of the vehicle. Further, the microcomputer 7610 can predict a danger such as a vehicle collision, approaching a pedestrian, or the like, or entering a closed road based on the acquired information, and generate a warning signal. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.

The audio/image output unit 7670 transmits at least one of the audio output signal and the image output signal to an output device capable of visually or aurally notifying an occupant in the vehicle or outside the vehicle of information. In the example of fig. 27, an audio speaker 7710, a display unit 7720, and a dashboard 7730 are shown as output devices. The display unit 7720 may include, for example, at least one of an in-vehicle display and a flat-view display. The display unit 7720 may have an Augmented Reality (AR) display function. In addition to these devices, the output device may be another device, such as a headset, a wearable device (such as a glasses-type display worn by the occupant), a projector, or a light. In the case where the output device is a display device, the display device visually displays results obtained from various types of processing performed by the microcomputer 7610 or information received from another control unit in various forms such as text, images, tables, or graphs. Further, in the case where the output device is an audio output device, the audio output device converts an audio signal including reproduced audio data, acoustic data, and the like into an analog signal, and audibly outputs the analog signal.

Note that in the example shown in fig. 27, at least two control units connected via the communication network 7010 may be integrated into one control unit. Alternatively, each control unit may comprise a plurality of control units. Furthermore, the vehicle control system 7000 may comprise a further control unit (not shown). Furthermore, in the above description, some or all of the functions performed by one of the control units may be provided to the other control unit. That is, predetermined arithmetic processing may be performed by any control unit as long as information is transmitted and received via the communication network 7010. Similarly, a sensor or a device connected to any control unit may be connected to another control unit, and a plurality of control units may transmit and receive detection information to and from each other via the communication network 7010.

The technique according to the present disclosure can be applied to an imaging unit such as the vehicle exterior information detection unit in the above-described configuration.

[ second application example ]

The techniques according to the present disclosure may be applied to a variety of products. For example, techniques according to the present disclosure may be applied to endoscopic surgical systems.

Fig. 29 is a diagram showing an example of a schematic configuration of an endoscopic surgery system 5000 to which the technique according to the present disclosure can be applied. Fig. 29 shows a case in which an operator (doctor) 5067 performs an operation on a patient 5071 on a bed 5069 using an endoscopic surgery system 5000. As shown, the endoscopic surgical system 5000 includes an endoscope 5001, other surgical tools 5017, a support arm device 5027 that supports the endoscope 5001, and a cart 5037 on which various devices for endoscopic surgery are mounted.

In endoscopic surgery, a plurality of tubular fenestration instruments, called trocars 5025 a-5025 d, are used to pierce the abdominal wall, rather than cutting and opening the abdominal wall. Then, the barrel 5003 of the endoscope 5001 and other surgical tools 5017 are inserted into the body cavity of the patient 5071 through the trocars 5025a to 5025 d. In the example shown, insufflation tube 5019, energy treatment tool 5021 and forceps 5023 are inserted into the body cavity of patient 5071 as other surgical tools 5017. Further, the energy therapy tool 5021 is used to perform incision and exfoliation of tissue, sealing of blood vessels, and the like by using high-frequency current or ultrasonic vibration. However, the illustrated surgical tool 5017 is merely an example, and various surgical tools (such as forceps, retractor, etc.) generally used for endoscopic surgery may be used as the surgical tool 5017.

An image of a surgical site in a body cavity of a patient 5071 captured by an endoscope 5001 is displayed on a display device 5041. The operator 5067 performs a procedure such as excision of a diseased portion, for example, using the energy therapy tool 5021 or the forceps 5023 while observing an image of the surgical site displayed on the display device 5041 in real time. Note that, although not shown, insufflation tube 5019, energy treatment tool 5021, and forceps 5023 are supported by operator 5067, an assistant, or the like during surgery.

(support arm device)

The support arm arrangement 5027 includes an arm 5031 extending from a base 5029. In the illustrated example, the arm 5031 includes joints 5033a, 5033b, and 5033c and links 5035a and 5035b, and is driven by the control of the arm control device 5045. The arm 5031 supports the endoscope 5001 to control its position and orientation. With this arrangement, the position of the endoscope 5001 can be stably fixed.

(endoscope)

The endoscope 5001 includes a lens barrel 5003 and a camera head 5005, the lens barrel 5003 is inserted into a body cavity of the patient 5071 by a predetermined length from an end portion, and the camera head 5005 is connected to a proximal end of the lens barrel 5003. In the illustrated example, an endoscope 5001 configured as a so-called rigid endoscope having a rigid barrel 5003 is illustrated. Alternatively, the endoscope 5001 may be configured as a so-called flexible endoscope having a flexible lens barrel 5003.

The lens barrel 5003 is provided with an opening at an end thereof, and an objective lens is mounted in the opening. The endoscope 5001 is connected to a light source device 5043. Light generated by the light source device 5043 is guided to the end of the lens barrel 5003 through a light guide extending inside the lens barrel, and is emitted toward an observation target in a body cavity of the patient 5071 through an objective lens. The endoscope 5001 may be a forward-looking endoscope, an oblique-looking endoscope, or a side-looking endoscope.

The camera head 5005 is provided with an optical system and an imaging element inside thereof, and light (observation light) reflected from an observation target is focused on the imaging element through the optical system. The imaging element photoelectrically converts the observation light to generate an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image. The image signal is transmitted to a Camera Control Unit (CCU)5039 as raw data. Note that the camera head 5005 has a function of adjusting the magnification and the focal length by appropriately driving the optical system.

Note that the camera head 5005 may be provided with a plurality of imaging elements so as to support, for example, stereoscopic viewing (3D display) or the like. In this case, the lens barrel 5003 is provided with a plurality of relay optical systems inside thereof to guide observation light to each of a plurality of imaging elements.

(various devices mounted on the cart)

The CCU5039 is constituted by a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and the like, and integrally controls the operations of the endoscope 5001 and the display device 5041. Specifically, the CCU5039 performs various types of image processing, such as development processing (demosaic processing), for displaying an image based on an image signal, on the image signal received from the camera head 5005. The CCU5039 supplies the display device 5041 with an image signal on which image processing has been performed. Further, the CCU5039 sends a control signal to the camera head 5005 to control the driving thereof. The control signal may contain information about imaging conditions, such as magnification and focal length.

The CCU5039 controls the display device 5041 to display an image based on the image signal on which the image processing has been performed by the CCU 5039. For example, in the case where the endoscope 5001 supports imaging at a high resolution such as 4K (3840 horizontal pixels × 2160 vertical pixels) or 8K (7680 horizontal pixels × 4320 vertical pixels), and/or in the case where the endoscope 5001 supports 3D display, a display device supporting high resolution display and/or 3D display may be used as the display device 5041 accordingly. In a case of supporting imaging at high resolution such as 4K or 8K, a display device having a size of 55 inches or more may be used as the display device 5041 to provide a more immersive feeling. Further, a plurality of display devices 5041 having different resolutions and sizes may be provided according to intended use.

The light source device 5043 includes, for example, a light source such as a Light Emitting Diode (LED), and provides emitted light to the endoscope 5001 when imaging a surgical site.

The arm control means 5045 is constituted by a processor such as a CPU, for example, and operates according to a predetermined program to control the driving of the arm 5031 supporting the arm device 5027 according to a predetermined control method.

The input device 5047 is an input interface to the endoscopic surgical system 5000. The user can input various types of information and input instructions to the endoscopic surgical system 5000 via the input device 5047. For example, the user inputs various types of information related to the surgery, such as physical information of the patient and information about the surgical procedure, via the input device 5047. Further, for example, the user can input an instruction to drive the arm 5031, an instruction to change the imaging condition (the type, magnification, focal length, and the like of emitted light) of the endoscope 5001, an instruction to drive the energy therapy tool 5021, and the like via the input device 5047.

The type of the input device 5047 is not limited, and various known input devices may be used as the input device 5047. As the input device 5047, for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 5057, and/or a joystick can be applied. In the case where a touch panel is used as the input device 5047, the touch panel may be provided on a display surface of the display device 5041.

Alternatively, the input device 5047 is, for example, a device worn by the user, such as a glasses-type wearable device or a head-mounted display (HMD), and performs various inputs according to gestures or line of sight of the user detected by these devices. Further, the input device 5047 includes a camera capable of detecting a motion of the user, and performs various inputs according to a gesture or a line of sight of the user detected from a video captured by the camera. Further, the input device 5047 includes a microphone capable of collecting a voice of the user, and performs various inputs by the voice via the microphone. As described above, since the input device 5047 has a configuration in which various types of information can be input in a non-contact manner, in particular, a user (for example, the operator 5067) belonging to a cleaning area can operate an apparatus belonging to a non-cleaning area in a non-contact manner. Further, the user can operate the apparatus while holding the surgical tool, and this improves the convenience of the user.

The treatment tool control device 5049 controls driving of the energy treatment tool 5021 for cauterization or incision of tissue, sealing of blood vessels, and the like. For the purpose of securing the field of view of the endoscope 5001 and securing the working space of the operator, in order to inflate the body cavity of the patient 5071, the insufflation device 5051 delivers gas into the body cavity through the gas blow tube 5019. The recorder 5053 is a device that can record various types of information related to a procedure. The printer 5055 is a device that can print various types of information relating to a procedure in various formats such as text, images, or graphs.

Particular feature configurations of the endoscopic surgical system 5000 will be described in more detail below.

(support arm device)

The support arm arrangement 5027 includes a base 5029 as a pedestal and an arm 5031 extending from the base 5029. In the example shown, arm 5031 includes a plurality of joints 5033a, 5033b, and 5033c, and a plurality of links 5035a and 5035b connected by joint 5033 b. However, fig. 29 shows the configuration of the arm 5031 in a simplified manner for convenience of explanation. In practice, the shapes, the numbers, and the arrangements of the joints 5033a to 5033c and the links 5035a and 5035b, the directions of the rotation axes of the joints 5033a to 5033c, and the like may be appropriately set so that the arm 5031 has a desired degree of freedom. For example, the arm 5031 may suitably have a configuration that achieves six or more degrees of freedom. With this arrangement, the endoscope 5001 can be freely moved within the movable range of the arm 5031, and the barrel 5003 of the endoscope 5001 can be inserted into the body cavity of the patient 5071 from a desired direction.

The joints 5033a to 5033c are provided with actuators, and the joints 5033a to 5033c have a configuration capable of being rotated about a predetermined rotation axis by the driving of the actuators. The arm control means 5045 controls driving of the actuator so as to control the rotation angle of each of the joints 5033a to 5033c, and controls driving of the arm 5031. With this arrangement, the position and orientation of the endoscope 5001 can be controlled. At this time, the arm control device 5045 may control driving of the arm 5031 by various known control methods such as force control or position control.

For example, the position and orientation of the endoscope 5001 can be controlled by the operator 5067 executing appropriate operation inputs via the input device 5047 (including the foot switch 5057), thereby causing the arm control device 5045 to appropriately control the driving of the arm 5031 in accordance with the operation inputs. By this control, the endoscope 5001 at the end of the arm 5031 can be moved from an optional position to an optional position and then fixedly supported at the moved position. Note that the arm 5031 can operate by a so-called master-slave method. In this case, the arm 5031 may be remotely controlled by the user via an input device 5047 mounted at a location remote from the operating room.

Further, in the case of the force application control, so-called power assist control may be performed in which the arm control means 5045 receives an external force from the user and drives the actuators of the corresponding joints 5033a to 5033c so that the arm 5031 moves smoothly in accordance with the external force. With this arrangement, when the user moves the arm 5031 while directly touching the arm 5031, the arm 5031 can move with a relatively light force. Therefore, the endoscope 5001 can be moved more intuitively and with simpler operation, and this improves the convenience of the user.

Here, generally, the endoscope 5001 has been supported by a doctor called an endoscopist during an endoscopic operation. On the other hand, by using the support arm device 5027, the position of the endoscope 5001 can be more reliably fixed without manual operation. This makes it possible to stably obtain an image of the surgical site and smoothly perform the surgery.

Note that the arm control device 5045 need not be provided on the cart 5037. Further, the arm control 5045 need not be a single device. For example, one arm control device 5045 may be provided for each of the joints 5033a to 5033c of the arm 5031 that supports the arm device 5027, and a plurality of arm control devices 5045 may cooperate with each other to control the driving of the arm 5031.

(light source device)

The light source device 5043 provides emitted light to the endoscope 5001 when imaging a surgical site. The light source device 5043 is constituted by a white light source including, for example, an LED, a laser light source, or a combination thereof. At this time, in the case where the white light source includes a combination of RGB laser light sources, the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, and this enables adjustment of the white balance of the captured image at the light source device 5043. Further, in this case, by emitting laser light from each of the RGB laser light sources to the observation target in a time-sharing manner, and controlling the driving of the imaging element of the camera head 5005 in synchronization with the emission timing, an image of each of R, G and B can be captured in a time-sharing manner. According to this method, a color image can be obtained without providing a color filter in the imaging element.

Further, the driving of the light source device 5043 may be controlled so that the intensity of light to be output may be changed at predetermined time intervals. By controlling the driving of the imaging element of the camera head 5005 in synchronization with the timing of the change in light intensity, acquiring images in a time-sharing manner, and generating a composite image from the images, a high dynamic range image without so-called occlusion shadow or highlight protrusion can be generated.

Further, the light source device 5043 may have a configuration in which light may be provided in a predetermined wavelength band that may be used for special light observation. In special light observation, for example, by utilizing the wavelength dependence of light absorption in body tissue, so-called narrow-band imaging is performed in which predetermined tissue such as blood vessels in the mucosal surface layer is imaged with high contrast by emitting light in a band narrower than light emitted during normal observation (i.e., white light). Alternatively, in the special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by emitting excitation light. In the fluorescence observation, for example, excitation light is emitted to body tissue and fluorescence from the body tissue is observed (autofluorescence observation), or a fluorescence image is obtained by locally injecting an agent such as indocyanine green (ICG) into the body tissue and emitting excitation light corresponding to the fluorescence wavelength of the agent to the body tissue. The light source device 5043 may have a configuration in which a configuration of narrow-band light and/or excitation light that can be used for such special light observation may be provided.

(Camera head and CCU)

The functions of the camera head 5005 and the CCU5039 of the endoscope 5001 will be described in more detail with reference to fig. 30. Fig. 30 is a block diagram showing an example of the functional configuration of the camera head 5005 and the CCU5039 shown in fig. 29.

Referring to fig. 30, the camera head 5005 has functions including a lens unit 5007, an imaging unit 5009, a driving unit 5011, a communication unit 5013, and a camera head controller 5015. Further, the CCU5039 has functions including a communication unit 5059, an image processing unit 5061, and a controller 5063. The camera head 5005 and CCU5039 are connected by a transmission cable 5065 to allow bidirectional communication.

First, a functional configuration of the camera head 5005 will be described. The lens unit 5007 is an optical system provided at a connection with the lens barrel 5003. Observation light incident from an end portion of the lens barrel 5003 is guided to the camera head 5005 and incident on the lens unit 5007. The lens unit 5007 is constituted by a combination of a plurality of lenses including zoom lenses (zoom lenses) and focus lenses. The optical characteristics of the lens unit 5007 are adjusted so that observation light can be focused on the light receiving surface of the imaging element of the imaging unit 5009. Further, the zoom lens and the focus lens have a configuration in which their positions can be moved on the optical axis to adjust the magnification and focus of a captured image.

The imaging unit 5009 is constituted by an imaging element, and is arranged at a stage subsequent to the lens unit 5007. Observation light having passed through the lens unit 5007 is focused on a light receiving surface of the imaging element, and an image signal corresponding to an observation image is generated by photoelectric conversion. The image signal generated by the imaging unit 5009 is supplied to the communication unit 5013.

As an imaging element included in the imaging unit 5009, for example, a Complementary Metal Oxide Semiconductor (CMOS) type image sensor which has a bayer array and is capable of capturing a color image is used. Note that as the imaging element, an imaging element capable of capturing a high-resolution image of, for example, 4K or higher may be used. An image of the surgical site can be obtained with high resolution, and this allows the operator 5067 to grasp the state of the surgical site in more detail and perform the surgery more smoothly.

Further, the imaging element included in the imaging unit 5009 has a configuration including a pair of imaging elements, one for acquiring a right-eye image signal supporting 3D display and the other for acquiring a left-eye image signal supporting 3D display. The 3D display allows the operator 5067 to more accurately grasp the depth of the living tissue in the surgical site. Note that in the case where the imaging unit 5009 has a multi-plate type configuration, a plurality of lens units 5007 are provided to support each imaging element.

Further, the imaging unit 5009 need not be provided in the camera head 5005. For example, the imaging unit 5009 may be disposed inside the lens barrel 5003, just behind the objective lens.

The drive unit 5011 is constituted by an actuator, and the camera head controller 5015 controls the zoom lens and the focus lens of the lens unit 5007 to move by a predetermined distance along the optical axis. With this arrangement, the magnification and focus of the image captured by the imaging unit 5009 can be appropriately adjusted.

The communication unit 5013 is constituted by communication means for transmitting and receiving various types of information to and from the CCU 5039. The communication unit 5013 transmits an image signal obtained from the imaging unit 5009 as raw data to the CCU5039 via the transmission cable 5065. At this time, the image signal is preferably transmitted by optical communication so as to display the captured image of the surgical site with low delay. This is because, during surgery, the operator 5067 performs surgery while observing the state of the affected part from the captured image, and it is required to display a moving image of the surgical site in as real time as possible in order to perform safer and more reliable surgery. In the case of performing optical communication, the communication unit 5013 is provided with a photoelectric conversion module that converts an electric signal into an optical signal. The image signal is converted into an optical signal by the photoelectric conversion module and then transmitted to the CCU5039 via the transmission cable 5065.

Further, the communication unit 5013 receives a control signal for controlling driving of the camera head 5005 from the CCU 5039. The control signal contains, for example, information for specifying the frame rate of a captured image, information for specifying the exposure value at the time of imaging and/or information for specifying the magnification and focus of a captured image, information on imaging conditions, and the like. The communication unit 5013 supplies the received control signal to the camera head controller 5015. Note that the control signals from the CCU5039 may also be sent via optical communication. In this case, the communication unit 5013 is provided with a photoelectric conversion module that converts an optical signal into an electrical signal. The control signal is converted into an electric signal by the photoelectric conversion module, and then supplied to the camera head controller 5015.

Note that the above-described imaging conditions such as the frame rate, the exposure value, the magnification, and the focus are automatically set by the controller 5063 of the CCU5039 based on the acquired image signal. That is, the endoscope 5001 has a so-called Auto Exposure (AE) function, an Auto Focus (AF) function, and an Auto White Balance (AWB) function.

The camera head controller 5015 controls driving of the camera head 5005 based on a control signal from the CCU5039 received via the communication unit 5013. For example, the camera head controller 5015 controls driving of the imaging element of the imaging unit 5009 based on information for specifying a frame rate of a captured image and/or information for specifying exposure at the time of imaging. Further, for example, the camera head controller 5015 appropriately moves the zoom lens and the focus lens of the lens unit 5007 via the drive unit 5011 based on information for specifying the magnification and focus of a captured image. The camera head controller 5015 may also include a function of storing information for identifying the lens barrel 5003 and the camera head 5005.

Note that the camera head 5005 can have high-pressure sterilization resistance by a configuration in which the lens unit 5007, the imaging unit 5009, and the like are arranged in a sealed structure having high air-tightness and water-tightness.

Next, a functional configuration of the CCU5039 will be described. The communication unit 5059 is constituted by communication means for transmitting and receiving various types of information to and from the camera head 5005. The communication unit 5059 receives an image signal transmitted from the camera head 5005 via the transmission cable 5065. At this time, as described above, the image signal can be appropriately transmitted through optical communication. In this case, in order to support optical communication, the communication unit 5059 is provided with a photoelectric conversion module that converts an optical signal into an electrical signal. The communication unit 5059 supplies the image processing unit 5061 with the image signal converted into an electric signal.

Further, the communication unit 5059 transmits a control signal for controlling driving of the camera head 5005 to the camera head 5005. The control signal may also be transmitted by optical communication.

The image processing unit 5061 performs various types of image processing on an image signal as raw data transmitted from the camera head 5005. Examples of the image processing include various types of known signal processing such as development processing, high image quality processing such as band emphasis processing, super-resolution processing, Noise Reduction (NR) processing, and/or camera shake correction processing, and/or enlargement processing (electronic zoom processing). Further, the image processing unit 5061 performs demodulation processing on the image signal for performing AE, AF, and AWB.

The image processing unit 5061 is constituted by a processor such as a CPU or a GPU, and the above-described image processing and demodulation processing may be performed by a processor operating according to a predetermined program. Note that in the case where the image processing unit 5061 is configured of a plurality of GPUs, the image processing unit 5061 appropriately divides information relating to image signals and performs image processing in parallel by the plurality of GPUs.

The controller 5063 performs various controls related to capturing an image of the surgical site by the endoscope 5001 and displaying the captured image. For example, the controller 5063 generates a control signal for controlling driving of the camera head 5005. At this time, in the case where the imaging conditions have been input by the user, the controller 5063 generates a control signal based on the input of the user. Alternatively, in the case where the endoscope 5001 has an AE function, an AF function, and an AWB function, the controller 5063 appropriately calculates an optimal exposure value, a focal length, and a white balance from the result of demodulation processing performed by the image processing unit 5061, and generates a control signal.

Further, the controller 5063 causes the display device 5041 to display an image of the surgical site based on the image signal on which the image processing unit 5061 has performed image processing. At this time, the controller 5063 identifies various objects in the image of the surgical site using various image recognition techniques. For example, the controller 5063 may recognize a surgical tool such as forceps, a specific living body part, bleeding, fog when using the energy therapy tool 5021, or the like by detecting the shape, color, or the like of the edge of the object in the image of the surgical site. When displaying the image of the surgical site on the display device 5041, the controller 5063 superimposes various types of surgical support information on the image of the surgical site using the result of the recognition. By superimposing the surgery support information and presenting it to the operator 5067, the surgery can be performed more safely and reliably.

The transmission cable 5065 connecting the camera head 5005 and the CCU5039 is an electrical signal cable supporting electrical signal communication, an optical fiber cable supporting optical communication, or a composite cable thereof.

Here, in the illustrated example, wired communication is performed using the transmission cable 5065, but wireless communication may be performed between the camera head 5005 and the CCU 5039. In the case where wireless communication is performed between the two, the transmission cable 5065 does not need to be laid in the operating room. This may address situations where movement of medical personnel in the operating room is impeded by the transmission cable 5065.

Examples of endoscopic surgical systems 5000 to which techniques according to the present disclosure may be applied have been described above. Note that although the endoscopic surgical system 5000 has been described here as an example, a system to which the technique according to the present disclosure can be applied is not limited to such an example. For example, techniques according to the present disclosure may be applied to examine flexible endoscopic systems or microsurgical systems.

The technique according to the present disclosure can be applied to, for example, a camera head in the above-described configuration.

[ configuration of the present disclosure ]

Note that the present disclosure may also have the following configuration.

[A1]

A ranging system, comprising:

a light source unit that emits infrared light to a target object;

a light receiving unit that receives infrared light from a target object; and

an arithmetic processing unit that obtains information on a distance to the target object based on the data from the light receiving unit,

wherein an optical member including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range is arranged on a light receiving surface side of the light receiving unit, and

the band pass filter has a concave light incident surface.

[A2]

The ranging system according to [ a1], wherein,

the optical member includes a lens disposed on a light incident surface side of the band-pass filter, and

the light at the maximum image height has an incident angle of 10 degrees or less with respect to a light incident surface of the band-pass filter.

[A3]

The ranging system according to [ A1] or [ A2], wherein,

the half width of the transmission band of the band-pass filter is 50nm or less.

[A4]

The ranging system according to any one of [ A1] to [ A3], wherein,

the band-pass filter comprises

A first filter transparent to light in a predetermined wavelength range of infrared light, an

And a second filter opaque to visible light and transparent to infrared light.

[A5]

The ranging system according to [ a4], wherein,

the first filter and the second filter are stacked and formed on one side of the substrate.

[A6]

The ranging system according to [ a4], wherein,

the first filter is formed on one surface of the base material, and

the second filter is formed on the other surface of the substrate.

[A7]

The ranging system according to any one of [ A4] to [ A6], wherein,

the first filter is arranged on the light incidence surface side, and

the second filter is arranged on the light receiving unit side.

[A8]

The ranging system according to [ a7], wherein,

the second filter has a concave shape imitating a light incident surface.

[A9]

The ranging system according to [ a7], wherein,

the second filter has a planar shape.

[A10]

The ranging system according to any one of [ A4] to [ A6], wherein,

the second filter is arranged on the light incidence surface side, and

the first filter is arranged on the light receiving unit side.

[A11]

The ranging system according to [ a10], wherein,

the first filter has a concave shape imitating a light incident surface.

[A12]

The ranging system according to any one of [ A1] to [ A11], wherein,

the light source unit includes an infrared laser element or an infrared light emitting diode element.

[A13]

The ranging system according to any one of [ A1] to [ A12], wherein,

the light source unit emits infrared light having a center wavelength of about 850nm, about 905nm, or about 940 nm.

[A14]

The ranging system according to any one of [ A1] to [ A13], wherein,

the arithmetic processing unit obtains distance information based on a time of flight of light reflected from the target object.

[A15]

The ranging system according to any one of [ A1] to [ A13], wherein,

the infrared light is emitted to the target object in a predetermined pattern, and

the arithmetic processing unit obtains distance information based on a pattern of light reflected from the target object.

[B1]

A light receiving module comprising:

a light receiving unit that receives infrared light; and

an optical member arranged on a light receiving surface side of the light receiving unit and including a band-pass filter selectively transparent to infrared light in a predetermined wavelength range,

wherein the band pass filter has a concave light incident surface.

[B2]

The light receiving module according to [ B1], wherein,

the optical member includes a lens disposed on a light incident surface side of the band-pass filter.

[B3]

The light receiving module according to [ B2], wherein,

the light at the maximum image height has an incident angle of 10 degrees or less with respect to a light incident surface of the band-pass filter.

[B4]

The light receiving module according to any one of [ B1] to [ B3], wherein,

the half width of the transmission band of the band-pass filter is 50nm or less.

[B5]

The light receiving module according to any one of [ B1] to [ B4], wherein,

the band-pass filter comprises

A first filter transparent to light in a predetermined wavelength range of infrared light, an

And a second filter opaque to visible light and transparent to infrared light.

[B6]

The light receiving module according to [ B5], wherein,

the first filter and the second filter are stacked and formed on one side of the substrate.

[B7]

The light receiving module according to [ B5], wherein,

the first filter is formed on one surface of the base material, and

the second filter is formed on the other surface of the substrate.

[B8]

The light receiving module according to any one of [ B5] to [ B7], wherein,

the first filter is arranged on the light incidence surface side, and

the second filter is arranged on the light receiving unit side.

[B9]

The light receiving module according to [ B8], wherein,

the second filter has a concave shape imitating a light incident surface.

[B10]

The light receiving module according to [ B8], wherein,

the second filter has a planar shape.

[B11]

The light receiving module according to any one of [ B5] to [ B7], wherein,

the second filter is arranged on the light incidence surface side, and

the first filter is arranged on the light receiving unit side.

[B12]

The light receiving module according to [ B11], wherein,

the first filter has a concave shape imitating a light incident surface.

List of reference marks

1. 1A, 1B, 1C and 1D ranging system

10. 10A, 10B and 90 optical member

11 lens

12. 92 band-pass filter

12A first filter

12B second filter

12C band pass filter layer

12D antireflection film

13. 13A substrate transparent to infrared light

14A frame

14B adhesive member

15. 15A diaphragm

16 suction mould

16A recess

16B opening

20. 20A and 20B light receiving unit

30. 30A and 30B analog-to-digital conversion units

40. 40A and 40B arithmetic processing unit

50 controller

60 light source driving unit

70 light source unit

71 light diffusing member

72 scanning unit

73 pattern projection unit

80 synthesis processing unit

120 wafer-shaped band-pass filter bank

140 wafer shaped frame

200 wafer-shaped imaging element groups.

56页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:激光雷达测距的多脉冲融合分析

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!