Image pickup apparatus

文档序号:1958053 发布日期:2021-12-10 浏览:28次 中文

阅读说明:本技术 摄像装置 (Image pickup apparatus ) 是由 畑野启介 于 2020-05-27 设计创作,主要内容包括:本发明提供了一种能够抑制图像质量降低的摄像装置。所述摄像装置设置有:第一半导体基板,其具有光输入面并且设置有光电转换部;第二半导体基板,其设置在所述第一半导体基板的与所述光输入面相反的一侧;绝缘膜,其设置在所述第一半导体基板的设置有所述光输入面的一侧;切口部和孔部的至少一者,其在所述绝缘膜的厚度方向上延伸;注入膜,其注入在所述切口部和所述孔部的至少一者的在深度方向上的至少一部分中;保护部件,其与所述第一半导体基板相对,所述绝缘膜位于所述保护部件与所述第一半导体基板之间;以及接合部件,其包括与所述注入膜的材料不同的材料并且设置在所述保护部件与所述绝缘膜之间。(The invention provides an image pickup apparatus capable of suppressing degradation of image quality. The image pickup apparatus is provided with: a first semiconductor substrate having a light input surface and provided with a photoelectric conversion portion; a second semiconductor substrate disposed on a side of the first semiconductor substrate opposite to the light input surface; an insulating film provided on a side of the first semiconductor substrate on which the light input surface is provided; at least one of a notch portion and a hole portion extending in a thickness direction of the insulating film; an injection film injected into at least a part of at least one of the notch portion and the hole portion in a depth direction; a protective member that is opposed to the first semiconductor substrate, the insulating film being located between the protective member and the first semiconductor substrate; and a bonding member that includes a material different from that of the injection film and is provided between the protection member and the insulating film.)

1. An image pickup apparatus comprising:

a first semiconductor substrate including a light input surface and provided with a photoelectric conversion portion;

a second semiconductor substrate disposed on a side of the first semiconductor substrate opposite to the light input surface;

an insulating film provided on a side of the first semiconductor substrate on which the light input surface is arranged;

at least one of a notch and a hole extending at least in a thickness direction of the insulating film;

an injection film injected into a part or all of the at least one of the notch portion and the hole portion in a depth direction;

a protective member that is opposed to the first semiconductor substrate with the insulating film therebetween; and

a bonding member that includes a material different from that of the injection film and is disposed between the protection member and the insulating film.

2. The image pickup device according to claim 1, wherein the cutout portion is provided in a periphery of the insulating film and extends through the insulating film and the first semiconductor substrate.

3. The image pickup apparatus according to claim 2, wherein the injection film contains an insulating material.

4. The image pickup apparatus according to claim 1, further comprising:

a lens opposed to the photoelectric conversion portion, the insulating film being located between the lens and the photoelectric conversion portion; and

a planarization film covering the lens and including the same material as that of the injection film.

5. The imaging apparatus according to claim 4, wherein a refractive index of a material of the planarization film and a refractive index of a material of the injection film are lower than a refractive index of a material of the lens.

6. The image pickup apparatus according to claim 1, further comprising a pad electrode provided between the first semiconductor substrate and the second semiconductor substrate, wherein,

the hole portion extends through the insulating film and the first semiconductor substrate and reaches the pad electrode.

7. The image pickup apparatus according to claim 6, wherein the injection film contains a conductive material.

8. The image pickup apparatus according to claim 6, wherein the injection film contains a metal material.

9. The image pickup apparatus according to claim 6, further comprising a multilayer wiring layer in which the pad electrode is provided.

10. The image pickup apparatus according to claim 9, further comprising an external connection terminal electrically connected to the pad electrode and provided on a surface of the second semiconductor substrate opposite to the multilayer wiring layer.

11. The imaging apparatus according to claim 1, wherein the injection film is injected into an entire portion in a depth direction of the at least one of the notch portion and the hole portion.

12. The imaging apparatus according to claim 1, wherein the injection film is injected into a part in a depth direction of the at least one of the notch portion and the hole portion.

13. The imaging apparatus according to claim 1, wherein the notch portion and the hole portion are provided, and the injection film is injected into the notch portion and the hole portion.

14. The image pickup apparatus according to claim 1, wherein the at least one of the cutout portion and the hole portion has a width gradually decreasing in the depth direction.

15. The image pickup apparatus according to claim 1, wherein the at least one of the cutout portion and the hole portion has a width that decreases stepwise in the depth direction.

Technical Field

The present invention relates to an image pickup apparatus including a semiconductor substrate.

Background

In recent years, image pickup apparatuses such as CSP (chip size package) have been developed (for example, see patent document 1 and patent document 2). Such an imaging device includes, for example, a semiconductor substrate and a protective member facing the semiconductor substrate. The semiconductor substrate is provided with a photoelectric conversion portion such as a photodiode. The protective member is bonded to the semiconductor substrate, for example, by a bonding member including a resin material.

List of cited documents

Patent document

Patent document 1: japanese unexamined patent application publication No. 2015-159275

Patent document 2: japanese unexamined patent application publication No. 2008-270650

Disclosure of Invention

In such an image pickup apparatus, it is desirable to suppress a decrease in image quality caused by, for example, flare (flare).

Accordingly, it is desirable to provide an image pickup apparatus capable of suppressing a decrease in image quality.

An image pickup apparatus according to an embodiment of the present invention includes: a first semiconductor substrate; a second semiconductor substrate; an insulating film; at least one of the notch portion and the hole portion; injecting a film; a protective member and an engaging member. The first semiconductor substrate includes a light input surface and is provided with a photoelectric conversion portion. The second semiconductor substrate is disposed on a side of the first semiconductor substrate opposite the light input surface. The insulating film is provided on a side of the first semiconductor substrate on which the light input surface is disposed. At least one of the cutout portion and the hole portion extends at least in a thickness direction of the insulating film. The injection film is injected into a part or all of the at least one of the notch portion and the hole portion in the depth direction. The protective member is opposed to the first semiconductor substrate, and the insulating film is located between the protective member and the first semiconductor substrate. The bonding member includes a material different from that of the injection film and is disposed between the protective member and the insulating film.

In the image pickup apparatus according to the embodiment of the present invention, the injection film is injected in a part or all of the at least one of the notch portion and the hole portion in the depth direction. The injection film includes a material different from a material of the joining member. Therefore, the bonding member between the protective member and the insulating film is formed thinner than in the case where the cutout portion or the hole portion is filled with the bonding member.

Drawings

Fig. 1 is a block diagram showing an example of a functional configuration of an image pickup apparatus according to a first embodiment of the present invention.

Fig. 2 is a schematic diagram showing a cross-sectional configuration of a main portion of the image pickup apparatus shown in fig. 1.

Fig. 3 is a schematic diagram showing another example (1) of the cross-sectional configuration of the image pickup apparatus shown in fig. 2.

Fig. 4 is a schematic diagram showing another example (2) of the cross-sectional configuration of the image pickup apparatus shown in fig. 2.

Fig. 5 is a schematic view showing a planar configuration of the notch portion and the like shown in fig. 2.

Fig. 6A is a schematic cross-sectional view showing a procedure of a manufacturing method of the image pickup apparatus shown in fig. 2.

Fig. 6B is a schematic cross-sectional view showing a process subsequent to fig. 6A.

Fig. 6C is a schematic cross-sectional view showing a process after fig. 6B.

Fig. 6D is a schematic cross-sectional view showing a process after fig. 6C.

Fig. 6E is a schematic cross-sectional view showing a process after fig. 6D.

Fig. 6F is a schematic cross-sectional view showing a process after fig. 6E.

Fig. 6G is a schematic cross-sectional view showing a process after fig. 6F.

Fig. 6H is a schematic cross-sectional view showing a process after fig. 6G.

Fig. 7 is a schematic cross-sectional view showing a procedure of the manufacturing method of the image pickup apparatus shown in fig. 3.

Fig. 8 is a schematic cross-sectional view showing a procedure of the manufacturing method of the image pickup apparatus shown in fig. 4.

Fig. 9 is a schematic diagram showing a cross-sectional configuration of a main portion of an image pickup apparatus according to a comparative example.

Fig. 10A is a schematic diagram for explaining reflected light generated in the imaging device shown in fig. 9.

Fig. 10B is a schematic diagram for explaining reflected light generated in the imaging device shown in fig. 2.

Fig. 11 is a schematic diagram showing a cross-sectional configuration of a main portion of an image pickup apparatus according to modification 1.

Fig. 12A is a schematic cross-sectional view showing a procedure of a manufacturing method of the image pickup apparatus shown in fig. 11.

Fig. 12B is a schematic cross-sectional view showing a process subsequent to fig. 12A.

Fig. 13 is a schematic diagram showing a cross-sectional configuration of a main portion of an image pickup apparatus according to modification 2.

Fig. 14A is a schematic cross-sectional view showing a procedure of the manufacturing method of the image pickup apparatus shown in fig. 13.

Fig. 14B is a schematic cross-sectional view showing a process subsequent to fig. 14A.

Fig. 14C is a schematic cross-sectional view showing a process after fig. 14B.

Fig. 14D is a schematic cross-sectional view showing a process after fig. 14C.

Fig. 15 is a schematic diagram showing a cross-sectional configuration of a main portion of an image pickup apparatus according to a second embodiment of the present invention.

Fig. 16 is a schematic view showing a planar configuration of the hole portion shown in fig. 15.

Fig. 17 is a schematic diagram showing another example of the cross-sectional configuration of the image pickup apparatus shown in fig. 15.

Fig. 18A is a schematic cross-sectional view showing a procedure of a manufacturing method of the image pickup apparatus shown in fig. 15.

Fig. 18B is a schematic cross-sectional view showing a process subsequent to fig. 18A.

Fig. 19 is a schematic diagram showing a cross-sectional configuration of a main portion of an image pickup apparatus according to modification 3.

Fig. 20 is a schematic cross-sectional view showing a procedure of the manufacturing method of the image pickup device shown in fig. 19.

Fig. 21 is a block diagram showing an example of an electronic apparatus including the image pickup device shown in fig. 1 and the like.

Fig. 22 is a block diagram showing an example of a schematic configuration of the in-vivo information acquisition system.

Fig. 23 is a diagram showing an example of a schematic configuration of an endoscopic surgery system.

Fig. 24 is a block diagram showing an example of a functional configuration of a camera head and a Camera Control Unit (CCU).

Fig. 25 is a block diagram showing an example of a schematic configuration of a vehicle control system.

Fig. 26 is a view for assisting in explaining an example of mounting positions of the vehicle exterior information detecting unit and the imaging unit.

Detailed Description

Some embodiments of the present invention will be described in detail below with reference to the accompanying drawings. Note that the description will be made in the following order.

1. First embodiment (image pickup device including injection film containing insulating material in cut part)

2. Modification 1 (example in which an injection film is injected into a part of a cut in the depth direction)

3. Modification 2 (example of filling a planarizing film in a notch part)

4. Second embodiment (image pickup device including injection film containing conductive material in hole part)

5. Modification 3 (example having notch and hole)

6. Application example (electronic equipment)

7. Practical application example

<1. first embodiment >

(functional structure of image pickup device 1)

Fig. 1 shows an example of a functional configuration of an image pickup apparatus (image pickup apparatus 1) according to an embodiment of the present invention. The image pickup device 1 includes a pixel unit 200P and a circuit portion 200C that drives the pixel unit 200P. The pixel unit 200P includes, for example, a plurality of light receiving unit regions (pixels P) arranged two-dimensionally. The circuit section 200C includes, for example, a row scanning unit 201, a horizontal selector unit 203, a column scanning unit 204, and a system control unit 202.

In the pixel unit 200P, for example, a pixel driving line Lread (e.g., a row selector line and a reset control line) is arranged for each pixel row, while a vertical signal line Lsig is arranged for each pixel column. The pixel driving line Lread transmits a driving signal for a signal read from the pixel unit 200P. One end of each pixel driving line Lread is connected to an output end corresponding to an associated row of the row scanning unit 201. The pixel unit 200P includes, for example, a pixel circuit provided for each pixel P.

The row scanning unit 201 includes, for example, a shift register and an address decoder, and functions as, for example, a pixel driver that drives each pixel P of the pixel unit 200P in units of rows. Signals output from the respective pixels P of the pixel row selected and scanned by the row scanning unit 201 are supplied to the horizontal selector unit 203 through the respective vertical signal lines Lsig. The horizontal selector unit 203 includes, for example, amplifiers and horizontal selector switches provided for the respective vertical signal lines Lsig.

The column scanning unit 204 includes, for example, a shift register and an address decoder, and sequentially drives the respective horizontal selector switches of the horizontal selector unit 203 while scanning the horizontal selector switches. By selection and scanning of the column scanning unit 204, signals of the respective pixels P transmitted through the respective vertical signal lines Lsig are sequentially output to the horizontal signal lines 205. The output signals are input to, for example, a signal processor not shown through respective horizontal signal lines 205.

The system control unit 202 receives data such as a clock given from the outside and a command giving an operation mode. Further, the system control unit 202 outputs data such as internal information of the image pickup apparatus 1. Further, the system control unit 202 includes a timing generator that generates various timing signals. For example, the system control unit 202 performs drive control of the row scanning unit 201, the horizontal selector unit 203, and the column scanning unit 204 in accordance with various timing signals generated by the timing generator.

(construction of main portion of image pickup apparatus 1)

Fig. 2 is a schematic cross-sectional view showing the configuration of a main part of the image pickup apparatus 1. Referring to fig. 2, a specific configuration of the image pickup apparatus 1 will be explained.

The image pickup apparatus 1 is a CSP, and includes, for example, a logic chip 10, a sensor chip 20, and a protection component 40 in this order. A bonding surface S is formed between the logic chip 10 and the sensor chip 20. Between the sensor chip 20 and the protective member 40, an insulating film 31, a microlens 32, a planarization film 33, and a bonding member 34 are provided in this order from the side where the sensor chip 20 is arranged. For example, the image pickup device 1 is configured such that the side on which the logic chip 10 is arranged is mounted on a printed circuit board such as a motherboard. On the side where the logic chip 10 is arranged, the image pickup device 1 includes a redistribution line 51, a solder bump 52, and a protective resin layer 53. For example, the logic chip 10 and the sensor chip 20 are electrically connected through a through via (not shown). As an alternative to vias, the logic chip 10 and the sensor chip 20 may be electrically connected by a metal direct bond, such as a Cu-Cu bond. Here, the microlens 32 corresponds to one specific example of the "lens" of the present invention. The solder bump 52 corresponds to a specific example of the "external connection terminal".

The logic chip 10 includes, for example, a semiconductor substrate 11 and a multilayer wiring layer 12, and has a structure in which they are stacked. The logic chip 10 includes, for example, a logic circuit and a control circuit. The entire circuit section 200C (fig. 1) may be provided in the logic chip 10. Alternatively, a part of the circuit section 200C may be provided in the sensor chip 20, and the remaining part of the circuit section 200C may be provided in the logic chip 10. Here, the semiconductor substrate 11 corresponds to one specific example of the "second semiconductor substrate" of the present invention, and the multilayer wiring layer 12 corresponds to one specific example of the "multilayer wiring layer" of the present invention.

The semiconductor substrate 11 is opposed to the protective member 40 with the multilayer wiring layer 12 and the sensor chip 20 therebetween. The multilayer wiring layer 12 is provided on one main surface (X-Y plane) of the semiconductor substrate 11, and the rewiring 51 and the like are provided on the other main surface of the semiconductor substrate 11. The semiconductor substrate 11 includes, for example, a silicon (Si) substrate. The thickness (dimension in the Z-axis direction) of the semiconductor substrate 11 is, for example, 50 to 150. mu.m.

The multilayer wiring layer 12 is provided between the semiconductor substrate 11 and the sensor chip 20. The multilayer wiring layer 12 includes a plurality of pad electrodes 12M and an interlayer insulating film 122 that separates the plurality of pad electrodes 12M. The pad electrode 12M is made of, for example, copper (Cu), aluminum (Al), or the like. The interlayer insulating film 122 is made of, for example, a silicon oxide (SiO) film, a silicon nitride (SiN) film, or the like. The multilayer wiring layer 12 includes a plurality of wirings (not shown) separated from each other by interlayer insulating films 122. For example, the bonding surface S is provided between the multilayer wiring layer 12 and the sensor chip 10.

A hole H is provided at a predetermined position of the semiconductor substrate 11. The hole H is provided for electrical connection of the pad electrode 12M and the redistribution line 51. The hole H extends from the other main surface of the semiconductor substrate 11 to the main surface of the semiconductor substrate 11 through the semiconductor substrate 11, and reaches the pad electrode 12M of the multilayer wiring layer 12.

The redistribution traces 51 are disposed near the hole H and cover the side walls and the bottom surface of the hole H. On the bottom surface of the hole H, the redistribution line 51 is in contact with the pad electrode 12M of the multilayer wiring layer 12. The heavy wiring 51 extends from the hole H to the other main surface of the semiconductor substrate 11, and is guided to the region where the solder bump 52 is formed. The redistribution lines 51 are arranged in selective areas of the other main surface of the semiconductor substrate 11. The heavy wiring 51 is made of, for example, copper (Cu), tungsten (W), titanium (Ti), tantalum (Ta), titanium Tungsten (TiW), polysilicon, or the like. The thickness of the heavy wiring 51 is, for example, about several micrometers to several tens micrometers.

Between the redistribution lines 51 and the semiconductor substrate 11, an insulating film (not shown) is provided. The insulating film covers the side walls of the holes H from the other main surface of the semiconductor substrate 11. The insulating film includes, for example, a silicon oxide (SiO) film, a silicon nitride (SiN) film, or the like.

The solder bumps 52 are connected to the redistribution lines 51 led to the other main surface of the semiconductor substrate 11. The solder bumps 52 serve as external connection terminals for mounting on the printed circuit board, and contain, for example, lead-free high melting point solder such as tin (Sn) -silver (Ag) -copper (Cu). For example, a plurality of solder bumps 52 are provided on the other main surface of the semiconductor substrate 11 in a regular arrangement at a predetermined pitch. The arrangement of the solder bumps 52 is appropriately set according to the positions of the bonding pads on a printed circuit board (not shown) on which the image pickup device 1 is to be mounted. The solder bump 52 is electrically connected to the pad electrode 12M of the multilayer wiring layer 12 through the rewiring 51. Other external connection terminals may be used instead of the solder bumps 52. For example, the external connection terminal may include a metal film such as copper (Cu) or nickel (Ni) formed using a plating method.

The protective resin layer 53 provided on the other main surface of the semiconductor substrate 11 is provided for protecting the heavy wiring 51. The protective resin layer 53 has an opening for exposing a part of the redistribution lines 51. The solder bumps 52 are arranged in the openings of the protective resin layer 53. That is, the solder bump 52 is connected to the redistribution line 51 in a portion exposed from the protective resin layer 53. The protective resin layer 53 is, for example, a solder resist, and includes an epoxy-based resin, a polyimide-based resin, a silicon-based resin, an acrylic resin, or the like.

The sensor chip 20 provided between the logic chip 10 and the protective member 40 includes, for example, a multilayer wiring layer (not shown) and a semiconductor substrate 21 in this order from the side where the logic chip 10 is arranged. Here, the semiconductor substrate 21 corresponds to one specific example of the "first semiconductor substrate" of the present invention.

The multilayer wiring layer of the sensor chip 20 is in contact with the multilayer wiring layer 12 of the logic chip 10. Between them, for example, a bonding surface S between the sensor chip 20 and the logic chip 10 is provided. The multilayer wiring layer of the logic chip 10 includes a plurality of wirings and an interlayer insulating film separating the plurality of wirings. In the multilayer wiring layer of the sensor chip 20, for example, a pixel circuit of the pixel unit 200P (fig. 1) is provided.

The semiconductor substrate 21 is made of, for example, a silicon (Si) substrate. The semiconductor substrate 21 is provided with a light input surface 21S. For example, one main surface of the semiconductor substrate 21 constitutes a light input surface 21S. On the other main surface, a multilayer wiring layer is provided. In the semiconductor substrate 21 of the sensor chip 20, a Photodiode (PD)211 is provided for each pixel P. The PD211 is disposed near the light input surface 21S of the semiconductor substrate 21. Here, the PD211 corresponds to one specific example of the "photoelectric conversion portion" of the present invention.

The insulating film 31 provided between the semiconductor substrate 21 and the microlens 32 has a function of flattening the light input surface 21S of the semiconductor substrate 21. The insulating film 31 contains, for example, silicon oxide (SiO) or the like. Here, the insulating film 31 corresponds to a specific example of the "insulating film" of the present invention.

The microlens 32 on the insulating film 31 is provided at a position opposing the PD211 of the sensor chip 20 for each pixel P. The microlens 32 is configured to collect light entering the microlens 32 onto the PD211 for each pixel P. The lens system of the microlens 32 is set to a value corresponding to the size of the pixel P. Examples of the lens material of the microlens 32 include a silicon oxide (SiO) film, a silicon nitride (SiN) film, and the like. The microlenses 32 can include an organic material. For example, the material constituting the microlens 32 is disposed in a film shape outside the pixel unit 200P. A color filter may be disposed between the microlens 32 and the insulating film 31.

The planarization film 33 is disposed between the microlens 32 and the bonding member 34. The planarization film 33 is provided on almost the entire surface of the light input face 21S of the semiconductor substrate 21 so as to cover the microlens 32. This flattens the light input surface 21S of the semiconductor substrate 21 on which the microlenses 32 are provided. The planarization film 33 includes, for example, a silicon oxide film (SiO) or a resin material. Examples of the resin material include epoxy-based resins, polyimide-based resins, silicon-based resins, and acrylic resins. For example, the planarization film 33 is provided with the notch C along the thickness direction.

For example, the notch C is provided to extend from the planarization film 33 in the stacking direction (Z-axis direction) of the imaging device 1. For example, the cutout portions C are provided in the planarization film 33, the insulating film 31, the sensor chip 20, and the logic chip 10. That is, the cutout portion C extends through the planarization film 33, the insulating film 31, the semiconductor substrate 21, and the multilayer wiring layer 12. The notch portion C is formed, for example, by digging from the planarization film 33 to half of the thickness direction of the semiconductor substrate 11 (a groove V of fig. 6B to be described later). For example, the bottom surface of the cutout portion C is provided inside the semiconductor substrate 11 of the logic chip 10. It is sufficient to provide the cutouts C at least in the thickness direction of the insulating film 31. For example, the cutout portion C may be provided to extend in the stacking direction of the imaging device 1 from the insulating film 31. The cutout portion C has, for example, a rectangular cross-sectional shape.

Fig. 3 and 4 show other examples of the cross-sectional configuration of the image pickup apparatus 1. As shown in the figure, the cutout portion C of the image pickup apparatus 1 may have other cross-sectional shapes than a rectangular shape. For example, as shown in fig. 3, the cutout portion C may have a tapered shape. Specifically, in the cutout portion C, the width of the cutout portion C gradually decreases from the planarization film 33 toward the semiconductor substrate 11. Alternatively, as shown in fig. 4, the cutout portion C may have a step. Specifically, in the cutout portion C, the width of the cutout portion C is reduced stepwise from the planarization film 33 toward the semiconductor substrate 11.

Fig. 5 shows an example of the planar shape of the notch portion C. The cross-sectional configuration along the line II-II' shown in fig. 5 corresponds to fig. 2. The cutout portion C is provided, for example, on the periphery of the image pickup device 1 (the insulating film 31), and surrounds the pixel unit 200P in plan view. The planar shape of the cutout portion C is, for example, rectangular.

In the present embodiment, the notch portion C is filled with the filling film 35. The injection film 35 is different from the joining member 34 and contains a different material from the joining member 34. As described later in detail, this enables the formation of the engaging member 34 thinner than in the case where the cutout portion C is filled with the engaging member 34.

For example, the injection film 35 is injected from the bottom surface of the notch C into the entire notch C in the depth direction. The front surface of the planarization film 33 (the surface of the side on which the bonding member 34 is arranged) and the front surface of the injection film 35 are substantially flush with each other. The injection film 35 includes, for example, an insulating material having low water permeability. The implantation film 35 includes, for example, an inorganic insulating material such as silicon nitride (SiN) and silicon oxynitride (SiON). The injection film 35 may include an organic insulating material such as siloxane. As described above, providing the cut-out portion C in the periphery of the image pickup apparatus 1 and injecting the injection film 35 having low water permeability in the cut-out portion C makes it possible to suppress intrusion of moisture into the image pickup apparatus 1 through the end portion.

The protective member 40 is opposed to the sensor chip 20 with the insulating film 31, the microlens 32, and the planarization film 33 therebetween. The protective member 40 covers the light input surface 21S of the semiconductor substrate 21. The protective member 40 includes, for example, a transparent substrate such as a glass substrate. An IR (infrared) cut filter or the like may be provided, for example, on a front surface (a surface opposite to the surface of the side on which the sensor chip 20 is arranged) of the protective member 40 or on a rear surface (the surface of the side on which the sensor chip 20 is arranged) of the protective member 40. The protection member 40 is opposed to the logic chip 10 with the sensor chip 20 therebetween.

The joining member 34 provided between the protective member 40 and the microlens 32 has, for example, a refractive index substantially the same as that of the protective member 40. For example, in the case where the protective member 40 is a glass substrate, the joining member 34 preferably includes a material having a refractive index of about 1.51. The bonding member 34 is provided to fill the space between the protective member 40 and the sensor chip 20. That is, the image pickup apparatus 1 has a so-called cavity-less structure. The bonding member 34 includes, for example, a light-transmitting resin material. The thickness of the bonding member 34 is, for example, 10 μm to 50 μm.

(method of manufacturing image pickup device 1)

Next, a method of manufacturing the image pickup apparatus 1 will be described with reference to fig. 6A to 6J.

First, as shown in fig. 6A, the logic wafer 10W and the sensor wafer 20W are bonded to form a bonding surface S. The logic wafer 10W includes a semiconductor substrate 11 and a multilayer wiring layer 12. The sensor wafer 20W includes a semiconductor substrate 21 and a multilayer wiring layer (not shown). The PD211 is formed in the semiconductor substrate 21. Further, on the light input surface 21S of the semiconductor substrate 21, an insulating film 31, a microlens 32, and a planarization film 33 are formed. Each of the logic wafer 10W and the sensor wafer 20W is provided with a plurality of chip regions a. In the post-process, the logic wafer 10W is singulated for each chip region a to form the logic chip 10, while the sensor wafer 20W is singulated for each chip region a to form the sensor chip 20.

Next, as shown in fig. 6B, grooves V are formed in the dicing lines between the adjacent chip regions a. In the post-process, the groove V contributes to the formation of the notch C of the image pickup device 1. For example, the groove V is formed, for example, to extend from the front surface of the planarization film 33 through the insulating film 31, the sensor wafer 20W, and the multilayer wiring layer 12, and then dug halfway in the thickness direction of the semiconductor substrate 11. For example, the groove V having a rectangular cross-sectional shape is formed.

Fig. 7 and 8 show other examples of the process of forming the groove V. As shown in fig. 7, the groove V may have a shape whose width gradually decreases from the planarization film 33 toward the semiconductor substrate 11. That is, the groove V may be formed in a tapered shape. By forming the groove V shown in fig. 7, the notch portion C shown in fig. 3 is formed in post-processing. Alternatively, as shown in fig. 8, the groove V may be formed in a shape whose width decreases stepwise from the planarization film 33 toward the semiconductor substrate 11. By forming the groove V shown in fig. 8, the notch portion C shown in fig. 4 is formed in post-processing.

After the groove V is formed, as illustrated in fig. 6C, an implantation film 35 is formed on the planarization film 33 to fill the groove V. The implantation film 35 is formed by forming a silicon nitride (SiN) film using, for example, a CVD (chemical vapor deposition) method. At this time, the implantation film 35 can be easily formed at the bottom of the groove V in the groove V (fig. 7 and 8) whose width decreases from the planarization film 33 toward the semiconductor substrate 11, as compared with the case of the groove V of a constant width. In other words, by forming the groove V whose width becomes smaller from the planarization film 33 toward the semiconductor substrate 11, the implantation property of the implantation film 35 can be improved.

After the implantation film 35 is formed, as shown in fig. 6D, the planarization film 33 and the implantation film 35 are planarized. Specifically, the surface of the side on which the implantation film 35 is arranged is subjected to CMP (chemical mechanical polishing) or etch back, so that the front surface of the implantation film 35 is formed flush with the front surface of the planarization film 33.

Next, as shown in fig. 6E, the protective member 40 is bonded to the sensor wafer 20W with the planarization film 33 therebetween. The protective member 40 is bonded to the sensor wafer 20W using the bonding member 34. Although detailed later, here, the groove V is filled with the implantation film 35. Therefore, the thickness of the engaging member 34 is reduced as compared with the case where the engaging member 34 is injected in the groove V.

After the protective member 40 is bonded to the sensor wafer 20W, as shown in fig. 6F, a hole H is formed in the logic wafer 10W. For example, the hole H extends through the semiconductor substrate 11 and reaches the pad electrode 12M of the multilayer wiring layer 12.

After the hole H is formed, as shown in fig. 6G, the redistribution line 51 is formed. The redistribution line 51 is electrically connected to the pad electrode 12M. For example, the redistribution traces 51 are formed as follows. First, a resist material film is formed on the other main surface of the semiconductor substrate 11, and then, an opening is formed in a selective region of the resist film. The opening is formed in the vicinity of the hole H. Next, a copper (Cu) film was formed by an electrolytic plating method using the resist film having the opening as a mask. In this way, the redistribution traces 51 can be formed in the selective region near the hole H.

After the heavy wiring lines 51 are formed, as shown in fig. 6H, a protective resin layer 53 is formed so as to cover the heavy wiring lines 51. An opening is formed in the protective resin layer 53. Openings are provided for connecting the solder bumps 52 to the redistribution lines 51. After the protective resin layer 53 is formed, the solder bump 52 is formed (see fig. 2). For example, the solder bump 52 can be formed by: a spherical solder material is provided in the opening of the protective resin layer 53, and then the solder material is subjected to heat treatment to form the solder material into a bump shape. Then, cutting is performed along the cutting line. Therefore, the chip regions a are singulated, and the imaging device 1 is formed.

In the manufacturing method of the image pickup device 1, the groove V is formed in the dicing line. This allows relaxation of stress applied to the interface between the films of the image pickup device 1 during singulation. Therefore, peeling and cracking of the film can be suppressed. Further, intrusion of moisture into the image pickup device 1 due to peeling and cracking of the film can be suppressed. Further, here, an injection film 35 having low water permeability is injected in the groove V. This makes it possible to more effectively suppress intrusion of moisture into the image pickup apparatus 1.

(action and Effect of the image pickup apparatus 1)

In the imaging device 1 of the present embodiment, the notch C is filled with the injection film 35. This allows the thickness of the engaging member 34 to be reduced as compared with the case where the engaging member 34 is injected in the cutout portion C. Hereinafter, such action and effect are illustrated by giving comparative examples.

Fig. 9 shows a schematic cross-sectional structure of a main portion of an image pickup apparatus (image pickup apparatus 100) according to a comparative example. The image pickup apparatus 100 includes a logic chip 10, a sensor chip 20, and a protection member 40. Between the protective member 40 and the sensor chip 20, an insulating film 31, a microlens 32, a planarization film 33, and a bonding member 34 are provided in this order from the side on which the sensor chip 20 is disposed. A notch C is provided from the planarization film 33 to the semiconductor substrate 11 in the periphery of the imaging device 100. In the imaging device 100, the engagement member 34 is inserted into the notch C. In this regard, the image pickup apparatus 100 is different from the image pickup apparatus 1.

In this imaging apparatus 100, the groove V (see fig. 6B) is filled with the bonding member 34 at the time of manufacturing. This makes it difficult to reduce the thickness of the engaging member 34. In the imaging apparatus 100, for example, the thickness of the bonding member 34 is larger than 50 μm. The thickness of the bonding member 34 of the imaging device 100 is, for example, more than 50 μm and not more than 200 μm. In the case where the bonding member 34 has a large thickness, the diffusion of light reflected from between the sensor chip 20 and the protective member 40 becomes larger. This makes the annular spot easily identifiable.

The relationship between the thickness of the joining member 34 and the generation of the light spot will be described with reference to fig. 10A and 10B. Reflected light L as shown in fig. 10A and 10BROriginating from a direction from the light sourceLight L reflected from between the sensor chip 20 and the protective member 40 during the travel of the sensor chip 20. Fig. 10A shows reflected light L of the imaging apparatus 100RAnd fig. 10B shows the reflected light L of the imaging device 1R. The image pickup apparatus 100 includes the engaging member 34 having the thickness t1, and the image pickup apparatus 1 includes the engaging member 34 having the thickness t 2. Thickness t1 is greater than thickness t2(t 1)>t2)。

In the imaging devices 1 and 100 having no cavity structure, the space between the protective member 40 and the sensor chip 20 is filled with the bonding member 34 having a refractive index equivalent to the refractive index of the protective member 40. Therefore, the light L is reflected from the front surface of the sensor chip 20 and enters the protective member 40 at an angle equal to or greater than the critical angle, resulting in total reflection. Reflected light LRInto the pixel cell 200P (fig. 1). Reducing the reflected light L from the position where the light L directly enters the pixel unit 200PRThe distances (the distances d1 and d2 described later) into the position of the pixel cell 200P suppress the light spot from being recognized. It is to be noted that in the cavity-structured image pickup apparatus, it is unlikely that such reflected light enters the pixel unit.

In the image pickup apparatus 100, by reducing the thickness of the protective member 40, it is possible to reduce the reflected light L to some extent from the position where the light L directly enters the pixel unit 200P to the reflected light LRDistance d1 into the location of pixel cell 200P. However, since the thickness t1 of the engaging member 34 is large, it is difficult to sufficiently reduce the distance d1 (fig. 10A). In contrast, in the image pickup apparatus 1, the thickness t2 of the joint member 34 (fig. 10B) can be easily reduced in addition to the thickness of the protective member 40. Therefore, it is possible to sufficiently reduce the reflected light L from the position where the light L directly enters the pixel unit 200PRDistance d2(d 1) into the position of pixel cell 200P>d2) In that respect This results in reduced visibility of the spots.

As described above, in the image pickup device 1 according to the present embodiment, the injection film 35 is injected in the notch portion C. Therefore, the thickness of the engaging member 34 (thickness t2) can be reduced as compared with the case where the notch portion C is filled with the engaging member 34. This makes it possible to reduce light reflected from between the semiconductor substrate 21 (sensor chip 20) and the protective member 40 (anti-reflection)Emitting light LR) Diffusion of (2). Therefore, deterioration in image quality due to flare or the like can be suppressed.

Further, in the imaging device 1, the injection film 35 is injected in the entire portion in the depth direction of the notch portion C. Therefore, the thickness t2 of the bonding member 34 can be reduced more effectively than in the case where the injection film 35 is injected in a part of the notch portion C in the depth direction (for example, the imaging device 1A of fig. 11 described later).

Further, in the image pickup device 1, the chip end face is covered with an injection film 35 having low water permeability. Therefore, the intrusion of moisture from the end face can be suppressed.

Variations of the above-described first embodiment and other embodiments will be described below. However, in the following description, the same configuration elements as those of the above-described embodiment are denoted by the same reference numerals, and the description thereof will be appropriately omitted.

< modification 1>

Fig. 11 shows a schematic cross-sectional configuration of a main portion of an image pickup apparatus (image pickup apparatus 1A) according to modification 1 of the above-described first embodiment. Here, the injection film 35 is injected into a part of the notch C in the depth direction. Except for this, the image pickup apparatus 1A according to modification 1 has a similar configuration to the image pickup apparatus 1 of the first embodiment described above, and has similar actions and effects.

For example, the cutout portions C are provided in the planarization film 33, the insulating film 31, the sensor chip 20, and the logic chip 10. For example, the bottom surface of the notch C is provided midway in the thickness direction of the semiconductor substrate 11. For example, the cross-sectional shape of the cutout portion C is rectangular (fig. 11). The cutout portion C may have other cross-sectional shapes than a rectangular shape (refer to fig. 3 and 4). The height (dimension in the Z-axis direction) of the implantation film 35 is smaller than the depth of the notch C, and the front surface of the implantation film 35 is provided, for example, inside the semiconductor substrate 21. That is, in the Z-axis direction, the front surface of the injection film 35 is arranged at a position closer to the bottom surface of the cutout C than the front surface of the planarization film 33. In the notch portion C, the injection film 35 and the bonding member 34 are injected in this order from the side of the bottom surface on which the notch portion C is arranged.

Such an image pickup apparatus 1A can be manufactured, for example, as follows (fig. 12A and 12B).

First, in a manner similar to that described in the above-described first embodiment, the groove V is formed by digging from the planarization film 33 toward the semiconductor substrate 11 (refer to fig. 6B). For example, the groove V having a rectangular cross-sectional shape is formed. In a manner similar to that described in the above-described first embodiment, the groove V (fig. 7 and 8) whose width gradually or stepwise decreases from the insulating film 31 toward the semiconductor substrate 11 may be formed.

Next, as shown in fig. 12A, an implantation film 35 is formed so as to fill a part of the groove V in the depth direction. The injection film 35 is formed by forming a film of an organic insulating material such as a resin using a coating method, for example. Examples of the organic insulating material include siloxane, epoxy resin, and the like.

After the implantation film 35 is formed, as shown in fig. 12B, the protective member 40 is bonded to the sensor wafer 20W. The protective member 40 is engaged using the engaging member 34. Here, a part of the groove V in the depth direction is filled with the implantation film 35. Therefore, the thickness of the joining member 34 is reduced as compared with the case where the joining member 34 is entirely injected in the depth direction of the groove V.

After the protective member 40 is bonded to the sensor wafer 20W, the image pickup device 1A can be manufactured in a manner similar to that described above in the first embodiment.

In the imaging device 1A according to the present modification, the injection film 35 is injected into a part of the notch C in the depth direction. Therefore, the thickness of the joining member 34 is reduced as compared with the case where the joining member 34 is entirely injected in the depth direction of the notch portion C. Therefore, the image quality degradation due to flare or the like can be suppressed. Further, in the image pickup device 1A, it is sufficient to form the implantation film 35 (fig. 12A) in a part of the groove V in the depth direction. This makes the planarization processing of the implantation film 35 and the planarization film 33 (for example, the processing of fig. 6D of the image pickup apparatus 1) unnecessary. Therefore, the manufacturing cost due to the planarization process can be reduced. Further, deterioration of the pixel unit 200P due to the planarization process can be suppressed. Therefore, generation of noise, for example, can be suppressed, thereby further improving image quality.

< modification 2>

Fig. 13 schematically shows a cross-sectional configuration of a main portion of an image pickup apparatus (image pickup apparatus 1B) according to modification 2 of the above-described first embodiment. Here, the planarizing film 33 is implanted in the notch portion C. Except for this, the image pickup apparatus 1B according to modification 2 has a similar configuration to the image pickup apparatus 1 of the first embodiment described above, and has similar actions and effects.

The planarization film 33 covers the microlens 32 and is, for example, implanted in all of the cutout portion C in the depth direction. For example, the cross-sectional shape of the cutout portion C is rectangular (fig. 13). The cutout portion C may have other cross-sectional shapes than a rectangular shape (refer to fig. 3 and 4). The planarization film 33 is provided continuously from above the microlens 32 to the inside of the cutout portion C, for example. That is, the planarization film 33 has a function as an injection film in the cutout C, and also has a function of planarizing the light input surface 21S of the semiconductor substrate 21. In other words, the material of the planarization film 33 is the same as that of the implantation film. Here, the planarization film 33 corresponds to one specific example of the implantation film of the present invention.

The refractive index of the material of the planarization film 33 is preferably lower than that of the material of the microlenses 32. This allows the light entering the microlens 32 to be efficiently collected on the PD 211. For example, in the case where the material of the microlens 32 is a silicon nitride film (refractive index 1.8), siloxane (refractive index 1.4) may be used as the material of the planarization film 33.

Such an imaging device 1B can be manufactured, for example, as follows (fig. 14A to 14D).

First, as shown in fig. 14A, the logic wafer 10W and the sensor wafer 20W are bonded to form a bonding surface S. The logic wafer 10W includes a semiconductor substrate 11 and a multilayer wiring layer 12. The sensor wafer 20W includes a semiconductor substrate 21 and a multilayer wiring layer (not shown). The PD211 is formed on the semiconductor substrate 21. Further, on the light input surface 21S of the semiconductor substrate 21, an insulating film 31 and microlenses 32 are formed.

Next, as shown in fig. 14B, grooves V are formed in the dicing lines between the adjacent chip regions a. For example, the groove V is formed by extending from the front surface of the insulating film 31 through the sensor wafer 20W and the multilayer wiring layer 12, and then digging half in the thickness direction of the semiconductor substrate 11. For example, the groove V having a rectangular cross-sectional shape is formed. In a similar manner to that described in the above first embodiment, the groove V having a shape whose width gradually or stepwise decreases from the insulating film 31 toward the semiconductor substrate 11 may be formed (refer to fig. 7 and 8).

After the groove V is formed, as illustrated in fig. 14C, a planarization film 33 is formed on the microlens 32 to fill the groove V. The planarization film 33 is formed by forming a siloxane film using, for example, a CVD method or a coating method.

After the planarization film 33 is formed, as shown in fig. 14D, the protective member 40 is bonded to the sensor wafer 20W with the planarization film 33 therebetween. The protective member 40 is bonded to the sensor wafer 20W using the bonding member 34. Here, the groove V is filled with the planarization film 33. Therefore, the thickness of the engaging member 34 is reduced as compared with the case where the engaging member 34 is injected in the groove V. Before the protective member 40 is bonded to the sensor wafer 20W, a process of CMP or etch-back of the planarization film 33 to adjust the thickness of the planarization film 33 may be provided.

After the protective member 40 is bonded to the sensor wafer 20W, the image pickup device 1B can be manufactured in a similar manner to that described above in the first embodiment.

In the imaging apparatus 1B according to the present modification, the planarizing film 33 is implanted in the notch portion C. Therefore, the thickness of the engaging member 34 is reduced as compared with the case where the engaging member 34 is injected in the notch portion C. Therefore, deterioration in image quality due to flare or the like can be suppressed. Further, in the image pickup apparatus 1B, the planarization film 33 covers the microlens 32 and is injected in the groove V. This can reduce the number of steps compared to the case where the step of forming the planarization film 33 and the step of forming the implantation film in the groove V (see fig. 6C) are performed separately. Therefore, the manufacturing cost can be reduced.

< second embodiment >

Fig. 15 schematically shows a cross-sectional configuration of a main portion of an image pickup apparatus (image pickup apparatus 2) according to a second embodiment of the present invention. The imaging device 2 includes a hole portion M extending through the planarization film 33, the insulating film 31, and the sensor chip 20 to reach the pad electrode 12M. In the hole M, a conductive implantation film (implantation film 15) is implanted. That is, the hole M is provided instead of the notch C (fig. 1) of the first embodiment described above. Except for this, the image pickup apparatus 1 according to the second embodiment has a similar configuration to the image pickup apparatus 1 of the first embodiment described above, and has similar actions and effects.

Fig. 16 schematically shows an example of the planar configuration of the aperture section M together with the planarization film 33. The cross-sectional configuration along line XXV-XXV' shown in FIG. 16 corresponds to that of FIG. 15. The imaging device 2 has a plurality of holes M outside the pixel unit 200P. The plurality of hole portions M are arranged to be spaced apart from each other. Each of the plurality of hole portions M has, for example, a rectangular planar shape. For example, a plurality of hole portions M are arranged to surround the pixel unit 200P in a plan view. Each of the plurality of hole portions M may have other planar shapes than a rectangle, for example, a circle or the like.

Although detailed description will be made later, the hole portion M and the implantation film 15 are provided for performing inspection using a needle, for example, in a wafer state during the manufacturing process of the image pickup device 2. For example, the hole section M is provided in the planarization film 33, the insulating film 31, the sensor chip 20, and the multilayer wiring layer 12 (logic chip 10). For example, the hole M is formed by digging from the planarizing film 33 into the pad electrode 12M (hole M in fig. 18A described later) of the multilayer wiring layer 12. The pad electrode 12M is exposed on the bottom surface of the hole M. The hole portion M has, for example, a rectangular cross-sectional shape. The hole portion M may have other cross-sectional shapes than a rectangle. For example, the width of the hole section M may be gradually or stepwise decreased from the planarization film 33 toward the multilayer wiring layer 12 (refer to fig. 3 and 4). The hole portion M is arranged at a position opposite to the hole H, for example.

For example, the implantation film 15 is implanted into all of the holes M in the depth direction. The front surface of the planarization film 33 (the surface of the side on which the bonding member 34 is arranged) and the front surface of the injection film 15 are substantially flush with each other. The injection film 15 includes, for example, a conductive metal material. Examples of the conductive metal material include, but are not limited to, aluminum (Al), copper (Cu), and nickel (Ni). The injection film 15 is electrically connected to the pad electrode 12M. For example, a wiring connected to the pad electrode 12 may be provided, and the injection film 15 may be connected to the wiring. At this time, the hole portion M may be arranged at the following positions: this position is offset from the position opposite the hole H.

Fig. 17 shows another example of the cross-sectional configuration of the main portion of the image pickup apparatus 2. As shown in the figure, the implantation film 15 may be implanted in a part of the hole portion M in the depth direction. At this time, the height of the implantation film 15 is smaller than the depth of the hole portion M, and for example, the front surface of the implantation film 15 is disposed within the semiconductor substrate 21. That is, in the Z-axis direction, the front surface of the injection film 15 is arranged at a position closer to the bottom surface of the hole section M (the pad electrode 12M) than the front surface of the planarization film 33. In the hole portion M, the injection film 15 and the bonding member 34 are injected in this order from the side on which the bottom surface is arranged.

Such an image pickup apparatus 2 can be manufactured, for example, as follows (fig. 18A and 18B).

First, in a similar manner to that described in the first embodiment above, the logic wafer 10W and the sensor wafer 20W are bonded to form a bonding surface S. The logic wafer 10W includes a semiconductor substrate 11 and a multilayer wiring layer 12. The sensor wafer 20W includes a semiconductor substrate 21 and a multilayer wiring layer (not shown). The PD211 is formed on the semiconductor substrate 21. Further, on the light input surface 21S of the semiconductor substrate 21, an insulating film 31 and microlenses 32 are formed (fig. 6A).

Next, as illustrated in fig. 18A, a plurality of hole portions M extending from the planarization film 33 to the pad electrode 12M are formed. Next, as shown in fig. 18B, the implantation film 15 is formed to be selectively implanted into the hole portion M. The implantation film 15 is formed by forming a film of a metal material using a plating method, for example. Thereby, the injection film 15 electrically connected to the pad electrode 12M is formed. For example, the implantation film 15 is formed to fill the entire hole M in the depth direction. The implantation film 15 may be formed to fill a part of the hole portion M in the depth direction.

After the implantation film 15 is formed, for example, a probe is applied to the front surface of the implantation film 15 to perform inspection in a wafer state. This enables, for example, detection of a fault.

After the implantation film 15 is formed, the protective member 40 is bonded to the sensor wafer 20W. The protective member 40 is joined using the joining member 34 (see fig. 6E). Here, the hole portion M is filled with the injection film 15. Therefore, the thickness of the joining member 34 is reduced as compared with the case where the hole portion M is injected into the joining member 34.

After the protective member 40 is bonded to the sensor wafer 20W, the image pickup device 2 can be manufactured in a manner similar to that described in the above-described first embodiment.

In the image pickup apparatus 2 according to the present embodiment, the injection film 15 is injected in the hole portion M. Therefore, the thickness of the engaging member 34 is reduced as compared with the case where the engaging member 34 is injected in the hole portion M. Therefore, deterioration in image quality due to flare or the like can be suppressed. In the imaging device 2, the hole M may be filled with an injection film 15 containing a metal material. This makes it easier to maintain the strength to form the hole H at a position opposite to the hole portion M. Further, in the case of performing inspection in a wafer state, a needle is applied to the front surface of the implantation film 15. Therefore, the thick injection film 15 alleviates the impact caused by the abutment of the needle, so that the deterioration of each portion caused by the abutment of the needle can be suppressed.

< modification 3>

Fig. 19 schematically shows a cross-sectional configuration of a main portion of an image pickup apparatus (image pickup apparatus 2A) according to modification 4 of the above-described second embodiment. The imaging device 2A includes a hole portion M and a cutout portion C located outside the pixel unit 200P. The notch C is filled with an injection film 35. That is, the imaging device 2A includes the hole portion M in which the injection film 15 is injected and the notch portion C in which the injection film 35 is injected. Except for this, the image pickup apparatus 2A according to modification 3 has a similar configuration to the image pickup apparatus 2 of the second embodiment described above, and similar actions and effects.

For example, in a manner similar to that described in the above-described second embodiment, the notch portion C is formed by digging from the planarization film 33 toward the middle in the thickness direction of the semiconductor substrate 11 (groove V in fig. 20 described later). The notch C is provided on the periphery of the imaging device 2. The cross-sectional shape of the cutout portion C is, for example, rectangular (fig. 19). The cutout portion C may have other cross-sectional shapes than a rectangular shape (refer to fig. 3 and 4). The injection film 35 injected in the cut portion C includes, for example, an insulating material having low water permeability, similarly to the above-described first embodiment.

Such an imaging device 2A can be manufactured, for example, as follows (fig. 20).

First, in a manner similar to that described in the above-described second embodiment, the implantation film 15 and below is formed (fig. 18B). Next, as shown in fig. 20, grooves V are formed in the dicing lines between the adjacent chip regions a. For example, the groove V is formed, for example, to extend from the front surface of the planarization film 33 through the insulating film 31, the sensor wafer 20W, and the multilayer wiring layer 12, and then dug halfway in the thickness direction of the semiconductor substrate 11. After the groove V is formed, an implantation film 35 is formed (see fig. 6C).

After the implantation film 35 is formed, the image pickup device 2A can be manufactured in a manner similar to that described in the above-described first embodiment.

In the imaging device 2A according to the present modification, the injection film 15 is injected into the hole M and the injection film 35 is injected into the notch C. Therefore, the thickness of the engaging member 34 is reduced as compared with the case where the engaging member 34 is injected in the hole portion M and the notch portion C.

< application example >

The present technology is not limited to application to an image pickup device, and is applicable to a general electronic apparatus using an image pickup device as an image pickup unit (photoelectric conversion unit). Examples include an image pickup apparatus such as a digital still camera and a video camera, a mobile terminal device such as a mobile phone having an image pickup function, and a copying machine using the image pickup apparatus as an image reading unit. It should be noted that camera means sometimes take the form of a camera module, i.e. a module mounted on an electronic device.

Fig. 21 is a block diagram showing a configuration example of an electronic apparatus 2000 as an example of the electronic apparatus of the present invention. The electronic device 2000 is, for example, a camera module for a mobile device such as a digital camera, a video camera, and a mobile phone. As shown in fig. 21, an electronic apparatus 2000 of the present invention includes, for example, an optical unit including a lens group 2001 or the like, an image pickup device 1, 1A, 1B, 2, or 2A (hereinafter simply referred to as an image pickup device 1), a DSP circuit 2003 which is a camera signal processor, a frame memory 2004, a display unit 2005, a storage unit 2006, an operation unit 2007, and a power supply unit 2008.

Further, a configuration is provided in which the DSP circuit 2003, the frame memory 2004, the display unit 2005, the storage unit 2006, the operation unit 2007, and the power supply unit 2008 are connected to one another by a bus 2009.

The lens group 2001 receives incident light (image light) from a subject and forms an image on an imaging plane of the image pickup apparatus 1. The image pickup apparatus 1 converts, for each pixel, the light amount of incident light used by the lens group 2001 to form an image on an imaging plane into an electric signal. The image pickup device 1 outputs the electric signal as a pixel signal.

The display unit 2005 includes a panel display unit such as a liquid crystal display unit or an organic EL (electro luminescence) display unit, for example, and displays a moving image or a still image captured by the image pickup apparatus 1. The storage unit 2006 records a moving image or a still image captured by the solid-state imaging element 2002 in a recording medium such as a DVD (digital versatile disc).

The operation unit 2007 gives operation instructions about various functions of the image pickup apparatus according to an operation by a user. The power supply unit 2008 supplies various types of power to the DSP circuit 2003, the frame memory 2004, the display unit 2005, the storage unit 2006, and the operation unit 2007 as appropriate, and serves as operation power supplies thereof.

< practical application example of in-vivo information collecting System >

Furthermore, the technique according to the present invention is applicable to various products. For example, the technique according to the present invention can be applied to an endoscopic surgical system.

Fig. 22 is a block diagram showing a schematic configuration of an in-patient information acquisition system using a capsule endoscope to which the technique according to an embodiment of the present invention (present technique) can be applied.

The in-vivo information collection system 10001 includes a capsule endoscope 10100 and an external control device 10200.

At the time of examination, the patient swallows the capsule endoscope 10100. The capsule type endoscope 10100 has an image pickup function and a wireless communication function, and when it moves inside an organ by peristalsis for a certain time, it continuously takes images of the inside of the organ such as the stomach or the intestine (hereinafter referred to as in-vivo images) at predetermined time intervals until it is naturally excreted by the patient. Then, the capsule endoscope 10100 sequentially transmits the information of the in-vivo image to the external control device 10200 outside the body by wireless transmission.

The external control device 10200 integrally controls the operation of the in-vivo information acquisition system 10001. In addition, the external control device 10200 receives information of the in-vivo image transferred thereto from the capsule endoscope 10100, and generates image data for displaying the in-vivo image on a display device (not shown) based on the received information of the in-vivo image.

In the in-vivo information acquisition system 10001, an in-vivo image for imaging the state in the patient's body can be acquired at any time in this manner during the period from when the capsule endoscope 10100 is swallowed to when it is excreted.

The configuration and function of the capsule endoscope 10100 and the external control device 10200 will be described in detail below.

The capsule endoscope 10100 includes a capsule casing 10101, and the capsule casing 10101 accommodates therein a light source unit 10111, an image pickup unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, a power supply unit 10116, and a control unit 10117.

The light source unit 10111 includes, for example, a light source such as a Light Emitting Diode (LED), and irradiates light on the imaging field of view of the imaging unit 10112.

The image pickup unit 10112 includes an image pickup element and an optical system. The optical system includes a plurality of lenses disposed in a front stage of the image pickup element. Reflected light of light irradiated on body tissue as an observation target (hereinafter referred to as observation light) is condensed by an optical system and introduced to an image pickup element. In the imaging unit 10112, incident observation light is photoelectrically converted by the imaging element, thereby generating an image signal corresponding to the observation light. The image signal generated by the image pickup unit 10112 is supplied to the image processing unit 10113.

The image processing unit 10113 includes a processor such as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), and performs various image processing on the image signal generated by the image capturing unit 10112. The image processing unit 10113 supplies the image signal, which has been signal-processed by it, to the wireless communication unit 10114 as RAW data.

The wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal that has been subjected to the signal processing by the image processing unit 10113, and transmits the processed image signal to the external control device 10200 through the antenna 10114A. In addition, the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A. The wireless communication unit 10114 supplies the control signal received from the external control device 10200 to the control unit 10117.

The power supply unit 10115 includes an antenna coil for receiving power, a power source regeneration circuit and a booster circuit for regenerating power from a current generated by the antenna coil, and the like. The power supply unit 10115 generates power using a non-contact charging principle.

The power supply unit 10116 includes a storage battery, and stores the power generated by the power supply unit 10115. In fig. 22, in order to avoid complicated illustration, an arrow or the like indicating a power supply destination from the power supply unit 10116 is omitted. However, the power stored in the power supply unit 10116 is supplied to and can be used to drive the light source unit 10111, the image capturing unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117.

The control unit 10117 includes a processor such as a CPU, and appropriately controls driving of the light source unit 10111, the image capturing unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power supply unit 10115 according to a control signal transmitted thereto from the external control device 10200.

The external control device 10200 includes a processor such as a CPU or a GPU, a microcomputer, a control board in which the processor and a storage element such as a memory are mixedly integrated, and the like. The external control device 10200 sends a control signal to the control unit 10117 of the capsule endoscope 10100 through the antenna 10200A to control the operation of the capsule endoscope 10100. In the capsule endoscope 10100, for example, the irradiation conditions of light on the observation target of the light source unit 10111 can be changed in accordance with a control signal from the external control device 10200. In addition, the image capturing conditions (for example, the frame rate, the exposure value, and the like of the image capturing unit 10112) may be changed according to a control signal from the external control device 10200. In addition, the content processed by the image processing unit 10113 or the condition (e.g., transmission interval, number of transmission images, etc.) of transmitting a signal from the wireless communication unit 10114 may be changed according to a control signal from the external control device 10200.

In addition, the external control device 10200 performs various image processes on the image signal transmitted thereto from the capsule endoscope 10100 to generate image data for displaying the captured in-vivo image on the display device. As the image processing, for example, various signal processing such as development processing (demosaicing processing), image quality improvement processing (bandwidth enhancement processing, super-resolution processing, Noise Reduction (NR) processing, and/or image stabilization processing), and/or enlargement processing (electronic zoom processing) may be performed. The external control device 10200 controls the driving of the display device to cause the display device to display the captured in-vivo image based on the generated image data. Alternatively, the external control device 10200 may also control a recording device (not shown) to record the generated image data or control a printing device (not shown) to output the generated image data by printing.

In the above, an example of an in-vivo information acquisition system to which the technique according to the present invention can be applied is explained. The technique according to the present invention is applicable to, for example, the image pickup unit 10112 other than the above-described configuration. This enables the accuracy of detection to be improved.

< practical application example of endoscopic surgery System >

The technique according to the present invention (present technique) is applicable to various products. For example, techniques according to the present invention may be applicable to endoscopic surgical systems.

Fig. 23 is a view showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the embodiment of the present invention (present technique) can be applied.

In fig. 23, a state in which a surgeon (doctor) 11131 is performing an operation for a patient 11132 on a bed 11133 using an endoscopic surgery system 11000 is shown. As shown, the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 (e.g., a pneumoperitoneum tube 11111 and an energy device 11112), a support arm device 11120 (on which the endoscope 11100 is supported), and a cart 11200 on which various devices for endoscopic surgery are loaded.

The endoscope 11100 includes a lens barrel 11101 and a camera 11102 connected to a proximal end of the lens barrel 11101, and the lens barrel 11101 has a region of a predetermined length from a distal end thereof for insertion into a body cavity of a patient 11132. In the illustrated example, the endoscope 11100 is illustrated as including a rigid endoscope as the lens barrel 11101 having a hard type. However, the endoscope 11100 may be a flexible endoscope including the lens barrel 11101 having flexibility.

The lens barrel 11101 has an opening portion at its distal end to which an objective lens is attached. The light source device 11203 is connected to the endoscope 11100 so that light generated by the light source device 11203 is guided to the distal end of the lens barrel through a light guide extending inside the lens barrel 11101 and irradiated toward an observation target in a body cavity of the patient 11132 through the objective lens. It is noted that endoscope 11100 can be a forward-looking endoscope, or can be a strabismus endoscope or a side-looking endoscope.

An optical system and an image pickup element are provided inside the camera 11102 so that reflected light (observation light) from an observation target is condensed on the image pickup element by the optical system. The image pickup element photoelectrically converts observation light to generate an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image. The image signal is transmitted as raw data to a Camera Control Unit (CCU) 11201.

The CCU11201 includes a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or the like, and centrally controls the operations of the endoscope 11100 and the display device 11202. In addition, the CCU11201 receives an image signal from the camera 11102, and performs various image processing for displaying an image based on the image signal, such as development processing (demosaicing processing), on the image signal.

The display device 11202 displays an image based on the image signal on which the image processing has been performed by the CCU11201, under the control of the CCU 11201.

The light source device 11203 includes a light source such as, for example, a Light Emitting Diode (LED), and supplies irradiation light to the endoscope 11100 when imaging a surgical field or the like.

The input device 11204 is an input interface for the endoscopic surgical system 11000. The user can perform input of various types of information or instructions input to the endoscopic surgical system 11000 through the input device 11204. For example, the user will input an instruction or the like through the endoscope 11100 to change the image capturing conditions (the type, magnification, focal length, or the like of the illumination light).

The treatment tool control device 11205 controls the actuation of the energy device 11112 for cauterizing or incising tissue, sealing blood vessels, etc. Pneumoperitoneum device 11206 delivers gas through pneumoperitoneum tube 11111 into the body cavity of patient 11132 to inflate the body cavity to ensure the field of view of endoscope 11100 and to ensure the surgeon's working space. The recorder 11207 is a device capable of recording various types of information relating to the procedure. The printer 11208 is a device capable of printing various types of information relating to the operation in various forms (e.g., text, images, or graphics).

It is noted that the light source device 11203 that provides illumination light to the endoscope 11100 when the surgical field is to be imaged may comprise a white light source, such as one containing an LED, a laser light source, or a combination thereof. In the case where the white light source includes a combination of red, green, and blue (RGB) laser light sources, since the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, white balance adjustment of a captured image can be performed by the light source device 11203. In this case, if the laser beams from the respective RGB laser light sources are irradiated onto the observation target in a time-division manner and the driving of the image pickup element of the camera 11102 is controlled in synchronization with the irradiation timing, the images corresponding to R, G and B can be captured in a time-division manner. According to this method, a color image can be obtained even if the image pickup element is not provided with a color filter.

In addition, the light source device 11203 may be controlled such that the intensity of light to be output is changed every predetermined time. By controlling the driving of the image pickup device of the camera 11102 in synchronization with the timing of the change in light intensity so as to time-divisionally acquire images and synthesize the images, an image of a high dynamic range free from underexposed shadows and overexposed light can be created.

In addition, the light source device 11203 may be configured to provide light of a predetermined wavelength band that can be used for special light observation. In the special light observation, for example, narrow-band light observation for imaging a predetermined tissue (e.g., blood vessels of the surface of a mucous membrane, etc.) with high contrast is performed by irradiating light of a narrower wavelength band than the irradiation light (i.e., white light) at the time of ordinary observation with wavelength dependence of light absorption in human tissue. Alternatively, in the special light observation, fluorescence observation for obtaining an image from fluorescence generated by irradiation of excitation light may be performed. In fluorescence observation, a fluorescence image can be obtained by irradiating excitation light onto body tissue to observe fluorescence from the body tissue (autofluorescence observation) or by locally injecting a reagent such as indocyanine green (ICG) and irradiating excitation light corresponding to the fluorescence wavelength of the reagent onto the body tissue. The light source device 11203 may be configured to provide narrow-band light and/or excitation light suitable for the special light observation described above.

Fig. 24 is a block diagram showing an example of the functional configuration of the camera 11102 and the CCU11201 illustrated in fig. 23.

The camera 11102 includes a lens unit 11401, an image pickup unit 11402, a drive unit 11403, a communication unit 11404, and a camera control unit 11405. The CCU11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera 11102 and the CCU11201 are connected to each other by a transmission cable 11400 to communicate.

The lens unit 11401 is an optical system provided at a connection position with the lens barrel 11101. Observation light taken from the distal end of the lens barrel 11101 is guided to the camera 11102 and introduced into the lens unit 11401. The lens unit 11401 includes a combination of a plurality of lenses including a zoom lens and a focus lens.

The number of image pickup elements included in the image pickup unit 11402 may be one (single-plate type) or a plurality (multi-plate type). For example, in the case where the image pickup unit 11402 is configured as an imaging unit of a multi-panel type, image signals corresponding to each of R, G and B are generated by an image pickup element, and the image signals may be synthesized to obtain a color image. The image pickup unit 11402 may also be configured to have a pair of image pickup elements for respectively acquiring an image signal for a right eye and an image signal for a left eye, thereby being used for three-dimensional (3D) display. If a 3D visualization is performed, the surgeon 11131 can more accurately understand the depth of the living tissue in the surgical field. It should be noted that in the case where the image pickup unit 11402 is configured as a stereoscopic type image pickup unit, a plurality of systems of the lens unit 11401 are provided corresponding to the respective image pickup elements.

Further, the image pickup unit 11402 is not necessarily provided on the camera 11102. For example, the image pickup unit 11402 may be provided immediately behind the objective lens inside the lens barrel 11101.

The driving unit 11403 includes an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera control unit 11405. Therefore, the magnification and focus of the image captured by the image capturing unit 11402 can be appropriately adjusted.

Communication unit 11404 includes communication devices to send and receive various types of information to and from CCU 11201. The communication unit 11404 transmits the image signal acquired from the image pickup unit 11402 to the CCU11201 as RAW data via the transmission cable 11400.

In addition, the communication unit 11404 receives a control signal for controlling the driving of the camera 11102 from the CCU11201, and supplies the control signal to the camera control unit 11405. The control information includes information such as information related to image capturing conditions, for example, information specifying the frame rate of a captured image, information specifying the exposure value at the time of capturing an image, and/or information specifying the magnification and focus of a captured image.

It should be noted that image capturing conditions such as a frame rate, an exposure value, a magnification, a focus, or the like may be designated by a user or may be automatically set by the control unit 11413 of the CCU11201 based on the obtained image signal. In the latter case, the endoscope 11100 has an Auto Exposure (AE) function, an Auto Focus (AF) function, and an Auto White Balance (AWB) function.

The camera control unit 11405 controls driving of the camera 11102 based on a control signal from the CCU11201 received through the communication unit 11404.

The communication unit 11411 includes a communication device for transmitting and receiving various types of information to and from the camera 11102. The communication unit 11411 receives an image signal transmitted thereto from the camera 11102 through the transmission cable 11400.

Further, the communication unit 11411 transmits a control signal for controlling driving of the camera 11102 to the camera 11102. The image signal and the control signal may be transmitted by electrical communication or optical communication or the like.

The image processing unit 11412 performs various image processes on the image signal in the form of RAW data transmitted thereto from the camera 11102.

The control unit 11413 performs various types of control related to image capturing of an operation area or the like by the endoscope 11100 and display of a captured image obtained by image capturing of the operation area or the like. For example, the control unit 11413 creates a control signal for controlling driving of the camera 11102.

Further, the control unit 11413 controls the display device 11202 to display a subject image imaged on the surgical region or the like based on the image signal on which the image processing has been performed by the image processing unit 11412. Accordingly, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 may recognize a surgical tool such as forceps, a specific living body region, bleeding, fog when the energy device 11112 is used, and the like by detecting the shape, color, and the like of the edge of the object included in the captured image. When the control unit 11413 controls the display device 11202 to display the photographed image, the control unit 11413 may cause various types of operation support information to be displayed in an overlapping manner with the image of the operation region using the result of the recognition. When the operation support information is displayed in an overlapping manner and presented to the surgeon 11131, the burden on the surgeon 11131 can be reduced, and the surgeon 11131 can surely perform the operation.

The transmission cable 11400 connecting the camera 11102 and the CCU11201 to each other is an electrical signal cable capable of electrical signal communication, an optical fiber capable of optical communication, or a composite cable capable of electrical communication and optical communication.

Here, although in the illustrated example, the communication is performed by wired communication using the transmission cable 11400, the communication between the camera 11102 and the CCU11201 may also be performed by wireless communication.

In the above, an example of an endoscopic surgical system to which the technique according to the present invention can be applied is explained. The technique according to the present invention is applied to the image pickup unit 11402 in the above configuration. Applying the technique according to the present invention to the image pickup unit 11402 will improve the accuracy of detection.

It is noted that endoscopic surgical systems are described herein as examples, but techniques in accordance with the present invention may be applied to other systems, such as microsurgical systems.

< practical application example of moving body >

The technique according to the present invention can be applied to various products. For example, the technology according to the present invention can be implemented as an apparatus mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid vehicle, a motorcycle, a bicycle, a personal mobile device, an airplane, an unmanned aerial vehicle, a ship, a robot, a construction machine, and an agricultural machine (tractor).

Fig. 25 is a block diagram showing an example of a schematic configuration of a vehicle control system as an example of a mobile body control system to which the technique according to the embodiment of the invention can be applied.

The vehicle control system 12000 includes a plurality of electric control units connected to each other through a communication network 12001. In the example illustrated in fig. 25, the vehicle control system 12000 includes a drive system control unit 12010, a vehicle body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050. Further, a microcomputer 12051, a sound image output section 12052, and an in-vehicle network interface (I/F)12053, which are functional configurations of the integrated control unit 12050, are shown.

The drive system control unit 12010 controls the operations of devices related to the drive system of the vehicle according to various types of programs. For example, the drive system control unit 12010 functions as a control device of: a driving force generating device such as an internal combustion engine, a drive motor, or the like for generating a driving force of the vehicle, a driving force transmitting mechanism that transmits the driving force to wheels, a steering mechanism that adjusts a steering angle of the vehicle, a braking device that generates a braking force of the vehicle, or the like.

The vehicle body system control unit 12020 controls the operations of various types of devices provided on the vehicle body according to various types of programs. For example, the vehicle body system control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lamps such as a headlamp, a backup lamp, a brake lamp, a turn lamp, a fog lamp, and the like. In this case, a radio wave transmitted from a mobile device that replaces a key or a signal of various switches may be input to the vehicle body system control unit 12020. The vehicle body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, or the like of the vehicle.

The vehicle exterior information detection unit 12030 detects information on the exterior of the vehicle including the vehicle control system 12000. For example, the vehicle exterior information detection unit 12030 is connected to the imaging unit 12031. The vehicle exterior information detection unit 12030 causes the imaging section 12031 to image an image outside the vehicle, and receives the imaged image. Based on the received image, the vehicle exterior information detection unit 12030 may perform processing of detecting an object such as a person, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance from the above object.

The image pickup section 12031 is an optical sensor that receives light and outputs an electric signal corresponding to the amount of received light. The image pickup section 12031 may output the electric signal as an image, or may output the electric signal as information on the measured distance. Further, the light received by the image pickup portion 12031 may be visible light, or may be invisible light such as infrared light.

The in-vehicle information detection unit 12040 detects information about the interior of the vehicle. The in-vehicle information detection unit 12040 is connected to a driver state detection unit 12041 that detects the state of the driver, for example. The driver state detection unit 12041 includes, for example, a camera that photographs the driver. Based on the detection information input from the driver state detection section 12041, the in-vehicle information detection unit 12040 may calculate the degree of fatigue of the driver or the degree of concentration of the driver, or may determine whether the driver is dozing.

The microcomputer 12051 can calculate a control target value of the driving force generation device, the steering mechanism, or the brake device based on information about the interior or exterior of the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and output a control command to the driving system control unit 12010. For example, the microcomputer 12051 may execute cooperative control intended to realize functions of an Advanced Driver Assistance System (ADAS) including collision avoidance or impact mitigation for the vehicle, follow-up driving based on a following distance, vehicle speed hold driving, vehicle collision warning, vehicle lane departure warning, and the like.

Further, by controlling the driving force generation device, the steering mechanism, the brake device, and the like based on the information on the outside or inside of the vehicle acquired by the outside-vehicle information detection unit 12030 or the inside-vehicle information detection unit 12040, the microcomputer 12051 can perform cooperative control aimed at realizing automated driving and the like that enables the vehicle to travel autonomously without depending on the operation of the driver.

Further, based on the information on the outside of the vehicle acquired by the vehicle exterior information detection unit 12030, the microcomputer 12051 may output a control command to the vehicle body system control unit 12020. For example, the microcomputer 12051 may perform cooperative control intended to prevent glare by controlling the headlamps to change from high beam to low beam, for example, according to the position of a preceding vehicle or an oncoming vehicle detected by the vehicle exterior information detecting unit 12030.

The audio/video output unit 12052 transmits an output signal of at least one of audio and video to an output device that can visually or audibly notify a passenger of the vehicle or the outside of the vehicle. In the example of fig. 25, an audio speaker 12061, a display portion 12062, and a dashboard 12063 are illustrated as output devices. The display portion 12062 may include, for example, at least one of an in-vehicle display and a head-up display.

Fig. 26 is a schematic diagram illustrating an example of the mounting position of the imaging section 12031.

In fig. 26, the imaging unit 12031 includes an imaging unit 12101, an imaging unit 12102, an imaging unit 12103, an imaging unit 12104, and an imaging unit 12105.

The image pickup portion 12101, the image pickup portion 12102, the image pickup portion 12103, the image pickup portion 12104, and the image pickup portion 12105 are provided at positions on, for example, a front nose, side mirrors, a rear bumper, and a rear door of the vehicle 12100, and a position of an upper portion of a windshield in the vehicle. The imaging unit 12101 provided at the nose and the imaging unit 12105 provided at the upper portion of the windshield in the vehicle mainly acquire images in front of the vehicle 12100. The image pickup portions 12102 and 12103 provided on the side mirrors mainly acquire images of both sides of the vehicle 12100. An image pickup unit 12104 provided on a rear bumper or a rear door mainly acquires an image of the rear of the vehicle 12100. The imaging unit 12105 provided on the upper portion of the windshield in the vehicle is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a signal, a traffic sign, a lane, and the like.

Incidentally, fig. 26 shows an example of the imaging range of the imaging sections 12101 to 12104. The imaging range 12111 represents the imaging range of the imaging unit 12101 provided at the nose. The imaging ranges 12112 and 12113 represent the imaging ranges of the imaging section 12102 and the imaging section 12103 provided in the side mirror, respectively. The imaging range 12114 represents an imaging range of an imaging unit 12104 provided on a rear bumper or a rear cover. For example, a bird's eye view image of the vehicle 12100 viewed from above is obtained by superimposing image data imaged by the imaging sections 12101 to 12104.

At least one of the image pickup portions 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the image pickup sections 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.

For example, based on the distance information acquired from the imaging sections 12101 to 12104, the microcomputer 12051 may determine the distances of the respective three-dimensional objects within the imaging ranges 12111 to 12114 and the temporal changes of the distances (relative speed to the vehicle 12100), and thereby extract, as the preceding vehicle, the closest three-dimensional object that is particularly on the traveling path of the vehicle 12100 and that travels in the substantially same direction as the vehicle 12100 at a predetermined speed (e.g., equal to or greater than 0 km/h). Further, the microcomputer 12051 may set in advance the following distance to be kept in front of the preceding vehicle, and perform automatic braking control (including following stop control), automatic acceleration control (including following start control), and the like. Therefore, it is possible to execute cooperative control such as automatic driving in order to cause the vehicle to automatically travel without depending on the operation of the driver.

For example, based on the distance information acquired from the image pickup portion 12101 to the image pickup portion 12104, the microcomputer 12501 may classify three-dimensional object data on a three-dimensional object into three-dimensional object data of two-wheeled vehicles, standard vehicles, large-sized vehicles, pedestrians, utility poles, and other three-dimensional objects, extract the classified three-dimensional object data, and use the extracted three-dimensional object data for automatic avoidance of an obstacle. For example, the microcomputer 12051 classifies the obstacles around the vehicle 12100 into obstacles that can be visually recognized by the driver of the vehicle 12100 and obstacles that are difficult for the driver of the vehicle 12100 to visually recognize. Then, the microcomputer 12051 determines a collision risk indicating the risk of collision with each obstacle. In the case where the collision risk is equal to or higher than the set value and thus there is a possibility of collision, the microcomputer 12051 issues a warning to the driver via the audio speaker 12061 or the display portion 12062, and performs forced deceleration or evasive steering by the drive system control unit 12010. The microcomputer 12051 can thereby assist driving to avoid a collision.

At least one of the image pickup portions 12101 to 12104 may be an infrared camera that detects infrared rays. The microcomputer 12051 can recognize a pedestrian, for example, by determining whether or not a pedestrian is present in images captured by the image capturing sections 12101 to 12104. For example, such identification of a pedestrian is performed by: a step of extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras; and a step of performing pattern matching processing on a series of feature points representing the contour of the object to determine whether the feature points are pedestrians. If the microcomputer 12051 determines that a pedestrian is present in the subject images of the image pickup portions 12101 to 12104, and thus recognizes the pedestrian, the sound image output portion 12052 controls the display portion 12062 so that a square contour line for emphasis is displayed so as to be superimposed on the recognized pedestrian. Further, the audio-visual output portion 12052 may control the display portion 12062 so as to display an icon or the like representing a pedestrian at a desired position.

In the above, an example of a vehicle control system to which the technique of the invention is applied is explained. The technique according to the present invention can be applied to the image pickup portion 12031 in the above-described configuration. Applying the technique according to the present invention to the image pickup section 12031 enables acquisition of a captured image that is easier to view. This makes it possible to reduce fatigue of the driver.

Although the contents of the present invention have been described above with reference to the embodiments and the modified examples, the contents of the present invention are not limited to the above-described embodiments and the like, but may be modified in various ways. For example, the configuration of the image pickup device described in the above-described embodiments and the like is merely exemplary, and other layers may also be included. In addition, the materials and thicknesses of the respective layers are also exemplary, and are not limited to the above-described materials and thicknesses.

Further, in the above-described first embodiment, the case where the notch portion C is provided from the planarization film 33 to the semiconductor substrate 11 is described. However, it is sufficient to provide the cutouts C at least in the thickness direction of the insulating film 31. For example, the notch C may be provided in the thickness direction of the planarization film 33 and the insulating film 31 so that the light input surface 21S of the semiconductor substrate 21 is exposed on the bottom surface of the notch C. Alternatively, the notch portion C may be provided from the planarization film 33 to the semiconductor substrate 21 such that the bottom surface of the notch portion C is provided inside the semiconductor substrate 21.

Further, in the first embodiment described above, the case where the notch portion C is provided to suppress the intrusion of moisture through the chip end face is described. In the above-described second embodiment, the case where the hole portion M is provided for inspection in a wafer state is explained. However, the functions of the notch portion and the hole portion of the present invention are not limited thereto. The shape and arrangement of the cutout portions and the hole portions of the present invention are not limited to those described in the above-described embodiments and the like.

In addition, in the above-described embodiment and the like, the case where the heavy wiring 51 is provided in the hole H of the semiconductor substrate 11 is described (for example, fig. 2). However, the hole H may be filled with an electric conductor separated from the redistribution line 51. The electric conductor may be connected to the redistribution line 51.

Further, in the above-described embodiment and the like, an example (for example, fig. 2) in which the image pickup apparatus 1 includes two stacked chips (the logic chip 10 and the sensor chip 20) is explained. Further, the image pickup apparatus 1 may include three or more stacked chips.

The effects described in the above embodiments and the like are merely illustrative. Other effects may be produced or further included according to the technique of the present invention.

It should be noted that the present invention may have the following configuration. According to the imaging device having the following configuration, the injection film is injected into a part or all of the notch portion and the hole portion in the depth direction. The injection film includes a material different from that of the joining member. This makes it possible to reduce the thickness of the bonding member between the protective member and the insulating film, as compared with the case where the cutout portion or the hole portion is filled with the bonding member. Therefore, the diffusion of light reflected from between the semiconductor substrate and the protective member can be reduced. This makes it possible to suppress degradation of image quality caused by flare or the like.

(1) An image pickup apparatus comprising:

a first semiconductor substrate including a light input surface and provided with a photoelectric conversion portion;

a second semiconductor substrate disposed on a side of the first semiconductor substrate opposite to the light input surface;

an insulating film provided on a side of the first semiconductor substrate on which the light input surface is arranged;

at least one of a notch and a hole extending at least in a thickness direction of the insulating film;

an injection film injected into a part or all of the at least one of the notch portion and the hole portion in a depth direction;

a protective member that is opposed to the first semiconductor substrate with the insulating film therebetween; and

a bonding member that includes a material different from that of the injection film and is disposed between the protection member and the insulating film.

(2)

The image pickup apparatus according to (1), wherein the cutout portion is provided in a periphery of the insulating film and extends through the insulating film and the first semiconductor substrate.

(3)

The imaging device according to (2), wherein the injection film contains an insulating material.

(4)

The image pickup apparatus according to any one of (1) to (3), further comprising:

a lens opposed to the photoelectric conversion portion, the insulating film being located between the lens and the photoelectric conversion portion; and

a planarization film covering the lens and including the same material as that of the injection film.

(5)

The imaging apparatus according to (4), wherein a refractive index of a material of the planarization film and a refractive index of a material of the injection film are lower than a refractive index of a material of the lens.

(6)

The image pickup apparatus according to any one of (1) to (5), further comprising a pad electrode provided between the first semiconductor substrate and the second semiconductor substrate, wherein,

the hole portion extends through the insulating film and the first semiconductor substrate and reaches the pad electrode.

(7)

The image pickup apparatus according to (6), wherein the injection film contains a conductive material.

(8)

The image pickup device according to (6) or (7), wherein the injection film contains a metal material.

(9)

The image pickup apparatus according to any one of (6) to (8), further comprising a multilayer wiring layer in which the pad electrode is provided.

(10)

The image pickup apparatus according to (9), further comprising an external connection terminal electrically connected to the pad electrode and provided on a surface of the second semiconductor substrate opposite to the multilayer wiring layer.

(11)

The imaging apparatus according to any one of (1) to (10), wherein the injection film is injected into an entire portion in a depth direction of the at least one of the notch portion and the hole portion.

(12)

The imaging apparatus according to any one of (1) to (10), wherein the injection film is injected in a part of the at least one of the cutout portion and the hole portion in a depth direction.

(13)

The imaging apparatus according to any one of (1) to (12), wherein the notch portion and the hole portion are provided, and the injection film is injected in the notch portion and the hole portion.

(14)

The imaging apparatus according to any one of (1) to (13), wherein the at least one of the cutout portion and the hole portion has a width that gradually decreases in the depth direction.

(15)

The imaging apparatus according to any one of (1) to (14), wherein the at least one of the cutout portion and the hole portion has a width that decreases stepwise in the depth direction.

This application claims priority from japanese patent application No. 2019-104223, filed on 4.6.2019 to the present patent office, the entire contents of which are incorporated herein by reference.

It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors insofar as they come within the scope of the appended claims or the equivalents thereof.

53页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:显示设备及其制造方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类