Solid-state imaging element and solid-state imaging device

文档序号:1146305 发布日期:2020-09-11 浏览:6次 中文

阅读说明:本技术 固态成像元件和固态成像装置 (Solid-state imaging element and solid-state imaging device ) 是由 平田晋太郎 富樫秀晃 兼田有希央 于 2019-03-04 设计创作,主要内容包括:本发明提供了一种固态成像元件,其包括光电转换层;经由其间的所述光电转换层彼此相对的第一电极和第二电极;设置在第一电极和所述光电转换层之间的半导体层;经由其间的所述半导体层与所述光电转换层相对的累积电极;设置在所述累积电极和所述半导体层之间的绝缘膜;和设置在所述半导体层和所述光电转换层之间的阻挡层。(The invention provides a solid-state imaging element, which includes a photoelectric conversion layer; a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween; a semiconductor layer provided between the first electrode and the photoelectric conversion layer; an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween; an insulating film provided between the accumulation electrode and the semiconductor layer; and a barrier layer disposed between the semiconductor layer and the photoelectric conversion layer.)

1. A solid-state imaging element comprising:

a photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween;

an insulating film provided between the accumulation electrode and the semiconductor layer; and

a barrier layer disposed between the semiconductor layer and the photoelectric conversion layer.

2. A solid-state imaging element comprising:

a photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer, the semiconductor layer having a potential barrier at a junction surface with respect to the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween; and

an insulating film disposed between the accumulation electrode and the semiconductor layer.

3. The solid-state imaging element according to claim 1, wherein

The photoelectric conversion layer contains an organic semiconductor material, and

the semiconductor layer includes a semiconductor material having a higher mobility than the organic semiconductor material.

4. The solid-state imaging element according to claim 1, further comprising a semiconductor substrate having a first face and a second face opposed to each other, wherein

The first electrode, the semiconductor layer, the barrier layer, the photoelectric conversion layer, and the second electrode are sequentially disposed on the first surface of the semiconductor substrate.

5. The solid-state imaging element according to claim 1, further comprising:

a semiconductor substrate having a first face and a second face opposed to each other; and

and a multilayer wiring provided between the second surface of the semiconductor substrate and the first electrode.

6. The solid-state imaging element according to claim 4, further comprising an inorganic photoelectric conversion portion provided within the semiconductor substrate.

7. The solid-state imaging element according to claim 1, further comprising a transfer electrode provided opposite to the semiconductor layer via the insulating film therebetween, the transfer electrode controlling movement of signal charges in the semiconductor layer.

8. The solid-state imaging element according to claim 1, further comprising a drain electrode provided separately from the first electrode and electrically connected to the semiconductor layer.

9. The solid-state imaging element according to claim 1, further comprising a light-shielding film covering the first electrode via the photoelectric conversion layer therebetween.

10. The solid-state imaging element according to claim 1, wherein the barrier layer comprises silicon oxide, silicon nitride, silicon oxynitride, or an organic material.

11. A solid-state imaging device includes a plurality of solid-state imaging elements each including

A photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween;

an insulating film provided between the accumulation electrode and the semiconductor layer; and

a barrier layer disposed between the semiconductor layer and the photoelectric conversion layer.

12. A solid-state imaging device includes a plurality of solid-state imaging elements each including

A photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer, the semiconductor layer having a potential barrier at a junction surface with respect to the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween; and

an insulating film disposed between the accumulation electrode and the semiconductor layer.

13. The solid-state imaging device according to claim 11, further comprising a shield electrode opposed to the semiconductor layer via the insulating film therebetween, the shield electrode being arranged between the accumulation electrodes adjacent to each other.

14. The solid-state imaging device according to claim 11, comprising a plurality of pixels in which the solid-state imaging elements are respectively provided, wherein the semiconductor layer is provided separately for each pixel.

15. The solid-state imaging device according to claim 11, comprising a plurality of pixels provided with the solid-state imaging elements, respectively, wherein the photoelectric conversion layers are provided separately for each pixel.

Technical Field

The present disclosure relates to a solid-state imaging element and a solid-state imaging device using, for example, an organic photoelectric conversion material.

Background

Solid-state imaging elements such as CCD (charge coupled device) image sensors and CMOS (complementary metal oxide semiconductor) image sensors have been used in solid-state imaging devices. The solid-state imaging element is provided with, for example, a photoelectric conversion layer containing an organic photoelectric conversion material (for example, see PTL 1).

List of cited documents

Patent document

PTL 1: japanese patent laid-open publication No. 2016-63165

Disclosure of Invention

In such a solid-state imaging element and a solid-state imaging device, for example, it is desirable to suppress occurrence of a transmission failure or the like of signal charges and improve element characteristics.

Therefore, it is desirable to provide a solid-state imaging element and a solid-state imaging device that make it possible to improve element characteristics.

A first solid-state imaging element according to an embodiment of the present disclosure includes: a photoelectric conversion layer; a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween; a semiconductor layer provided between the first electrode and the photoelectric conversion layer; an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween; an insulating film provided between the accumulation electrode and the semiconductor layer; and a barrier layer disposed between the semiconductor layer and the photoelectric conversion layer.

The first solid-state imaging device according to an embodiment of the present disclosure includes the first solid-state imaging element according to an embodiment of the present disclosure.

In the first solid-state imaging element and the first solid-state imaging device according to the respective embodiments of the present disclosure, the signal charges generated in the photoelectric conversion layer are accumulated in the semiconductor layer and then read out by the first electrode. Here, a blocking layer is provided between the semiconductor layer and the photoelectric conversion layer, thus making it unlikely that signal charges accumulated in the semiconductor layer return to the photoelectric conversion layer. The blocking layer functions as a potential barrier or a physical barrier during the movement of the signal charges.

The second solid-state imaging element according to the embodiment of the present disclosure includes: a photoelectric conversion layer; a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween; a semiconductor layer provided between the first electrode and the photoelectric conversion layer, the semiconductor layer having a potential barrier at a junction surface with respect to the photoelectric conversion layer; an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween; and an insulating film provided between the accumulation electrode and the semiconductor layer.

The second solid-state imaging device according to an embodiment of the present disclosure includes the second solid-state imaging element according to an embodiment of the present disclosure.

In the second solid-state imaging element and the second solid-state imaging device according to the respective embodiments of the present disclosure, the signal charges generated in the photoelectric conversion layer are accumulated in the semiconductor layer and then read out by the first electrode. Here, a potential barrier is provided at the interface between the semiconductor layer and the photoelectric conversion layer, thus making it less likely that signal charges accumulated in the semiconductor layer will return to the photoelectric conversion layer.

According to the first solid-state imaging element and the first solid-state imaging device of the respective embodiments of the present disclosure, the blocking layer is provided between the semiconductor layer and the photoelectric conversion layer, and according to the second solid-state imaging element and the second solid-state imaging device of the respective embodiments of the present disclosure, the potential barrier is provided at the junction surface between the semiconductor layer and the photoelectric conversion layer, so that occurrence of poor transmission of signal charges accumulated in the semiconductor layer can be suppressed. Therefore, element characteristics can be improved.

It should be noted that the foregoing is merely exemplary of the present disclosure. The effects of the present disclosure are not limited to those described above, and may be other different effects or may further include other effects.

Drawings

Fig. 1 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to a first embodiment of the present disclosure.

Fig. 2 is a schematic diagram of a planar configuration of the first electrode and the accumulation electrode shown in fig. 1.

Fig. 3 is a schematic cross-sectional view of another example of the semiconductor layer shown in fig. 1.

Fig. 4A is a graph (1) illustrating the energy of the barrier layer shown in fig. 1.

Fig. 4B is a graph (2) illustrating the energy of the barrier layer shown in fig. 1.

Fig. 5 is a schematic cross-sectional view of another example of the photoelectric conversion layer shown in fig. 1.

Fig. 6 is a schematic cross-sectional view of another example of the semiconductor layer and the photoelectric conversion layer shown in fig. 1.

Fig. 7 is an equivalent circuit diagram of the solid-state imaging element shown in fig. 1.

Fig. 8 is a schematic diagram of the configuration of the first electrode, the accumulation electrode, and various transistors of the solid-state imaging element shown in fig. 1.

Fig. 9 is a schematic sectional view of a step in the manufacturing method of the solid-state imaging element shown in fig. 1.

Fig. 10 is a schematic cross-sectional view of a step following fig. 9.

Fig. 11 is a schematic cross-sectional view of a step following fig. 10.

Fig. 12 is a schematic cross-sectional view of a step following fig. 11.

Fig. 13 is a schematic cross-sectional view of a step following fig. 12.

Fig. 14 is a schematic cross-sectional view of a step following fig. 13.

Fig. 15 is an explanatory diagram of the operation of the solid-state imaging element shown in fig. 1.

Fig. 16 is an explanatory diagram of the global shutter driving of the solid-state imaging element shown in fig. 15.

Fig. 17 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to comparative example 1.

Fig. 18 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to comparative example 2.

Fig. 19 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to modification 1.

Fig. 20 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to modification 2.

Fig. 21 is a schematic sectional view of another example of the solid-state imaging element shown in fig. 20.

Fig. 22 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to modification 3.

Fig. 23 is a schematic sectional view of another example of the solid-state imaging element shown in fig. 22.

Fig. 24 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to modification 4.

Fig. 25 is a schematic diagram of a planar configuration of the first electrode, the accumulation electrode, and the transmission electrode shown in fig. 24.

Fig. 26 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to modification 5.

Fig. 27 is a schematic diagram of a planar configuration of the first electrode, the accumulation electrode, and the discharge electrode shown in fig. 26.

Fig. 28 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to modification 6.

Fig. 29 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to modification 7.

Fig. 30 is a schematic sectional view of a schematic configuration of a solid-state imaging element according to a second embodiment of the present disclosure.

Fig. 31 is an explanatory diagram of a potential barrier formed by the junction surface shown in fig. 30.

Fig. 32 is a block diagram showing a configuration of a solid-state imaging device including the solid-state imaging element shown in fig. 1 and the like.

Fig. 33 is a functional block diagram showing an example of an electronic device (camera) using the imaging element shown in fig. 32.

Fig. 34 is a block diagram showing an example of a schematic configuration of the in-vivo information acquisition system.

Fig. 35 is a diagram showing an example of a schematic configuration of the endoscopic surgery system.

Fig. 36 is a block diagram showing an example of a functional configuration of a camera head and a Camera Control Unit (CCU).

Fig. 37 is a block diagram showing an example of a schematic configuration of the vehicle control system.

Fig. 38 is an auxiliary explanatory view of an example of the mounting positions of the vehicle exterior information detecting portion and the imaging portion.

Detailed Description

Hereinafter, embodiments of the present disclosure are described in detail with reference to the accompanying drawings. It should be noted that the description is given in the following order.

1. First embodiment (example of solid-state imaging element including barrier layer between semiconductor layer and photoelectric conversion layer)

2. Modification 1 (example of providing multilayer wiring on the first surface of the semiconductor substrate)

3. Modification 2 (example including one photodiode section in a semiconductor substrate)

4. Modification 3 (example in which a photodiode section is not provided in a semiconductor substrate)

5. Modification 4 (example including a transfer electrode between a first electrode and an accumulation electrode)

6. Modification 5 (example including a discharge electrode separated from a first electrode)

7. Modification 6 (example including shield electrode between adjacent accumulation electrodes)

8. Modification 7 (example including light-shielding film opposing first electrode)

9. Second embodiment (solid-state imaging element in which a potential barrier is provided at the interface between a semiconductor layer and a photoelectric conversion layer)

10. Application example 1 (example of solid-state imaging device)

11. Application example 2 (example of electronic device)

12. Application example 3 (application example of in vivo information acquisition System)

13. Application example 4 (application example of endoscopic surgery System)

14. Application example 5 (application example of moving body)

<1. first embodiment >

Fig. 1 schematically shows a cross-sectional configuration of a solid-state imaging element (solid-state imaging element 10) according to a first embodiment of the present disclosure. The solid-state imaging element 10 constitutes one pixel (unit pixel P) in a solid-state imaging device such as a CMOS image sensor (for example, the solid-state imaging device 1 in fig. 32) used in an electronic apparatus such as a digital still camera or a video camera, for example.

(1-1. constitution of solid-state imaging element)

The solid-state imaging element 10 is, for example, of a so-called longitudinal light-splitting type in which one organic photoelectric conversion portion 20 and two inorganic photoelectric conversion portions 32B and 32R are stacked in a longitudinal direction. The organic photoelectric conversion portion 20 is provided on the first surface 30A (back surface) side of the semiconductor substrate 30. The semiconductor substrate 30 has a second surface 30B (front surface) opposite to the first surface 30A. In the solid-state imaging element 10, light enters from the first face 30A side (light incident side S1), and multilayer wiring (wiring layer side S2) is provided on the second face 30B side.

The inorganic photoelectric conversion portions 32B and 32R are each formed to be buried in the semiconductor substrate 30 and laminated in the thickness direction of the semiconductor substrate 30. The organic photoelectric conversion portion 20 includes a photoelectric conversion layer 25 formed using an organic photoelectric conversion material between a pair of electrodes (a first electrode 21A and a second electrode 26) arranged to face each other. The photoelectric conversion layer 25 includes a p-type semiconductor and an n-type semiconductor, and has a bulk heterojunction structure in the layer. The bulk heterojunction structure is a p/n junction surface formed by mixing a p-type semiconductor and an n-type semiconductor.

The organic photoelectric conversion portion 20 and the inorganic photoelectric conversion portions 32B and 32R each perform photoelectric conversion by selectively detecting light in different wavelength regions. Specifically, the organic photoelectric conversion portion 20 acquires a green (G) color signal. The inorganic photoelectric conversion portions 32B and 32R acquire blue (B) and red (R) color signals, respectively, due to the difference in absorption coefficient. This enables the solid-state imaging element 10A to acquire a plurality of types of color signals in one pixel without using a color filter.

It should be noted that in this embodiment, an explanation is given of a case where electrons are read out as signal charges in pairs of electrons and holes (electron-hole pairs) generated by photoelectric conversion. In the figure, "+ (plus) added to" p "and" n "indicates that the impurity concentration of p-type or n-type is high, and" + "indicates that the impurity concentration of p-type or n-type is higher than" + ".

The second surface 30B of the semiconductor substrate 30 is provided with, for example, a floating diffusion (floating diffusion layer) FD1 (region 36B within the semiconductor substrate 30), an FD2 (region 37C within the semiconductor substrate 30), an FD3 (region 38C within the semiconductor substrate 30), transfer transistors Tr2 and Tr3, an amplification transistor (modulation element) AMP, a reset transistor RST, a selection transistor SEL, and a multilayer wiring 40. The multilayer wiring 40 has, for example, a constitution in which wiring layers 41, 42, and 43 are stacked in an insulating layer 44.

Between the first surface 30A of the semiconductor substrate 30 and the organic photoelectric conversion portion 20, for example, a layer having fixed charges (fixed charge layer) 27k, a dielectric layer having insulation 27y, and an interlayer insulating layer 22s are provided. A protective layer 51 is provided on the organic photoelectric conversion portion 20 (light incident side S1). Over the protective layer 51, optical members such as a planarization layer (not shown) and an on-chip lens 52 are disposed.

A through electrode 34 is provided between the first surface 30A and the second surface 30B of the semiconductor substrate 30. The organic photoelectric conversion portion 20 is connected to the gate electrode Gamp of the amplification transistor AMP and one source/drain region 36B of the reset transistor RST (reset transistor Tr1RST) also serving as the floating diffusion FD1 via the through electrode 34. This allows the solid-state imaging element 10 to well transfer the electric charges (here, electrons) generated in the organic photoelectric conversion portion 20 on the first face 30A side of the semiconductor substrate 30 to the second face 30B side of the semiconductor substrate 30 via the through electrode 34, and thus the characteristics can be improved.

The lower end of the through electrode 34 is connected to a connection portion 41A in the wiring layer 41, and the connection portion 41A and the gate electrode Gamp of the amplification transistor AMP are connected to each other via a lower first contact 45. The connection portion 41A and the floating diffusion FD1 (region 36B) are connected to each other via, for example, the lower second contact 46. The upper end of the through electrode 34 is connected to the first electrode 21A via, for example, a connection wiring 39A and the first contact 29A on the upper side.

For example, the through electrode 34 is provided in each solid-state imaging element 10 for each organic photoelectric conversion portion 20. The through electrode 34 has a function as a connector between the organic photoelectric conversion portion 20 and the gate electrode Gamp of the amplification transistor AMP and the floating diffusion FD1, and serves as a transfer path for charges (here, electrons) generated in the organic photoelectric conversion portion 20. The through electrode 34 is made of a metal material such as aluminum, tungsten, titanium, cobalt, hafnium, tantalum, or the like. The through electrode 34 may be composed of a doped silicon material such as PDAS (phosphorus doped amorphous silicon).

The reset gate Grst of the reset transistor RST is disposed adjacent to the floating diffusion FD1 (one source/drain region 36B of the reset transistor RST). This enables the charge accumulated in the floating diffusion FD1 to be reset by the reset transistor RST.

In the solid-state imaging element 10 of the present embodiment, light that has entered the organic photoelectric conversion portion 20 from the second electrode 26 side is absorbed by the photoelectric conversion layer 25. The excitons thus generated move to the interface between the electron donor and the electron acceptor constituting the photoelectric conversion layer 25, undergo exciton separation, i.e., dissociation into electrons and holes. One kind of electric charges (for example, electrons) generated here is accumulated in the semiconductor layer 23 opposite to the accumulation electrode 21B, and the other kind of electric charges (for example, holes) is discharged to the second electrode 26.

Hereinafter, a description is given of the constitution, material, and the like of each portion.

The organic photoelectric conversion portion 20 is an organic photoelectric conversion element that absorbs green light corresponding to a part or all of a selected wavelength region (for example, in a range of 450nm to 650nm), and generates electron-hole pairs. The organic photoelectric conversion portion 20 includes, in order from a position close to the first surface 30A of the semiconductor substrate 30, a first electrode 21A and an accumulation electrode 21B, an insulating layer 22, a semiconductor layer 23, a barrier layer 24, a photoelectric conversion layer 25, and a second electrode 26. The insulating layer 22 is provided with an opening 22H; the opening 22H allows the first electrode 21A to be electrically connected to the semiconductor layer 23. The insulating layer 22 is interposed between the accumulation electrode 21B and the semiconductor layer 23.

Fig. 2 shows a planar configuration of the first electrode 21A and the accumulation electrode 21B. The first electrode 21A and the accumulation electrode 21B each have a planar configuration such as a quadrangle, and are disposed apart from each other. For example, the area of the accumulation electrode 21B is larger than that of the first electrode 21A. One pixel (pixel P) has, for example, one first electrode 21A and one accumulation electrode 21B. The first electrode 21A is provided to transfer the charges (here, electrons) generated in the photoelectric conversion layer 25 to the floating diffusion FD1, and functions as a readout electrode. The first electrode 21A is connected to the floating diffusion FD1 via, for example, the upper first contact 29A, the connection wiring 39A, the through electrode 34, the connection portion 41A, and the lower second contact 46.

The accumulation electrode 21B opposed to the photoelectric conversion layer 25 via the semiconductor layer 23 therebetween is provided for accumulating signal charges (for example, electrons) among the charges generated in the photoelectric conversion layer 25 in the semiconductor layer 23. The accumulation electrode 21B is provided in a region opposed to the light receiving surfaces of the inorganic photoelectric conversion portions 32B and 32R formed in the semiconductor substrate 30, and covers these light receiving surfaces. The accumulation electrode 21B is electrically connected to a drive circuit (not shown) via, for example, the upper second contact 29B and the connection wiring 39B. Making the area of the accumulation electrode 21B larger than that of the first electrode 21A results in more electric charges being accumulated.

The first electrode 21A and the accumulation electrode 21B are each formed of, for example, a conductive film having light transmissivity, and formed of, for example, ITO (indium tin oxide). However, in addition to ITO, a dopant-doped tin oxide (SnO) may be used2) A system material or a zinc oxide system material in which aluminum zinc oxide (ZnO) is doped with a dopant is used as a constituent material of the lower electrode 21. Examples of the zinc oxide-based material include aluminum (Al) -doped Aluminum Zinc Oxide (AZO), gallium (Ga) -doped Gallium Zinc Oxide (GZO), and indium (In) -doped Indium Zinc Oxide (IZO) as dopants. In addition, for example, CuI, InSbO may be used in addition to those mentioned above4、ZnMgO、CuInO2、MgIn2O4、CdO、ZnSnO3And the like.

The first electrode 21A may be made of a conductive material having a light-shielding property. Specifically, the first electrode 21A may be composed of a film of a metal such as aluminum (Al), tungsten (W), titanium (Ti), molybdenum (Mo), tantalum (Ta), copper (Cu), cobalt (Co), or nickel (Ni), or an alloy film thereof, or may be composed of a film containing silicon or oxygen in the metal film.

The insulating layer 22 is provided for electrically isolating the accumulation electrode 21B and the semiconductor layer 23 from each other, and is provided on, for example, an interlayer insulating layer 27s to cover the first electrode 21A and the accumulation electrode 21B. The opening 22H provided in the insulating layer 22 exposes the first electrode 21A from the insulating layer 22 and is in contact with the semiconductor layer 23. The insulating layer 22 is formed of a single-layer film or a stacked-layer film of two or more kinds of films of silicon oxide, TEOS (tetraethyl orthosilicate), silicon nitride, silicon oxynitride (SiON), or the like. The insulating layer 22 has a thickness of, for example, 3nm to 500 nm.

The semiconductor layer 23 provided between the first electrode 21A or the insulating layer 22 and the photoelectric conversion layer 25 is preferably composed of a material having higher charge mobility and a larger band gap than the photoelectric conversion layer 25. The band gap of the constituent material of the semiconductor layer 23 is preferably 3.0eV or more. Examples of such a material include an oxide semiconductor material such as IGZO and an organic semiconductor material. Examples of the organic semiconductor material include transition metal dichalcogenides, silicon carbide, diamond, graphene, carbon nanotubes, fused polycyclic hydrocarbon compounds, and fused heterocyclic compounds. The semiconductor layer 23 may be formed of a single film, or may be formed by stacking a plurality of films. The semiconductor layer 23 has a thickness of, for example, 10nm to 500nm, preferably 30nm to 150nm, and more preferably 50nm to 100 nm. By providing such a semiconductor layer 23 as a lower layer of the photoelectric conversion layer 25, recombination of charges at the time of charge accumulation can be prevented, and thus the transfer efficiency can be improved.

The impurity concentration of the constituent material of the semiconductor layer 23 is preferably 1 × 1018cm-3The following. The semiconductor layer 23 is provided in common for the plurality of solid-state imaging elements 10, for example (fig. 1).

As shown in fig. 3, the semiconductor layer 23 may be provided separately for each element (pixel). At this time, an element separation layer (element separation layer 20i) is disposed between the semiconductor layers 23 of the adjacent solid-state imaging elements 10.

In this embodiment, the barrier layer 24 is provided between the semiconductor layer 23 and the photoelectric conversion layer 25. The blocking layer 24 functions as a potential barrier or a physical barrier during the movement of the signal charges. This makes it unlikely that the signal charges accumulated in the semiconductor layer 23 return to the photoelectric conversion layer 25, thereby suppressing the occurrence of a transfer failure, which will be described in detail later. The blocking layer 24 controls movement of charges between the semiconductor layer 23 and the photoelectric conversion layer 25, and functions as an energy barrier for movement of charges.

Fig. 4A schematically shows the electron affinity of the constituent material of the barrier layer 24. When the signal charges are electrons, the electron affinity (electron affinity EA2) of the constituent material of the blocking layer 24 is smaller than the electron affinity (electron affinity EA1) of the constituent material of the photoelectric conversion layer 25 and the electron affinity (electron affinity EA3) of the constituent material of the semiconductor layer 23. The barrier layer 24 may be provided so that the electron affinity EA3 of the constituent material of the semiconductor layer 23 is the same as or smaller than the electron affinity EA1 of the electron affinity EA1 of the constituent material of the photoelectric conversion layer 25. That is, the degree of freedom of the constituent materials of the semiconductor layer 23 and the photoelectric conversion layer 25 can be increased.

Fig. 4B schematically shows the ionization potential of the constituent material of the barrier layer 24. When the signal charges are holes, the ionization potential (ionization potential IP2) of the constituent material of the barrier layer 24 is larger than the ionization potential (ionization potential IP1) of the constituent material of the photoelectric conversion layer 25 and the ionization potential (ionization potential IP3) of the constituent material of the semiconductor layer 23.

The barrier layer 24 is made of, for example, silicon oxide (SiO), silicon nitride (SiN), or silicon oxynitride (SiON), and has a thickness of 0.1nm to 50nm, preferably 1nm to 10 nm. When the signal charges are electrons, the blocking layer 24 may be composed of an organic material having an electron injection blocking function. When the signal charges are holes, the blocking layer 24 may be composed of an organic material having a hole injection blocking function. The barrier layer 24 is, for example, provided in common for the plurality of solid-state imaging elements 10 (fig. 1).

The photoelectric conversion layer 25 disposed between the barrier layer 24 and the second electrode 26 converts light energy into electric energy. The photoelectric conversion layer 25 includes, for example, two or more organic semiconductor materials (p-type semiconductor material or n-type semiconductor material) each serving as a p-type semiconductor or an n-type semiconductor. The photoelectric conversion layer 25 includes a junction plane (p/n junction plane) between the p-type semiconductor material and the n-type semiconductor material in the layer. The p-type semiconductor functions relatively as an electron donor (donor), and the n-type semiconductor functions relatively as an electron acceptor (acceptor). The photoelectric conversion layer 25 provides a place where excitons generated upon light absorption are separated into electrons and holes. Specifically, the exciton separates into an electron and a hole at an interface (p/n junction) between the electron donor and the electron acceptor.

In addition to the p-type semiconductor material and the n-type semiconductor material, the photoelectric conversion layer 25 may contain an organic semiconductor material that performs photoelectric conversion on light of a predetermined wavelength region while transmitting light of any other wavelength region, that is, a so-called dye material. In the case where the photoelectric conversion layer 25 is formed using three types of organic semiconductor materials, that is, a p-type semiconductor material, an n-type semiconductor material, and a dye material, the p-type semiconductor material and the n-semiconductor material are preferably each a material having light transmittance in the visible light region (for example, 450nm to 800 nm). The photoelectric conversion layer 25 has a thickness of, for example, 50nm to 500 nm. The photoelectric conversion layer 25 is provided in common for the plurality of solid-state imaging elements 10, for example (fig. 1).

As shown in fig. 5 and 6, the photoelectric conversion layer 25 may be separately provided for each element (pixel). At this time, the protective layer 51 is disposed between the photoelectric conversion layers 25 of the adjacent solid-state imaging elements 10. The semiconductor layer 23 may be provided commonly for a plurality of solid-state imaging elements 10, and the barrier layer 24 and the photoelectric conversion layer 25 may be provided separately for each element (fig. 5). The semiconductor layer 23, the barrier layer 24, and the photoelectric conversion layer 25 may be separately provided for each element (fig. 6).

The photoelectric conversion layer 25 contains, for example, a p-type semiconductor, an n-type semiconductor, and a dye material. Examples of the p-type semiconductor include thiophene derivatives, benzothienobenzothiophene derivatives, ferrocene derivatives, p-phenylene vinylene derivatives, carbazole derivatives, pyrrole derivatives, aniline derivatives, diamine derivatives, phthalocyanine derivatives, subphthalocyanine derivatives, hydrazone derivatives, naphthalene derivatives, anthracene derivatives, phenanthrene derivatives, pyrene derivatives, perylene derivatives, tetracene derivatives, pentacene derivatives, quinacridone derivatives, thiophene derivatives, ferrocene derivatives, phenyleneBithiophene derivatives, benzothiophene derivatives, triarylamine derivatives, perylene derivatives, picene derivatives, and mixtures thereof,Derivatives, fluoranthene derivatives, porphyrinic derivatives, metal complexes comprising heterocyclic compounds as ligands, polythiophene derivatives, polybenzothiadiazole derivatives and polyfluorene derivatives. These materials have relatively high mobility and thus facilitate the design of hole transport properties.

Examples of the n-type semiconductor included in the photoelectric conversion layer 25 include a fullerene or a fullerene derivative. Fullerenes are, for example, higher-order fullerenes and endohedral fullerenes. Higher-order fullerenes, e.g. C60、C70And C74And the like. The fullerene derivative is, for example, fullerene fluoride or PCBM ([6,6 ]]-phenyl-C61-methyl butyrate) fullerene compounds and fullerene multimers. The fullerene derivative may contain a halogen atom, an alkyl group, a phenyl group, a functional group having an aromatic compound, a functional group having a halide, a partially fluorinated alkyl group, a perfluoroalkyl group, a silylalkyl group, a silylalkoxy group, an arylsilyl group, an arylthioalkyl group, an alkylsulfanyl group, an arylsulfonyl group, an alkylsulfonyl group, an arylthioether group, an alkylsulfanyl group, an amino group, an alkylamino group, an arylamino group, a hydroxyl group, an alkoxy group, an acylamino group, an acyloxy group, a carbonyl group, a carboxyl group, a carboxamide group, a carboalkoxy group, an acyl group, a sulfonyl group, a cyano group, a nitro group, a group having a chalcogenide, a phosphino group, or a. The alkyl group may be linear or branched. The fullerene derivative may contain a cyclic alkyl group. The aromatic compound may include a plurality of cyclic structures. The plurality of cyclic structures may be bonded by single bonds or may have a fused ring structure.

The n-type semiconductor included in the photoelectric conversion layer 25 may be, for example, an oxazole derivative, an oxadiazole derivative, a triazole derivative, an organic molecule including a heterocyclic compound in a part of a molecular skeleton, an organometallic complex, a subphthalocyanine, or the like. The heterocyclic compound contains a nitrogen atom, an oxygen atom or a sulfur atom. Examples of the heterocyclic compound include pyridine derivatives, pyrazine derivatives, pyrimidine derivatives, triazine derivatives, quinoline derivatives, quinoxaline derivatives, isoquinoline derivatives, acridine derivatives, phenazine derivatives, phenanthroline derivatives, tetrazole derivatives, pyrazole derivatives, imidazole derivatives, thiazole derivatives, imidazole derivatives, benzimidazole derivatives, benzotriazole derivatives, benzoxazole derivatives, carbazole derivatives, benzofuran derivatives, dibenzofuran derivatives, porphyrinic derivatives, polyphenylenevinyl derivatives, polybenzothiazole derivatives, polyfluorene derivatives, and the like. These materials have relatively high mobility and facilitate the design of electron transport properties.

Examples of the dye material contained in the photoelectric conversion layer 25 include phthalocyanine derivatives, subphthalocyanine derivatives, quinacridone derivatives, naphthalocyanine derivatives, and squaraine derivatives. The photoelectric conversion layer 25 may contain a rhodamine-based dye, a cyanine-based dye, a coumaric acid dye, tris-8-hydroxyquinoline aluminum (Alq3), or the like. The photoelectric conversion layer 25 may contain a plurality of materials, or may have a stacked structure. The photoelectric conversion layer 25 may contain a material that does not directly contribute to photoelectric conversion.

Other layers may be provided between the photoelectric conversion layer 25 and the first electrode 21A (specifically, between the semiconductor layer 23 and the insulating layer 22) and between the photoelectric conversion layer 25 and the second electrode 26. Specifically, for example, a lower layer film, a hole transport layer, an electron blocking film, the photoelectric conversion layer 25, a hole blocking film, a buffer film, an electron transport layer, a work function adjusting film, and the like may be sequentially stacked from the first electrode 21A side.

The second electrode 26 is opposed to the first electrode 21A and the accumulation electrode 21B via the semiconductor layer 23, the barrier layer 24, and the photoelectric conversion layer 25 therebetween. The second electrode 26 is constituted by a conductive film having light transmissivity, similarly to the first electrode 21A and the accumulation electrode 21B. In the solid-state imaging device 1 using the solid-state imaging element 10 as one pixel, the second electrode 26 may be separated for each pixel, or may be formed as an electrode common to the pixels. The second electrode 26 has a thickness of, for example, 10nm to 200 nm.

The interlayer insulating layer 27s provided between the insulating layer 22 and the first surface 30A of the semiconductor substrate 30 is composed of, for example, a single-layer film of one of silicon oxide, silicon nitride, and silicon oxynitride (SiON), or a stacked-layer film of two or more kinds.

The dielectric layer 27y is provided between the interlayer insulating layer 27s and the first surface 30A of the semiconductor substrate 30. The dielectric layer 27y is formed of, for example, a silicon oxide film, TEOS, a silicon nitride film, a silicon oxynitride film, or the like, but the material of the dielectric layer 27y is not particularly limited.

The fixed charge layer 27k is disposed between the dielectric layer 27y and the first face 30A of the semiconductor substrate 30. The fixed charge layer 27k may be a film having a positive fixed charge, or may be a film having a negative fixed charge. Examples of the material of the film having a negative fixed charge include hafnium oxide, aluminum oxide, zirconium oxide, tantalum oxide, titanium oxide, and the like. As a material other than the above materials, lanthanum oxide, praseodymium oxide, cerium oxide, neodymium oxide, promethium oxide, samarium oxide, europium oxide, gadolinium oxide, terbium oxide, dysprosium oxide, holmium oxide, thulium oxide, ytterbium oxide, lutetium oxide, yttrium oxide, an aluminum nitride film, a hafnium oxynitride film, an aluminum oxynitride film, or the like can be used.

The fixed charge layer 27k may have a configuration in which two or more types of films are stacked. Therefore, for example, in the case of a film having negative fixed charges, the function as a hole accumulation layer can be further enhanced.

The protective layer 51 is provided to cover the second electrode 26. The protective layer 51 is made of a material having light transmittance, and is made of a single-layer film of one kind of silicon oxide, silicon nitride, silicon oxynitride, or the like, or a stacked-layer film of two or more kinds thereof. The protective layer 51 has a thickness of, for example, 100nm to 30000 nm.

The on-chip lens 52 is disposed on the protective layer 51. The on-chip lens 52 is, for example, disposed at a region opposed to the accumulation electrode 21B, and condenses incident light onto the photoelectric conversion layer 25 of the portion opposed to the accumulation electrode 21B. The solid-state imaging element 10 may be provided with a pad (not shown).

The semiconductor substrate 30 is composed of, for example, an n-type silicon (Si) substrate, and includes a p-well 31 in a predetermined region. The second face 30B of the p-well 31 is provided with the above-described transfer transistors Tr2 and Tr3, the amplification transistor AMP, the reset transistor RST, the selection transistor SEL, and the like. In addition, a peripheral circuit (not shown) including a logic circuit and the like is provided at a peripheral portion of the semiconductor substrate 30.

Fig. 7 is an equivalent circuit diagram of the solid-state imaging element 10, and fig. 8 shows the configuration of the transistor and the first electrode 21A and the accumulation electrode 21B. The structure of the semiconductor substrate 30 will be described with reference to fig. 7 and 8 and fig. 1.

The reset transistor RST (reset transistor Tr1RST) resets the electric charges transferred from the organic photoelectric conversion portion 20 to the floating diffusion portion FD1, and is constituted by, for example, a MOS transistor. Specifically, the reset transistor Tr1rst is composed of a reset gate Grst, a channel formation region 36A, and source/ drain regions 36B and 36C. The reset gate Grst is connected to a reset line RST1, and one source/drain region 36B of the reset transistor Tr1RST doubles as a floating diffusion FD 1. The other source/drain region 36C constituting the reset transistor Tr1rst is connected to the power supply VDD.

The amplification transistor AMP is a modulation element that modulates the amount of charge generated in the organic photoelectric conversion portion 20 into a voltage, and is constituted by, for example, a MOS transistor. Specifically, the amplifying transistor AMP is composed of a gate electrode Gamp, a channel forming region 35A, and source/ drain regions 35B and 35C. The gate electrode Gamp is connected to the first electrode 21A and one source/drain region 36B (floating diffusion FD1) of the reset transistor Tr1rst via the lower first contact 45, the connection portion 41A, the lower second contact 46, and the through electrode 34. In addition, one source/drain region 35B shares one region with the other source/drain region 36C constituting the reset transistor Tr1rst, and is connected to the power supply VDD.

The selection transistor SEL (selection transistor TR1SEL) is composed of a gate Gsel, a channel formation region 34A, and source/ drain regions 34B and 34C. The gate Gsel is connected to a select line SEL 1. In addition, one source/drain region 34B shares one region with the other source/drain region 35C constituting the amplifying transistor AMP. The other source/drain region 34C is connected to a signal line (data output line) VSL 1.

The inorganic photoelectric conversion portions 32B and 32R each have a pn junction in a predetermined region of the semiconductor substrate 30. The inorganic photoelectric conversion portions 32B and 32R enable light to be dispersed in the longitudinal direction by utilizing different wavelengths of light absorbed according to the incident depth of light in the silicon substrate. The inorganic photoelectric conversion portion 32B selectively detects blue light to accumulate signal charges corresponding to blue, and is provided at a depth at which efficient photoelectric conversion of blue light is possible. The inorganic photoelectric conversion portion 32R selectively detects red light to accumulate signal charges corresponding to red, and is provided at a depth at which efficient photoelectric conversion of red light is possible. It should be noted that blue (B) is, for example, a color corresponding to a wavelength region of 450nm to 495nm, and red (R) is, for example, a color corresponding to a wavelength region of 620nm to 750 nm. It is sufficient that the inorganic photoelectric conversion portions 32B and 32R can detect part or all of the light in the respective wavelength regions.

The inorganic photoelectric conversion portion 32B includes, for example, a p + region serving as a hole accumulation layer and an n region serving as an electron accumulation layer. The inorganic photoelectric conversion portion 32R includes, for example, a p + region serving as a hole accumulation layer and an n region (having a p-n-p laminated structure) serving as an electron accumulation layer. The n region of the inorganic photoelectric conversion portion 32B is connected to the vertical transfer transistor Tr 2. The p + region of the inorganic photoelectric conversion portion 32B is bent along the transfer transistor Tr2, and leads to the p + region of the inorganic photoelectric conversion portion 32R.

The transfer transistor Tr2 (transfer transistor Tr2TRs) is provided for transferring the signal charges (here, electrons) corresponding to blue color generated and accumulated in the inorganic photoelectric conversion section 32B to the floating diffusion FD 2. Since the inorganic photoelectric conversion portion 32B is formed at a deep position from the second surface 30B of the semiconductor substrate 30, the transfer transistor TR2TRs of the inorganic photoelectric conversion portion 32B is preferably formed of a vertical transistor. In addition, the transfer transistor TR2TRs is connected to the transfer gate line TG 2. Further, the floating diffusion FD2 is provided in the region 37C near the gate Gtrs2 of the diffusion transistor TR2 TRs. The electric charges accumulated in the inorganic photoelectric conversion portion 32B are read out by the floating diffusion portion FD2 via a transfer channel formed along the gate Gtrs 2.

The transfer transistor Tr3 (transfer transistor Tr3TRs) transfers the signal charge (here, electrons) corresponding to the color red generated and accumulated in the inorganic photoelectric conversion section 32R to the floating diffusion FD3, and is constituted by, for example, a MOS transistor. In addition, the transfer transistor TR3TRs is connected to the transfer gate line TG 3. Further, the floating diffusion FD3 is provided in the region 38C near the gate Gtrs3 of the transfer transistor TR3 TRs. The electric charges accumulated in the inorganic photoelectric conversion portion 32R are read out by the floating diffusion portion FD3 via a transfer channel formed along the gate Gtrs 3.

On the second face 30B side of the semiconductor substrate 30, a reset transistor TR2rst, an amplification transistor TR2amp, and a selection transistor TR2sel which constitute a controller of the inorganic photoelectric conversion portion 32B are also provided. In addition, a reset transistor TR3rst, an amplification transistor TR3amp, and a selection transistor TR3sel which constitute a controller of the inorganic photoelectric conversion portion 32R are provided.

The reset transistor TR2rst is composed of a gate electrode, a channel formation region, and source/drain regions. The gate of the reset transistor TR2RST is connected to the reset line RST2, and one source/drain region of the reset transistor TR2RST is connected to the power supply VDD. The other source/drain region of the reset transistor TR2rst serves as the floating diffusion FD 2.

The amplification transistor TR2amp is composed of a gate, a channel formation region, and source/drain regions. The gate is connected to the other source/drain region (floating diffusion FD2) of the reset transistor TR2 rst. In addition, one source/drain region constituting the amplification transistor TR2amp shares one region with one source/drain region constituting the reset transistor TR2rst, and is connected to the power supply VDD.

The selection transistor TR2sel is composed of a gate, a channel formation region, and source/drain regions. The gate is connected to a select line SEL 2. In addition, one source/drain region constituting the selection transistor TR2sel shares one region with the other source/drain region constituting the amplification transistor TR2 amp. The other source/drain region constituting the selection transistor TR2sel is connected to a signal line (data output line) VSL 2.

The reset transistor TR3rst is composed of a gate, a channel formation region, and source/drain regions. The gate of the reset transistor TR3RST is connected to the reset line RST3, and one source/drain region constituting the reset transistor TR3RST is connected to the power supply VDD. The other source/drain region constituting the reset transistor TR3rst doubles as the floating diffusion FD 3.

The amplification transistor TR3amp is composed of a gate, a channel formation region, and source/drain regions. The gate is connected to the other source/drain region (floating diffusion FD3) constituting the reset transistor TR3 rst. In addition, one source/drain region constituting the amplification transistor TR3amp shares one region with one source/drain region constituting the reset transistor TR3rst, and is connected to the power supply VDD.

The selection transistor TR3sel is composed of a gate, a channel formation region, and source/drain regions. The gate is connected to a select line SEL 3. In addition, one source/drain region constituting the selection transistor TR3sel shares one region with the other source/drain region constituting the amplification transistor TR3 amp. The other source/drain region constituting the selection transistor TR3sel is connected to a signal line (data output line) VSL 3.

The reset lines RST1, RST2, and RST3, the select lines SEL1, SEL2, and SEL3, and the transmission gate lines TG2 and TG3 are connected to the vertical drive circuit 112 constituting a drive circuit, respectively. The signal lines (data output lines) VSL1, VSL2, and VSL3 are connected to the column signal processing circuit 113 constituting the drive circuit.

The lower first contact 45, the lower second contact 46, the upper first contact 29A, and the upper second contact 29B are each composed of a doped silicon material such as PDAS (phosphorus-doped amorphous silicon) or a metal material such as aluminum (Al), tungsten (W), titanium (Ti), cobalt (Co), hafnium (Hf), or tantalum (Ta).

(1-2. method for manufacturing solid-state imaging element)

The solid-state imaging element 10 can be manufactured, for example, as follows (fig. 9 to 14).

First, as shown in fig. 9, for example, a p-well 31 is formed as a well of a first conductivity type in the semiconductor substrate 30, and inorganic photoelectric conversion portions 32B and 32R of a second conductivity type (for example, n-type) are formed in the p-well 31. A p + region is formed near the first surface 30A of the semiconductor substrate 30.

As also shown in fig. 9, for example, an n + region serving as floating diffusion portions FD1 to FD3 is formed on the second surface 30B of the semiconductor substrate 30, and then, a gate insulating layer 33 and a gate wiring layer 47 including respective gates of the transfer transistor Tr2, the transfer transistor Tr3, the selection transistor SEL, the amplification transistor AMP, and the reset transistor RST are formed. This results in the formation of the transfer transistor Tr2, the transfer transistor Tr3, the selection transistor SEL, the amplification transistor AMP, and the reset transistor RST. Further, wiring layers 41 to 43 (multilayer wiring 40) including a lower first contact 45, a lower second contact 46, and a connection portion 41A, and an insulating layer 44 are formed on the second surface 30B of the semiconductor substrate 30.

As a base of the semiconductor substrate 30, for example, an SOI (silicon on insulator) substrate in which the semiconductor substrate 30, a buried oxide film (not shown), and a holding substrate (not shown) are laminated is used. Although not shown in fig. 9, the buried oxide film and the holding substrate are bonded to the first face 30A of the semiconductor substrate 30. After the ion implantation, an annealing process is performed.

Next, a support substrate (not shown) or another semiconductor substrate or the like is bonded to the second face 30B side (the multilayer wiring 40 side) of the semiconductor substrate 30, and the substrate is turned upside down. Subsequently, the semiconductor substrate 30 is separated from the buried oxide film of the SOI substrate and the holding substrate to expose the first surface 30A of the semiconductor substrate 30. The above steps may be performed by a technique used in a general CMOS process such as ion implantation and CVD (chemical vapor deposition).

Next, as shown in fig. 10, the semiconductor substrate 30 is processed from the first face 30A side by, for example, dry etching, for example, to form an annular opening 34H. As shown in fig. 10, with respect to the depth, the opening 34H penetrates from the first surface 30A to the second surface 30B of the semiconductor substrate 30, and reaches, for example, the connection portion 41A.

Subsequently, for example, the negative fixed charge layer 27k is formed on the first face 30A of the semiconductor substrate 30 and the side face of the opening 34H. Two or more types of films may be stacked as the negative fixed charge layer 27 k. This makes it possible to further enhance the function as a hole accumulation layer. After the negative fixed charge layer 27k is formed, a dielectric layer 27y is formed. Next, after forming the link wirings 39A and 39B at predetermined positions on the dielectric layer 27y, an interlayer insulating layer 27s including, for example, an SiO film in which the upper first contact 29A and the upper second contact 29B are buried on the link wirings 39A and 39B is formed using a photolithography method and a CMP (chemical mechanical polishing) method.

Subsequently, as shown in fig. 11, a conductive film 21y is formed on the upper first contact 29A, the upper second contact 29B, and the interlayer insulating layer 27s, and then a photoresist PR is formed at a predetermined position of the conductive film 21 y. Thereafter, as shown in fig. 12, the first electrode 21A and the accumulation electrode 21B are formed by etching and removing the photoresist PR.

Then, for example, an SiO film is formed on the interlayer insulating layer 27s, the first electrode 21A, and the accumulation electrode 21B, and then the SiO film is planarized using a CMP method. Subsequently, the insulating layer 22 is formed as a film on the interlayer insulating layer 27s, the first electrode 21A, and the accumulation electrode 21B. The insulating layer 22 is formed as a film using, for example, an ALD (atomic layer deposition) method.

Next, as shown in fig. 13, a photoresist PR is formed on a region facing the first electrode 21A, and the insulating layer 22 is etched using, for example, a dry etching method. This allows the opening 22H to be formed.

Subsequently, as shown in fig. 14, a semiconductor layer 23, a barrier layer 24, a photoelectric conversion layer 25, and a second electrode 26 are sequentially formed on the insulating layer 22 and the first electrode 21A. Then, the protective layer 51 is formed. Thereafter, an optical member such as a planarization layer and an on-chip lens 52 are disposed. Thus, the solid-state imaging element 10 shown in fig. 1 is completed.

(1-3. operation of solid-state imaging element)

In the solid-state imaging element 10, when light enters the organic photoelectric conversion portion 20 via the on-chip lens 52, the light passes through the organic photoelectric conversion portion 20 and the inorganic photoelectric conversion portions 32B and 32R in order, and photoelectric conversion is performed for each of green, blue, and red light in the course of the passage. Hereinafter, a description is given of the signal acquisition operation for each color.

(obtaining of Green Signal by organic photoelectric conversion portion 20)

First, green light among light that has been incident on the solid-state imaging element 10 is selectively detected (absorbed) by the organic photoelectric conversion portion 20, and photoelectric conversion is performed.

The organic photoelectric conversion portion 20 is connected to the gate electrode Gamp of the amplification transistor AMP and the floating diffusion FD1 via the through electrode 34. Therefore, electrons of the electron-hole pairs generated in the organic photoelectric conversion portion 20 are extracted from the first electrode 21A and the accumulation electrode 21B side, transported to the second surface 30B side of the semiconductor substrate 30 via the through electrode 34, and accumulated in the floating diffusion portion FD 1. Meanwhile, the amount of charge generated in the organic photoelectric conversion portion 20 is modulated into a voltage by the amplifying transistor AMP.

The reset gate Grst of the reset transistor RST is disposed adjacent to the floating diffusion FD 1. Accordingly, the charge accumulated in the floating diffusion FD1 is reset by the reset transistor RST.

Here, the organic photoelectric conversion portion 20 is connected not only to the amplification transistor AMP via the through electrode 34 but also to the floating diffusion portion FD1, and therefore, the electric charges accumulated in the floating diffusion portion FD1 can be easily reset by the reset transistor RST.

In contrast, in the case where the through electrode 34 and the floating diffusion FD1 are not connected to each other, it is difficult to reset the electric charge accumulated in the floating diffusion FD1, thus resulting in applying a large voltage to pull the electric charge to the second electrode 26 side. Therefore, the photoelectric conversion layer 25 may be damaged. In addition, a structure that enables reset in a short time causes an increase in dark noise, causing a tradeoff, and thus such a structure is difficult.

Accumulation and transfer of signal charges by the first electrode 21A and the accumulation electrode 21B are explained with reference to fig. 15.

In the solid-state imaging element 10, changing the potential to be applied to the accumulation electrode 21B causes charge accumulation and transfer. During the accumulation period, a positive potential V1 is applied from the drive circuit to the accumulation electrode 21B. This causes an electric field of a certain amount or more to be applied to the blocking layer 24, and thus the electric charges (here, electrons) generated in the photoelectric conversion layer 25 move from the photoelectric conversion layer 25 to the semiconductor layer 23 via the blocking layer 24, and are accumulated in the semiconductor layer 23 at a portion opposite to the accumulation electrode 21B (accumulation period). The holes generated in the photoelectric conversion layer 25 are discharged via the second electrode 26.

The reset operation is performed later during the accumulation period. Specifically, the scanning section changes the voltage of the reset signal RST from a low level to a high level at a predetermined timing. This causes the reset transistor TR1rst to be in an on state in the unit pixel P; as a result, the voltage of the floating diffusion FD1 is set to the power supply voltage VDD, and the voltage of the floating diffusion F D1 is reset.

After the reset operation is completed, readout of the signal charge is performed. In reading out the signal charges, a potential V2 is applied from the drive circuit to the first electrode 21A. For the potential V2, V2< V1 holds. The potential V2 may be a negative potential. Here, the barrier layer 24 serves as an insulating layer. Application of the potential V2 to the first electrode 21A causes signal charges (here, electrons) accumulated in the portion of the semiconductor layer 23 opposite to the accumulation electrode 21B to be read out by the floating diffusion FD1 via the first electrode 21A. That is, the signal charges accumulated in the semiconductor layer 23 are read out by the controller (transfer period).

In addition, as shown in fig. 16, global shutter driving is also possible in the solid-state imaging element 10.

First, in the accumulation period, a predetermined potential V3 is applied from the drive circuit to the accumulation electrode 21B. Here, the blocking layer 24 functions as an insulating layer, and the electric charges (here, electrons) generated in the photoelectric conversion layer 25 are accumulated in the photoelectric conversion layer 25 at a portion opposite to the accumulation electrode 21B (accumulation period). The holes generated in the photoelectric conversion layer 25 are discharged via the second electrode 26.

In the subsequent transfer period (first transfer period), a predetermined potential V4 is applied from the drive circuit to the accumulation electrode 21B. For the potential V4, V3< V4 holds. This causes an electric field of a certain amount or more to be applied to the blocking layer 24, and thus the signal charges accumulated in the photoelectric conversion layer 25 are transferred to the semiconductor layer 23 via the blocking layer 24 once for all pixels (pixels P) (first transfer period).

The signal charges transferred to the semiconductor layer 23 are held for a certain time (storage period) at the portion of the semiconductor layer 23 opposite to the accumulation electrode 21B. Thereafter, the signal charges are read out as necessary. In reading out the signal charges, a potential V4 is applied from the drive circuit to the first electrode 21A. For the potential V4, V4< V3 holds. The potential V4 may be a negative potential. Here, the barrier layer 24 serves as an insulating layer. Application of the potential V4 to the first electrode 21A causes signal charges (here, electrons) held in the portion of the semiconductor layer 23 opposite to the accumulation electrode 21B to be read out by the floating diffusion FD1 via the first electrode 21A (second transfer period).

(blue signal and red signal are obtained by the inorganic photoelectric conversion portions 32B and 32R)

Of the light transmitted through the organic photoelectric conversion portion 20, blue light and red light are sequentially absorbed by the inorganic photoelectric conversion portion 32B and the inorganic photoelectric conversion portion 32R, respectively, and photoelectric conversion is performed. In the inorganic photoelectric conversion section 32B, electrons corresponding to the incident blue light are accumulated in the n region of the inorganic photoelectric conversion section 32B, and the accumulated electrons are transferred to the floating diffusion FD2 by the transfer transistor Tr 2. Similarly, in the inorganic photoelectric conversion section 32R, electrons corresponding to the incident red light are accumulated in the n region of the inorganic photoelectric conversion section 32R, and the accumulated electrons are transferred to the floating diffusion FD3 by the transfer transistor Tr 3.

(1-3. action and Effect)

In the present embodiment, the signal charges generated in the photoelectric conversion layer 25 are accumulated in the semiconductor layer 23 in the portion opposed to the accumulation electrode 21B. The accumulated signal charges are transferred to and read out from the first electrode 21A. That is, similarly to the inorganic photoelectric conversion portions 32B and 32R, also in the organic photoelectric conversion portion 20, the signal charges are once accumulated and then read out by the floating diffusion portion FD 1. This makes it possible to reset the floating diffusion FD1 immediately before the transfer of the signal charge. Therefore, the noise component can be removed, thereby improving the quality of the captured image.

In addition, in the present embodiment, the blocking layer 24 is provided between the semiconductor layer 23 and the photoelectric conversion layer 25, thereby suppressing the occurrence of a transmission failure of signal charges accumulated in the semiconductor layer 23. Hereinafter, the description will be made with reference to a comparative example (comparative example 1).

Fig. 17 illustrates a schematic cross-sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 100) according to comparative example 1. The solid-state imaging element 100 is not provided with the barrier layer (the barrier layer 24 of fig. 1), and the semiconductor layer 23 and the photoelectric conversion layer 25 are in contact with each other. In such a solid-state imaging element 100, the signal charges accumulated in the semiconductor layer 23 are more likely to return to the photoelectric conversion layer 25 having a smaller mobility. Such backflow of the signal charges causes occurrence of poor transmission of the signal charges.

In contrast, in the present embodiment, the barrier layer 24 is provided between the semiconductor layer 23 and the photoelectric conversion layer 25. When the potential V2 is applied to the accumulation electrode 21B (fig. 15), the barrier layer 24 functions as an insulating layer, and thus movement of signal charges from the semiconductor layer 23 across the barrier layer 24 is suppressed. That is, the signal charge is less likely to return from the semiconductor layer 23 to the photoelectric conversion layer 25. This suppresses the occurrence of poor transfer of the signal charges accumulated in the semiconductor layer 23.

Further, in the present embodiment, the blocking layer 24 is provided between the semiconductor layer 23 and the photoelectric conversion layer 25, thereby making it possible to accumulate signal charges in the photoelectric conversion layer 25 at the time of global shutter driving (accumulation period in fig. 16). The signal charges accumulated in the photoelectric conversion layer 25 are once transferred to the semiconductor layer 23 for all pixels and once held in the semiconductor layer 23 (storage period in fig. 16). In this way, accumulation, transfer, and holding (storage) of signal charges are performed along the lamination direction of the semiconductor layer 23, the barrier layer 24, and the photoelectric conversion layer 25, thereby making it possible to realize global shutter driving without reducing the numerical aperture. The following description will be made with reference to a comparative example (comparative example 2).

Fig. 18 shows a schematic cross-sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 101) according to comparative example 2. The solid-state imaging element 101 is not provided with the barrier layer (the barrier layer 24 of fig. 1), similarly to the solid-state imaging element 100. The solid-state imaging element 101 includes a storage electrode (storage electrode 21M), and is configured to enable global shutter driving. For example, the storage electrodes 21M are arranged side by side between the first electrode 21A and the accumulation electrode 21B. The semiconductor layer 23 of the portion opposed to the first electrode 21A and the storage electrode 21M is covered with a light-shielding film (light-shielding film 54).

In the solid-state imaging element 101, the signal charges generated in the photoelectric conversion layer 25 move to the semiconductor layer 23 and are accumulated in a portion of the semiconductor layer 23 opposite to the accumulation electrode 21B. After that, the signal charges are transferred within the semiconductor layer 23 from the portion opposing the accumulation electrode 21B to the portion opposing the storage electrode 21M once for all the pixels, and are held. That is, accumulation, transfer, and holding (storage) of signal charges are performed along the planar direction of the semiconductor layer 23, so that the portion of the semiconductor layer 23 opposite to the storage electrode 21M is shielded from light. Accordingly, the numerical aperture is reduced due to the provision of the global shutter function.

In contrast, in the present embodiment, as described above, accumulation, transfer, and holding (storage) of signal charges are performed along the lamination direction of the semiconductor layer 23, the barrier layer 24, and the photoelectric conversion layer 25, thereby making it possible to realize global shutter driving without reducing the numerical aperture.

As described above, in the solid-state imaging element 10 according to the present embodiment, the blocking layer 24 is provided between the semiconductor layer 23 and the photoelectric conversion layer 25, thereby making it possible to suppress occurrence of a transmission failure of signal charges accumulated in the semiconductor layer 23. Therefore, element characteristics can be improved.

In addition, in the present embodiment, by providing the blocking layer 24, accumulation, transfer, and holding (storage) of signal charges can be performed in the lamination direction of the semiconductor layer 23, the blocking layer 24, and the photoelectric conversion layer 25, thereby making it possible to realize global shutter driving without reducing the numerical aperture.

Modifications of the first embodiment and other embodiments will be described below. The similar constituent elements to those of the first embodiment are denoted by the same reference numerals, and the description thereof is appropriately omitted.

<2. modification 1>

Fig. 19 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 10A) according to modification 1 of the foregoing first embodiment. The solid-state imaging element 10A is a front-surface illumination type solid-state imaging element; light enters the semiconductor substrate 30 from the second face 30B side. Except for this point, the solid-state imaging element 10A has a similar configuration and effect to the solid-state imaging element 10.

The connection portion 41A, the multilayer wiring 40, the interlayer insulating layer 27s, and the like are provided between the second surface 30B of the semiconductor substrate 30 and the organic photoelectric conversion portion 20 (the first electrode 21A and the accumulation electrode 21B). The first electrode 21A is connected to the connection portion 41A via a first contact 29A and a connection wiring 39A provided in an upper portion in the interlayer insulating layer 27 s. That is, the front-surface illumination type solid-state imaging element 10A eliminates the need for the through-electrode (the through-electrode 34 in fig. 1) of the semiconductor substrate 30.

<3. modification 2>

Fig. 20 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 10B) according to modification 2 of the foregoing first embodiment. The solid-state imaging element 10B has one inorganic photoelectric conversion portion (inorganic photoelectric conversion portion 32C) within the semiconductor substrate 30. Except for this point, the solid-state imaging element 10B has a similar configuration and effect to the solid-state imaging element 10.

The inorganic photoelectric conversion portion 32C is a portion that performs photoelectric conversion on light transmitted through the color filter layer 53 and the organic photoelectric conversion portion 20. The solid-state imaging element 10B may be provided with an inorganic photoelectric conversion portion 32C that detects light beams of different colors. The solid-state imaging element 10B includes, for example, a color filter layer 53 between the organic photoelectric conversion section 20 and the on-chip lens 52, and the color filter layer 53 is arranged at a position opposing the inorganic photoelectric conversion section 32C. The color filter layer 53 may be disposed between the semiconductor substrate 30 and the organic photoelectric conversion portion 20 (not shown).

In this way, one inorganic photoelectric conversion portion (inorganic photoelectric conversion portion 32C) may be provided in the semiconductor substrate 30. In the solid-state imaging element 10B, light of any wavelength can be selectively utilized for each pixel.

As shown in fig. 21, the solid-state imaging element 10B may be a front-surface irradiation type solid-state imaging element.

<4. modification 3>

Fig. 22 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 10C) according to modification 3 of the foregoing first embodiment. In the solid-state imaging element 10C, no inorganic photoelectric conversion portion (the inorganic photoelectric conversion portions 32R and 32B in fig. 1 or the inorganic photoelectric conversion portion 32C in fig. 20) is provided within the semiconductor substrate 30. Except for this point, the solid-state imaging element 10C has a similar configuration and effect to the solid-state imaging element 10.

The solid-state imaging element 10C has a color filter layer 53, similarly to the solid-state imaging element 10B of the foregoing modification 2. The color filter layer 53 may be provided between the organic photoelectric conversion section 20 and the on-chip lens 52 (fig. 20), or may be provided between the semiconductor substrate 30 and the organic photoelectric conversion section 20 (not shown). In such a solid-state imaging element 10C, light of any wavelength can be selectively utilized for each pixel.

In addition, since the inorganic photoelectric conversion portion is not provided in the semiconductor substrate 30, the degree of freedom in selecting the conductive material constituting the first electrode 21A and the accumulation electrode 21B can be improved. Specifically, the conductive material constituting the first electrode 21A and the accumulation electrode 21B is not limited to the light transmissive conductive material; more general metallic materials may be used.

Further, in the back-side illumination type solid-state imaging element 10C, the use of the semiconductor substrate 30 allows the constitution of a lamination type imaging element.

As shown in fig. 23, the solid-state imaging element 10C may be a front-surface irradiation type solid-state imaging element. In the solid-state imaging element 10C, a circuit for improving the function may be provided in the semiconductor substrate 30 of the portion opposed to the first electrode 21A and the accumulation electrode 21B.

<5. modification 4>

Fig. 24 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 10D) according to modification 4 of the foregoing first embodiment. The solid-state imaging element 10D includes, in addition to the first electrode 21A and the accumulation electrode 21B, a transfer electrode 21C as an electrode opposed to the second electrode 26 via the semiconductor layer 23 therebetween. The transfer electrode 21C is provided to control the movement of signal charges in the semiconductor layer 23. Except for this point, the solid-state imaging element 10D has a similar configuration and effect to the solid-state imaging element 10.

Fig. 25 shows a planar configuration of the first electrode 21A, the accumulation electrode 21B, and the transmission electrode 21C. The first electrode 21A, the accumulation electrode 21B, and the transmission electrode 21C are provided separately from each other. The transfer electrode 21C has a planar shape of, for example, a quadrangle, and is disposed side by side with the first electrode 21A and the accumulation electrode 21B. The transfer electrode 21C is disposed between the first electrode 21A and the accumulation electrode 21B.

The transfer electrode 21C is provided for improving the efficiency of transferring the signal charges accumulated in the accumulation electrode 21B to the first electrode 21A, and is opposed to the semiconductor layer 23 via the insulating layer 22 therebetween. The transfer electrode 21C is connected to a pixel drive circuit (not shown) constituting a drive circuit, for example, via the upper third contact 29C and the connection wiring 39C. The first electrode 21A, the accumulation electrode 21B, and the transmission electrode 21C can apply voltages independently of each other. For example, by adjusting the potential applied to the transfer electrode 21C, it is possible to prevent the signal charges accumulated in the portion of the semiconductor layer 23 opposite to the accumulation electrode 21B from unintentionally moving toward the first electrode 21A.

In this way, in the solid-state imaging element 10D, the transfer electrode 21C between the first electrode 21A and the accumulation electrode 21B can further improve the transfer efficiency of the signal charges accumulated in the semiconductor layer 23.

The solid-state imaging element 10D may be of a front-illuminated type (see fig. 19). One inorganic photoelectric conversion portion 32C may be provided in the semiconductor substrate 30 of the solid-state imaging element 10D (see fig. 20 and 21), or no inorganic photoelectric conversion portion may be provided in the semiconductor substrate 30 (see fig. 22 and 23).

<6. modification 5>

Fig. 26 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 10E) according to modification 5 of the foregoing first embodiment. The solid-state imaging element 10E includes, in addition to the first electrode 21A and the accumulation electrode 21B, a discharge electrode 21D as an electrode opposed to the second electrode 26 via the semiconductor layer 23 therebetween. Except for this point, the solid-state imaging element 10E has a similar configuration and effect to the solid-state imaging element 10.

Fig. 27 shows a planar configuration of the first electrode 21A, the accumulation electrode 21B, and the discharge electrode 21D. The first electrode 21A, the accumulation electrode 21B, and the discharge electrode 21D are provided separately from each other. The discharge electrode 21D is provided, for example, so as to surround the first electrode 21A and the accumulation electrode 21B. The discharge electrode 21D is provided in common for pixels, for example. The discharge electrode 21D may be provided separately for each pixel (not shown).

The discharge electrode 21D is provided at an opening of the insulating layer 22, and is electrically connected to the semiconductor layer 23. The discharge electrode 21D is provided to send signal charges that are not sufficiently attracted by the accumulation electrode 21B or excessive signal charges (so-called overflow signal charges) when charges larger than the transfer capability are generated to the drive circuit. The discharge electrode 21D is connected to a pixel drive circuit (not shown) constituting the drive circuit, for example, via the upper fourth contact 29D and the connection wiring 39D. The first electrode 21A, the accumulation electrode 21B, and the discharge electrode 21D can apply voltages independently of each other.

In this way, in the solid-state imaging element 10E, the drain electrode 21D electrically connected to the semiconductor layer 23 enables excessive signal charges generated in the semiconductor layer 23 to be drained without remaining in the semiconductor layer 23.

The solid-state imaging element 10E may be of a front-illuminated type (see fig. 19). One inorganic photoelectric conversion portion 32C may be provided in the semiconductor substrate 30 of the solid-state imaging element 10E (see fig. 20 and 21), or no inorganic photoelectric conversion portion may be provided in the semiconductor substrate 30 (see fig. 22 and 23). The solid-state imaging element 10E may include the transfer electrode 21C together with the discharge electrode 21D (see fig. 24).

< 7> modification 6>

Fig. 28 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 10F) according to modification 6 of the foregoing first embodiment. The solid-state imaging element 10F includes, in addition to the first electrode 21A and the accumulation electrode 21B, a shield electrode 21E as an electrode opposed to the second electrode 26 via the semiconductor layer 23 therebetween. Except for this point, the solid-state imaging element 10F has a similar configuration and effect to the solid-state imaging element 10.

The first electrode 21A, the accumulation electrode 21B, and the shield electrode 21E are provided separately from each other. The shield electrode 21E is provided side by side with the first electrode 21A and the accumulation electrode 21B, and is arranged between the accumulation electrodes 21B adjacent to each other. The shield electrode 21E is provided to suppress leakage (leak) of the signal charge between the adjacent accumulation electrodes 21B, and is opposed to the semiconductor layer 23 via the insulating layer 22 therebetween. The shield electrode 21E is connected to a pixel drive circuit (not shown) constituting the drive circuit, for example, via the upper fifth contact 29E and the connection wiring 39E. The first electrode 21A, the accumulation electrode 21B, and the shield electrode 21E can apply voltages independently of each other.

In this way, in the solid-state imaging element 10F, the shield electrode 21E provided between the adjacent accumulation electrodes 21B can suppress leakage of the signal charge between the adjacent accumulation electrodes 21B.

The solid-state imaging element 10F may be of a front-illuminated type (see fig. 19). One inorganic photoelectric conversion portion 32C may be provided in the semiconductor substrate 30 of the solid-state imaging element 10F (see fig. 20 and 21), or no inorganic photoelectric conversion portion may be provided in the semiconductor substrate 30 (see fig. 22 and 23). The solid-state imaging element 10F may include the transmission electrode 21C (see fig. 24) or the discharge electrode 21D (see fig. 26) together with the shielding electrode 21E.

<8 > modification 7>

Fig. 29 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 10G) according to modification 7 of the foregoing first embodiment. The solid-state imaging element 10G includes a light shielding film 54 covering the first electrode 21A via the photoelectric conversion layer 25 therebetween. Except for this point, the solid-state imaging element 10G has a similar configuration and effect to the solid-state imaging element 10.

The light shielding film 54 is provided, for example, between the second electrode 26 and the on-chip lens 52, and covers the photoelectric conversion layer 25 at a portion opposite to the first electrode 21A. This allows suppressing photoelectric conversion at a region (the photoelectric conversion layer 25) close to the first electrode 21A. Therefore, the transfer of excessive signal charges to the first electrode 21A can be suppressed. The light-shielding film 54 contains a metal such as tungsten (W) or aluminum (Al), and may be made of a simple metal or an alloy. The light-shielding film 54 may contain a constituent material of a color filter layer (for example, the color filter layer 53 in fig. 20), and may have a laminated structure. A part of the accumulation electrode 21B may be covered with the light shielding film 54.

In this way, in the solid-state imaging element 10G, the photoelectric conversion layer 25 of the portion opposed to the first electrode 21A is covered with the light shielding film 54, so that the transfer of an excessive signal charge to the first electrode 21A is suppressed.

The solid-state imaging element 10G may be of a front-illuminated type (see fig. 19). One inorganic photoelectric conversion portion 32C may be provided in the semiconductor substrate 30 of the solid-state imaging element 10G (see fig. 20 and 21), or no inorganic photoelectric conversion portion may be provided in the semiconductor substrate 30 (see fig. 22 and 23). The solid-state imaging element 10G may include a transfer electrode 21C (see fig. 24), a discharge electrode 21D (see fig. 26), or a shield electrode 21E (see fig. 28). A part of the transfer electrode 21C, the discharge electrode 21D, or the shield electrode 21E may be covered with a light shielding film 54.

<9 > second embodiment

Fig. 30 schematically shows a sectional configuration of a main portion of a solid-state imaging element (solid-state imaging element 60) of a second embodiment of the present disclosure. In the solid-state imaging element 60, the junction surface (junction surface 20S) between the semiconductor layer 23 and the photoelectric conversion layer 25 functions as a potential barrier. That is, instead of the barrier layer (the barrier layer 24 in fig. 1), the solid-state imaging element 60 is provided with the bonding surface 20S serving as a potential barrier. Except for this point, the solid-state imaging element 60 has a similar configuration and effect to the solid-state imaging element 10 of the foregoing first embodiment.

Fig. 31 shows an example of potential energy of the semiconductor layer 23 and the photoelectric conversion layer 25. In this way, a potential barrier is formed at the junction surface 20S between the semiconductor layer 23 and the photoelectric conversion layer 25. The joint surface 20S is constructed under the following conditions.

When the signal charge is an electron, the semiconductor layer 23 has a potential in a conduction band lower than a potential of a conductor of the photoelectric conversion layer 25, and has a fermi level lower than a fermi level (vacuum level reference) of the photoelectric conversion layer 25. When the signal charges are holes, the semiconductor layer 23 has a potential higher than the potential of the valence band of the photoelectric conversion layer 25 and has a fermi level lower than the fermi level of the photoelectric conversion layer 25.

In this way, in the solid-state imaging element 60, the junction surface 20S between the semiconductor layer 23 and the photoelectric conversion layer 25 functions as a potential barrier, thereby making it possible to suppress occurrence of a transmission failure of signal charges accumulated in the semiconductor layer 23, similarly to the above-described solid-state imaging element 10. Therefore, element characteristics can be improved.

In addition, the junction surface 20S allows accumulation, transfer, and retention (storage) of signal charges to be performed along the lamination direction of the semiconductor layer 23 and the photoelectric conversion layer 25, thereby making it possible to realize global shutter driving without reducing the numerical aperture.

The solid-state imaging element 60 may be of a front-illuminated type (see fig. 19). One inorganic photoelectric conversion portion 32C may be provided in the semiconductor substrate 30 of the solid-state imaging element 60 (see fig. 20 and 21), or no inorganic photoelectric conversion portion may be provided in the semiconductor substrate 30 (see fig. 22 and 23). The solid-state imaging element 60 may include the transmission electrode 21C (see fig. 24), the discharge electrode 21D (see fig. 26), or the shielding electrode 21E (see fig. 28). The solid-state imaging element 60 may be provided with a light shielding film 54 (see fig. 29).

< application example 1>

Fig. 32 shows the entire configuration of a solid-state imaging device (solid-state imaging device 1) in which the solid-state imaging elements 10 (or the solid-state imaging elements 10A to 10G and 60; hereinafter collectively referred to as the solid-state imaging elements 10) explained in the foregoing embodiments and the like are used for each pixel. The solid-state imaging device 1 is a CMOS image sensor, includes a pixel section 1a as an imaging region on a semiconductor substrate 30, and includes, in a peripheral region of the pixel section 1a, a peripheral circuit section 130 configured of, for example, a row scanning section 131, a horizontal selection section 133, a column scanning section 134, and a system control section 132.

The pixel section 1a includes a plurality of pixels P (solid-state imaging elements 10) two-dimensionally arranged in a matrix, for example. For example, a pixel driving line Lread (e.g., a row selection line and a reset control line) is wired to the pixels P in units of pixel rows, and a vertical signal line Lsig is wired to the pixels P in units of pixel columns. The pixel driving line Lread transmits a driving signal for reading out a signal from the pixel P. One end of the pixel drive line Lread is connected to output terminals corresponding to the respective rows in the row scanning section 131.

The line scanning unit 131 is configured by a shift register, an address decoder, and the like, and is a pixel driving unit that drives each pixel P of the element region R1, for example, in units of a line. The signals output from the pixels P in the pixel row selectively scanned by the row scanning unit 131 are supplied to the horizontal selection unit 133 via the vertical signal lines Lsig. The horizontal selection unit 133 is configured by an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.

The column scanning section 134 is constituted by a shift register, an address decoder, and the like, and sequentially drives each horizontal selection switch in the horizontal selection section 133 while scanning. As a result of the selective scanning by the column scanning section 134, the signals of the respective pixels transmitted via the respective vertical signal lines Lsig are sequentially output to the horizontal signal line 135, and are input to a signal processing section and the like, not shown, through the horizontal signal line 135.

The system control section 132 receives a clock supplied from the outside, data indicating an operation mode, and the like, and outputs data such as internal information of the imaging element 4. The system control section 132 further includes a timing generator that generates various timing signals, and performs drive control of the row scanning section 131, the horizontal selection section 133, and the column scanning section 134 based on the various timing signals generated by the timing generator.

< application example 2>

The above-described solid-state imaging device 1 can be applied to any type of electronic apparatus having an imaging function, for example, a camera system such as a digital still camera and a video camera or a mobile phone having an imaging function. Fig. 33 shows a schematic configuration of the electronic apparatus 2 (camera) as an example. The electronic apparatus 2 is, for example, a video camera capable of shooting still images or shooting moving images, and includes the solid-state imaging device 1, an optical system (optical lens) 310, a shutter device 311, a driving section 313 that drives the solid-state imaging device 1 and the shutter device 311, and a signal processing section 312.

The optical system 310 guides image light (incident light) from a subject to the imaging element 4. The optical system 310 may be composed of a plurality of optical lenses. The shutter device 311 controls the light irradiation period and the light shielding period with respect to the solid-state imaging device 1. The driving section 313 controls the transfer operation of the solid-state imaging device 1 and the shutter operation of the shutter device 311. The signal processing section 312 performs various types of signal processing on the signal output from the solid-state imaging device 1. The image signal Dout after the signal processing is stored in a storage medium such as a memory or output to a monitor or the like.

< application example 3>

< example of application of in-vivo information acquisition System >

Further, the technology according to the embodiments of the present disclosure (the present technology) is applicable to various products. For example, techniques according to embodiments of the present disclosure may be applied to endoscopic surgical systems.

Fig. 34 is a block diagram showing an example of a schematic configuration of an in-vivo information acquisition system using a patient using a capsule-type endoscope to which the technique according to the embodiment of the present disclosure (the present technique) can be applied.

The in-vivo information acquisition system 10001 includes a capsule endoscope 10100 and an external control device 10200.

At the time of examination, the patient swallows the capsule type endoscope 10100. The capsule-type endoscope 10100 has an imaging function and a wireless communication function, moves inside organs such as the stomach and the intestine due to peristaltic motion or the like until it is naturally excreted from the patient, sequentially captures images inside the organs (hereinafter, also referred to as in-vivo images) at predetermined intervals, and wirelessly sequentially transmits information on the in-vivo images to the external control device 10200 outside the body.

The external control device 10200 comprehensively controls the operation of the in-vivo information acquisition system 10001. In addition, the external control device 10200 receives information on the in-vivo image transmitted from the capsule endoscope 10100 and generates image data for displaying the in-vivo image on a display device (not shown) based on the received information on the in-vivo image.

In this way, the in-vivo information acquisition system 10001 can acquire an in-vivo image obtained by capturing the in-vivo state of the patient at any time from swallowing the capsule-type endoscope 10100 until it is excreted.

The configuration and functions of the capsule endoscope 10100 and the external control device 10200 will be described in more detail.

The capsule endoscope 10100 includes a capsule casing 10101, and the casing 10101 houses a light source unit 10111, an imaging unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, a power supply unit 10116, and a control unit 10117.

The light source portion 10111 includes a light source, such as a Light Emitting Diode (LED). Light is emitted into the imaging field of view of the imaging section 10112.

The imaging section 10112 includes an imaging element and an optical system including a plurality of lenses disposed in front of the imaging element. Reflected light of light emitted to body tissue as an observation target (hereinafter referred to as "observation light") is collected by the optical system and incident on the imaging element. In the imaging section 10112, the imaging element performs photoelectric conversion on the observation light incident thereon, and generates an image signal corresponding to the observation light. The image signal generated by the imaging section 10112 is supplied to the image processing section 10113.

The image processing section 10113 includes a processor such as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), and performs various types of signal processing on the image signal generated by the imaging section 10112. The image processing section 10113 supplies the image signal on which the signal processing is performed to the wireless communication section 10114 as RAW data.

The wireless communication section 10114 performs predetermined processing such as modulation processing on the image signal subjected to the signal processing by the image processing section 10113, and transmits the image signal to the external control device 10200 via the antenna 10114A. The wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A. The wireless communication unit 10114 supplies the control signal received from the external control device 10200 to the control unit 10117.

The power feeding unit 10115 includes an antenna coil for power reception, a power regeneration circuit for regenerating power from a current generated in the antenna coil, a booster circuit, and the like. The power supply unit 10115 generates electric power using the principle of so-called non-contact charging.

Power supply portion 10116 includes a secondary battery, and stores the electric power generated by power supply portion 10115. In fig. 34, in order to avoid complication of the drawing, an arrow or the like indicating a power supply destination of electric power from the power supply unit 10116 is omitted; however, the power stored in the power supply section 10116 is supplied to the light source section 10111, the imaging section 10112, the image processing section 10113, the wireless communication section 10114, and the control section 10117, and may be used to drive these components.

The control section 10117 includes a processor such as a CPU, and suitably controls the light source section 10111, the imaging section 10112, the image processing section 10113, the wireless communication section 10114, and the power supply section 10115 according to a control signal transmitted from the external control device 10200.

The external control device 10200 includes a processor such as a CPU or a GPU, or a microprocessor, a control board, or the like on which the processor and a storage element such as a memory are mixedly mounted. The external control device 10200 transmits a control signal to the controller 10117 of the capsule endoscope 10100 through the antenna 10400A, thereby controlling the operation of the capsule endoscope 10100. In the capsule endoscope 10100, for example, conditions for emitting light to the observation target in the light source unit 10111 can be changed in accordance with a control signal from the external control device 10200. In addition, imaging conditions (for example, a frame rate, an exposure value, and the like in the imaging section 10112) can be changed according to a control signal from the external control device 10200. In addition, details of processing in the image processing section 10113 and conditions (e.g., transmission intervals, the number of transmission images, etc.) under which the wireless communication section 10114 transmits image signals can be changed according to a control signal from the external control device 10200.

Further, the external control device 10200 performs various types of image processing on the image signal transmitted from the capsule endoscope 10100 and generates image data for displaying the captured in-vivo image on the display device. As the image processing, various types of known signal processing, for example, development processing (demosaicing processing), high image quality processing (band enhancement processing, super-resolution processing, Noise Reduction (NR) processing, image stabilization processing, and/or the like), and/or enlargement processing (electronic zoom processing), and the like, may be performed. The external control device 10200 controls the driving of the display device to display the captured in-vivo image based on the generated image data. Alternatively, the external control device 10200 may cause a recording device (not shown) to record the generated image data, or cause a printing device (not shown) to print out the generated image data.

The above has explained an example of an in-vivo information acquisition system to which the technology according to the embodiments of the present disclosure can be applied. In the above-described configuration, the technique according to the embodiment of the present disclosure can be applied to the imaging section 10112. This makes it possible to improve the detection accuracy.

< application example 4>

< example of application of endoscopic surgery System >

The technology according to the embodiments of the present disclosure (present technology) is applicable to various products. For example, techniques according to embodiments of the present disclosure may be applied to endoscopic surgical systems.

Fig. 35 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique (present technique) according to the embodiment of the present disclosure can be applied.

Fig. 35 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a bed 11133 using the endoscopic surgery system 11000. As shown, the endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as a veress tube 11111 and an energy treatment instrument 11112, a support arm device 11120 that supports the endoscope 11100, and a cart 11200 on which various devices for endoscopic surgery are mounted.

The endoscope 11100 includes a lens barrel 11101 in which a region of a predetermined length from the distal end is inserted into a body cavity of a patient 11132, and a camera 11102 connected to the proximal end of the lens barrel 11101. In the example shown in the drawings, the endoscope 11100 formed as a so-called hard scope including the hard lens barrel 11101 is shown, but the endoscope 11100 may be formed as a so-called soft scope including a soft lens barrel.

The lens barrel 11101 is provided at its distal end with an opening portion into which an objective lens is fitted. The light source device 11203 is connected to the endoscope 11100, and guides light generated by the light source device 11203 to the distal end of the lens barrel through a light guide extending to the inside of the lens barrel 11101, and emits the light toward an observation object within the body cavity of the patient 11132 via the objective lens. Further, the endoscope 11100 may be a direct view mirror, an oblique view mirror, or a side view mirror.

An optical system and an imaging element are provided inside the camera 11102, and reflected light (observation light) from an observation target is condensed on the imaging element by the optical system. The observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image is generated. The image signal is transmitted as RAW data to a Camera Control Unit (CCU) 11201.

The CCU11201 includes a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and the like, and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU11201 receives an image signal from the camera 11102, and performs various types of image processing such as development processing (demosaicing processing) on the image signal to display an image based on the image signal.

The display device 11202 displays an image based on an image signal on which image processing has been performed by the CCU11201, by the control of the CCU 11201.

For example, the light source device 11203 includes a light source such as a Light Emitting Diode (LED) and supplies irradiation light for photographing a surgical site or the like to the endoscope 11100.

The input device 11204 is an input interface for the endoscopic surgical system 11000. A user may input various types of information and instructions to the endoscopic surgical system 11000 via the input device 11204. For example, the user inputs an instruction or the like for changing the imaging conditions (the type of irradiation light, magnification, focal length, and the like) of the endoscope 11100.

The treatment instrument control device 11205 controls driving of the energy treatment instrument 11112 for cauterization, incision, sealing of blood vessels, and the like of tissues. The pneumoperitoneum device 11206 injects gas into a body cavity via the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132, so as to secure the field of view of the endoscope 11100 and secure a working space of an operator. The recorder 11207 is a device capable of recording various types of information relating to the procedure. The printer 11208 is a device capable of printing various types of information relating to the operation in various forms such as text, images, graphics, and the like.

Further, the light source device 11203 that supplies illumination light for imaging a surgical site to the endoscope 11100 may include a white light source such as an LED, a laser light source, or a combination thereof. In the case where the white light source includes a combination of red, green, and blue (RGB) laser light sources, the output intensity and the output timing of each color (wavelength) can be controlled with high accuracy, so that adjustment of the white balance of a captured image can be performed in the light source device 11203. Further, in this case, by emitting the laser light from the respective RGB laser light sources onto the observation target time-divisionally and controlling the driving of the imaging element of the camera 11102 in synchronization with the emission timing, it is also possible to capture images corresponding to RGB time-divisionally. According to this method, a color image can be obtained also in the case where no color filter is provided in the imaging element.

Further, the driving of the light source device 11203 may be controlled so that the intensity of light to be output is changed at predetermined time intervals. By controlling the driving of the imaging element of the camera 11102 in synchronization with the timing of the change in light intensity to acquire images divisionally by time and synthesize the images, an image of a high dynamic range without underexposed blocking shadows and overexposed highlights can be generated.

Further, the light source device 11203 may supply light of a predetermined wavelength band corresponding to the special light observation. In special light observation, so-called narrow band imaging, in which a predetermined tissue such as a blood vessel of a mucosal surface is photographed with high contrast, is performed by emitting light having a narrow band region compared to irradiation light (i.e., white light) at the time of ordinary observation, for example, by using wavelength dependence of light absorption in a body tissue. In addition, in the special light observation, fluorescence imaging in which an image is obtained by fluorescence generated by emitting excitation light may be performed. In fluorescence imaging, for example, excitation light may be irradiated to body tissue to observe fluorescence from the body tissue (autofluorescence imaging), or an agent such as indocyanine green (ICG) may be locally injected into the body tissue and excitation light corresponding to a fluorescence wavelength of the agent may be emitted to obtain a fluorescence image. Light source device 11203 may supply narrow-band light and/or excitation light corresponding to such special light observations.

Fig. 36 is a block diagram showing an example of the functional configuration of the camera 11102 and the CCU11201 shown in fig. 35.

The camera 11102 includes a lens unit 11401, an imaging portion 11402, a driving portion 11403, a communication portion 11404, and a camera control portion 11405. The CCU11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera 11102 and the CCU11201 are connected by a transmission cable 11400 so that communication can be performed therebetween.

The lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light received from the distal end of the lens barrel 11101 is guided to the camera 11102 and incident on the lens unit 11401. The lens unit 11401 includes a combination of a plurality of lenses including a zoom lens and a focus lens.

The imaging portion 11402 is composed of an imaging element. The imaging element constituting the imaging section 11402 may be one element (so-called single plate type) or may be a plurality of elements (so-called multi-plate type). When the imaging section 11402 is a multi-panel type, for example, image signals corresponding to RGB are generated by respective imaging elements, and a color image can be obtained by synthesizing the image signals. In addition, the imaging section 11402 may include a pair of imaging elements for acquiring image signals for the right and left eyes corresponding to three-dimensional (3D) display. By performing the 3D display, the operator 11131 can grasp the depth of the body tissue in the surgical site more accurately. Further, when the imaging portion 11402 is a multi-plate type, a plurality of lens units 11401 corresponding to respective imaging elements may be provided.

Further, the imaging portion 11402 is not necessarily provided in the camera 11102. For example, the imaging section 11402 may be disposed right behind the objective lens inside the lens barrel 11101.

The driving part 11403 includes an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis by the control of the camera control part 11405. As a result, the magnification and focus of the image captured by the imaging portion 11402 can be appropriately adjusted.

The communication section 11404 includes a communication device for transmitting/receiving various types of information to/from the CCU 11201. The communication section 11404 transmits the image signal acquired from the imaging section 11402 to the CCU11201 as RAW data via the transmission cable 11400.

Further, the communication section 11404 receives a control signal for controlling driving of the camera 11102 from the CCU11201, and supplies the control signal to the camera control section 11405. The control signal includes information relating to imaging conditions, for example, information specifying a frame rate of a captured image, information specifying an exposure value at the time of imaging, and/or information specifying a magnification and a focus of the captured image, and the like.

Further, imaging conditions such as a frame rate, an exposure value, a magnification, and a focus may be appropriately specified by a user, or may be automatically set by the control section 11413 of the CCU11201 based on the acquired image signal. In the latter case, a so-called Auto Exposure (AE) function, an Auto Focus (AF) function, and an Auto White Balance (AWB) function are incorporated in the endoscope 11100.

The camera control unit 11405 controls driving of the camera 11102 based on a control signal from the CCU11201 received via the communication unit 11404.

The communication section 11411 includes a communication device for transmitting/receiving various types of information to/from the camera 11102. The communication section 11411 receives the image signal transmitted from the camera 11102 via the transmission cable 11400.

Further, the communication portion 11411 transmits a control signal for controlling driving of the camera 11102 to the camera 11102. The image signal and the control signal may be transmitted by electrical communication, optical communication, or the like.

The image processing section 11412 performs various types of image processing on the image signal as the RAW data transmitted from the camera 11102.

The control section 11413 performs various types of control regarding imaging of a surgical site or the like by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site or the like. For example, the control unit 11413 generates a control signal for controlling driving of the camera 11102.

Further, the control unit 11413 causes the display device 11202 to display the captured image of the surgical site or the like based on the image signal on which the image processing unit 11412 has performed the image processing. In this case, the control part 11413 may recognize various objects within the captured image by using various image recognition techniques. For example, the control section 11413 detects the edge shape and/or color and the like of the object contained in the captured image, thereby being able to recognize a surgical instrument such as forceps, a specific living body part, bleeding, fog when the energy treatment instrument 11112 is used, and the like. When causing the display device 11202 to display the captured image, the control unit 11413 may cause the display device 11202 to display various types of operation support information related to the image of the operation site in a superimposed manner by using the recognition result. The operation support information is displayed in superposition and presented to the operator 11131, whereby the burden on the operator 11131 can be reduced and the operator 11131 can perform the operation reliably.

The transmission cable 11400 connecting the camera 11102 and the CCU11201 together is an electrical signal cable supporting communication of electrical signals, an optical fiber supporting optical communication, or a composite cable thereof.

Here, in the example shown in the drawing, wired communication is performed by using the transmission cable 11400, but wireless communication may be performed between the camera 11102 and the CCU 11201.

The above has explained an example of an endoscopic surgery system to which the technique according to the embodiment of the present disclosure can be applied. In the above-described configuration, the technique according to the embodiment of the present disclosure can be applied to the image forming portion 11402. Applying the technique according to the embodiment of the present disclosure to the imaging portion 11402 makes it possible to improve the detection accuracy.

Further, although described herein with an endoscopic surgical system as an example, techniques according to embodiments of the present disclosure may be applied to, for example, a microsurgical system.

< application example 5>

< application example of Mobile body >

The technology according to the embodiments of the present disclosure (present technology) can be applied to various products. For example, the technology according to the embodiments of the present disclosure is implemented as a device to be mounted on any type of moving body such as an automobile, an electric automobile, a hybrid electric automobile, a motorcycle, a bicycle, a personal mobile device, an airplane, an unmanned aerial vehicle (drone), a ship, a robot, a construction machine, and an agricultural machine (tractor).

Fig. 37 is a block diagram of a schematic configuration example of a vehicle control system as an example of a mobile body control system to which the technique according to the embodiment of the present disclosure can be applied.

The vehicle control system 12000 includes a plurality of electronic control units connected together via a communication network 12001. In the example shown in fig. 37, the vehicle control system 12000 includes a drive system control unit 12010, a main body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050. Further, as a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network interface (I/F)12053 are shown.

The drive system control unit 12010 controls the operations of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 functions as a control device such as a driving force generation device for generating a driving force of the vehicle such as an internal combustion engine or a drive motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting a steering angle of the vehicle, a brake device for generating a braking force of the vehicle, and the like.

The main body system control unit 12020 controls the operations of various devices mounted to the vehicle body according to various programs. For example, the main body system control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lights such as a head light, a tail light, a stop light, a turn signal light, or a fog light. In this case, a radio wave transmitted from the portable device or a signal of various switches for replacing the key may be input to the main body system control unit 12020. The main body system control unit 12020 receives input of radio waves or signals and controls the door lock device, power window device, lamp, and the like of the vehicle.

Vehicle exterior information detection section 12030 detects information on the exterior of the vehicle to which vehicle control system 12000 is attached. For example, the vehicle exterior information detection means 12030 is connected to the imaging unit 12031. The vehicle exterior information detection unit 12030 causes the imaging section 12031 to capture an image of the outside of the vehicle and receives the captured image. The vehicle exterior information detection unit 12030 can perform object detection processing such as a person, a car, an obstacle, a sign, characters on a road, or distance detection processing based on the received image.

The imaging section 12031 is an optical sensor that receives light and outputs an electric signal corresponding to the amount of received light. The imaging section 12031 may output an electrical signal as an image or an electrical signal as distance measurement information. Further, the light received by the imaging section 12031 may be visible light or invisible light such as infrared light.

The in-vehicle information detection unit 12040 detects information in the vehicle. For example, the in-vehicle information detection unit 12040 is connected to a driver state detection unit 12041 for detecting the state of the driver. For example, the driver state detection unit 12041 includes a camera that takes an image of the driver, and based on the detection information input from the driver state detection unit 12041, the in-vehicle information detection unit 12040 may calculate the fatigue or concentration of the driver, or may determine whether the driver falls asleep in a sitting posture.

For example, the microcomputer 12051 may calculate control target values of the driving force generation device, the steering mechanism, or the brake device based on the information of the interior and exterior of the vehicle obtained by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and may output a control instruction to the driving system control unit 12010. For example, the microcomputer 12051 may perform coordinated control to realize functions of an Advanced Driver Assistance System (ADAS) including collision avoidance or collision mitigation of vehicles, follow-up running based on a distance between vehicles, vehicle speed maintenance running, vehicle collision warning, lane departure warning of vehicles, and the like.

In addition, the microcomputer 12051 can perform coordinated control by controlling the driving force generation device, the steering mechanism, the brake device, and the like based on the information on the vehicle surroundings obtained by the outside-vehicle information detection unit 12030 or the inside-vehicle information detection unit 12040 to realize automatic driving and the like in which the vehicle autonomously travels without depending on the operation of the driver.

In addition, the microcomputer 12051 can output a control command to the subject system control unit 12020 based on the information outside the vehicle obtained by the vehicle exterior information detection unit 12030. For example, the microcomputer 12051 controls headlights according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detecting unit 12030 to perform cooperative control to achieve glare prevention such as switching a high beam to a low beam.

The audio image output unit 12052 delivers at least one of a sound and an image output signal to an output device capable of visually or aurally notifying a vehicle occupant or information outside the vehicle. In the example of fig. 37, as output devices, an audio speaker 12061, a display unit 12062, and a dashboard 12063 are shown. For example, the display unit 12062 may include at least one of an in-vehicle display and a flat-view display.

Fig. 38 is a diagram of an example of the mounting position of the imaging section 12031.

In fig. 38, a vehicle 12100 includes imaging portions 12101, 12102, 12103, 12104, and 12105 as the imaging portion 12031.

Each of the imaging portions 12101, 12102, 12103, 12104, and 12105 is provided at a position such as a head of the vehicle 12100, a side view mirror, a rear bumper, a rear door, an upper side of a windshield in the vehicle, and the like. The imaging section 12101 provided in the vehicle head and the imaging section 12105 provided on the upper side of the windshield in the vehicle mainly obtain an image of the front of the vehicle 12100. The imaging portions 12102 and 12103 provided in the side view mirrors mainly obtain images of the sides of the vehicle 12100. An imaging portion 12104 provided in a rear bumper or a rear door mainly obtains an image of the rear of the vehicle 12100. The imaging portion 12105 provided on the upper side of the windshield in the vehicle is mainly used to detect a preceding vehicle, a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, and the like.

Further, FIG. 38 shows examples of imaging ranges of the imaging sections 12101 to 12104. The imaging range 12111 represents an imaging range of an imaging portion 12101 provided in the vehicle head, the imaging ranges 12112 and 12113 represent imaging ranges of imaging portions 12102 and 12103 provided in the side view mirror, respectively, and the imaging range 12114 represents an imaging range of an imaging portion 12104 provided in the rear bumper or the rear door. For example, image data captured by the imaging sections 12101 to 12104 are superimposed on each other, thereby obtaining a bird's-eye view image of the vehicle 12100 as seen from above.

At least one of the imaging sections 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging sections 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.

For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 obtains the distance to each three-dimensional object in the imaging ranges 12111 to 12114 and the temporal change in the distance (relative speed to the vehicle 12100), and can extract, as the leading vehicle, a three-dimensional object that is located on the traveling route of the vehicle 12100, particularly the closest three-dimensional object, and that travels at a predetermined speed (for example, 0km/h or more) in substantially the same direction as the vehicle 12100. In addition, the microcomputer 12051 may set a distance between vehicles secured in advance in front of the preceding vehicle, and may perform automatic braking control (including follow-up running stop control), automatic acceleration control (including follow-up running start control), and the like. In this way, it is possible to perform cooperative control of automatic driving or the like in which the vehicle autonomously travels without depending on the operation of the driver.

For example, the microcomputer 12051 may extract three-dimensional object data on a three-dimensional object by classifying the three-dimensional object into other three-dimensional objects such as two-wheeled vehicles, general vehicles, large-sized vehicles, pedestrians, and utility poles based on distance information obtained from the imaging sections 12101 to 12104, and automatically avoid an obstacle using the extracted data. For example, the microcomputer 12051 recognizes obstacles around the vehicle 12100 as obstacles that can be visually recognized by the driver of the vehicle 12100 and obstacles that are difficult to visually recognize. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 can perform driving assistance for collision avoidance by outputting a warning to the driver via the audio speaker 12061 and the display unit 12062 or performing forced deceleration or avoidance steering via the drive system control unit 12010.

At least one of the imaging sections 12101 to 12104 may be an infrared camera for detecting infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether the pedestrian is present in the captured images of the imaging portions 12101 to 12104. For example, the identification of a pedestrian is performed by a process of extracting feature points in a captured image of the imaging sections 12101 to 12104 as infrared cameras and a process of performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the object is a pedestrian. When the microcomputer 12051 judges that a pedestrian exists in the captured images of the imaging sections 12101 to 12104 and identifies a pedestrian, the audio image output unit 12052 controls the display unit 12062 to display the superimposed quadrangular contour line to emphasize the identified pedestrian. Further, the audio image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.

The example of the vehicle control system to which the technology according to the embodiment of the present disclosure can be applied has been described above. In the above-described configuration, the technique according to the embodiment of the present disclosure may be applied to the image forming portion 12031. Specifically, the solid-state imaging device in fig. 32 can be applied to the imaging section 12031 in particular. Applying the technique according to the embodiment of the present disclosure to the imaging section 12031 makes it possible to obtain a captured image that is more easily seen.

The description has been given above with reference to the embodiments and the like; however, the present disclosure is not limited to the foregoing embodiments and the like, and various modifications may be made. For example, in the foregoing embodiment and the like, the solid-state imaging element 10 and the like have a configuration in which the organic photoelectric conversion portion 20 that detects green light and the inorganic photoelectric conversion portions 32B and 32R that detect blue light and red light, respectively, are stacked. However, the present disclosure is not limited to this structure. In other words, red light or blue light can be detected in the organic photoelectric conversion portion, and green light can be detected in the inorganic photoelectric conversion portion.

In addition, the number or ratio of the organic photoelectric conversion portion and the inorganic photoelectric conversion portion is not limited; two or more organic photoelectric conversion portions may be provided. For example, in a vertical beam splitting type solid-state imaging element in which a red photoelectric conversion portion, a green photoelectric conversion portion, and a blue photoelectric conversion portion containing an organic semiconductor material capable of selectively absorbing light in each predetermined wavelength region are sequentially stacked via an insulating layer on a substrate, the present technology also achieves effects similar to the foregoing embodiments and the like. In addition, in the solid-state imaging element in which the organic photoelectric conversion portion and the inorganic photoelectric conversion portion are arranged side by side along the substrate face, the present technology also achieves effects similar to those of the foregoing embodiments and the like.

Further, the constitution of the solid-state imaging element of the present disclosure is not limited to the combination shown in the foregoing embodiments and the like. For example, the solid-state imaging element may include the transfer electrode 21C, the discharge electrode 21D, and the shield electrode 21E. In addition, the accumulation electrodes 21B may be formed so as to be divided into two or more.

In addition, the solid-state imaging element and the solid-state imaging device of the present disclosure need not include all the constituent elements described in the foregoing embodiments and the like, but may include any other layer.

The effects described in the foregoing embodiments and the like are merely exemplary, and other effects may be or may further be included.

It should be noted that the present disclosure may include the following constitutions.

(1)

A solid-state imaging element comprising:

a photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween;

an insulating film provided between the accumulation electrode and the semiconductor layer; and

a barrier layer disposed between the semiconductor layer and the photoelectric conversion layer.

(2)

A solid-state imaging element comprising:

a photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer, the semiconductor layer having a potential barrier at a junction surface with respect to the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween; and

an insulating film disposed between the accumulation electrode and the semiconductor layer.

(3)

The solid-state imaging element according to (1) or (2), wherein

The photoelectric conversion layer contains an organic semiconductor material, and

the semiconductor layer includes a semiconductor material having a higher mobility than the organic semiconductor material.

(4)

The solid-state imaging element according to (1), further comprising a semiconductor substrate having a first face and a second face opposed to each other, wherein

The first electrode, the semiconductor layer, the barrier layer, the photoelectric conversion layer, and the second electrode are sequentially disposed on the first surface of the semiconductor substrate.

(5)

The solid-state imaging element according to any one of (1) to (3), further comprising:

a semiconductor substrate having a first face and a second face opposed to each other; and

and a multilayer wiring provided between the second surface of the semiconductor substrate and the first electrode.

(6)

The solid-state imaging element according to (4) or (5), further comprising an inorganic photoelectric conversion portion provided in the semiconductor substrate.

(7)

The solid-state imaging element according to any one of (1) to (6), further comprising a transfer electrode provided opposite to the semiconductor layer with the insulating film therebetween, the transfer electrode controlling movement of signal charges in the semiconductor layer.

(8)

The solid-state imaging element according to any one of (1) to (7), further comprising a discharge electrode provided separately from the first electrode and electrically connected to the semiconductor layer.

(9)

The solid-state imaging element according to any one of (1) to (8), further comprising a light-shielding film covering the first electrode via the photoelectric conversion layer therebetween.

(10)

The solid-state imaging element according to (1), wherein the barrier layer comprises silicon oxide, silicon nitride, silicon oxynitride, or an organic material.

(11)

A solid-state imaging device includes a plurality of solid-state imaging elements each including

A photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween;

an insulating film provided between the accumulation electrode and the semiconductor layer; and

a barrier layer disposed between the semiconductor layer and the photoelectric conversion layer.

(12)

A solid-state imaging device includes a plurality of solid-state imaging elements each including

A photoelectric conversion layer;

a first electrode and a second electrode opposing each other via the photoelectric conversion layer therebetween;

a semiconductor layer provided between the first electrode and the photoelectric conversion layer, the semiconductor layer having a potential barrier at a junction surface with respect to the photoelectric conversion layer;

an accumulation electrode opposed to the photoelectric conversion layer via the semiconductor layer therebetween; and

an insulating film disposed between the accumulation electrode and the semiconductor layer.

(13)

The solid-state imaging device according to (11) or (12), further comprising a shield electrode opposed to the semiconductor layer via the insulating film therebetween, the shield electrode being disposed between the accumulation electrodes adjacent to each other.

(14)

The solid-state imaging device according to any one of (11) to (13), comprising a plurality of pixels in which the solid-state imaging elements are respectively provided, wherein the semiconductor layer is provided separately for each pixel.

(15)

The solid-state imaging device according to any one of (11) to (14), comprising a plurality of pixels in which the solid-state imaging elements are respectively disposed, wherein the photoelectric conversion layers are separately disposed for each pixel.

This application claims priority from a prior japanese patent application JP2018-50808 filed on 3/19/2018 to the present patent office, the entire contents of which are incorporated herein by reference.

It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may be made depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

67页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:阵列基板、显示装置及其驱动方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类