Color and infrared image sensor

文档序号:1967015 发布日期:2021-12-14 浏览:30次 中文

阅读说明:本技术 彩色和红外图像传感器 (Color and infrared image sensor ) 是由 卡米尔·杜波伦 本杰明·布蒂农 于 2020-02-21 设计创作,主要内容包括:本公开涉及一种彩色和红外图像传感器(1),包括硅衬底(10)、在衬底中和衬底上形成的MOS晶体管(16)、至少部分地形成在衬底中的第一光电二极管(2)、覆盖衬底的单独的光敏层(26),以及覆盖衬底的彩色滤光片(34),该图像传感器还包括位于每个光敏块的任一侧并界定每个光敏块中的第二光电二极管(4)的第一和第二电极(22、28)。第一光电二极管被配置为吸收可见光谱的电磁波,每个光敏块被配置为吸收可见光谱和红外光谱的第一部分的电磁波。(The present disclosure relates to a color and infrared image sensor (1) comprising a silicon substrate (10), MOS transistors (16) formed in and on the substrate, a first photodiode (2) formed at least partially in the substrate, a separate photosensitive layer (26) covering the substrate, and a color filter (34) covering the substrate, the image sensor further comprising first and second electrodes (22, 28) located on either side of each photosensitive block and defining a second photodiode (4) in each photosensitive block. The first photodiode is configured to absorb electromagnetic waves in the visible spectrum, and each photosensitive block is configured to absorb electromagnetic waves in a first portion of the visible spectrum and the infrared spectrum.)

1. A color and infrared image sensor (1) comprising a silicon substrate (10), MOS transistors (16) formed in and on the substrate, first photodiodes (2) formed at least partially in the substrate, the first photodiodes being configured to absorb electromagnetic waves of the visible spectrum, and second electrodes (22, 28) defining second photodiodes (4) in each photosensitive block, the photosensitive blocks (26) being made of an organic material, the individual photosensitive layers (26) covering the substrate, and color filters (34) covering the substrate, the image sensor further comprising first and second electrodes (22, 28) located on either side of each photosensitive block and defining second photodiodes (4) in each photosensitive block, the photosensitive blocks each being configured to absorb electromagnetic waves of a first part of the visible and infrared spectrum, wherein the photosensitive blocks (26) are made of an organic material.

2. The image sensor of claim 1, further comprising an infrared filter (40), a color filter (34) interposed between said substrate (10) and said infrared filter, said infrared filter being configured to allow the passage of electromagnetic waves of said visible spectrum, to allow the passage of electromagnetic waves of said first portion of said infrared spectrum, and to block the passage of electromagnetic waves of at least a second portion of said infrared spectrum between said visible spectrum and said first portion of said infrared spectrum.

3. The image sensor according to claim 1 or 2, wherein the photosensitive block (26) and the color filter (34) are at the same distance from the substrate (10).

4. The image sensor of claim 1 or 2, wherein the photosensitive block (26) is closer to the substrate (10) than the color filter (34).

5. The image sensor as claimed in claim 4, wherein each photosensitive block (26) is covered with a visible light filter (36) made of an organic material.

6. The image sensor according to any one of claims 1 to 6, comprising a lens array (38) interposed between the substrate (10) and the infrared filter (50).

7. The image sensor according to any of claims 1 to 6, comprising, for each pixel of the color image to be acquired, at least a first, a second and a third sub-pixel (RGB-SPix), each sub-pixel comprising one of the first photodiodes (4) and one of the color filters (34), the color filters of the first, second and third sub-pixels allowing the passage of electromagnetic waves in different frequency ranges of the visible spectrum, and a fourth sub-pixel (IR-Pix) comprising one of the second photodiodes (2).

8. The image sensor of claim 7, comprising, for each first, second and third sub-pixel (RGB-SPix), a first readout circuit (6_ R, 6_ G, 6_ B) coupled to the first photodiode (4), and, for the fourth sub-pixel (IR-Pix), a second readout circuit (6_ IR) coupled to the second photodiode (2).

9. The image sensor according to claim 8, wherein for each pixel of the color image to be acquired, the first readout circuit (6_ R, 6_ G, 6_ B) is configured to transfer a first charge generated in the first photodiode (4) to a first conductive track (68), and the second readout circuit (6_ IR) is configured to transfer a second charge generated in the second photodiode (2) to the first conductive track (68) or a second conductive track.

10. The image sensor of claim 9, wherein the first photodiodes (4) are arranged in rows and columns, and wherein the first readout circuitry (6_ R, 6_ G, 6_ B) is configured to control the generation of the first electric charge during a first time interval that is time-shifted for all the first photodiodes of the image sensor simultaneously, or from row to row of first photodiodes, or for each pixel of the color image to be acquired, from difference to the first, second and third sub-pixels (RGB-SPix).

11. The image sensor of claim 9 or 10, wherein the second photodiodes (2) are arranged in rows and columns, and wherein the second readout circuitry (6_ IR) is configured to control the generation of the second electric charge during a second time interval that is simultaneous for all of the second photodiodes (2) of the image sensor.

Technical Field

The present disclosure relates to image sensors or electronic imagers.

Background

Image sensors are used in many fields, particularly in the field of electronic devices, due to their miniaturization. Image sensors exist in human interface applications or image capture applications.

For some applications, it is desirable to have an image sensor that can simultaneously acquire color images and infrared images. Such image sensors are referred to as color and infrared image sensors in the following description. An example of an application of color and infrared image sensors involves acquiring an infrared image of an object having a structured infrared pattern projected thereon. The fields of use of such image sensors are in particular motor vehicles, unmanned aerial vehicles, smart phones, robots and augmented reality systems.

The phase of the pixel collecting charge under the influence of incident radiation is called the integration phase of the pixel. The integration phase is typically followed by a readout phase during which the amount of charge collected by the pixel is measured.

The design of color and infrared image sensors requires consideration of a number of constraints. First, the resolution of the color image should not be less than that obtained with conventional color image sensors.

Secondly, for some applications it may be desirable that the image sensor is of the global shutter type, i.e. a method of image acquisition that achieves simultaneous start and end of the pixel integration phase. This is particularly useful for acquiring infrared images of objects having a structured infrared pattern projected thereon.

Third, it is desirable that the size of the image sensor pixels be as small as possible. Fourth, it is desirable that the fill factor of each pixel (corresponding to the ratio of the area of the pixel actively participating in capturing the incident radiation in top view to the total surface area of the pixel in top view) is as large as possible.

It may be difficult to design color and infrared image sensors that meet all of the above constraints.

Disclosure of Invention

One embodiment overcomes all or part of the disadvantages of the previously described color and infrared image sensors.

According to one embodiment, the resolution of the color image acquired by the color and infrared image sensors is greater than 2,560ppi, preferably greater than 8,530 ppi.

According to one embodiment, the acquisition method of the infrared image is of the global shutter type.

According to one embodiment the size of the color and infrared image sensor pixels is less than 10 μm, preferably less than 3 μm.

According to one embodiment, the fill factor of each pixel of the color and infrared image sensor is greater than 50%, preferably greater than 80%.

One embodiment provides a color and infrared image sensor comprising a silicon substrate, MOS transistors formed in and on the substrate, a first photodiode formed at least partially in the substrate, a separate photosensitive layer covering the substrate, and a color filter covering the substrate, the image sensor further comprising first and second electrodes located on either side of each photosensitive block and defining a second photodiode in each photosensitive block, the first photodiode being configured to absorb electromagnetic waves in the visible spectrum, each photosensitive block being configured to absorb electromagnetic waves in a first portion of the visible and infrared spectrum.

According to one embodiment, the image sensor further comprises an infrared filter, a color filter interposed between the substrate and the infrared filter, the infrared filter configured to allow passage of electromagnetic waves of the visible spectrum, to allow passage of electromagnetic waves of the first portion of the infrared spectrum, and to block electromagnetic waves of at least a second portion of the infrared spectrum between the visible spectrum and the first portion of the infrared spectrum.

According to one embodiment, the photosensitive block and the color filter are at the same distance from the substrate.

According to one embodiment, the photosensitive block is closer to the substrate than the color filter.

According to one embodiment, each photosensitive block is covered with a visible light filter made of an organic material.

According to one embodiment, the image sensor further comprises a lens array interposed between the substrate and the infrared filter.

According to one embodiment, the image sensor further comprises, for each pixel of the color image to be acquired, at least a first, a second and a third sub-pixel, each sub-pixel comprising one of the first photodiodes and one of the color filters, the color filters of the first, second and third sub-pixels allowing the passage of electromagnetic waves in different frequency ranges of the visible spectrum, and a fourth sub-pixel comprising one of the second photodiodes.

According to one embodiment, the image sensor further comprises, for each of the first, second and third sub-pixels, a first readout circuit coupled to the first photodiode, and for the fourth sub-pixel, a second readout circuit coupled to the second photodiode.

According to one embodiment, for each pixel of the color image to be acquired, the first readout circuitry is configured to transfer a first charge generated in the first photodiode to a first conductive track, and the second readout circuitry is configured to transfer a second charge generated in the second photodiode to the first conductive track or a second conductive track.

According to one embodiment, the first photodiodes are arranged in rows and columns, and the first readout circuitry is configured to control the generation of the first charge during a first time interval that is time-shifted for all the first photodiodes of the image sensor simultaneously, or from row to row of first photodiodes, or for each pixel of the color image to be acquired, from difference of the first, second and third sub-pixels.

According to one embodiment, the second photodiodes are arranged in rows and columns and the second readout circuitry is configured to control generation of the second electrical charge during a second time interval that is simultaneous for all of the second photodiodes of the image sensor.

According to one embodiment, the photosensitive layer is made of an organic material.

Drawings

The above features and advantages, and other features and advantages, are described in detail in the following description of specific embodiments, which is given by way of example and not of limitation with reference to the accompanying drawings, in which:

FIG. 1 is a simplified, partially exploded perspective view of an embodiment of a color and infrared image sensor;

FIG. 2 is a simplified partial cross-sectional view of the image sensor of FIG. 1;

FIG. 3 is a simplified exploded perspective view of a portion of another embodiment of a color and infrared image sensor;

FIG. 4 is a simplified partial cross-sectional view of the image sensor of FIG. 3;

FIG. 5 is an electrical diagram of an embodiment of a readout circuit for a subpixel of the image sensor of FIG. 1; and

fig. 6 is a timing diagram of signals of an embodiment of a method of operating an image sensor having the readout circuitry of fig. 5.

Detailed Description

Like features are designated by like reference numerals throughout the various figures. In particular, structural and/or functional features that are common among the various embodiments may have the same reference numerals and may have the same structural, dimensional, and material characteristics. For clarity, only those steps and elements useful for understanding the described embodiments are shown and described in detail. In particular, the use of the image sensor described below is not described in detail.

In the following disclosure, unless otherwise specified, when referring to absolute position qualifiers, such as the terms "front", "back", "top", "bottom", "left", "right", etc., or relative position qualifiers, such as the terms "above", "below", "higher", "lower", etc., or direction qualifiers, such as "horizontal", "vertical", etc., reference is made to the orientation shown in the figures, or to the orientation of the image sensor during normal use. Unless otherwise indicated, the expressions "about", "approximately", "substantially" and "about" mean within 10%, preferably within 5%.

Unless otherwise stated, when two elements are referred to as being connected together, it is meant that there is no direct connection of any intervening elements other than conductors, and when two elements are referred to as being coupled together, it is meant that the two elements may be connected together or may be coupled together via one or more other elements. Further, a signal alternating between a first constant state (e.g., a low state labeled "0") and a second constant state (e.g., a high state labeled "1") is referred to as a "binary signal". The high and low states of different binary signals of the same electronic circuit may be different. In particular, a binary signal may correspond to a voltage or current that may not be completely constant in either a high or low state. Further, the terms "insulating" and "conductive" are considered herein to mean "electrically insulating" and "electrically conductive," respectively.

The transmittance of a layer corresponds to the ratio of the intensity of radiation exiting the layer to the intensity of radiation entering the layer. In the following description, a layer or film is considered opaque to radiation when the radiation-transmissive layer or film has a transmission of less than 10%. In the following description, a layer or film is considered transparent to radiation when the transmission of the radiation-transmitting layer or film is greater than 10%. In the following description, the refractive index of a material corresponds to the refractive index of the material in the wavelength range of the radiation captured by the image sensor. Unless otherwise stated, the refractive index is considered to be substantially constant over the wavelength range of the useful radiation, equal to the average of the refractive index over the wavelength range of the radiation captured by the image sensor.

In the following description, "visible light" means electromagnetic radiation having a wavelength in the range of 400nm to 700nm, and "infrared radiation" means electromagnetic radiation having a wavelength in the range of 700nm to 1 mm. Among infrared radiation, one can particularly distinguish near infrared radiation having a wavelength in the range of 700nm to 1.4 μm.

The pixels of the image correspond to unit elements of the image captured by the image sensor. When the optoelectronic device is a color image sensor, it generally comprises, for each pixel of the color image to be acquired, at least three components, each of which acquires light radiation of substantially a single color, i.e. in the wavelength range below 100nm (for example, red, green and blue). Each component may in particular comprise at least one photodetector.

Fig. 1 is a partially simplified exploded perspective view of an embodiment of a color and infrared image sensor 1, and fig. 2 is a partially simplified sectional view thereof. The image sensor 1 comprises an array of first photon sensors 2, also called photodetectors, capable of capturing infrared images; an array of second photodetectors 4 capable of capturing color images is also included. The array of photodetectors 2 and 4 is associated with an array of readout circuits 6 that measure the signals captured by the photodetectors 2 and 4. The readout circuitry represents the transistor elements used to read out, address and control the pixels or sub-pixels defined by the respective photodetectors 2 and 4.

For each pixel of the color image and of the infrared image to be acquired, the color sub-pixels RGB-SPix of the image sensor 1 are referred to as the portion of the image sensor 1 comprising the color photodetectors 4, the color photodetectors 4 being obtained by enabling the light radiation in a limited portion of the visible radiation of the image to be acquired; and the infrared pixels IR-Pix are referred to as the part of the image sensor 1 comprising the infrared photodetection appliance 2, which infrared photodetection appliance 2 by enabling the infrared radiation of the pixels acquiring the infrared image.

Fig. 1 and 2 show three color sub-pixels RGB-SPix and one infrared pixel IR-Pix associated with one pixel of a color and infrared image. In the present embodiment, the acquired color image and the infrared image have the same resolution so that the infrared pixel IR-Pix can also be considered as another sub-pixel of the acquired color image. For the sake of clarity, only certain elements of the image sensor present in fig. 2 are shown in fig. 1. The image sensor 1 in fig. 2 includes, from bottom to top:

a semiconductor substrate 10 comprising an upper surface 12, preferably planar;

for each color sub-pixel RGB-SPix, comprising at least one doped semiconductor region 14 formed in the substrate 10 and forming part of the color photodiode 4;

electronic components 16 of the readout circuitry 6 located in the substrate 10 and/or on the surface 12, fig. 2 showing a single electronic component 16;

a stack 18 of insulating layers covering the surface 12, conductive tracks 20 located on the stack 18 and between the insulating layers of the stack 18;

for each infrared pixel IR-Pix, an electrode 22 is included, located on the stack 18 and coupled to one of the substrate 10, the component 16 or one of the conductive tracks 20 via a conductive via 24;

for each infrared pixel IR-Pix, an active layer 26 covering the electrode 22 and possibly the stack 18 around the electrode 22; in top view, the active layer 26 extends only over the surface of the infrared pixels IR-Pix and not over the surface of the color sub-pixels RGB-Pix;

for all color sub-pixels RGB-Pix, an insulating layer 27 covering the stack 18 is included;

for each infrared pixel IR-Pix, an electrode 28, covering the active layer 26 and possibly the insulating layer 27, is coupled to one of the substrate 10, the component 16 or one of the conductive tracks 20 via a conductive via 30;

an insulating layer 32 covering the electrode 28;

a color filter 34 comprising a cover insulating layer 32 for each color sub-pixel RGB-SPix, and a block 36 transparent to infrared radiation comprising a cover insulating layer 32 for the infrared pixel IR-Pix;

for each color sub-pixel RGB-SPix and for the infrared pixel IR-Pix, a micro-lens 38 covering the color filter 34 or the transparent block 36;

an insulating layer covering the microlenses 38; and

and a filter 42 covering the insulating layer 40.

The color sub-pixels RGB-SPix and the infrared pixels IR-Pix may be distributed in rows and columns. In the present embodiment, each color sub-pixel RGB-SPix and each infrared pixel IR-Pix has a square or rectangular base in a direction perpendicular to the surface 12, with a side length between 0.1 μm and 100 μm, for example equal to about 3 μm. However, each sub-pixel SPix may have a differently shaped base, for example a hexagonal shape.

In this embodiment, the active layer 26 is only present at the infrared pixel IR-Pix level of the image sensor 1. The active area of each infrared photodetector 2 corresponds to the area where most of the useful incident infrared radiation is absorbed and converted into electrical signals by the infrared photodetector 2, and substantially corresponds to the portion of the active layer 26 located between the lower electrode 22 and the upper electrode 28.

According to one embodiment, the active layer 26 is capable of capturing electromagnetic radiation having wavelengths in the range of 400nm to 1,100 nm. The infrared photodetector 2 may be made of an organic material. The photodetector may correspond to an Organic Photodiode (OPD) or an organic photoresistor. In the following description, it is considered that the photodetector 2 corresponds to a photodiode.

The filter 42 is capable of allowing visible light to pass, allowing a portion of the infrared radiation over the infrared wavelength range of interest to pass to acquire an infrared image, and of blocking the remainder of the incident radiation, in particular the remainder of the infrared radiation outside the infrared wavelength range of interest. According to one embodiment, the infrared wavelength range of interest may correspond to a 50nm range centered on the expected wavelength of the infrared radiation, e.g., centered on a 940nm wavelength or on a 850nm wavelength. The filter 42 may be an interference filter and/or may include an absorbing layer and/or a reflecting layer.

The color filter 34 may correspond to the colored resin block. Each color filter 34 is capable of allowing light in the wavelength range of visible light to pass through. For each pixel of the color image to be acquired, the image sensor may comprise color sub-pixels RGB-SPix having color filters 34 capable of allowing only blue light (e.g., in the wavelength range of 430nm to 490 nm) to pass through; a sub-pixel RGB-SPix having a color filter 34 capable of allowing only green light (e.g., in a wavelength range of 510nm to 570 nm) to pass through, and a sub-pixel RGB-SPix having a color filter 34 capable of allowing only red light (e.g., in a wavelength range of 600nm to 720 nm) to pass through. The transparent block 36 is capable of allowing infrared radiation as well as visible light to pass through. The transparent block 36 may then correspond to a block of transparent resin. As a variant, the transparent block 36 can allow infrared radiation to pass and block visible light. The transparent block 36 may then correspond to a black resin block or active layer, for example having a structure similar to the active layer 26 and capable of absorbing only radiation in the target spectrum.

Since the filter 42 allows only a useful portion of the near infrared to pass, the active layer 26 receives only a useful portion of the infrared radiation if the transparent block 36 is capable of allowing infrared radiation to pass and blocking visible light. This advantageously enables a simplification of the design of the active layer 26 with a broad and particularly including the absorption range of visible light. In the case where the transparent block 36 is capable of allowing infrared radiation and visible light to pass through, the active layer 26 of the infrared photodiode 2 will capture infrared radiation and visible light. The signal representing only the infrared radiation captured by the infrared photodiode 2 can then be determined by a linear combination of the signals delivered by the infrared photodiode 2 and the color photodiode 4 of the pixel.

According to one embodiment, the semiconductor substrate 10 is made of silicon, preferably monocrystalline silicon. According to one embodiment, the electronic component 16 comprises a transistor, in particular a metal oxide gate field effect transistor, also referred to as a MOS transistor. The color photodiode 4 is an inorganic photodiode, preferably made of silicon. Each color photodiode 4 includes at least a doped silicon region 14 extending in the substrate 10 from the surface 12. According to one embodiment, the substrate 10 is undoped or lightly doped, having a first conductivity type, e.g., P-type, and each region 14 is a doped region having a conductivity type opposite to that of the substrate 10, e.g., N-type. The depth of each region 14 as measured from the surface 12 may be in the range 500nm to 6 μm. The color photodiode 4 may correspond to a pinned photodiode. An example of a pinned photodiode is described in detail in us patent 6677656.

The conductive tracks 20, conductive vias 24, 30 and electrodes 22 may be made of a metallic material, such as silver (Ag), aluminum (Al), gold (Au),Copper (Cu), nickel (Ni), titanium (Ti), and chromium (Cr). The conductive tracks 20, conductive vias 24, 30 and electrodes 22 may have a single or multi-layer structure. Each insulating layer of the stack 18 may be made of an inorganic material, for example, silicon oxide (SiO)2) Or silicon nitride (SiN).

Each electrode 28 is at least partially transparent to the light radiation it receives. Each electrode 28 may be made of a transparent conductive material, for example, a transparent conductive oxide or TCO, from carbon nanotubes, from graphene, from a conductive polymer, from a metal, or a mixture or alloy of at least two of these compounds. Each electrode 28 may have a single-layer or multi-layer structure.

Examples of TCOs that can form each electrode 28 are Indium TiN Oxide (ITO), Aluminum Zinc Oxide (AZO), and Gallium Zinc Oxide (GZO), titanium nitride (TiN), molybdenum oxide (MoO)3) And tungsten oxide (WO)3). One example of a conductive polymer that can form each electrode 28 is a polymer known as PEDOT: PSS, which is a mixture of poly (3,4) -ethylenedioxythiophene and sodium polystyrene sulfonate and polyaniline (also known as PAni). Examples of metals that can form each electrode 28 are silver, aluminum, gold, copper, nickel, titanium, and chromium. Examples of multilayer structures that can form each electrode 28 are multilayer AZO and silver structures of AZO/Ag/AZO type.

The thickness of each electrode 28 may be in the range of 10nm to 5 μm, for example about 30 nm. In the case where the electrode 28 is a metal, the thickness of the electrode 28 is less than or equal to 20nm, preferably less than or equal to 10 nm.

Each insulating layer 27, 32, 40 may be made of a fluorinated polymer, in particular a fluorinated polymer sold by Bellex under the trade name Cytop, and of: polyvinylpyrrolidone (PVP), Polymethylmethacrylate (PMMA), Polystyrene (PS), parylene, Polyimide (PI), Acrylonitrile Butadiene Styrene (ABS), polyethylene terephthalate (PET), polyethylene naphthalate (PEN), Cyclo Olefin Polymer (COP), Polydimethylsiloxane (PDMS), a photoresist resin, an epoxy resin, an acrylate resin, or a mixture of at least two of these compounds. As a variant, each insulating layer 27, 32, 50 may be made of an inorganic dielectric materialMade of, in particular, silicon nitride, silicon oxide or aluminium oxide (Al)2O3). The alumina may be deposited by Atomic Layer Deposition (ALD). The maximum thickness of each insulating layer 27, 32, 50 may be in the range from 50nm to 2 μm, for example, about 100 nm.

The active layer 26 of each infrared pixel IR-Pix may comprise a small molecule, oligomer or polymer. These may be organic or inorganic materials, in particular quantum dots. The active layer 26 may comprise a bipolar semiconductor material, or a mixture of an N-type semiconductor material and a P-type semiconductor material, for example in the form of stacked layers or an intimate mixture on the order of nanometers to form a bulk heterojunction. The thickness of the active layer 26 may be in the range of 50nm to 2 μm, for example, about 200 nm.

Examples of P-type semiconducting polymers capable of forming the active layer 26 are poly (3-hexylthiophene) (P3HT), poly [ N-9' -heptadecyl-2, 7-carbazole-alt-5, 5- (4, 7-di-2-thienyl-2 ',1',3' -benzothiadiazole) ] (PCDTBT), poly [ (4, 8-bis- (2-ethylhexyloxy) -benzo [1, 2-b; 4,5-b ' ] dithiophene) -2, 6-diyl-alt- (4- (2-ethylhexanoyl) -thieno [3,4-b ] thiophene)) -2, 6-diyl ] (PBDTTT-C), poly [ 2-methoxy-5- (2-ethyl-hexyloxy) -1, 4-phenylene-vinylene ] (MEH-PPV) or poly [2,6- (4, 4-bis- (2-ethylhexyl) -4H-cyclopenta-diene [2, 1-b; 3,4-b' ] dithiophene) -alt-4,7(2,1, 3-benzothiadiazole) ] (PCPDTBT).

Examples of N-type semiconductor materials that can form the active layer 26 are fullerenes, in particular C60, [6,6] -phenyl-C61-methyl butyrate ([60] PCBM), [6,6] -phenyl-C71-methyl butyrate ([70] PCBM), perylene diimine, zinc oxide (ZnO), or nanocrystals capable of forming quantum dots.

The active layer 26 of each infrared pixel IR-Pix may be interposed between first and second interface layers (not shown). The interface layer facilitates the collection, injection or blocking of charge from the electrodes to the active layer 26, depending on the photodiode polarization mode. The thickness of each interfacial layer is preferably in the range from 0.1nm to 1 μm. The first interfacial layer is capable of aligning the work function of the adjacent electrode with the electron affinity energy of the acceptor material used in the active layer 26. The first interface layer may be formed of cesium carbonate (CSCO)3) Metal oxides, especially zinc oxide(ZnO) or a mixture of at least two of these compounds. The first interfacial layer may comprise a self-assembled monolayer or polymer, for example, (polyethyleneimine, ethoxylated polyethyleneimine, poly [ (9, 9-bis (3' - (N, N-dimethylamino) propyl) -2, 7-fluorene) -alt-2,7- (9, 9-dioctylfluorene)]. The second interface layer can be aligned with the ionization potential of the donor material used in the active layer 26 using the work function of the other electrode. The second interfacial layer may be made of copper oxide (CuO), nickel oxide (NiO), vanadium oxide (V)2O5) Magnesium oxide (MgO), tungsten oxide (WO)3) Molybdenum oxide (MoO)3) PEDOT PSS or mixtures of at least two of these compounds.

The microlenses 38 have dimensions in the micrometer range. In this embodiment, each color sub-pixel RGB-SPix and each infrared pixel IR-Pix includes a micro-lens 38. As a variant, each microlens 38 can be replaced by another type of micron-range optical element, in particular a micron-range fresnel lens, a micron-range refractive index gradient lens or a micron-range diffraction grating. The microlenses 38 are converging lenses each having a focal length f in the range of 1 μm to 100 μm, preferably 1 μm to 10 μm. According to one embodiment, all of the microlenses 38 are substantially identical.

The microlenses 38 can be made of silicon dioxide, PMMA, positive photosensitive resin, PET, PEN, COP, PDMS/silicone or epoxy. The microlenses 38 can be formed by the flow of a block of resin. The microlenses 38 can also be formed by molding over a layer of PET, PEN, COP, PDMS/silicone, or epoxy.

According to one embodiment, layer 40 is a layer that follows the shape of microlens 38. Layer 40 may be obtained by means of an Optically Clear Adhesive (OCA), in particular a Liquid Optically Clear Adhesive (LOCA), or a material with a low refractive index, or an epoxy/acrylate glue, or a film of a gas or gas mixture, for example air. Preferably, layer 40 is made of a material having a low refractive index when layer 40 follows the shape of microlenses 38, wherein the refractive index is lower than the refractive index of the material of microlenses 48. The layer 40 may be made of a filler material that is a non-stick transparent material. According to another embodiment, layer 40 corresponds to a thin film, such as an OCA film, that is applied against microlens array 38. In this case, the contact area between layer 40 and microlenses 38 may be reduced, e.g., limited to the top of the microlenses. Layer 40 may then be formed of a material having a higher index of refraction than if layer 40 followed the shape of microlenses 38. According to another embodiment, layer 40 corresponds to an OCA film that is applied against microlens array 38, the adhesive having properties that enable film 40 to fully or substantially fully conform to the surfaces of the microlenses.

Depending on the materials considered, the method of forming at least some of the layers of the image sensor 1 may correspond to the so-called additive method, for example by printing the material forming the organic layer directly at the desired location, in particular in the form of a sol-gel, for example by inkjet printing, gravure printing, screen printing, flexography, spraying or drop coating. Depending on the materials considered, the method of forming the layers of the image sensor 1 may correspond to the so-called subtractive method, in which the material forming the organic layers is deposited on the entire structure and then the unused portions are removed, for example by photolithography or laser ablation. Specifically, a method such as spin coating, spray coating, photolithography, slit coating, blade coating, flexography, or screen printing can be used. When the layer is a metal, the metal is deposited on the entire support, for example by evaporation or by cathode sputtering, and the metal layer is defined by etching.

Advantageously, at least some of the layers of the image sensor 1 may be formed by printing techniques. The materials of the aforementioned layers may be deposited in liquid form, for example, by means of an inkjet printer in the form of conductive and semiconductive inks. "material in liquid form" here also refers to a gel material that can be deposited by printing techniques. An annealing step may be provided between the deposition of the different layers, but the annealing temperature cannot exceed 150 ℃, and the deposition and possible annealing may be performed at atmospheric pressure.

In the embodiment shown in fig. 1 and 2, for each pixel of the color and infrared image, the electrode 28 may extend over all color sub-pixels RGB-SPix and infrared pixels IR-Pix, and the via 30 is arranged in an area not corresponding to a sub-pixel, for example at the periphery of the pixel. Furthermore, the electrode 28 may be shared by all pixels of the same row and/or all pixels of the image sensor. In this case, the through-hole 30 may be provided at the periphery of the image sensor 1. According to a variant, the electrode 28 may extend only on the active layer 26 and the vias 30 may be provided on the level of the infrared pixels IR-Pix.

Fig. 3 and 4 are diagrams of another embodiment of an image sensor 50 similar to fig. 1 and 2, respectively. The image sensor 50 comprises all the elements of the image sensor 1 shown in fig. 1 and 2, with the difference that the insulating layer 32 is interposed between the microlens 38 and the color filter 34, and the active layer 26 is arranged in the position of the block 36 (not present), i.e. at the same level as the color filter 34, with the additional difference that the insulating layer 27 is not present. Furthermore, the electrode 28 extends only over the active layer 26 and the via 30 is arranged at the level of the infrared pixel IR-Pix. In this case, the active layer 26 of the infrared photodiode 2 will capture both infrared radiation and visible light. The signal representing only the infrared radiation captured by the infrared photodiode 2 can then be determined by a linear combination of the signals delivered by the infrared photodiode 2 and the color photodiode 4 of the pixel.

Fig. 5 shows a simplified circuit diagram of an embodiment of the read-out circuits 6_ R, 6_ G, 6_ B associated with the color photodiodes 4 of the color sub-pixels RGB-SPix of the pixel of the color image to be acquired, and of the read-out circuit 6_ IR associated with the infrared photodiode 2 of the infrared pixel IR-Pix.

The readout circuits 6_ R, 6_ G, 6_ B, and 6_ IR have similar structures. In the following description, a suffix "_ R" is added to a reference numeral designating a component of the readout circuit 6_ R, a suffix "_ G" is added to a reference numeral designating an identical component of the readout circuit 6_ G, a suffix "_ B" is added to a reference numeral designating an identical component of the readout circuit 6_ B, and a suffix "_ IR" is added to a reference numeral designating an identical component of the readout circuit "6 _ IR".

Each sensing circuit 6_ R, 6_ G, 6_ B, 6_ IR comprises a follower-assembled MOS transistor 60_ R, 60_ G, 60_ B, 60_ IR in series with a MOS select transistor 62_ R, 62_ G, 62_ B, 62_ IR between a first terminal 64_ R, 64_ G, 64_ B, 64_ IR and a second terminal 66_ R, 66_ G, 66_ B, 66_ IR. The terminals 64_ R, 64_ G, 64_ B, 64_ IR are coupled to a source having a high reference potential VDD in the case where the transistors forming the readout circuit are N-channel MOS transistors, or the terminals 64_ R, 64_ G, 64_ B, 64_ IR are coupled to a source having a low reference potential (e.g., ground) in the case where the transistors forming the readout circuit are P-channel MOS transistors. The terminals 66_ R, 66_ G, 66_ B, 66_ IR are coupled to the conductive track 68. The conductive track 68 may be coupled to all color sub-pixels and all infrared pixels of the same column and to a current source 69 not forming part of the read-out circuit 6_ R, 6_ G, 6_ B, 6_ IR. The gates of the transistors 62_ R, 62_ G, 62_ B, 62_ IR are intended to receive a signal SEL _ R, SEL _ G, SEL _ B, SEL _ IR for selecting a color sub-pixel/infrared pixel. The gates of the transistors 60_ R, 60_ G, 60_ B, and 60_ IR are coupled to the node FD _ R, FD _ G, FD _ B, FR _ IR. The node FD _ R, FD _ G, FD _ B, FR _ IR is coupled to the application terminal of a reset potential Vrst _ R, Vrst _ G, Vrst _ B, Vrst _ IR, which may be VDD, through reset MOS transistors 70_ R, 70_ G, 70_ B, 70_ IR. The gates of the transistors 70_ R, 70_ G, 70_ B, 70_ IR are intended to receive a signal RST _ R, RST _ G, RST _ B, RST _ IR for controlling the resetting of the color sub-pixels/infrared pixels, in particular to be able to substantially reset the node FD to the potential Vrst.

The node FD _ R, FD _ G, FD _ B is connected to the cathode electrode of the color photodiode 4 of the color sub-pixel. The anode electrode of the color photodiode 4 is connected to a source (e.g., ground) having a low reference potential GND. The node FD _ IR is coupled to the cathode electrode 22 of the infrared photodiode 2. The anode electrode 28 of the infrared photodiode 4 is coupled to a source having a reference potential V _ IR. One electrode may be provided to be coupled to the node FD _ R, FD _ G, FD _ B, FD _ IR and the other electrode to be coupled to a capacitor (not shown) of a source having a low reference potential GND. As a variant, the role of this capacitor can be achieved by the stray capacitance present at the node FD _ R, FD _ G, FD _ B, FD _ IR.

For each row of color subpixels associated with the same color, the signal SEL _ R, SEL _ G, SEL _ B, RST _ R, RST _ G, RST _ B may be transmitted to all color subpixels in that row. For each row of infrared pixels, the signals SEL _ IR, RST _ IRB, and the potential V _ IR may be transmitted to all infrared pixels in the row. The signals Vrst _ R, Vrst _ G, Vrst _ B, Vrst _ IR may be the same or different. According to one embodiment, the signal Vrst _ R, Vrst _ G, Vrst _ B is the same and the signal Vrst _ IR is different from the signal Vrst _ R, Vrst _ G, Vrst _ B.

Fig. 6 is a timing diagram of binary signals RST _ IR, SEL _ IR, RST _ R, SEL _ R, RST _ G, SEL _ G, RST _ B, SEL _ B and potential V _ IR during the embodiment of the method of operation of the readout circuits 6_ R, 6_ G, 6_ B, 6_ IR shown in fig. 5. T0 to t10 are referred to as continuous times of one operation cycle. The timing chart is established in consideration of the case where the MOS transistors of the readout circuits 6_ R, 6_ G, 6_ B, 6_ IR are N-channel transistors.

At time t0, signals SEL _ IR, SEL _ R, SEL _ G, and SEL _ B are in a low state, causing select transistors 62_ IR, 62_ R, 62_ G, and 62_ B to turn off. The cycle includes a phase of resetting the infrared pixel and the color sub-pixels associated with the red color. For this reason, the signals RST _ IR and RST _ R are in a high level state, so that the reset transistors 70_ IR and 70_ R are turned on. Then, the electric charges accumulated in the infrared photodiode 2 are discharged to a source having a potential Vrst _ IR, and the electric charges accumulated in the color photodiode 4 of the color sub-pixel associated with red are discharged to a source having a potential Vrst _ R.

Just before time t1, the potential V _ IR is set to the low level. At time t1, which marks the start of a new cycle, signal RST _ IR is set to a low state so that transistor 70_ IR is turned off, and signal RST _ R is set to a low state so that transistor 70_ R is turned off. An integration phase is then started for the infrared photodiode 2 during which charge is generated and collected in the photodiode 2, and for the photodiode 4 of the color sub-pixel associated with the red color, during which charge is generated and collected in the photodiode 4. At time t2, the signal RST _ G is set to a low state, so that the transistor 70_ G is turned off. The integration phase is then started for the photodiode 4 of the color sub-pixel associated with green, during which charge is generated and collected in the photodiode 4. At time t3, the signal RST _ B is set to a low state, so that the transistor 70_ B is turned off. The integration phase is then started for the photodiode 4 of the color sub-pixel associated with blue, during which charge is generated and collected in the photodiode 4.

At time t4, the potential V _ IR is set to a high level, which stops the charge collection in the infrared photodiode. The integration phase of the infrared photodiode 2 is thus stopped.

At time t5, the signal SEL _ R is temporarily set to a high state so that the potential of the conductive rail 68 reaches a value representing the voltage at the node FD _ R and thus the amount of charge stored in the photodiode 4 of the color sub-pixel associated with the color red. The integration phase of the photodiode 4 of the color sub-pixel associated with red thus extends from time t1 to time t 5. At time t6, signal SEL _ G is temporarily set to a high state so that the potential of conductive rail 68 reaches a value representing the voltage at node FD _ G and hence the amount of charge stored in photodiode 4 of the color sub-pixel associated with green. The integration phase of the photodiode 4 associated with green therefore extends from time t2 to time t 6. At time t7, the signal SEL _ B is temporarily set to a high state so that the potential of the conductive rail 68 reaches a value representing the voltage at the node FD _ B and thus the amount of charge stored in the photodiode 4 of the color sub-pixel associated with blue. Thus, the integration phase of the photodiode 4 of the color sub-pixel associated with blue extends from time t3 to time t 7. At time t8, the signal SEL _ IR is temporarily set to a high level state, so that the potential of the conductive rail 68 reaches a value representing the voltage at the node FD _ IR, and thus the amount of charge stored in the infrared photodiode 2. At time t9, signals RST _ IR and RST _ R are set to a high state. Time t10 marks the end of a cycle and corresponds to time t1 of the next cycle.

As shown in fig. 6, the integration phases of the color photodiodes of the sub-pixels associated with the same pixel of the color image to be acquired are shifted in time. This enables a rolling shutter type readout method for color photodiodes in which the integration phases of the pixel rows are temporally offset with respect to one another. Furthermore, since the integration phase of the infrared photodiode 2 is controlled by the signal V-IR, the present embodiment advantageously enables to perform a global shutter type readout method for acquiring infrared images, in which the integration phases of all infrared photodiodes are performed simultaneously.

In the case where the image sensor has the structure shown in fig. 3 and 4 or the structure shown in fig. 1 and 2 having the block 36 that does not block visible light, the infrared photodiode 4 can absorb near infrared radiation and visible light. In this case, in order to determine the amount of charge generated by the infrared photodiode during the integration phase due to infrared radiation only, the signal provided by the color photodiode 4 of the sub-pixel associated with the same image pixel may be subtracted from the signal provided by the infrared photodiode 2. However, the integration phase of the color sub-pixels is preferably performed simultaneously with the integration phase of the infrared photodiode 2. Each readout circuit 6_ R, 6_ G, 6_ B, 6_ IR shown in fig. 5 then further includes a MOS transfer transistor located between the node FD _ R, FR _ G, FD _ B, FD _ IR and the cathode electrode of the photodiode 4, 2. The transfer transistor enables control of the start and end of the color photodiode integration phase, and thus a global shutter type readout method for acquiring a color image can be realized.

Various embodiments and modifications have been described. Those skilled in the art will appreciate that certain features of these embodiments may be combined and that other variations will readily occur to those skilled in the art. Specifically, the structure shown in fig. 2 covering the electrode 28 of the photodiode 4 may be implemented for the image sensor 50 shown in fig. 4. Further, in the case where each of the readout circuits 6_ R, 6_ G, 6_ B, 6_ IR shown in fig. 5 further includes a MOS transfer transistor between the node FD _ R, FR _ G, FD _ B, FD _ IR and the cathode electrode of the photodiode 4, 2, a readout method may be provided in which reading of the first value V1 representing the potential of the node FD _ R, FD _ G, FD _ B, FD _ IR may be performed immediately after the reset transistors 70_ R, 70_ G, 70_ B, 70_ IR are turned on, and reading of the second value V2 representing the potential of the node FD _ R, FD _ G, FD _ B, FD _ IR may be performed immediately after the transfer transistors are turned on. The difference between the values V2 and V1 represents the amount of charge stored in the photodiode while suppressing thermal noise due to the reset transistors 70_ R, 70_ G, 70_ B, 70_ IR. Finally, the actual implementation of the embodiments and variants described herein is within the abilities of one of ordinary skill in the art based on the functional description provided above.

16页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:成像装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类