Image sensor with multi-pixel detector and partial isolation structure

文档序号:408886 发布日期:2021-12-17 浏览:6次 中文

阅读说明:本技术 具有多像素检测器及部分隔离结构的图像传感器 (Image sensor with multi-pixel detector and partial isolation structure ) 是由 艾群咏 渡边一史 熊志伟 V·瓦乃兹艾 于 2021-06-15 设计创作,主要内容包括:本申请案涉及具有多像素检测器及部分隔离结构的图像传感器。描述一种图像传感器的多像素检测器。所述多像素检测器包含安置在半导体衬底内以形成第一像素的第一光电二极管区域,安置在所述半导体衬底内以形成相邻于所述第一像素的第二像素的第二光电二极管区域,及在所述第一光电二极管区域与所述第二光电二极管区域之间从所述半导体衬底的第一侧朝向所述半导体衬底的第二侧延伸的部分隔离结构。所述第一光电二极管区域与所述第二光电二极管区域之间的所述部分隔离结构的横向部分的长度小于所述第一光电二极管区域的横向长度。(The application relates to an image sensor having a multi-pixel detector and a partial isolation structure. A multi-pixel detector of an image sensor is described. The multi-pixel detector includes a first photodiode region disposed within a semiconductor substrate to form a first pixel, a second photodiode region disposed within the semiconductor substrate to form a second pixel adjacent to the first pixel, and a partial isolation structure extending between the first photodiode region and the second photodiode region from a first side of the semiconductor substrate toward a second side of the semiconductor substrate. A length of a lateral portion of the partial isolation structure between the first photodiode region and the second photodiode region is less than a lateral length of the first photodiode region.)

1. A multi-pixel detector of an image sensor, the multi-pixel detector comprising:

a first photodiode region disposed within a semiconductor substrate to form a first pixel;

a second photodiode region disposed within the semiconductor substrate to form a second pixel adjacent to the first pixel;

a partial isolation structure extending between the first photodiode region and the second photodiode region from a first side of the semiconductor substrate toward a second side of the semiconductor substrate, and wherein a length of a lateral portion of the partial isolation structure between the first photodiode region and the second photodiode region is less than a lateral length of the first photodiode region.

2. The multi-pixel detector of claim 1, wherein the first photodiode region, the second photodiode region, and the partial isolation structure are positioned in the semiconductor substrate such that a first cross section of the multi-pixel detector taken along a first direction extends through the first photodiode region, the second photodiode region, and the partial isolation structure, and a second cross section of the multi-pixel detector taken along a second direction parallel to the first direction extends through the first photodiode region and the second photodiode region, but does not extend through the partial isolation structure.

3. The multi-pixel detector of claim 1, further comprising:

a deep trench isolation structure collectively laterally surrounding the first photodiode region and the second photodiode region, and wherein the partial isolation structure is coupled to the deep trench isolation structure.

4. The multi-pixel detector of claim 3, wherein the deep trench isolation structure and the partial isolation structure extend a first depth and a second depth, respectively, from the first side of the semiconductor substrate toward the second side of the semiconductor substrate, and wherein the first depth is substantially equal to the second depth.

5. The multi-pixel detector of claim 1, further comprising:

a third photodiode region disposed within the semiconductor substrate to form a third pixel; and

a fourth photodiode region disposed within the semiconductor substrate to form a fourth pixel adjacent to the third pixel, and wherein the first, second, third, and fourth pixels are arranged in a two-by-two array of pixels to form a phase detection autofocus sensor included in the image sensor.

6. The multi-pixel detector of claim 5, further comprising:

a deep trench isolation structure laterally surrounding the first, second, third, and fourth photodiode regions.

7. The multi-pixel detector of claim 6, further comprising:

a multi-finger isolation structure extending from the first side of the semiconductor substrate toward the second side of the semiconductor substrate, the multi-finger isolation structure comprising:

a first partial isolation structure disposed between the first photodiode region and the second photodiode region, wherein the first partial isolation structure corresponds to the partial isolation structure;

a second partial isolation structure disposed between the second photodiode region and the third photodiode region;

a third partial isolation structure disposed between the third photodiode region and the fourth photodiode region; and

a fourth partial isolation structure disposed between the fourth photodiode region and the first photodiode region.

8. The multi-pixel detector of claim 7, wherein the first, second, third, and fourth partial isolation structures each extend laterally from the deep trench isolation structure toward a central region of the multi-pixel detector located between the first, second, third, and fourth photodiode regions, and wherein each of the first, second, third, and fourth partial isolation structures does not extend into the central region.

9. The multi-pixel detector of claim 7, wherein the first partial isolation structure and the third partial isolation structure are aligned along a first common direction, wherein the second partial isolation structure and the fourth partial isolation structure are aligned along a second common direction, and wherein the first common direction is perpendicular to the second common direction.

10. The multi-pixel detector of claim 7, wherein a first lateral length of the first partial isolation structure is less than a corresponding lateral length of at least one of the second partial isolation structure, the third partial isolation structure, or the fourth partial isolation structure.

11. The multi-pixel detector of claim 5, further comprising:

a common microlens optically aligned with the first, second, third, and fourth photodiode regions.

12. An image sensor comprising a plurality of repeating units, each repeating unit included in the plurality of repeating units comprising:

a first quadrant including a multi-pixel detector to collect phase information for phase detection autofocus, wherein the multi-pixel detector includes:

a deep trench isolation structure forming a perimeter boundary of the first quadrant;

a first photodiode region, a second photodiode region, a third photodiode region, and a fourth photodiode region arranged together in two rows and two columns to form a two-by-two array of pixels laterally surrounded by the deep trench isolation structure; and

a multi-finger isolation structure including four partial isolation structures, each disposed between a respective pair of adjacent photodiode regions included in the two-by-two array of pixels, and wherein each of the four partial isolation structures extends from the deep trench isolation structure toward a center of the two-by-two array of pixels without directly contacting each other.

13. The image sensor of claim 12, wherein the plurality of repeating units comprises a first repeating unit, wherein the four partial isolation structures of the first repeating unit comprise a first partial isolation structure, a second partial isolation structure, a third partial isolation structure, and a fourth partial isolation structure, and wherein a first lateral length of the first partial isolation structure is less than a corresponding lateral length of at least one of the second partial isolation structure, the third partial isolation structure, or the fourth partial isolation structure.

14. The image sensor of claim 13, wherein the plurality of repeating units includes a second repeating unit positioned centrally within the plurality of repeating units, and wherein each of the four partial isolation structures of the second repeating unit has a common lateral length that is greater than the lateral length of the first partial isolation structure included in the first repeating unit.

15. The image sensor of claim 12, wherein lateral lengths of the four partial isolation structures included in each of the plurality of repeating units are adjusted such that the multi-pixel detector of each of the plurality of repeating units has substantially equal angular selectivity to incident light with respect to exposure time throughout the image sensor.

16. The image sensor of claim 12, wherein the first quadrant of each of the plurality of repeating units further includes:

a common microlens optically aligned with the first, second, third, and fourth photodiode regions.

17. The image sensor of claim 16, wherein each of the plurality of repeating units further includes other quadrants, wherein each of the other quadrants includes a two-by-two array of image pixels, and wherein each of the image pixels is optically coupled with an individual microlens.

18. The image sensor of claim 17, wherein the multi-pixel detector of a first repeating unit included in the plurality of repeating units is laterally surrounded by the image pixels of the first repeating unit included in the plurality of repeating units and an adjacent repeating unit, and wherein the common microlenses of the first repeating unit are laterally surrounded by the individual microlenses included in the first repeating unit and the adjacent repeating unit.

19. The image sensor of claim 16, wherein the common microlens of a given repeating unit included in the plurality of repeating units is optically offset based on a position of the given repeating unit within the image sensor, and wherein a degree of offset is adjusted such that the multi-pixel detector included in each of the plurality of repeating units has a substantially equal angular selectivity to incident light with respect to exposure time throughout the image sensor.

20. The image sensor of claim 19, wherein the degree of offset increases from a midpoint of the image sensor toward a perimeter of the image sensor.

Technical Field

The present disclosure relates generally to image sensors and, in particular, but not exclusively, to CMOS image sensors and applications thereof.

Background

Image sensors have become ubiquitous and are now widely used in digital cameras, cell phones, surveillance cameras, and medical, automotive and other applications. As image sensors are integrated into a wider range of electronic devices, it is desirable to enhance their functionality, performance metrics, and the like in as many ways as possible (e.g., resolution, power consumption, dynamic range, etc.) through both device architecture design and image acquisition processing.

Typical image sensors operate in response to image light reflected from an external scene being incident on the image sensor. An image sensor includes a pixel array having a photosensitive element (e.g., a photodiode) that absorbs a portion of incident image light and generates image charge after absorbing the image light. The image charge of each of the pixels can be measured as the output voltage of each photosensitive element, which varies as a function of the incident image light. In other words, the amount of image charge generated is proportional to the intensity of the image light used to produce the digital image (i.e., image data) representing the external scene.

Disclosure of Invention

In one aspect, the present application provides a multi-pixel detector of an image sensor, the multi-pixel detector comprising: a first photodiode region disposed within a semiconductor substrate to form a first pixel; a second photodiode region disposed within the semiconductor substrate to form a second pixel adjacent to the first pixel; a partial isolation structure extending between the first photodiode region and the second photodiode region from a first side of the semiconductor substrate toward a second side of the semiconductor substrate, and wherein a length of a lateral portion of the partial isolation structure between the first photodiode region and the second photodiode region is less than a lateral length of the first photodiode region.

In another aspect, the present application provides an image sensor including a plurality of repeating units, each repeating unit included in the plurality of repeating units comprising: a first quadrant including a multi-pixel detector to collect phase information for phase detection autofocus, wherein the multi-pixel detector includes: a deep trench isolation structure forming a perimeter boundary of the first quadrant; a first photodiode region, a second photodiode region, a third photodiode region, and a fourth photodiode region arranged together in two rows and two columns to form a two-by-two array of pixels laterally surrounded by the deep trench isolation structure; and a multi-finger isolation structure including four partial isolation structures, each disposed between a respective pair of adjacent photodiode regions included in the two-by-two array of pixels, and wherein each of the four partial isolation structures extends from the deep trench isolation structure toward a center of the two-by-two array of pixels without directly contacting each other.

Drawings

Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified. Not all instances of an element may be labeled as being necessary to avoid obscuring the figure in the appropriate cases. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles described.

FIG. 1A illustrates a top view of an image sensor having a multi-pixel detector and partial isolation structures according to the teachings of the present disclosure.

Figure 1B illustrates a top view of a repeating unit included in an image sensor having a multi-pixel detector and partial isolation structures according to the teachings of the present disclosure.

Fig. 1C illustrates an enlarged top view of a portion of an isolation structure included in a multi-pixel detector according to the teachings of the present disclosure.

Figures 1D-1E illustrate partial cross-sectional views of an image sensor having a multi-pixel detector and partial isolation structures according to the teachings of the present disclosure.

Figure 2A illustrates a partial cross-sectional view of an image sensor having a multi-pixel detector without partial isolation structures according to the teachings of the present disclosure.

Fig. 2B illustrates that the angle relative to the exposure time of a multi-pixel detector without partial isolation structures is selectable according to the teachings of the present disclosure.

Figure 3A illustrates a top view of an image sensor having a multi-pixel detector and partial isolation structures having varying lateral lengths based on relative position within the image sensor according to teachings of the present disclosure.

Figure 3B illustrates a top view of an image sensor having a multi-pixel detector, partial isolation structures, and offset common microlenses according to the teachings of the present disclosure.

Figure 4 is a functional block diagram of an imaging system including an image sensor with a multi-pixel detector and partial isolation structures according to the teachings of the present disclosure.

Detailed Description

Embodiments of apparatus and systems each include or otherwise relate to an image sensor having a multi-pixel detector and a partial isolation structure. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein may be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.

Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.

Throughout this specification, several terms of art are used. These terms will have their ordinary meaning in the art to which they pertain unless explicitly defined herein or otherwise clearly indicated by the context in which they are used. It should be noted that element names and symbols may be used interchangeably throughout this document (e.g., Si vs. silicon); however, both have the same meaning.

Embodiments described herein utilize an image sensor having an architecture that may include a plurality of repeating units, each of which is structured to include a multi-pixel detector to collect phase information representative of incident light of an external scene (e.g., for phase detection autofocus). In some embodiments, the multi-pixel detector can include a plurality of photodiode regions arranged in rows and columns to form a two-by-two array of pixels. Advantageously, the multi-pixel detector includes one or more partial isolation structures disposed between photodiode regions of a two-by-two array of pixels, which provides improved light sensitivity and angular selectivity. In general, the partial isolation structures of a two-by-two array of a given pixel may be referred to as multi-finger isolation structures. In some embodiments, the lateral length of the fingers (i.e., individual partial isolation structures) of a given multi-finger isolation structure may be adjusted to allow selective homogenization of the angle of the multi-pixel detector throughout the image sensor, which may be used to improve phase detection autofocus under varying illumination conditions. In the same or other embodiments, each of the two-by-two arrays of pixels may be respectively aligned with a common microlens having a variable degree of offset depending on position within the image sensor to further adjust (i.e., homogenize) the angular selectivity and light sensitivity of the multi-pixel detector included in the image sensor.

Fig. 1A-1E illustrate representative views of an image sensor 100 including a multi-pixel detector and partial isolation structures. It should be understood that the views presented in fig. 1A-1E may omit certain elements of the image sensor 100 to avoid obscuring the details of the present disclosure. In other words, not all elements of image sensor 100 may be labeled, illustrated, or otherwise shown within individual of fig. 1A-1E. It should be further understood that in some embodiments, image sensor 100 may not necessarily include all of the elements shown in fig. 1A-1E. Further, it should also be noted that the various views or partial views of the image sensor 100 in fig. 1A-1E are illustrated with respect to a coordinate system 195 formed by axes x, y, and z, where the x-y plane is parallel to the planarized first side 150 of the semiconductor substrate 101 (e.g., as shown in fig. 1D-1E).

FIG. 1A illustrates a top view 100-A of an image sensor 100 having a multi-pixel detector and partial isolation structure according to the teachings of the present disclosure. The image sensor 100 includes a plurality of repeating units 103 disposed within a semiconductor substrate 101 (e.g., a silicon wafer). Each repeating unit includes pixels arranged in an array (e.g., four rows and four columns) formed at least in part by photodiode regions 105 disposed within a semiconductor substrate 101. The photodiode region 105 is a region of the semiconductor substrate 101 that has been doped or otherwise modified to facilitate the generation of image charges for light sensing (e.g., accumulated charges generated proportionally in response to the magnitude or intensity of incident light). More particularly, image charge may be extracted or otherwise measured (e.g., by readout circuitry) for collecting phase difference information, generating images representative of external scenes, and the like.

The repeating units 103 of the image sensor 100 are positioned within the semiconductor substrate 101 in a regular, repeating manner (e.g., as an array of repeating units) to collectively form the image sensor 100. It should be understood that each repeating unit of the image sensor 100 may include a multi-pixel detector for collecting phase information and a plurality of image pixels for image generation (see, e.g., fig. 1B). However, in other embodiments, other repeating units (e.g., repeating units without a multi-pixel detector) may also be included in the image sensor 100. In other words, the number of multi-pixel detectors within the image sensor 100 may be customized according to a target specification. Accordingly, it should be understood that in some embodiments, the illustrated repeating unit 103 may be the smallest repeating unit of the image sensor 100. However, the illustrated embodiments of the image sensor 100 should not be considered limiting and other configurations or architectures may be utilized in accordance with the teachings of the present disclosure. Further, it should be understood that certain elements may be omitted to avoid obscuring certain aspects of the disclosure. For example, the illustrated embodiment of the image sensor 100 included in fig. 1A may further include one or more microlenses, color filters, metal grids, circuitry, and the like, which are omitted for clarity.

Fig. 1B illustrates a top view of a repeating unit 103 included in the image sensor 100 of fig. 1A, according to teachings of the present disclosure. The illustrated view of the repeat unit 103 in FIG. 1B may represent each repeat unit included in the image sensor 100 along an x-y plane of a coordinate system 195. The repeating unit 103 includes a first quadrant 107, a second quadrant 109, a third quadrant 111, and a fourth quadrant 113, each including a plurality of photodiode regions 105 (e.g., 105-1, 105-2, 105-3, and 105-4) arranged in rows and columns to form a two-by-two array of pixels. It should be understood that each pixel may be referenced with respect to a given photodiode region 105. For example, a first pixel included in the first quadrant 107 may be referenced with respect to the photodiode region 105-1. A given pixel may be defined as an element of image sensor 100 capable of generating an image charge in response to incident light; the image charge can be measured or otherwise quantified, for example, in the form of voltage and/or current measurements proportional to the intensity or power of light incident on a given pixel.

In the illustrated embodiment, the first quadrant 107 forms a multi-pixel detector included in an image sensor capable of collecting phase information of incident light (e.g., a sensor for phase detection autofocus). The first quadrant 107 includes a first photodiode region 105-1, a second photodiode region 105-2, a third photodiode region 105-3, and a fourth photodiode region 105-4, each forming respective first, second, third, and fourth pixels that collectively form a multi-pixel detector. Based on the arrangement of the photodiode region 105, the first pixel (i.e., photodiode region 105-1) is laterally adjacent to the second pixel (i.e., photodiode region 105-2) and the fourth pixel (i.e., photodiode region 105-4), the second pixel is laterally adjacent to the first pixel and the third pixel (i.e., photodiode region 105-3), the third pixel is laterally adjacent to the fourth pixel and the second pixel, and the fourth pixel is laterally adjacent to the third pixel and the first pixel. It should be understood that the terms "lateral" or "transversely" refer to a direction substantially parallel to the x-y plane of the coordinate system 195. Further, it should be noted that the first pixel and the third pixel are diagonally adjacent to each other. Similarly, the second and fourth pixels are diagonally adjacent to each other.

As illustrated in fig. 1B, the photodiode region 105 of the first quadrant 107 is collectively laterally surrounded by a deep trench isolation structure 115. It should be understood that the deep trench isolation structures 115 form a perimeter boundary that defines the first quadrant 107. The multi-pixel detector further includes a multi-finger isolation structure including four fingers (e.g., partial isolation structures 117, 119, 121, and 123) coupled to (e.g., in direct contact with) the deep trench isolation structure 107 and disposed between neighboring photodiode regions 105 (e.g., the first partial isolation structure 117 is disposed between the first photodiode region 105-1 and the photodiode region 105-2 along the z-axis of the coordinate system 195). It is understood that the partial isolation structures 117, 119, 121, and 123 are part of a common interconnect structure (e.g., one or more of the partial isolation structures 117, 119, 121, and/or 123 extend from the deep trench isolation structure 115). In one embodiment, one or more of the partial isolation structures 117, 119, 121, and 123 extend laterally from the deep trench isolation structure 115 toward, but not into, a center point or region of the multi-pixel detector (e.g., a region centered between the photodiode regions 105-1, 105-2, 105-3, and 105-4). In other words, portions of isolation structures 117, 119, 121, and 123 are not in direct contact with each other. In another embodiment, the partial isolation structures 117, 119, 121, and 123 may not extend from the deep trench isolation structure 115 (e.g., a portion of the semiconductor substrate may be disposed between the partial isolation structure and the deep trench isolation structure).

In one embodiment, the pairs of partial isolation structures may be parallel to each other and/or perpendicular to each other. For example, the partial isolation structures 117 and 121 are oriented along a first common direction (e.g., a direction parallel to the x-axis), while the partial isolation structures 119 and 123 are oriented along a second common direction perpendicular to the first common direction (e.g., a direction parallel to the y-axis). In the illustrated embodiment, each of the partial isolation structures 117, 119, 121, and 123 has a common lateral length (e.g., distance traversed in the x-y plane). However, in other embodiments, one or more of the partial isolation structures 117, 119, 121, and 123 may have different lateral lengths (e.g., the partial isolation structure 117 may have a lateral length that is less than the lateral lengths of the partial isolation structures 119, 121, and 123).

The other quadrants (e.g., the second quadrant 109, the third quadrant 111, and the fourth quadrant 113) of the repeating unit 103 each include a two-by-two array of image pixels (e.g., defined by the photodiode region 105 within the respective quadrant). Each of the image pixels is optically coupled with an individual microlens and color filter to generate an image of the external scene. For example, quadrants 109, 111, and 113 may each have red, green, and blue color filters, respectively, which may provide color and intensity information for a portion of an external scene.

Fig. 1C illustrates an enlarged top view of a portion of the isolation structure included in the multi-pixel detector of the first quadrant 107 shown in fig. 1B, according to the teachings of the present disclosure. More specifically, FIG. 1C is a view of a multi-pixel detector oriented in the x-y plane of a coordinate system 195 and includes a first photodiode region 105-1, a second photodiode region 105-2, a deep trench isolation structure 115, a first partial isolation structure 117, a second partial isolation structure 119, and a third partial isolation structure 123 disposed within a semiconductor substrate 101.

As illustrated, the first photodiode region 105-1 is disposed adjacent to the second photodiode region 105-2. The first partial isolation structure 117 is disposed between the first photodiode region 105-1 and the second photodiode region 105-2. The length 133 of the lateral portion of the first partial isolation structure 117 disposed between the first photodiode region 105-1 and the second photodiode region 105-2 is less than the lateral length 131 of the first photodiode region 105-1. More specifically, the first photodiode region 105-1, the second photodiode region 105-2, and the first partial isolation structure 117 are positioned in the semiconductor substrate 101 such that a first cross section of the multi-pixel detector taken along a first direction AA ' extends through the first photodiode region 105-1, the second photodiode region 105-2, and the first partial isolation structure 117, while a second cross section of the multi-pixel detector taken along a second direction BB ' parallel to the first direction AA ' extends through the first photodiode region 105-1 and the second photodiode region 105-2, but does not pass through the first partial isolation structure 117.

It is understood that the critical dimensions of both the partial isolation structure 117 and the deep trench isolation structure 115 may be the same (e.g., 0.2 μm) or different. The critical dimension may be referred to as a width or a thickness and may correspond to a distance traversed by an element perpendicular to a longitudinal direction of the element. For example, the critical dimension of the partial isolation structure 117 may correspond to a distance traversed by the partial isolation structure 117 in a direction of a y-axis of the coordinate system 195 perpendicular to a longitudinal direction of the partial isolation structure 117 (i.e., a direction of an x-axis of the coordinate system 195).

FIGS. 1D and 1E illustrate partial cross-sectional views 100-AA ' and 100-BB ' along lines A-A ', respectively, of the image sensor 100 shown in FIG. 1B, according to the teachings of the present disclosure. More particularly, representative cross-sectional views of a first quadrant 107 and a second quadrant 109 of the repeating unit 103 along a y-z plane of a coordinate system 195 are illustrated. The first quadrant 107 includes a multi-pixel detector 151 capable of collecting phase information of the external scene (e.g., for phase detection autofocus), while the second quadrant 109 includes a set of adjacent image pixels (including image pixels 153-1 and 153-2) for imaging the external scene.

The multi-pixel detector 151 of the first quadrant 107 includes a photodiode region 105, a deep trench isolation structure 115, a partial isolation structure 117, a color filter 139, a common microlens 141, and a metal grid 145. The partial isolation structure 117 extends from the first side 135 of the semiconductor substrate 101 toward the second side 137 of the semiconductor substrate (i.e., in the z-axis direction of the coordinate system 195) between the first photodiode region 105-1 and the second photodiode region 105-2. The deep trench isolation structure 115 (e.g., portions of the deep trench isolation structures 115-3 and/or 115-4) and the partial isolation structure 117 extend a first depth and a second depth, respectively, from a first side 135 (e.g., a backside) of the semiconductor substrate 101 toward a second side 137 (e.g., a front side) of the semiconductor substrate 101. As illustrated, the first depth is substantially equal to the second depth because the deep trench isolation structure 115 and the partial isolation structure 117 extend an equal depth into the semiconductor substrate 101 in a direction along the z-axis.

The common microlens 141 of the multi-pixel detector 151 is optically aligned with each of the photodiode regions 105 included in the first quadrant 107 (e.g., the first photodiode regions 105-1, 105-2, 105-3, and 105-4 of the first quadrant 107 in the repeating unit 103 illustrated in fig. 1B) and the individual color filter 139 (e.g., a green color filter). More specifically, the common microlens 141 is shaped to direct incident light through the color filter 139 toward the photodiode region 105 included in the first quadrant 107. Light incident on the photodiode region 105 of the multi-pixel detector 151 generates image charge that can be quantified in the form of voltage or current measurements. Measurements obtained between adjacent pixels in the multi-pixel detector 151 may be compared to collect or otherwise determine phase information (e.g., for auto-focusing).

As illustrated, a portion of isolation structure 117 extends laterally (e.g., in a direction parallel to the x-axis of coordinate system 195) between first photodiode region 105-1 and second photodiode region 105-2 in multi-pixel detector 151. However, as shown in view 100-BB' of FIG. 1E, the partial isolation structure does not extend across the entire length of the first and second photodiode regions 105-1 and 105-2. The length of the partial isolation structure (e.g., the distance traversed in the direction of the x-axis for the partial isolation structure 117) is tailored to provide a balance between optical sensitivity and selectivity. The partial isolation structures may mitigate or reduce electrical crosstalk (e.g., blooming) between adjacent photodiode regions (e.g., between photodiode regions 105-1 and 105-2). In some embodiments, the partial isolation structure may also partially attenuate and/or reflect incident light.

Second quadrant 109 includes image pixels 153-1 and 153-2, which image pixels 153-1 and 153-2 may be substantially similar or identical to other image pixels included in the second quadrant (e.g., second quadrant 109 includes a two-by-two array of image pixels, as illustrated in FIG. 1B). As illustrated, each image pixel 153 includes an individual microlens optically aligned with a photodiode region (e.g., microlens 143-1 is optically aligned with photodiode region 105-1 of second quadrant 109). Each of the microlenses, such as common microlens 141 and/or individual microlenses 143, can be formed of a polymer (e.g., polymethylmethacrylate, polydimethylsiloxane, etc.) or other material, and shaped to have optical power for converging, dispersing, or otherwise directing light incident on the microlens through a corresponding optically aligned one of the plurality of color filters 139 to a respective one of the plurality of photodiode regions 105.

The color filter 139 has a corresponding spectral light response that describes the portion of the electromagnetic spectrum that the color filter transmits (e.g., the color filter 139-R transmits "red" light while reflecting or attenuating portions of the electromagnetic spectrum outside the "red" color). Disposed between the color filters 139 is a metal grid 145 that separates the color filters having different spectral light responses and reflects light incident on the metal grid 145 toward adjacent photodiode regions 105.

It should be understood that when looking at fig. 1D or 1E relative to fig. 1A-1C, the multi-pixel detector 151 in a given repeat unit 103 is laterally surrounded by a plurality of image pixels 153 included in the given repeat unit 103 and adjacent repeat units. Thus, the common microlenses 151 of a given repeating unit are also laterally surrounded by the individual microlenses 143 included in the given repeating unit and in adjacent repeating units.

Referring back to fig. 1B, it should be understood that each repeating unit 103 includes a different color filter to generate a "full color" image of the external scene. For example, the color filters of the first, second, third, and fourth quadrants 107, 109, 111, 113 may have spectral light responses of green, red, green, and blue, respectively, to generate "full color" information of a portion of the external scene. In some embodiments, image information is generated using only quadrants with a given repeating unit of image pixels (i.e., not a multi-pixel detector). However, in other embodiments, all quadrants may be used to collect image information (e.g., a multi-pixel detector may provide image information in addition to phase information provided during different image acquisition steps). It is understood that different color filter patterns may be utilized in accordance with embodiments of the present disclosure, in addition to the quad bayer (i.e., BGGR).

It is understood that the image sensor 100 of fig. 1A-1E can be fabricated by semiconductor device processing and microfabrication techniques known to those of ordinary skill in the art. In one embodiment, fabrication of the image sensor 100 may include providing a semiconductor substrate (e.g., a silicon wafer having a front side and a back side), forming a mask or template (e.g., formed from cured photoresist) on the front side of the semiconductor material via photolithography to provide a plurality of exposed regions of the front side of the semiconductor material, doping (e.g., via ion implantation, chemical vapor deposition, physical vapor deposition, and the like) the exposed portions of the semiconductor material to form a plurality of photodiode regions 105 extending into the semiconductor substrate 101 from the front side 135 of the semiconductor substrate 101, removing the mask or template (e.g., by dissolving the cured photoresist with a solvent), and planarizing (e.g., via chemical-mechanical planarization or polishing) the front side of the semiconductor substrate 101. In the same or another embodiment, photolithography can similarly be used to form a plurality of color filters 139 (e.g., via a cured pigmented polymer having a desired spectral photoresponse), a plurality of common microlenses 141, a plurality of individual microlenses 143 (e.g., polymer-based microlenses of a target shape and size formed from a master mold or template), a metal grid 145, deep trench isolation structures 115, and partial isolation structures 117. It should be noted that the deep trench isolation structures and portions of the isolation structures may be formed by etching trenches into the semiconductor substrate 101, and then filling the trenches with a target material (e.g., one or more dielectric materials, such as silicon dioxide) that forms the corresponding structures. In some embodiments, the trenches formed for the deep trench isolation structures 115 and/or the partial isolation structures 117 may be lined with one or more dielectric materials (e.g., silicon dioxide) and then further filled with another material (e.g., undoped polysilicon). It should be understood that the described techniques are merely illustrative and not exhaustive, and that other techniques may be utilized to fabricate one or more components of the image sensor 100.

Fig. 2A illustrates a partial cross-sectional view of an image sensor having a multi-pixel detector 251 without partial isolation structures according to the teachings of the present disclosure. The multi-pixel detector includes a semiconductor substrate 201, a first photodiode region 205-1, a second photodiode region 205-2, a deep trench isolation structure 215, a color filter 239, a common microlens 241, and a metal grid 245. It is noted that the multi-pixel detector 251 may correspond to the multi-pixel detector 151 illustrated in fig. 1D and 1E, except that the multi-pixel detector 251 does not include a partial isolation structure (e.g., the partial isolation structure 117 of fig. 1D). By omitting part of the isolation structure, the multi-pixel structure 251 increases the light sensitivity of the photodiode region at the expense of angle selectivity with respect to exposure time.

Fig. 2B illustrates that the angle relative to the exposure time of the multi-pixel detector 251 shown in fig. 2A without partial isolation structures may be selective, according to the teachings of the present disclosure. More particularly, fig. 2B shows pixel response time relative to incident light angle (i.e., θ illustrated in fig. 2A) in arbitrary units for a first exposure time (e.g., plot 270) and a second exposure time (e.g., plot 280). The y-axis of graphs 270 and 280 is the pixel response, which corresponds to the measured signal (e.g., voltage or current measurement) associated with the neighboring photodiode regions (e.g., photodiode regions 205-1 and 205-2). The x-axis of graphs 270 and 280 correspond to incident light angles as shown in fig. 2A, where θ is the incident light angle measured from the normal of the multi-pixel detector. The solid and dashed lines of graphs 270 and 290 correspond to the right photodiode region (e.g., photodiode region 205-2 of fig. 2A) and the left photodiode region (e.g., photodiode region 205-1 of fig. 2A), respectively.

Referring back to fig. 2B, the first exposure time of plot 270 is greater than the second exposure time of plot 280. In the illustrated embodiment, the first and second exposure times correspond to approximately 0.5 seconds and 0.07 seconds durations, respectively. However, it should be understood that other exposure durations for auto-focusing or collecting phase information may be utilized. As shown by comparing graphs 270 and 280, longer exposure times result in less well-defined pixel response peaks, indicating that the reduced angle may be selective relative to shorter exposure times. It should be further appreciated that the angle relative to the exposure time affecting adjacent photodiode regions included in the multi-pixel detector may be selectively varied further based on the position of the multi-pixel detector within the image sensor. For example, a multi-pixel detector located centrally on an image sensor may have different degrees of angular selectivity relative to a multi-pixel detector located near an outer edge of the image sensor.

However, depending on the lighting conditions, it may be desirable to utilize different exposure times for autofocusing. For example, in low light conditions, it may be desirable to use a longer exposure time for collecting phase information relative to normal lighting conditions. Accordingly, in some embodiments, it may be desirable to mitigate variations in the angle of the multi-pixel detector of the image sensor that may be selective with respect to exposure time and/or position of the multi-pixel detector within the image sensor. As shown in embodiments of the present disclosure (see, e.g., fig. 3A and 3B), the angularly selectable changes may be normalized across the image sensor by adjusting the length of one or more partial isolation structures in the multi-finger isolation structure of the multi-pixel detector, offsetting a common microlens, or a combination thereof.

Fig. 3A illustrates a top view of an image sensor 300 having a multi-pixel detector and partial isolation structures having varying lateral lengths based on relative position within the image sensor 300, according to the teachings of the present disclosure. Image sensor 300 is a variation of image sensor 100 illustrated in fig. 1A-1E, modified to have substantially uniform angular selectivity to incident light over the entire image sensor 300 with respect to exposure time (e.g., the angles for each repeating unit 303 may be substantially similar for the first exposure time and the second exposure time).

The image sensor 300 of fig. 3A includes a plurality of repeating units 303, each including a first quadrant 307, the first quadrant 307 including a multi-pixel detector capable of collecting phase information, similar to the first quadrant 107 described in fig. 1B-1E. One difference is that the lateral length of one or more partial isolation structures 312 (i.e., the partial isolation structures labeled by dashed circles in fig. 3A) included in each of the plurality of repeating units 303 is adjusted based on the position within the image sensor 300 such that the multi-pixel detector of each of the plurality of repeating units 303 has substantially equal angular selectivity to incident light with respect to exposure time throughout the image sensor 300. For example, the partial isolation structures 312, which are marked by dotted circles, have a lateral length that is smaller than the lateral length of the other partial isolation structures included in the corresponding repeating unit 303. Referring to the first quadrant 307 (i.e., the multi-pixel detector) of the repeating unit 303-L, the lateral length of the partial isolation structure 312 is less than the corresponding lateral length of at least one of the other partial isolation structures (e.g., the second, third, and fourth partial isolation structures, not labeled, included in the multi-pixel detector of the repeating unit 303-L).

Further, it should be noted that the lateral length of the partial isolation structures may vary iteratively by row and by column across the entire image sensor 300. In other words, the lateral length of the adjusted partial isolation structure 312 of a given repeating unit may increase as the position of the repeating unit moves from an outermost position to a centermost position within the image sensor 300. For example, repeat unit 303-C is centrally located within the plurality of repeat units 303. Each of the four partial isolation structures of repeat unit 303-C has a common lateral length that is greater than the lateral length of one or more partial isolation structures 312 of the other repeat units (e.g., repeat units 303-L, 303-R, 303-T, and 303-B).

Figure 3B illustrates a top view of an image sensor 350 having a multi-pixel detector, partial isolation structures, and an offset common microlens 341 according to the teachings of the present disclosure. Image sensor 300 is similar to image sensor 350, but further includes a common microlens 341 that is offset based on the position of a given repeating unit 303 in image sensor 350 (i.e., the focal point of the microlens is not located at the midpoint of the multi-pixel detector). Advantageously, by offsetting the common microlenses 341, the angle of the multi-pixel detector included in each of the repeating units 303 may be selectively further tuned to normalize across the image sensor 350 with respect to exposure time. For example, similar to how the lateral lengths of the different partial isolation structures may be adjusted based on location, the degree of offset for each of the common microlenses 341 may be adjusted such that the multi-pixel detector included in each of the plurality of repeating units 303 has a substantially equal angular selectivity to incident light with respect to exposure time throughout the image sensor 350. As illustrated in FIG. 3B, the degree of offset increases from the midpoint of the image sensor (e.g., non-offset repeat units 303-C) toward the periphery of the image sensor (e.g., the outermost repeat units (e.g., 303-T, 303-B, 303-R, and 303-L) have a greater degree of offset than repeat units located closer to the central repeat unit 303-C).

Fig. 4 is a functional block diagram of an imaging system 470 including an image sensor 400 having a multi-pixel detector and partial isolation structure according to the teachings of the present disclosure. The image sensor 400 of the imaging system 470 may be implemented by any of the embodiments described in this disclosure, such as the image sensor 100 of fig. 1A-1E, the image sensor 300 of fig. 3A, and/or the image sensor 350 of fig. 3B.

Imaging system 470 is capable of focusing on a point of interest (POI) within external scene 495 in response to incident light 492. Imaging system 470 includes an image sensor 400 that generates signals (e.g., phase information obtained via one or more multi-pixel detectors) in response to incident light 492, an objective lens (es) 475 having adjustable optical power to focus on one or more points of interest within external scene 492, and a controller 480 for controlling, among other things, the operation of image sensor 400 and objective lens (es) 475. The image sensor 400 is one possible implementation of the image sensor 100 illustrated in fig. 1A-1E and includes a semiconductor material 401 having a plurality of photodiode regions 305 to form image pixels and/or a multi-pixel detector with partial isolation structures, a plurality of color filters 439, and a plurality of microlenses 440 (e.g., common and/or individual microlenses arranged in a manner similar to any of the embodiments described in fig. 1A-3B). The controller 480 illustrated in fig. 4 includes one or more processors 482, memory 484, control circuitry 486, readout circuitry 488, and functional logic 490.

The controller 480 includes logic and/or circuitry to control the operation of the various components of the imaging system 470 (e.g., during the pre, post, and in-situ phases of image and/or video acquisition). The controller 480 may be implemented as hardware logic (e.g., an application specific integrated circuit, a field programmable gate array, a system-on-a-chip, etc.), software/firmware logic executing on a general purpose microcontroller or microprocessor, or a combination of both hardware and software/firmware logic. In one embodiment, the controller 480 includes a processor 482 coupled to a memory 484, the memory 484 storing instructions that are otherwise executed by the controller 480 or by one or more components of the imaging system 470. When executed by the controller 480, the instructions may cause the imaging system 470 to perform operations associated with various functional modules, logic blocks, or circuitry of the imaging system 470, the imaging system 470 including any one of, or a combination of, the control circuitry 486, the readout circuitry 488, the functional logic 490, the image sensor 400, the objective lens 475, and any other elements (illustrated or otherwise) of the imaging system 470. The memory is a non-transitory computer readable medium that may include, but is not limited to, volatile (e.g., RAM) or non-volatile (e.g., ROM) storage systems readable by the controller 480. It should be further understood that the controller 480 may be a monolithic integrated circuit, one or more discrete interconnected electrical components, or a combination thereof. Further, in some embodiments, one or more electrical components may be coupled to each other to collectively serve as a controller 480 for programming or otherwise controlling the operation of the imaging system 470.

The control circuitry 486 may control the operating characteristics of the imaging system 470 (e.g., exposure duration, when to capture a digital image or video, and the like). Readout circuitry 488 reads out or otherwise samples analog signals from individual photodiode regions (e.g., reads out electrical signals generated by each of the plurality of photodiode regions 405, which represent image charges generated in response to incident light, to generate phase detection autofocus signals, reads out image signals to capture image frames or video, and the like), and may include amplification circuitry, analog-to-digital (ADC) circuitry, image buffers, or the like. In the illustrated embodiment, the readout circuitry 488 is included in the controller 480, but in other embodiments, the readout circuitry 488 can be separate from the controller 480. Functional logic 490 is coupled to readout circuitry 488 to receive signals to generate Phase Detection Autofocus (PDAF) signals in response, to generate images in response to receiving image signals or data, and the like. In some embodiments, the signals may be stored as PDAF signals or image data, respectively, and may be manipulated by function logic 490 (e.g., calculating an expected image signal, grading an image signal, demosaicing image data, applying post-image effects (e.g., cropping, rotating, removing red-eye, adjusting brightness, adjusting contrast, or otherwise)).

The processes explained above may be implemented using software and/or hardware. The described techniques may constitute machine-executable instructions embodied in a tangible or non-transitory machine (e.g., computer) readable storage medium, which when executed by a machine (e.g., the controller 120 of fig. 1A) will cause the machine to perform the described operations. Further, the processes may be embodied within hardware, such as an application specific integrated circuit ("ASIC"), a Field Programmable Gate Array (FPGA), or otherwise.

A tangible machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a non-transitory form that is accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., Read Only Memory (ROM), Random Access Memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).

The above description of illustrated examples of the invention, including what is described in the abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific examples of the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.

These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific examples disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:光电传感器及其驱动方法、显示模组和显示装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类