Image sensor with a plurality of pixels

文档序号:1492813 发布日期:2020-02-04 浏览:12次 中文

阅读说明:本技术 图像传感器 (Image sensor with a plurality of pixels ) 是由 李景镐 于 2019-06-18 设计创作,主要内容包括:一种图像传感器包括像素阵列,所述像素阵列包括沿第一方向和第二方向布置的多个像素。多个像素中的每个像素包括沿第一方向和第二方向中的至少一个彼此相邻设置的多个光电二极管。所述图像传感器还包括:控制逻辑,被配置为通过从所述多个像素获得像素信号来生成图像数据,并且在基本同一时间读取与由所述多个像素之一中包括的所述多个光电二极管中的两个或更多个光电二极管生成的电荷相对应的像素电压。(An image sensor includes a pixel array including a plurality of pixels arranged in a first direction and a second direction. Each of the plurality of pixels includes a plurality of photodiodes disposed adjacent to each other in at least one of the first direction and the second direction. The image sensor further includes: control logic configured to generate image data by obtaining pixel signals from the plurality of pixels and read pixel voltages corresponding to charges generated by two or more photodiodes of the plurality of photodiodes included in one of the plurality of pixels at substantially the same time.)

1. An image sensor, comprising:

a pixel array including a plurality of pixels arranged in a first direction and a second direction, wherein each of the plurality of pixels includes a plurality of photodiodes divided into a first photodiode group and a second photodiode group, and at least one of the first photodiode group and the second photodiode group includes two or more photodiodes of the plurality of photodiodes adjacent to each other in at least one of the first direction and the second direction; and

control logic configured to generate image data by obtaining pixel signals from the plurality of pixels and read pixel voltages corresponding to charges generated by two or more photodiodes of the plurality of photodiodes included in one of the plurality of pixels at the same time.

2. The image sensor of claim 1, wherein each of the plurality of pixels comprises:

a device connection layer disposed under and physically connected to the plurality of photodiodes,

wherein the device connection layer separates the plurality of photodiodes into the first and second photodiode groups by connecting at least a part of the photodiodes to each other; and

a pixel circuit disposed under the device connection layer.

3. The image sensor of claim 2, wherein the pixel circuit comprises:

a first transfer transistor connected to the first photodiode group;

a second transfer transistor connected to the second photodiode group; and

a floating diffusion region connected to the first transfer transistor and the second transfer transistor.

4. The image sensor of claim 3, wherein the pixel circuit further comprises:

a driving transistor generating a voltage corresponding to the charge accumulated in the floating diffusion region;

a selection transistor that outputs the voltage generated by the drive transistor to the control logic; and

a reset transistor that resets the floating diffusion region.

5. The image sensor of claim 3, wherein the control logic performs the following: detecting a first pixel voltage corresponding to the electric charge generated by the first photodiode group by turning on the first transfer transistor; detecting a sum pixel voltage corresponding to a sum of the electric charges generated by the first and second photodiode groups by turning on the second transfer transistor; and detecting a second pixel voltage corresponding to the electric charges generated by the second photodiode group by calculating a difference between the sum pixel voltage and the first pixel voltage.

6. The image sensor of claim 5, wherein the control logic does not reset the floating diffusion region after detecting the first pixel voltage and before detecting the summed pixel voltage.

7. The image sensor of claim 5, wherein the control logic uses the first pixel voltage and the second pixel voltage to provide an autofocus function.

8. The image sensor of claim 2, wherein the device connection layer is a region doped with N-type impurities.

9. The image sensor according to claim 2, wherein the device connection layers included in part of pixels adjacent to each other among the plurality of pixels have different areas or shapes.

10. The image sensor of claim 2, wherein, in at least one of the plurality of pixels, a light receiving area of the first photodiode group is different from a light receiving area of the second photodiode group.

11. The image sensor of claim 2, wherein a light receiving area of the first photodiode group is the same as a light receiving area of the second photodiode group in at least one of the plurality of pixels.

12. The image sensor of claim 1, wherein each of the plurality of pixels comprises:

a plurality of transfer transistors respectively connected to the plurality of photodiodes; and

a plurality of connection lines that separate the plurality of photodiodes into the first photodiode group and the second photodiode group by connecting at least part of gate electrode layers of the plurality of transfer transistors to each other.

13. The image sensor according to claim 12, wherein the first photodiode groups are disposed at different positions in a part of pixels among pixels adjacent to each other among the plurality of pixels.

14. The image sensor of claim 12, wherein the connection line comprises:

a first connection line for connecting the photodiodes included in the first photodiode group to each other; and

and a second connection line for connecting the photodiodes included in the second photodiode group to each other.

15. The image sensor of claim 14, wherein the control logic performs the following: detecting a first pixel voltage corresponding to the charge generated by the first photodiode group by turning on a transfer transistor connected to the first connection line; detecting a sum pixel voltage corresponding to a sum of the charges generated by the first and second photodiode groups by turning on a transfer transistor connected to the second connection line; and detecting a second pixel voltage corresponding to the electric charges generated by the second photodiode group by calculating a difference between the sum pixel voltage and the first pixel voltage.

16. The image sensor of claim 1, wherein the control logic obtains information for providing autofocus functions in different directions from at least some of the plurality of pixels.

17. An image sensor, comprising:

a pixel array including a plurality of pixels; and

control logic configured to generate image data using the charge generated in each of the plurality of pixels,

wherein each of the plurality of pixels comprises: a plurality of photodiodes formed at the same depth in the semiconductor substrate; a pixel circuit located under the plurality of photodiodes; and a device connection layer physically connecting at least some of the plurality of photodiodes to each other and disposed between the pixel circuit and the plurality of photodiodes.

18. The image sensor of claim 17, wherein each of the plurality of pixels comprises a first photodiode, a second photodiode, a third photodiode, and a fourth photodiode.

19. The image sensor of claim 18, wherein the device connection layer comprises:

a first device connection layer physically connecting the first photodiode and the second photodiode to each other; and

a second device connection layer physically connecting the third photodiode and the fourth photodiode to each other.

20. The image sensor of claim 18, wherein the device connection layer physically connects three of the first to fourth photodiodes to each other.

21. The image sensor of claim 17, wherein each of the plurality of pixels comprises:

a floating diffusion region accumulating at least a portion of the charge generated by the plurality of photodiodes; and

a plurality of transfer transistors connected between the floating diffusion region and the plurality of photodiodes,

wherein the number of the plurality of photodiodes is greater than the number of the plurality of transfer transistors.

22. An image sensor, comprising:

a pixel array including a plurality of pixels; and

a control logic circuit configured to generate image data using the electric charge generated in each of the plurality of pixels,

wherein each of the plurality of pixels comprises:

a plurality of photodiodes formed at the same depth in the semiconductor substrate;

a plurality of transfer transistors respectively connected to the plurality of photodiodes; and

and a connection line for connecting gate electrode layers of at least some of the plurality of transfer transistors to each other.

23. The image sensor of claim 22, wherein the connection lines comprise first connection lines and second connection lines.

24. The image sensor according to claim 23, wherein in a first pixel and a second pixel which are adjacent to each other in the plurality of pixels, the first connection line of the first pixel is connected to the first connection line of the second pixel, and the second connection line of the first pixel is connected to the second connection line of the second pixel.

25. The image sensor of claim 22, wherein the plurality of pixels includes a first pixel and a second pixel disposed adjacent to each other, a connection line of the first pixel extends in a first direction, and a connection line of the second pixel extends in a second direction that intersects the first direction.

Technical Field

Example embodiments of the inventive concepts relate to an image sensor.

Background

Image sensors are semiconductor-based sensors that receive light to generate an electrical signal. The image sensor may include a pixel array having a plurality of pixels, and include logic circuits that drive the pixel array and generate an image. The plurality of pixels may include a photodiode that generates charges in response to external light, and a pixel circuit that converts the charges generated by the photodiode into an electrical signal. The image sensor may be applied to various devices. For example, image sensors may be used in smart phones, tablet PCs, laptop computers, televisions, vehicles, etc., in addition to ordinary cameras used to capture photos or videos. Recently, various methods for improving an auto-focus function of an image sensor have been proposed to improve the quality of an image captured by the image sensor.

Disclosure of Invention

Example embodiments of the inventive concepts provide an image sensor capable of providing an improved auto-focus function and improving a readout time for reading a pixel voltage, improving power consumption according to a readout operation, and improving noise characteristics.

According to an example embodiment of the inventive concepts, an image sensor includes a pixel array and control logic. The pixel array includes a plurality of pixels arranged in a first direction and a second direction. Each of the plurality of pixels includes a plurality of photodiodes divided into a first photodiode group and a second photodiode group, and at least one of the first photodiode group and the second photodiode group includes two or more photodiodes of the plurality of photodiodes that are adjacent to each other in at least one of the first direction and the second direction. The control logic is configured to generate image data by obtaining pixel signals from the plurality of pixels, and read pixel voltages corresponding to charges generated by two or more photodiodes of the plurality of photodiodes included in one of the plurality of pixels at substantially the same time.

According to an example embodiment of the inventive concepts, an image sensor includes a pixel array and control logic. The pixel array includes a plurality of pixels. The control logic is configured to generate image data using the charge generated in each of the plurality of pixels. Each of the plurality of pixels includes: a plurality of photodiodes formed at substantially the same depth in the semiconductor substrate; a pixel circuit located under the plurality of photodiodes; and a device connection layer physically connecting at least some of the plurality of photodiodes to each other and disposed between the pixel circuit and the plurality of photodiodes.

According to an example embodiment of the inventive concepts, an image sensor includes a pixel array and a control logic circuit. The pixel array includes a plurality of pixels. The control logic circuit is configured to generate image data using the charge generated in each of the plurality of pixels. Each of the plurality of pixels includes: a plurality of photodiodes formed at substantially the same depth in the semiconductor substrate; a plurality of transfer transistors connected to the plurality of photodiodes; and a connection line for connecting the gate electrode layers in at least part of the plurality of transfer transistors to each other.

Drawings

The above and other features of the present inventive concept will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings, in which:

fig. 1 is a diagram illustrating an image processing apparatus including an image sensor according to an example embodiment of the inventive concepts.

Fig. 2 and 3 are diagrams illustrating an image sensor according to an example embodiment of the inventive concept.

Fig. 4 is a diagram illustrating an operation of an image sensor according to an example embodiment of the inventive concepts.

Fig. 5 is a diagram illustrating a pixel array of an image sensor according to an example embodiment of the inventive concepts.

Fig. 6 and 7 are circuit diagrams illustrating a pixel circuit of an image sensor according to an example embodiment of the inventive concepts.

Fig. 8 to 12 are diagrams illustrating a pixel structure of an image sensor according to an example embodiment of the inventive concepts.

Fig. 13 and 14 are diagrams illustrating a pixel structure of an image sensor according to an example embodiment of the inventive concepts.

Fig. 15 and 16 are diagrams illustrating a pixel structure of an image sensor according to an example embodiment of the inventive concepts.

Fig. 17 is a diagram illustrating an image sensor according to an example embodiment of the inventive concepts.

Fig. 18 is a circuit diagram illustrating a pixel circuit of an image sensor according to an example embodiment of the inventive concepts.

Fig. 19 is a timing diagram illustrating an operation of an image sensor according to an exemplary embodiment of the inventive concept.

Fig. 20 is a block diagram illustrating an electronic device including an image sensor according to an example embodiment of the inventive concepts.

Detailed Description

Example embodiments of the inventive concepts will now be described more fully hereinafter with reference to the accompanying drawings. Like reference numerals may refer to like elements throughout.

Spatially relative terms, such as "below," "lower," "below," "over," "upper," and the like, may be used herein to facilitate describing one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the exemplary terms "under" and "beneath" can encompass both an orientation of above and below.

Furthermore, it should be understood that the description of features or aspects in each example embodiment should generally be considered applicable to other similar features or aspects in other example embodiments, unless the context clearly dictates otherwise.

Further, one of ordinary skill in the art will understand that when two or more elements or values are described as being substantially the same or approximately equal to each other, it is to be understood that the elements or values are the same as each other, indistinguishable from each other, or distinguishable from each other but functionally identical to each other. It will be further understood that when two components or directions are described as extending substantially parallel or perpendicular to each other, the two components or directions extend exactly parallel or perpendicular to each other, or substantially parallel or perpendicular to each other within measurement error, as will be understood by those of ordinary skill in the art. Further, those of ordinary skill in the art will appreciate that while a parameter may be described herein as having a particular value of "about," it is understood that the parameter may be precisely that particular value or approximately that particular value within measurement error, according to example embodiments.

Further, those of ordinary skill in the art will understand and appreciate that when two or more processes or events are described as being performed or occurring at substantially the same time (or substantially the same time), it is understood that the processes or events can be performed or occurring at exactly the same time or at approximately the same time. For example, one of ordinary skill in the art will appreciate that processes or events may be performed or occur at substantially the same time within measurement error.

Fig. 1 is a diagram illustrating an image processing apparatus including an image sensor according to an example embodiment of the inventive concepts.

Referring to fig. 1, an image processing apparatus 1 according to an exemplary embodiment of the inventive concept may include an image sensor 10 and an image processor 20. The image sensor 10 may include a pixel array 11, a row driver 12, a readout circuit 13, a column driver 15, and a timing controller 14. As described in further detail below, the row driver 12, the readout circuit 13, the column driver 15, and the timing controller 14 are circuits for controlling the pixel array 11, and may be included in control logic. In an example embodiment, additional components may be included in the image sensor 10.

The image sensor 10 may operate according to a control command received from the image processor 20, and may convert light from the object 30 into an electrical signal and output the electrical signal to the image processor 20. The pixel array 11 included in the image sensor 10 may include a plurality of pixels PX, and the plurality of pixels PX may include a photoelectric element that receives light and generates charges. For example, the photoelectric element may be a photodiode PD. In an example embodiment, at least one of the plurality of pixels PX may include two or more photodiodes, and the image sensor 10 may provide an auto-focusing function by using a phase difference of pixel signals generated by each of the two or more photodiodes included in the at least one of the plurality of pixels PX.

In an example embodiment, each of the plurality of pixels PX may include a pixel circuit that generates a pixel signal according to electric charges generated by the photodiode. In example embodiments, the pixel circuit may include, for example, a transfer transistor, a driving transistor, a selection transistor, and a reset transistor. The pixel circuit may obtain a pixel signal by detecting a reset voltage and a pixel voltage from each of the plurality of pixels PX and calculating a difference value. The pixel voltage may be a voltage reflected by charges generated in a photodiode included in each of the plurality of pixels PX.

When the plurality of pixels PX have two or more photodiodes, each of the plurality of pixels PX may include a pixel circuit for processing charges generated in each of the two or more photodiodes. For example, according to an example embodiment, the pixel circuit may include two or more of: at least one of a transfer transistor, a drive transistor, a selection transistor, and a reset transistor.

The row driver 12 may drive the pixel array 11 by rows. For example, the row driver 12 may generate a transfer control signal for controlling a transfer transistor of the pixel circuit, a reset control signal for controlling a reset transistor of the pixel circuit, and a selection control signal for controlling a selection transistor of the pixel circuit.

For example, the readout circuit 13 may include a correlated double sampler CDS, an analog-to-digital converter ADC, and the like. The correlated double sampler may be connected to pixels PX included in a row selected by a row selection signal supplied from the row driver 12 through column lines, and may perform correlated double sampling to detect a reset voltage and a pixel voltage. The analog-to-digital converter may convert the reset voltage and the pixel voltage detected by the correlated double sampler into digital signals and transmit the converted voltages to the column driver 15.

For example, the column driver 15 may include an amplification circuit, a latch or a buffer circuit capable of temporarily storing a digital signal, or the like, and may temporarily store or amplify a digital signal received from the readout circuit 13 to generate image data. The operation timings of the row driver 12, the readout circuit 13, and the column driver 15 may be determined by the timing controller 14, and the timing controller 14 may operate based on a control command transmitted from the image processor 20. The image processor 20 may perform signal processing on the image data output from the column driver 15 to output the image data to a display device, or may store the image data in a storage device such as a memory. In an example embodiment, the image processing apparatus 1 may be mounted on an autonomous vehicle, and the image processor 20 may perform signal processing on image data and transmit the image data to the main controller to control the autonomous vehicle.

Fig. 2 and 3 are diagrams illustrating an image sensor according to an example embodiment of the inventive concept.

For convenience of explanation, in describing fig. 3, a repeated description of elements and technical aspects described with reference to fig. 2 will be omitted.

First, referring to fig. 2, an image sensor 2 according to an example embodiment of the inventive concept may include a first layer 40, a second layer 50 disposed under the first layer 40, and a third layer 60 disposed under the second layer 50. In an example embodiment, additional layers may be included. The first layer 40, the second layer 50, and the third layer 60 may be stacked in directions substantially perpendicular to each other. In an example embodiment, the first layer 40 and the second layer 50 may be stacked on each other on a wafer level, and the third layer 60 may be attached to a lower portion of the second layer 50 on a chip level. The first layer 40 to the third layer 60 may be provided in one semiconductor package.

The first layer 40 may be a semiconductor substrate including a sensing area SA in which a plurality of pixels PX are disposed and a first pad area PA1 disposed around the sensing area SA. A plurality of upper PADs PAD may be included in the first PAD area PA 1. The plurality of upper PADs PAD may be connected to the control logic LC and PADs provided in the second PAD area PA2 of the second layer 50 through, for example, vias.

Each of the plurality of pixels PX may include, for example, a photodiode that receives light and generates electric charges and a pixel circuit that processes the electric charges generated by the photodiode. The pixel circuit may include a plurality of transistors that output a voltage corresponding to the charge generated by the photodiode.

The second layer 50 may include a plurality of devices forming control logic LC. A plurality of devices included in control logic LC may be configured to provide circuitry for driving pixel circuits disposed in first layer 40. The control logic LC herein may also be referred to as a control logic circuit. Accordingly, the terms "control logic" and "control logic circuitry" may be used interchangeably herein. The plurality of devices may include, for example, a row driver 12, a column driver 15, and a timing controller 14, as well as additional devices. A plurality of devices included in the control logic LC may be connected to the pixel circuit through the first and second pad areas PA1 and PA 2. The control logic LC may obtain a reset voltage and a pixel voltage from a plurality of pixels PX, and may generate a pixel signal using the reset voltage and the pixel voltage.

In an example embodiment, at least one of the plurality of pixels PX may include a plurality of photodiodes disposed on the same level. The pixel signals generated according to the charge of each of the plurality of photodiodes may have a phase difference from each other, and the control logic LC may provide an auto-focusing function based on the phase difference of the pixel signals generated by the plurality of photodiodes included in one pixel PX.

The third layer 60 disposed under the second layer 50 may include a memory chip MC and a dummy (dummy) chip DC, and include a protective layer EN sealing the memory chip MC and the dummy chip DC. The memory chip MC may be, for example, a dynamic random access memory DRAM or a static random access memory SRAM. The virtual chip DC cannot actually store data. The memory chip MC may be electrically connected to at least a part of the devices included in the control logic LC of the second layer 50 through, for example, bumps, and may store information necessary for providing an auto-focus function. In an example embodiment, the bumps may be, for example, micro-bumps.

Next, referring to fig. 3, the image sensor 3 according to an example embodiment may include a first layer 70 and a second layer 80. The first layer 70 may include: a sensing area SA in which a plurality of pixels PX are disposed, a control logic LC in which devices for driving the plurality of pixels PX are disposed, and a first pad area PA1 disposed around the sensing area SA and the control logic LC. A plurality of upper PADs PAD may be included in the first PAD area PA 1. The plurality of upper PADs PAD may be connected to the memory chip MC disposed in the second layer 80 through, for example, vias. The second layer 80 may include, for example, a memory chip MC, a dummy chip DC, and a protection layer EN sealing the memory chip MC and the dummy chip DC.

Fig. 4 is a diagram illustrating an operation of an image sensor according to an example embodiment of the inventive concepts.

Referring to fig. 4, an image sensor 100 according to an example embodiment of the inventive concepts may include, for example, a pixel array 110, a row driver 120, and a readout circuit 130. The ROW driver 120 may input a transfer control signal, a reset control signal, a selection control signal, and the like to each pixel circuit through ROW lines ROW1 through ROW included in the plurality of ROW lines ROW. The readout circuit 130 may detect a pixel voltage and a reset voltage from a pixel PX connected to a ROW line ROW selected by the ROW driver 120. The readout circuit 130 may include, for example, a sampling circuit 131 and an analog-to-digital converter 132, wherein the sampling circuit 131 includes a plurality of correlated double samplers CDS1 to CDSn, and the analog-to-digital converter 132 converts outputs SOUT1 to SOUTn of the sampling circuit 131 included in the plurality of outputs SOUT into digital data. The digital data may correspond to, for example, DOUT in fig. 4, which may be output by analog-to-digital converter 132.

The pixel array 110 may include a plurality of ROW lines ROW extending in one direction, and column lines COL1 through COLn included in a plurality of column lines COL intersecting the ROW lines ROW. The ROW line ROW and the column line COL may be connected to the pixel PX11To PXmn. Pixel PX11To PXmnMay include a photodiode and a pixel circuit. In an example embodiment, the pixel PX11To PXmnMay include a plurality of photodiodes disposed on the same level, wherein the plurality of photodiodes are used to provide an auto-focus function.

Referring to the comparative example, when a plurality of pixels PX11To PXmnWhen each pixel of (a) includes a plurality of photodiodes for providing an autofocus function, the readout circuit 130 may detect a pixel voltage from each of the plurality of photodiodes. Therefore, since the sub-pixel PX is plural times11To PXmnEach of which detects a pixel voltage, so it is possible to increase time and power consumption required for a readout operation. In addition, since the slave pixel PX11To PXmnThe photodiode of each of them detects the pixel voltage, a noise component occurring in the readout operation may increase, and the performance of the image sensor 100 may decrease.

In contrast, in example embodiments of the inventive concept, in order to solve the above-described problem that may occur in the comparative example, reflection is made by a plurality of pixels PX11To PXmnThe pixel voltages of the charges generated by at least a portion of the phototransistors in the plurality of photodiodes included in each of the plurality of photodiodes may be read substantially simultaneously. As a result, time and power consumption required for the readout operation can be reduced, the number of times of performing the readout operation can be reduced, and the noise component can be reduced, which can result in improved performance of the image sensor 100.

Fig. 5 is a diagram illustrating a pixel array of an image sensor according to an example embodiment of the inventive concepts.

Referring to fig. 5, a pixel array 200 of an image sensor according to an example embodiment of the inventive concepts may include a plurality of pixels 210, 220, 230, and 240. The plurality of pixels 210 to 240 may be arranged in a first direction (X-axis direction) and a second direction (Y-axis direction). Each of the plurality of pixels 210 to 240 may include a plurality of photodiodes PD1, PD2, PD3, and PD 4. In the example embodiment shown in fig. 5, each of the plurality of pixels 210 to 240 includes four photodiodes PD1 to PD 4. However, the exemplary embodiments are not limited thereto. For example, according to example embodiments, the number of photodiodes PD1 to PD4 included in each of the plurality of pixels 210 to 240 may be variously modified.

In a typical case, a readout circuit of the image sensor may read a pixel voltage from each of the plurality of photodiodes PD1 to PD4 to obtain a pixel signal. For example, the operation of obtaining the pixel signal from the first pixel 210 may include an operation of reading the pixel voltage from each of the first to fourth photodiodes PD1 to PD4 included in the first pixel 210. Accordingly, in order to obtain a pixel signal from the first pixel 210, a readout operation of reading a pixel voltage may be performed four times, which may result in an increase in time and/or power consumption required for the readout operation. In addition, since a noise component is included in the pixel voltage for each readout operation, image quality may be degraded.

In example embodiments according to the inventive concepts, the readout circuit may substantially simultaneously read pixel voltages corresponding to charges generated by at least a portion of the plurality of photodiodes PD1 through PD4 included in each of the plurality of pixels 210 through 240. At least a portion of the photodiodes PD 1-PD 4 may be connected such that the readout circuit may substantially simultaneously read pixel voltages corresponding to charges generated by at least a portion of the photodiodes PD 1-PD 4.

Fig. 6 and 7 are circuit diagrams illustrating a pixel circuit of an image sensor according to an example embodiment of the inventive concepts. For example, the pixel circuit according to the example embodiments of fig. 6 and 7 may be a pixel circuit applied to the image sensor shown in fig. 5.

For convenience of explanation, in describing fig. 7, a repeated description of elements and technical aspects described with reference to fig. 6 will be omitted.

First, referring to fig. 6, a pixel circuit of an image sensor according to an example embodiment of the inventive concepts may include, for example, a reset transistor RX, a driving transistor DX, a selection transistor SX, a first transfer transistor TX1, and a second transfer transistor TX 2. The first transfer transistor TX1 may be connected to the first and second photodiodes PD1 and PD2, and the second transfer transistor TX2 may be connected to the third and fourth photodiodes PD3 and PD 4.

The operation of the pixel circuit shown in fig. 6 will be described below.

First, when the reset transistor RX is turned on by the reset control signal RG, the floating diffusion FD may be reset by the power supply voltage VDD. Then, when the selection transistor SX is turned on by a selection control signal SEL, a readout circuit of the image sensor may detect a reset voltage from the floating diffusion FD through the corresponding column line COL.

When the operation of detecting the reset voltage is completed, the first transfer transistor TX1 may be turned on. At this time, the second transmission transistor TX2 may be turned off. When the first transfer transistor TX1 is turned on by the first transfer control signal TG1, charges generated by the first and second photodiodes PD1 and PD2 may be accumulated in the floating diffusion FD. Then, when the selection transistor SX is turned on, the readout circuit may detect the first pixel voltage corresponding to the amount of electric charges generated by the first and second photodiodes PD1 and PD2 through the corresponding column line COL.

When the operation of detecting the first pixel voltage is completed, the second transfer transistor TX2 may be turned on. The second transmission transistor TX2 may be turned on by a second transmission control signal TG 2. When the second transfer transistor TX2 is turned on, charges generated by the third and fourth photodiodes PD3 and PD4 may be accumulated in the floating diffusion FD. At this time, the charges accumulated in the floating diffusion FD may be the charges generated by the first to fourth photodiodes PD1 to PD 4. The readout circuit can detect the sum pixel voltage corresponding to the total amount of electric charges generated by the first to fourth photodiodes PD1 to PD4 through the corresponding column line COL.

The control logic of the image sensor may obtain the second pixel voltage corresponding to the amount of charges generated by the third and fourth photodiodes PD3 and PD4 by calculating a difference between the sum pixel voltage and the first pixel voltage. The control logic may obtain the first pixel signal and the second pixel signal by using the first pixel voltage and the second pixel voltage, and may provide the auto-focusing function by using a phase difference between the first pixel signal and the second pixel signal. The control logic may generate the image data by using pixel signals obtained from a sum pixel voltage corresponding to a sum of the electric charges generated by the first to fourth photodiodes PD1 to PD 4.

In example embodiments of the inventive concepts, the readout circuit does not obtain the pixel voltage from each of the photodiodes PD1 to PD4 through the pixel circuit, and the pixel voltage may be read from at least a part of the photodiodes PD1 to PD4 substantially simultaneously. Therefore, the number of times of performing the readout operation can be reduced. As a result, the time and/or power consumption required for the readout operation can be reduced, and the increase in noise components due to the increase in the number of times the readout operation is performed can be significantly reduced. Therefore, deterioration of image quality can be reduced.

Next, referring to fig. 7, in an example embodiment, the first transfer transistor TX1 may be connected to the first photodiode PD1, and the second transfer transistor TX2 may be connected to the second to fourth photodiodes PD2 to PD 4. When the first transfer transistor TX1 is turned on so that the charges of the first photodiode PD1 are accumulated in the floating diffusion region FD, the readout circuit may detect the first pixel voltage corresponding to the charges of the first photodiode PD 1. Next, when the second transfer transistor TX2 is turned on so that the charges of the second to fourth photodiodes PD2 to PD4 move to the floating diffusion region FD, the readout circuit may detect a sum pixel voltage corresponding to the sum of the charges generated by the first to fourth photodiodes PD1 to PD 4.

The control logic may obtain the second pixel voltage corresponding to the sum of the charges generated by the second to fourth photodiodes PD2 to PD4 by calculating a difference between the sum pixel voltage and the first pixel voltage. The control logic may provide an autofocus function using a phase difference between a first pixel signal and a second pixel signal generated from the first pixel voltage and the second pixel voltage, respectively. In addition, the image data may be generated by using the pixel signal generated from the sum pixel voltage.

In each of the exemplary embodiments shown in fig. 6 and 7, the autofocus function may be provided in a different direction. For example, in the example embodiment shown in fig. 6, the pixel voltage may be detected substantially simultaneously from the charges of the first photodiode PD1 and the second photodiode PD2, and the pixel voltage may be detected substantially simultaneously from the charges of the third photodiode PD3 and the fourth photodiode PD 4. Accordingly, assuming that the pixel circuit of fig. 6 is applied to the pixel array 200 of fig. 5, the pixel circuit of fig. 6 can generate information required for focusing in the second direction (Y-axis direction). Similarly, the pixel circuit of fig. 7 can provide information necessary for focusing in a direction rotated counterclockwise by about 45 degrees based on the second direction (Y-axis direction).

Fig. 8 to 12 are diagrams illustrating a pixel structure of an image sensor according to an example embodiment of the inventive concepts.

Referring to fig. 8, a pixel array 300 of an image sensor according to an example embodiment of the inventive concepts may include a plurality of pixels 310, 320, 330, and 340. It should be understood that fig. 8 shows only a partial region of the pixel array 300 for convenience of explanation.

Each of the plurality of pixels 310 to 340 may include first to fourth photodiodes PD1, PD2, PD3, and PD 4. The first to fourth photodiodes PD1 to PD4 may be arranged in the first direction (X-axis direction) and the second direction (Y-axis direction), and may be located on substantially the same level in the third direction (Z-axis direction). The first device isolation film 301 may be disposed between the plurality of pixels 310 to 340, and the second device isolation film 302 may be disposed for each of the plurality of pixels 310 to 340. A plurality of unit regions for forming the first to fourth photodiodes PD1 to PD4 in each of the plurality of pixels 310 to 340 may be defined by the second device isolation film 302.

In the example embodiment shown in fig. 8, at least a portion of the first through fourth photodiodes PD1 through PD4 in each of the plurality of pixels 310 through 340 may be physically connected to each other. For example, in the case of the first pixel 310, the first to third photodiodes PD1 to PD3 may be physically connected to each other to form the first photodiode group PG 1. The fourth photodiode PD4 of the first pixel 310 may independently provide the second photodiode group PG 2. In the second pixel 320, the first photodiode PD1 and the second photodiode PD2 may be physically connected to each other to provide a first photodiode group PG1, and the third photodiode PD3 and the fourth photodiode PD4 may be physically connected to each other to provide a second photodiode group PG 2.

The first photodiode group PG1 of the third pixel 330 may include a first photodiode PD1 and a third photodiode PD3 physically connected to each other, and the second photodiode group PG2 may include a second photodiode PD2 and a fourth photodiode PD4 physically connected to each other. In the case of the fourth pixel 340, the first photodiode PD1 may independently provide the first photodiode group PG1, and the second to fourth photodiodes PD2 to PD4 physically connected to each other may provide the second photodiode group PG 2.

In the above description, the expression of physically connecting may be interpreted to mean that two or more of the first to fourth photodiodes PD1 to PD4 are directly connected through a device connection layer. For example, in an example embodiment, in the case of the first pixel 310, the first to third photodiodes PD1 to PD3 may be commonly connected to a device connection layer to form a first photodiode group PG 1. In example embodiments of the inventive concept, the device connection layers providing the first or second photodiode groups PG1 or PG2 may have different shapes, areas, or the like, in the pixels 310 to 340 adjacent to each other. For example, the shape, area, number, and the like of each device connection layer of the first and second pixels 310 and 320 may be different from each other.

For example, according to example embodiments, the device connection layer disposed at a lower portion of the plurality of photodiodes PD1 to PD4 may separate the plurality of photodiodes PD1 to PD4 into the first photodiode group PG1 and the second photodiode group PG2 by connecting at least portions of the plurality of photodiodes PD1 to PD4 to each other.

At least a portion of the first to fourth photodiodes PD1 to PD4 may be connected to each other through the device connection layer, so that the pixel circuit may read the pixel voltage substantially simultaneously from the two or more photodiodes PD1 to PD4 connected to the device connection layer. Therefore, according to example embodiments, the number of readout operations performed to read the pixel voltage from each of the plurality of pixels 310 to 340 may be reduced, thereby reducing the required time and power consumption of the readout operation. As a result, according to example embodiments, an increase in noise due to an increase in the number of times a readout operation is performed may be significantly reduced. In addition, the pass transistor may be connected to each of the first and second photodiode groups PG1 and PG2 in a one-by-one manner. For example, the number of photodiodes PD 1-PD 4 in each of the plurality of pixels 310-340 may be greater than the number of transfer transistors.

Fig. 9 and 10 are cross-sectional views of the pixel array 300 shown in fig. 8 taken along lines I-I 'and II-II', respectively. Referring to fig. 9 and 10, the third and fourth pixels 330 and 340 may be separated from each other by the first device isolation film 301, and the second device isolation film 302 may be formed inside each of the third and fourth pixels 330 and 340. Each of the third and fourth pixels 330 and 340 may have a plurality of unit regions defined by the second device isolation film 302, and a plurality of photodiodes PD1 to PD4 may be formed in the plurality of unit regions.

The third and fourth pixels 330 and 340 may include microlenses 331 and 341, color filters 333 and 343, and pixel circuits 335 and 345. In each of the third and fourth pixels 330 and 340 disposed adjacent to each other, the color filters 333 and 343 may transmit light of different colors. For example, the color filter 333 of the third pixel 330 may be a green color filter transmitting green light, and the color filter 343 of the fourth pixel 340 may be a red color filter transmitting red light. The pixel circuits 335 and 345 may include, for example, a driving transistor, a reset transistor, a selection transistor, a transfer transistor, and the like.

In the third pixel 330, the first and third photodiodes PD1 and PD3 may be connected to each other through the first device connection layer CL1 to provide a first photodiode group PG1, and the second and fourth photodiodes PD2 and PD4 may be connected to each other through the second device connection layer CL2 to provide a second photodiode group PG 2. In the fourth pixel 340, the second to fourth photodiodes PD2 to PD4 may be connected to each other through one device connection layer CL to provide a second photodiode group PG 2.

As shown in fig. 9 and 10, the device connection layers CL1, CL2, and CL may be disposed between the photodiodes PD1 to PD4 and the pixel circuits 335 and 345. Further, in an example embodiment, the device isolation films 301 and 302 are not connected to all of the color filters 333 and 343 and the pixel circuits 335 and 345. The device connection layers CL1, CL2, and CL may be formed in regions where the device isolation films 301 and 302 are not formed. For example, the device connection layers CL1, CL2, and CL may be doped with impurities to physically connect some of the photodiodes PD1 to PD4 and to the floating diffusion regions of the pixel circuits 335 and 345. In example embodiments, the device connection layers CL1, CL2, and CL may be doped with N-type impurities.

Fig. 11 is a cross-sectional view of the pixel array 300 shown in fig. 8, taken along line III-III'. Referring to fig. 11, the first pixel 310 and the third pixel 330 may be separated from each other by a first device isolation film 301, and a second device isolation film 302 may be formed inside each of the third pixel 330 and the fourth pixel 340. The depths of the first and second device isolation films 301 and 302 in the third direction (Z-axis direction) may be less than the depth of the semiconductor substrate in which the photodiodes PD1 to PD4 are formed. The first pixel 310 may include a microlens 311 and a color filter 313.

In the first pixel 310, the first to third photodiodes PD1 to PD3 may be connected by a device connection layer CL. The first to third photodiodes PD1 to PD3 connected through the device connection layer CL may provide a first photodiode group PG1, and the fourth photodiode PD4 may independently provide a second photodiode group PG 2. The pixel circuit 315 of the first pixel 310 may obtain the first pixel voltage corresponding to the sum of the charges generated in the first to third photodiodes PD1 to PD3 substantially simultaneously through the device connection layer CL. Accordingly, by reducing the number of readout operations for obtaining the pixel voltage, the operation speed, power consumption, and/or noise characteristics of the image sensor can be improved.

Fig. 12 is a cross-sectional view of the pixel array 300 shown in fig. 8, taken along line IV-IV'. Referring to fig. 12, the second pixel 320 and the fourth pixel 340 may be separated from each other by the first device isolation film 301, and the second device isolation film 302 may be formed inside each of the second pixel 320 and the fourth pixel 340. The second pixel 320 may include a microlens 321, a color filter 323, and a pixel circuit 325. The device connection layers CL1, CL2, and CL may be disposed between the device isolation films 301 and 302 and the pixel circuits 325 and 345 in the third direction (Z-axis direction).

The second pixel 320 may include a first photodiode group PG1 having a first device connection layer CL1 and first and second photodiodes PD1 and PD2, and a second photodiode group PG2 having a second device connection layer CL2 and third and fourth photodiodes PD3 and PD 4. In the fourth pixel 340, the second to fourth photodiodes PD2 to PD4 may be physically connected to each other through one device connection layer CL to provide a second photodiode group PG 2.

The image sensor may provide an auto-focus function using a phase difference of pixel signals obtained from the photodiode groups PG1 and PG2 of each of the pixels 310 to 340. In the example embodiments described with reference to fig. 8 to 12, the photodiode groups PG1 and PG2 may have different shapes and/or areas in at least a part of the pixels 310 to 340 adjacent to each other. Accordingly, since at least a portion of the pixels 310 to 340 adjacent to each other provide information required for focusing in different directions, the performance of the image sensor can be improved by providing an auto-focusing function in various directions. In addition, according to example embodiments of the inventive concepts, by bundling at least a part of the photodiodes PD1 to PD4 included in each of the pixels 310 to 340 into the photodiode groups PG1 and PG2, power consumption and time required for a readout operation and noise generated in the readout operation may be reduced.

Fig. 13 and 14 are diagrams illustrating a pixel structure of an image sensor according to an example embodiment of the inventive concepts.

Fig. 13 is a plan view illustrating a partial region of a pixel array 400 of an image sensor according to an example embodiment of the inventive concepts. Fig. 14 is a cross-sectional view of the pixel array 400 shown in fig. 13, taken along line V-V'.

Referring to fig. 13 and 14, a plurality of pixels 410, 420, 430, and 440 may be separated from each other by a device isolation film 401. Each of the plurality of pixels 410 to 440 may include first to fourth photodiodes PD1 to PD 4. In each of the plurality of pixels 410 to 440, at least a portion of the first to fourth photodiodes PD1 to PD4 may be connected to each other to provide the first or second photodiode group PG1 or PG 2. The photodiode groups PG1 and PG2 may be defined by device connection layers CL1, CL2, and CL that connect at least a part of the first to fourth photodiodes PD1 to PD 4. The third pixel 430 may include a microlens 431, a color filter 433, and a pixel circuit 435. The fourth pixel 440 may include a microlens 441, a color filter 443, and a pixel circuit 445.

In the example embodiments illustrated in fig. 13 and 14, the device isolation film 401 may be formed only at the boundaries between the plurality of pixels 410 to 440, and the device isolation film 401 is not formed inside each of the plurality of pixels 410 to 440. In addition, referring to fig. 14, device connection layers CL1, CL2, and CL may be disposed between the pixel circuits 435 and 445 and the color filters 433 and 443 to physically connect at least a portion of the first to fourth photodiodes PD1 to PD4 to each other.

The device connection layers CL1, CL2, and CL may be doped with N-type impurities, and the device connection layers CL1, CL2, and CL included in the plurality of pixels 410 to 440 adjacent to each other may have different shapes or areas. For example, the area of the device connection layer CL formed in the fourth pixel 440 may be larger than the areas of the first and second device connection layers CL1 and CL2 formed in the third pixel 430.

The light receiving areas of the photodiode groups PG1 and PG2 included in the plurality of pixels 410 to 440 may be determined by the device connection layers CL1, CL2, and CL. For example, in an example embodiment, the light receiving areas of the photodiode groups PG1 and PG2 of the third pixel 430 may be different from the light receiving areas of the photodiode groups PG1 and PG2 of the fourth pixel 440. In contrast, in example embodiments, the light receiving areas of the photodiode groups PG1 and PG2 of the third pixel 430 may be substantially the same as the light receiving areas of the photodiode groups PG1 and PG2 of the second pixel 420.

Fig. 15 and 16 are diagrams illustrating a pixel structure of an image sensor according to an example embodiment of the inventive concepts.

Fig. 15 is a plan view illustrating a partial region of a pixel array 500 of an image sensor according to an example embodiment of the inventive concepts. Fig. 16 is a cross-sectional view of the pixel array 500 shown in fig. 15 taken along line VI-VI'.

The pixel array 500 may include a plurality of pixels 510, 520, 530, and 540 separated by a device isolation film 501. The plurality of photodiodes PD1 to PD4 may be disposed in each of the plurality of pixels 510 to 540 along the first direction (X-axis direction) and the second direction (Y-axis direction). At least a part of the plurality of photodiodes PD1 to PD4 in each of the plurality of pixels 510 to 540 may be connected to each other through the device connection layers CL1, CL2, and CL to provide the photodiode groups PG1 and PG 2. Referring to fig. 16, the third pixel 530 may include a microlens 531, a color filter 533, and a pixel circuit 535. The fourth pixel 540 may include a microlens 541, a color filter 543, and a pixel circuit 545. Device connection layers CL1, CL2, and CL may be disposed between the pixel circuits 535 and 545 and the color filters 533 and 543. The microlenses 531 and 541 may be formed on the upper portions of the color filters 533 and 543.

In the example embodiments shown in fig. 15 and 16, a charge transport layer CM connecting the photodiodes PD1 to PD4 to each other may be formed. For example, in the third pixel 530, when light is excessively introduced into the first photodiode PD1 to saturate it, some of the charges generated in the first photodiode PD1 may be transferred to the second photodiode PD2 through the charge transport layer CM. Therefore, saturation of the photodiodes PD1 to PD4 can be prevented or reduced by the charge transport layer CM. According to example embodiments, the charge transport layer CM may be connected between different photodiode groups PG1 and PG2, or may also be connected between photodiodes PD1 to PD4 belonging to the same photodiode groups PG1 and PG 2. As shown in fig. 16, the charge transport layer CM may be disposed between the device connection layers CL1, CL2, and CL and the color filters 533 and 543 in the third direction (Z-axis direction).

Fig. 17 is a diagram illustrating an image sensor according to an example embodiment of the inventive concepts.

Fig. 17 is a plan view illustrating a partial region of a pixel array 600 of an image sensor according to an exemplary embodiment of the inventive concept. Referring to fig. 17, the pixel array 600 may include a plurality of pixels 610, 620, 630, and 640 arranged in a first direction (X-axis direction) and a second direction (Y-axis direction). Each of the plurality of pixels 610 to 640 may include a plurality of photodiodes PD1 to PD 4. The number of the plurality of photodiodes PD1 to PD4 included in each of the plurality of pixels 610 to 640 may be modified differently.

In the example embodiment shown in fig. 17, each of the plurality of pixels 610 to 640 may include a transfer transistor connected to the plurality of photodiodes PD1 to PD 4. In each of the plurality of pixels 610 to 640, the number of photodiodes PD1 to PD4 and the number of transfer transistors may be equal to each other. In addition, in each of the plurality of pixels 610 to 640, at least part of the gate electrode layers of the transfer transistors may be connected to each other through connection lines 611, 612, 621, 622, 631, 632, 641, and 642.

For example, referring to the first pixel 610, gate electrode layers of transfer transistors connected to the first photodiode PD1 and the third photodiode PD3 may be connected to each other through the first connection line 611. The first connection line 611 may be connected to the first transmission control line TL1 through the intermediate line 613. In the first pixel 610, the gate electrode layers of the transfer transistors connected to the second photodiode PD2 and the fourth photodiode PD4 may be connected to each other by a second connection line 612. The second connection line 612 may be connected to the second transmission control line TL2 through an intermediate line 614.

Accordingly, the charges generated in the first and third photodiodes PD1 and PD3 may be moved together to the floating diffusion region by a transfer control signal transmitted through the first transfer control line TL 1. In addition, the charges generated in the second and fourth photodiodes PD2 and PD4 may be moved together to the floating diffusion region by a transfer control signal, which is transferred through the second transfer control line TL 2. For example, the first and third photodiodes PD1 and PD3 may operate as the first photodiode group PG1, and the second and fourth photodiodes PD2 and PD4 may operate as the second photodiode group PG 2.

Next, referring to the second pixel 620, gate electrode layers of transfer transistors connected to the first photodiode PD1 and the second photodiode PD2 may be connected to each other through the first connection line 621. The first connection line 621 may be connected to the first transmission control line TL1 through the intermediate line 623. In the second pixel 620, the gate electrode layers of the transfer transistors connected to the third photodiode PD3 and the fourth photodiode PD4 may be connected to each other through a second connection line 622. The second connection line 622 may be connected to the second transmission control line TL2 through an intermediate line 624.

According to example embodiments, at least some of the connection lines 611, 612, 621, 622, 631, 632, 641, and 642 may separate the plurality of photodiodes PD1 through PD4 into the first and second photodiode groups PG1 and PG2 by connecting at least a portion of gate electrode layers of the plurality of transfer transistors to each other.

The charges generated in the first and second photodiodes PD1 and PD2 of the second pixel 620 may be moved together to the floating diffusion region by a transfer control signal transmitted through the first transfer control line TL 1. In addition, the charges generated in the third and fourth photodiodes PD3 and PD4 of the second pixel 620 may be moved together to the floating diffusion region by a transfer control signal, which is transferred through the second transfer control line TL 2. For example, the first and second photodiodes PD1 and PD2 may operate as the first photodiode group PG1, and the third and fourth photodiodes PD3 and PD4 may operate as the second photodiode group PG 2.

In an example embodiment, the third pixel 630 may have a similar structure to the first pixel 610, and the fourth pixel 640 may have a similar structure to the second pixel 620. Alternatively, in an example embodiment, the third pixel 630 may have a similar structure to the second pixel 620, and the fourth pixel 640 may have a similar structure to the first pixel 610.

Referring to the first and second pixels 610 and 620, the first transmission control line TL1 may be connected to the first photodiode group PG1, and the second transmission control line TL2 may be connected to the second photodiode group PG 2. Accordingly, the first photodiode groups PG1 of the first and second pixels 610 and 620 may be activated substantially simultaneously through the first transmission control line TL1, and the second photodiode groups PG2 of the first and second pixels 610 and 620 may be activated substantially simultaneously through the second transmission control line TL 2.

Referring to the third pixel 630, the connection line 631 may be connected to the first transmission control line TL1 through a middle line 633, and the connection line 632 may be connected to the second transmission control line TL2 through a middle line 634.

Referring to the fourth pixel 640, the connection line 641 may be connected to the first transmission control line TL1 through an intermediate line 643, and the connection line 642 may be connected to the second transmission control line TL2 through an intermediate line 644.

The operation of the image sensor will be described in more detail below with reference to fig. 18 and 19.

Fig. 18 is a circuit diagram illustrating a pixel circuit of an image sensor according to an example embodiment of the inventive concepts. Fig. 19 is a timing diagram illustrating an operation of an image sensor according to an exemplary embodiment of the inventive concept.

Fig. 18 is a circuit diagram showing pixel circuits of the first pixel 610 and the second pixel 620 of the pixel array 600 shown in fig. 17. Referring to fig. 18, each of the first and second pixels 610 and 620 may include first to fourth photodiodes PD1 to PD4, first to fourth transfer transistors TX1 to TX4, a reset transistor RX, a driving transistor DX, and a selection transistor SX. The selection transistor SX of the first pixel 610 may be connected to the first column line COL1, and the selection transistor SX of the second pixel 620 may be connected to the second column line COL 2.

The reset transistors RX of the first and second pixels 610 and 620 may be controlled by a reset control signal RG, and the selection transistors SX of the first and second pixels 610 and 620 may be controlled by a selection control signal SEL. The first and third transfer transistors TX1 and TX3 of the first pixel 610 and the first and second transfer transistors TX1 and TX2 of the second pixel 620 may be controlled by a first transfer control signal TG1, wherein the first transfer control signal TG1 is transferred through a first transfer control line TL 1. In addition, the second and fourth transfer transistors TX2 and TX4 of the first pixel 610 and the third and fourth transfer transistors TX3 and TX4 of the second pixel 620 may be controlled by a second transfer control signal TG2, wherein the second transfer control signal TG2 is transferred through a second transfer control line TL 2.

Referring to fig. 19, the operation of the image sensor according to an exemplary embodiment of the inventive concept may be started by turning on the reset transistor RX by the reset control signal RG. When the reset transistor RX is turned on, the floating diffusion regions FD of the first and second pixels 610 and 620 may be reset by the power supply voltage VDD. Referring to fig. 19, after the reset transistor RX is turned off, the readout circuit may sample the reset voltage of each of the first and second pixels 610 and 620 during the reset sampling time TR in which the reset voltage sampling signal SHR has a high logic value.

When the reset sampling time TR elapses, the first and third transfer transistors TX1 and TX3 of the first pixel 610 and the first and second transfer transistors TX1 and TX2 of the second pixel 620 may be turned on by the first transfer control signal TG 1. Accordingly, the charges of the first and third photodiodes PD1 and PD3 of the first pixel 610 may be moved together to the floating diffusion region FD. In addition, the charges of the first and second photodiodes PD1 and PD2 of the second pixel 620 may be moved together to the floating diffusion region FD.

When the first and third transfer transistors TX1 and TX3 of the first pixel 610 and the first and second transfer transistors TX1 and TX2 of the second pixel 620 are turned off, the readout circuit may obtain a first pixel voltage at each of the first and second pixels 610 and 620 in response to the pixel voltage sampling signal SHS. The first pixel voltage obtained by the readout circuit at the first pixel 610 may be a voltage corresponding to the charges of the first photodiode PD1 and the third photodiode PD3 of the first pixel 610. Further, the first pixel voltage obtained by the readout circuit at the second pixel 620 may be a voltage corresponding to the charges of the first photodiode PD1 and the second photodiode PD2 of the second pixel 620. The first pixel voltage obtained by the readout circuit may be stored in, for example, a memory. For example, the memory may be a memory included in the image sensor and one semiconductor package.

When the first sampling time TS1 elapses, the second and fourth transfer transistors TX2 and TX4 of the first pixel 610 and the third and fourth transfer transistors TX3 and TX4 of the second pixel 620 may be turned on by the second transfer control signal TG 2. Accordingly, the charges of the second and fourth photodiodes PD2 and PD4 of the first pixel 610 may be moved together to the floating diffusion region FD. In addition, the charges of the third and fourth photodiodes PD3 and PD4 of the second pixel 620 may be moved together to the floating diffusion FD.

In an example embodiment, there is no period in which the floating diffusion FD is reset by the reset control signal RG between a period in which the first transmission control signal TG1 has a high logic value and a period in which the second transmission control signal TG2 has a high logic value. Accordingly, when the second transfer control signal TG2 has a high logic value, charges generated in the first to fourth photodiodes PD1 to PD4 may be accumulated in each floating diffusion region FD of the first and second pixels 610 and 620.

When the second and fourth transfer transistors TX2 and TX4 of the first pixel 610 and the third and fourth transfer transistors TX3 and TX4 of the second pixel 620 are turned off by the second transfer control signal TG2, the readout circuit may detect the sum pixel voltage during the second sampling time TS 2. The sum pixel voltage detected from the first pixel 610 by the readout circuit may be a voltage corresponding to the sum of the charges of the first to fourth photodiodes PD1 to PD4 of the first pixel 610. Similarly, the sum pixel voltage detected by the readout circuit from the second pixel 620 may be a voltage corresponding to the sum of the electric charges of the first photodiode PD1 to the fourth photodiode PD4 of the second pixel 620.

Control logic including readout circuitry may generate image data using the summed pixel voltages detected at each of first pixel 610 and second pixel 620. In addition, the control logic may obtain the second pixel voltage by calculating a difference between the sum pixel voltage detected in each of the first and second pixels 610 and 620 and the first pixel voltage. In the case of the first pixel 610, the second pixel voltage may be a voltage corresponding to charges generated in the second and fourth photodiodes PD2 and PD 4. In the case of the second pixel 620, the second pixel voltage may be a voltage corresponding to charges generated in the third and fourth photodiodes PD3 and PD 4.

The control logic may calculate the first pixel signal and the second pixel signal in each of the plurality of pixels 610 to 640 using the first pixel voltage and the second pixel voltage obtained in the above-described manner. For example, the first pixel signal may be a signal corresponding to charges generated in the first photodiode group PG1 of each of the plurality of pixels 610 to 640, and the second pixel signal may be a signal corresponding to charges generated in the second photodiode group PG 2.

The control logic may calculate a phase difference between the first pixel signal and the second pixel signal to generate information required for focus adjustment of the image sensor. As shown in fig. 17, since the first and second photodiode groups PG1 and PG2 are defined differently in at least a part of the pixels 610 to 640, the control logic may generate information required to adjust focusing in various directions. Meanwhile, information and image data required for focus adjustment may be obtained only by readout operations a number of times smaller than the number of photodiodes PD1 to PD4 included in each of the plurality of pixels 610 to 640. Accordingly, time and power consumption required for the readout operation can be reduced, and the influence of noise generated in the readout operation can be reduced, thereby improving the performance of the image sensor.

As shown in the figures, which illustrate cross-sectional views of the plurality of photodiodes PD 1-PD 4, according to example embodiments, at least some of the plurality of photodiodes PD 1-PD 4 may be formed at substantially the same depth in a semiconductor substrate (e.g., first layer 40 in fig. 2) in which the plurality of photodiodes PD 1-PD 4 are formed. For example, in an example embodiment, the heights of each of at least some of the plurality of photodiodes PD 1-PD 4 may be approximately equal to each other. Further, in example embodiments, distances between an upper surface of each of at least some of the plurality of photodiodes PD 1-PD 4 and an upper surface of the semiconductor substrate on which they are formed may be substantially the same as each other.

Fig. 20 is a block diagram illustrating an electronic device including an image sensor according to an example embodiment of the inventive concepts.

The computer device 1000 according to the example embodiment shown in fig. 20 may include an image sensor 1010, a display 1020, a memory 1030, a processor 1040, and a port 1050. In addition, the computer device 1000 may also include a wired/wireless communication device, a power supply device, and the like. In the assembly shown in fig. 20, port 1050 may be used, for example, to communicate with video cards, sound cards, memory cards, USB devices, and the like. The computer device 1000 may be, for example, a desktop or laptop computer, a smartphone, a tablet PC, a wearable device such as a smart watch, and the like.

Processor 1040 may perform particular operations or commands, tasks, or the like. The processor 1040 may be, for example, a central processing unit CPU or a micro-processing unit MCU, a system on a chip SOC, etc., and may communicate with the image sensor 1010, the display 1020, and the memory 1030, as well as other devices connected to the port 1050 via the bus 1060.

The memory 1030 may be a storage medium storing data, multimedia data, and the like required for the operation of the computer device 1000. Memory 1030 may include volatile memory, such as random access memory RAM, or non-volatile memory, such as flash memory. In addition, the memory 1030 may further include at least one of a solid state drive SSD, a hard disk drive HDD, and an optical drive ODD as a storage device. The input/output devices may include input devices such as a keyboard, mouse, touch screen, etc., and output devices such as a display, audio output unit, etc.

Image sensor 1010 may be mounted on a package substrate and connected to processor 1040 by bus 1060 or other communication means. The image sensor 1010 may be used in the computer device 1000 in the form of the various example embodiments described with reference to fig. 1-19.

Example embodiments are described in terms of functional blocks, units and/or modules and are illustrated in the accompanying drawings as would be common in the art of the inventive concept. Those skilled in the art will appreciate that the blocks, units and/or modules are physically implemented via electronic (or optical) circuitry, such as logic circuitry, discrete components, microprocessors, hardwired circuitry, memory elements, wired connections, etc., which may be formed using semiconductor-based or other manufacturing techniques. Where the blocks, units, and/or modules are implemented by a microprocessor or the like, they may be programmed using software (e.g., microcode) to perform the various functions discussed herein, and may optionally be driven by firmware and/or software. Alternatively, each block, unit and/or module may be implemented by dedicated hardware or as a combination of dedicated hardware for performing some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) for performing other functions.

As described above, according to example embodiments of the inventive concepts, it is possible to reduce a readout time and power consumed in a readout operation by substantially simultaneously reading pixel voltages corresponding to charges generated in at least a portion of a plurality of photodiodes included in each of a plurality of pixels of an image sensor.

While the present inventive concept has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present inventive concept as defined by the following claims.

38页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:双屏异显方法、存储介质及电子设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类