Image sensor, electronic device, and method for operating image sensor

文档序号:1941733 发布日期:2021-12-07 浏览:21次 中文

阅读说明:本技术 图像传感器、电子设备以及图像传感器的操作方法 (Image sensor, electronic device, and method for operating image sensor ) 是由 李正局 崔允硕 安度昶 郑小荣 于 2021-06-03 设计创作,主要内容包括:提供了一种图像传感器,包括:像素阵列,包括多个像素组,所述多个像素组中的每一个像素组包括施加有第一转换增益的第一像素和施加有第二转换增益的第二像素;读出电路,被配置为:针对多个像素组中的每一个像素组,通过单个读出来接收与第一像素相对应的第一像素信号和与第二像素相对应的第二像素信号,基于多个像素组的第一像素信号来生成第一图像数据,并基于多个像素组的第二像素信号来生成第二图像数据;以及图像信号处理器,被配置为通过以像素组为单位对第一图像数据和第二图像数据进行合并来生成输出图像数据。(There is provided an image sensor including: a pixel array including a plurality of pixel groups, each of the plurality of pixel groups including first pixels to which a first conversion gain is applied and second pixels to which a second conversion gain is applied; a readout circuit configured to: receiving, for each of a plurality of pixel groups, a first pixel signal corresponding to a first pixel and a second pixel signal corresponding to a second pixel through a single readout, generating first image data based on the first pixel signals of the plurality of pixel groups, and generating second image data based on the second pixel signals of the plurality of pixel groups; and an image signal processor configured to generate output image data by merging the first image data and the second image data in units of pixel groups.)

1. An image sensor, comprising:

a pixel array including a plurality of pixel groups, each of the plurality of pixel groups including first pixels to which a first conversion gain is applied and second pixels to which a second conversion gain is applied;

a readout circuit configured to: receiving, for each of the plurality of pixel groups, a first pixel signal corresponding to the first pixel and a second pixel signal corresponding to the second pixel through a single readout, generating first image data based on the first pixel signals of the plurality of pixel groups, and generating second image data based on the second pixel signals of the plurality of pixel groups; and

an image signal processor configured to generate output image data by combining the first image data and the second image data in units of pixel groups.

2. The image sensor of claim 1, wherein, for each of the plurality of pixel groups, the first pixel signal is read out by summing first pixel values of the first pixels and the second pixel signal is read out by summing second pixel values of the second pixels.

3. The image sensor of claim 1, wherein each of the first conversion gain and the second conversion gain is a high conversion gain or a low conversion gain.

4. The image sensor of claim 1, wherein each of the plurality of pixel groups comprises:

2n first pixels and 2n second pixels, where n is a positive integer.

5. The image sensor as set forth in claim 4,

wherein the first pixel of each of the plurality of pixel groups comprises:

a red pixel, a blue pixel, a first green pixel or a second green pixel, and

wherein the second pixel of each of the plurality of pixel groups comprises:

a white pixel or a yellow pixel.

6. The image sensor of claim 4, wherein the first and second pixels of each of the plurality of pixel groups comprise:

a red pixel, a blue pixel, a first green pixel, or a second green pixel.

7. The image sensor of claim 1, wherein each of the plurality of pixel groups comprises:

a third pixel to which a third conversion gain is applied.

8. The image sensor of claim 7, wherein the first, second, and third pixels of each of the plurality of pixel groups comprise:

a red pixel, a blue pixel, a first green pixel, or a second green pixel.

9. The image sensor of claim 7, wherein the readout circuitry is further configured to:

receiving, for each of the plurality of pixel groups, a third pixel signal corresponding to a third pixel to which the third conversion gain is applied by a single readout, an

Third image data is generated based on third pixel signals of the plurality of pixel groups.

10. The image sensor of claim 9, wherein the image signal processor is further configured to: the output image data is generated by merging the first image data, the second image data, and the third image data in units of pixel groups.

11. The image sensor of claim 1, wherein each of the plurality of pixel groups comprises:

a third pixel to which a third conversion gain is applied and a fourth pixel to which a fourth conversion gain is applied.

12. The image sensor of claim 11, wherein the readout circuitry is further configured to:

receiving, for each of the plurality of pixel groups, a third pixel signal corresponding to the third pixel and a fourth pixel signal corresponding to the fourth pixel by a single readout, an

Third image data is generated based on third pixel signals of the plurality of pixel groups, and fourth image data is generated based on fourth pixel signals of the plurality of pixel groups.

13. The image sensor of claim 12, wherein the image signal processor is further configured to: the output image data is generated by merging the first image data, the second image data, the third image data, and the fourth image data in units of pixel groups.

14. An electronic device, comprising:

an image sensor in which a plurality of pixel groups are arranged, each of the plurality of pixel groups including a plurality of pixels, the image sensor being configured to: generating, for each of the plurality of pixel groups, a plurality of pixel signals corresponding to a plurality of conversion gains by a single readout, generating a plurality of image data corresponding to the plurality of pixel signals based on the plurality of pixel signals, and generating output image data by combining the plurality of image data; and

a processor configured to perform image processing on the output image data.

15. The electronic device of claim 14,

each of the plurality of pixel groups includes:

pixels corresponding to the plurality of conversion gains, and

the image sensor is further configured to: generating a plurality of pixel signals corresponding to the plurality of conversion gains by summing pixel values of pixels corresponding to the plurality of conversion gains for each of a plurality of pixel groups.

16. The electronic device of claim 14,

each of the plurality of pixel groups includes:

a first RGB pixel corresponding to a first conversion gain; and

a white pixel corresponding to the second conversion gain, and

the image sensor is further configured to: generating, for each of the plurality of pixel groups, a first pixel signal corresponding to the first RGB pixel and a second pixel signal corresponding to the white pixel.

17. The electronic device of claim 14,

each of the plurality of pixel groups includes:

a first RGB pixel corresponding to a first conversion gain; and

a second RGB pixel corresponding to a second conversion gain, an

The image sensor is further configured to: generating, for each of the plurality of pixel groups, a first pixel signal corresponding to the first RGB pixel and a second pixel signal corresponding to the second RGB pixel.

18. The electronic device of claim 14,

the image sensor is further configured to: generating first output image data corresponding to the first frame portion and second output image data corresponding to the second frame portion, and

the processor is further configured to: generating multi-frame image data by merging the first output image data and the second output image data.

19. A method of operation of an image sensor, the method of operation comprising:

outputting, by a single readout, a first pixel signal corresponding to a first pixel corresponding to a first conversion gain from each of a plurality of pixel groups included in a pixel array;

outputting, from each of the plurality of pixel groups included in the pixel array, a second pixel signal corresponding to a second pixel corresponding to a second conversion gain through the single readout;

generating first image data based on first pixel signals of the plurality of pixel groups;

generating second image data based on second pixel signals of the plurality of pixel groups; and

generating output image data by merging the first image data and the second image data in units of pixel groups.

20. The operating method of claim 19, wherein outputting the first pixel signal comprises: outputting the first pixel signal by summing first pixel values of the first pixels, and

outputting the second pixel signal includes: outputting the second pixel signal by summing second pixel values of the second pixels.

Technical Field

At least some example embodiments of the present inventive concept relate to an image sensor, and more particularly, to an image sensor for generating image data applied with a plurality of conversion gains through a single readout and a method of operating the same.

Background

The image sensor may sense an image of an object by using a photoelectric conversion element that reacts according to the intensity of light reflected from the object and generates image data.

In recent years, a Dual Conversion Gain (DCG) technique has been applied to realize a High Dynamic Range (HDR) image of an image sensor. In some example embodiments of the DCG technique in the related art, an HDR image is obtained by performing a plurality of readouts applied with a High Conversion Gain (HCG) and a Low Conversion Gain (LCG). Therefore, there is a problem that the frame rate is lowered.

Disclosure of Invention

At least some example embodiments of the inventive concepts provide an image sensor for generating image data applied with a plurality of conversion gains through a single readout, an electronic device, and an operating method of the image sensor.

According to at least some example embodiments of the inventive concepts, an image sensor includes: a pixel array including a plurality of pixel groups, each of the plurality of pixel groups including first pixels to which a first conversion gain is applied and second pixels to which a second conversion gain is applied; a readout circuit configured to: receiving, for each of a plurality of pixel groups, a first pixel signal corresponding to a first pixel and a second pixel signal corresponding to a second pixel through a single readout, generating first image data based on the first pixel signals of the plurality of pixel groups, and generating second image data based on the second pixel signals of the plurality of pixel groups; and an image signal processor configured to generate output image data by merging the first image data and the second image data in units of pixel groups.

According to at least some example embodiments of the inventive concepts, an electronic device includes: an image sensor in which a plurality of pixel groups are arranged, each of the plurality of pixel groups including a plurality of pixels, and configured to: generating a plurality of pixel signals corresponding to a plurality of conversion gains by a single readout for each of a plurality of pixel groups, generating a plurality of image data corresponding to the plurality of pixel signals based on the plurality of pixel signals, and generating output image data by combining the plurality of image data; and a processor configured to perform image processing on the output image data.

According to at least some example embodiments of the inventive concepts, a method of operating an image sensor includes: outputting, by a single readout, a first pixel signal corresponding to a first pixel corresponding to a first conversion gain from each of a plurality of pixel groups included in a pixel array; outputting, by a single readout, a second pixel signal corresponding to a second pixel corresponding to a second conversion gain from each of a plurality of pixel groups included in the pixel array; generating first image data based on first pixel signals of a plurality of pixel groups; generating second image data based on second pixel signals of the plurality of pixel groups; and generating output image data by merging the first image data and the second image data in units of pixel groups.

Drawings

The above and other features and advantages of the exemplary embodiments of the present inventive concept will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. The drawings are intended to depict example embodiments of the inventive concept and should not be construed as limiting the intended scope of the claims. The drawings are not to be regarded as being drawn to scale unless specifically indicated.

Fig. 1 is a diagram illustrating an image sensor and an electronic device including the same according to an example embodiment of the inventive concepts;

fig. 2 is a diagram illustrating a pixel array according to an example embodiment of the inventive concepts;

fig. 3 is a diagram illustrating a pixel group of a pixel array according to an example embodiment of the inventive concepts;

fig. 4 is a diagram for explaining a readout operation during a combining operation according to an exemplary embodiment of the inventive concept;

fig. 5 is a diagram for explaining a method of generating synthetic image data according to an example embodiment of the inventive concepts;

fig. 6 to 8B are diagrams for explaining a method of generating synthetic image data according to a pattern type of a pixel array;

fig. 9 and 10 are diagrams illustrating a method of generating multi-frame High Dynamic Range (HDR) image data according to an example embodiment of the inventive concepts;

fig. 11 is a flowchart of an operating method of an image sensor according to an example embodiment of the inventive concepts;

fig. 12 is a diagram illustrating an electronic device according to an example embodiment of the inventive concepts;

fig. 13 is a diagram illustrating a portion of an electronic device according to an example embodiment of the inventive concepts; and

fig. 14 is a diagram illustrating a specific configuration of a camera module according to an example embodiment of the inventive concepts.

Detailed Description

Embodiments are described in terms of functional blocks, units and/or modules and are illustrated in the accompanying drawings as are common in the art of the inventive concept. Those skilled in the art will appreciate that the blocks, units and/or modules are physically implemented via electronic (or optical) circuitry, such as logic circuitry, discrete components, microprocessors, hardwired circuitry, memory elements, wired connections, etc., which may be formed using semiconductor-based or other manufacturing techniques. Where the blocks, units, and/or modules are implemented by a microprocessor or the like, they may be programmed using software (e.g., microcode) to perform the various functions discussed herein, and may optionally be driven by firmware and/or software. Alternatively, each block, unit and/or module may be implemented by dedicated hardware or as a combination of dedicated hardware for performing some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) for performing other functions. Furthermore, each block, unit and/or module of an embodiment may be physically separated into two or more interactive and discrete blocks, units and/or modules without departing from the scope of the inventive concept. Furthermore, the blocks, units and/or modules of an embodiment may be physically combined into more complex blocks, units and/or modules without departing from the scope of the inventive concept.

Fig. 1 is a diagram illustrating an image sensor and an electronic device including the same according to an example embodiment of the inventive concepts.

Referring to fig. 1, an electronic device 10 may include an image sensor 100 and a processor 200. The image sensor 100 may convert an optical signal of an object incident through the optical lens LW into image data. The image sensor 100 may be installed in an electronic device having an image or optical sensing function. For example, the image sensor 100 may be installed in an electronic device 10, such as a digital still camera, a digital video camera, a smart phone, a wearable device, an internet of things (IoT) device, a tablet Personal Computer (PC), a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a navigation device, and so on. The image sensor 100 may also be mounted in an electronic device 10 provided as a component of a vehicle, furniture, a manufacturing facility, a door, various measurement devices, or the like. For example, the processor 200 may be an application processor and/or an image processor. Therefore, in this specification, the processor 200 may also be referred to as an image processor 200.

According to at least some example embodiments of the inventive concepts, the image processor 200 may be or include the following: hardware including logic circuitry; a hardware/software combination that executes software; or a combination thereof. For example, the image processor may more specifically include, but is not limited to, one or more of the following: a Central Processing Unit (CPU), processor core, Arithmetic Logic Unit (ALU), digital signal processor, microcomputer, Field Programmable Gate Array (FPGA), programmable logic unit, microprocessor, Application Specific Integrated Circuit (ASIC), etc. According to at least some example embodiments of the inventive concepts, the image processor 200 may be specifically constructed and/or programmed (e.g., via computer-executable program code) to perform and/or control some or all of the operations described in this specification as being performed by the image processor or elements of the image processor.

Referring to fig. 1, an image sensor 100 may include a pixel array 110, a readout circuit 120, and an image signal processor 130. In an example embodiment, the pixel array 110, the readout circuitry 120, and the image signal processor 130 may be embodied together as a single semiconductor chip or semiconductor module. In example embodiments, the pixel array 110 and the readout circuitry 120 may be embodied together as one semiconductor chip or semiconductor module, and the image signal processor 130 may be embodied as another semiconductor chip or semiconductor module. According to at least some example embodiments of the inventive concepts, the image signal processor 130 may be or include the following: hardware including logic circuitry; a hardware/software combination that executes software; or a combination thereof. For example, the image signal processor 130 may more specifically include, but is not limited to, one or more of the following: a Central Processing Unit (CPU), processor core, Arithmetic Logic Unit (ALU), digital signal processor, microcomputer, Field Programmable Gate Array (FPGA), programmable logic unit, microprocessor, Application Specific Integrated Circuit (ASIC), etc. According to at least some example embodiments of the inventive concepts, the image signal processor 130 may be specifically constructed and/or programmed (e.g., via computer-executable program code) to perform and/or control some or all of the operations described in this specification as being performed by the signal processor or elements of the signal processor.

The pixel array 110 may be embodied as a photoelectric conversion element such as a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS), or various photoelectric conversion elements. The pixel array 110 may include a plurality of pixels PX for converting a received light signal (light) into an electrical signal, and the plurality of pixels PX may be arranged in a matrix. The pixel array 110 includes a plurality of row lines and a plurality of column lines connected to a plurality of pixels PX.

Each of the plurality of pixels PX includes an optical sensing element (or a photoelectric conversion element). Examples of the optical sensing element may include a photodiode, a phototransistor, a photogate, a pinned photodiode, a perovskite photodiode, an organic photoconductive film, and the like, or various optical sensing elements may be applied.

The plurality of pixels PX may sense light by the optical sensing element and convert the sensed light into an electrical signal. Each of the plurality of pixels PX may sense light of a specific spectral region. For example, the plurality of pixels may include a pixel for converting light of a red spectral region into an electric signal (hereinafter referred to as a red pixel), a pixel for converting light of a green spectral region into an electric signal (hereinafter referred to as a green pixel), and a pixel for converting light of a blue spectral region into an electric signal (hereinafter referred to as a blue pixel). However, example embodiments of the inventive concept are not limited thereto, and the plurality of pixels PX may further include a white pixel. As another example, the plurality of pixels PX may include a combination of pixels of different colors (e.g., yellow pixels, cyan pixels, magenta pixels, etc.).

A color filter array for transmitting light of a specific spectral region may be disposed on the plurality of pixels PX, and colors to be sensed by the plurality of pixels PX may be determined by the color filters on the plurality of pixels PX. However, example embodiments of the inventive concept are not limited thereto, and in some example embodiments of a specific optical sensing element, light of a specific wavelength band may be converted into an electrical signal according to a level of the electrical signal provided to the optical sensing element.

The electric charges generated by the photoelectric conversion element of each of the plurality of pixels PX may be accumulated in the floating diffusion node, and the electric charges accumulated in the diffusion node may be read out by being converted into a voltage. In some example embodiments, the rate at which charge accumulated in a floating diffusion is converted to a voltage may be referred to as a conversion gain.

The conversion gain of each of the plurality of pixels PX may vary according to the capacitance of the floating diffusion node. Specifically, when the capacitance of the floating diffusion increases, the conversion gain may decrease, and when the capacitance of the floating diffusion decreases, the conversion gain may increase. The conversion gain of each of the plurality of pixels PX may be changed by a conversion gain transistor (not shown) or a capacitor (not shown) connected to the floating diffusion node.

A plurality of conversion gains, for example, a High Conversion Gain (HCG) and a Low Conversion Gain (LCG), may be applied to the plurality of pixels PX. However, the inventive concept is not limited thereto, and the plurality of conversion gains applied to the plurality of pixels PX may include three or more conversion gains. The value of HCG is higher than the value of LCG.

Fig. 2 is a diagram illustrating a pixel array according to an example embodiment of the inventive concepts.

Referring to fig. 2, the pixel array 110 may include a plurality of pixel groups PG, each of which includes two or more pixels PX adjacent to each other. For example, the pixel array 110 may include a plurality of pixel groups PG, each of which includes pixels PX arranged in a 2n × 2n matrix (n is a positive integer). Meanwhile, the present disclosure is not limited to the example shown in fig. 2. For example, the pixel array 110 may include a plurality of Pixel Groups (PG) including Pixels (PX) arranged in a 3N × 3N matrix (N is a positive integer).

The plurality of pixel groups PG are basic units to which the readout method according to an exemplary embodiment of the inventive concept is applied when the image sensor 100 operates in the first mode in which the combination operation is performed, and the plurality of pixel groups PG may correspond to a plurality of combination areas of image data generated based on the readout signal. The pixel array 110 may output the pixel values of the pixels PX included in each of the plurality of pixel groups PG by a single readout. The pixel values of the pixels PX included in one pixel group during readout may be summed and output as at least one pixel signal.

In some example embodiments, the pixel array 110 may output a plurality of pixel signals corresponding to a plurality of conversion gains of a plurality of pixel groups PG. In example embodiments, the plurality of pixel groups PG may be divided into a plurality of subgroups corresponding to a plurality of conversion gains, and pixel values of the pixels PX included in each of the plurality of subgroups may be summed and output as a plurality of pixel signals corresponding to the plurality of conversion gains.

For example, when the pixel group PG includes a first sub-group corresponding to the high conversion gain HCG and a second sub-group corresponding to the low conversion gain LCG, the pixel array 110 may sum up pixel values of the pixels PX included in the first sub-group and output a first pixel signal, and sum up pixel values of the pixels PX included in the second sub-group and output a second pixel signal. This will be described in detail below with reference to fig. 3 and 4.

When the image sensor 100 operates in the second mode (e.g., a normal mode in which combination is not performed), the pixel array 110 may read out a plurality of pixel signals of the plurality of pixels PX in units of rows.

The readout circuit 120 may receive pixel signals from the pixel array 110 and convert the pixel signals into digital data, thereby generating image data (which may be referred to as an image). For convenience of explanation, the image data generated by the readout circuit 120 will be hereinafter referred to as first image data IDT 1.

For example, in the second mode in which the combining operation is not performed, the readout circuit 120 may generate the first image data IDT1 including the pixel values of the plurality of pixels PX based on the pixel signals output from the pixel array 110.

As another example, in the first mode in which the combining operation is performed, the readout circuit 120 may receive a plurality of pixel signals corresponding to a plurality of conversion gains from the plurality of pixel groups PG, and generate a plurality of first image data IDTs 1 corresponding to the plurality of conversion gains based on the received plurality of pixel signals. For example, the readout circuit 120 may generate the first image data IDT1 corresponding to a first conversion gain (e.g., High Conversion Gain (HCG)) based on a plurality of first pixel signals output from a first sub-group of the plurality of pixel groups PG. The readout circuit 120 may generate the first image data IDT1 corresponding to a second conversion gain (e.g., a Low Conversion Gain (LCG)) based on a plurality of second pixel signals output from a second sub-group of the plurality of pixel groups PG.

The image signal processor 130 can perform image processing on the first image data IDT1 output from the readout circuit 120. For example, the image signal processor 130 may perform image processing such as defective pixel correction, color correction, and image quality improvement on the image data (e.g., the first image data IDT 1).

According to an example embodiment of the inventive concepts, in the first mode of performing the combining operation, the image signal processor 130 may synthesize a plurality of first image data IDTs 1 corresponding to a plurality of conversion gains to generate output image data OIDT. In an example embodiment, the image signal processor 130 may generate the output image data oid by synthesizing a plurality of first image data IDT1 corresponding to a plurality of conversion gains in units of pixel groups PG. This will be described in detail below with reference to fig. 5.

In addition, the image signal processor 130 may provide image processed image data (e.g., output image data OIDT) to an image processor 200 (e.g., an application processor, a main processor of the electronic device 10, a graphics processor, etc.).

Fig. 3 is a diagram illustrating a pixel group of a pixel array according to an example embodiment of the inventive concepts. Specifically, fig. 3 is a diagram showing pixel groups PG1, PG2, PG3, and PG4 of the pixel array 110 having an RGBW pattern.

Referring to fig. 3, the pixel array 110 having the RGBW pattern may include first and second ROWs ROW1 and 2 in which red, white, green (e.g., first green Gr) and white pixels R, W are sequentially disposed, and third and fourth ROWs ROW3 and 4 in which green (e.g., second green Gb), white, blue and white pixels W are sequentially disposed. In the RGBW pattern, the white pixels W in the first to fourth ROWs ROW1 to 4 may be disposed in a diagonal direction.

The pixel array 110 may include a plurality of pixel groups PG1, PG2, PG3, and PG4, each of which includes four adjacent pixels PX. For example, referring to fig. 3, the pixel array 110 may include: a first pixel group PG1 (including two red pixels R and two white pixels W), a second pixel group PG2 (including two first green pixels Gr and two white pixels W), a third pixel group PG3 (including two second green pixels Gb and two white pixels W), and a fourth pixel group PG4 (including two blue pixels B and two white pixels W). That is, the pixel groups PG1, PG2, PG3, and PG4 may include color pixels of the same color, and white pixels.

The pixels PX included in each of the plurality of pixel groups PG1, PG2, PG3, and PG4 may be divided into a plurality of subgroups. In an example embodiment, the plurality of pixels PX included in each of the plurality of pixel groups PG1, PG2, PG3, and PG4 may be divided into a plurality of sub-groups according to whether the plurality of pixels PX are white pixels. The plurality of subgroups may correspond to a plurality of conversion gains.

For example, referring to fig. 3, among the pixels PX of the first pixel group PG1, the red pixels R except the white pixels W may be grouped into a first sub-group, and the white pixels W may be grouped into a second sub-group. In the pixels PX of the second pixel group PG2, the first green pixels Gr may be grouped into a first sub-group, and the white pixels W may be grouped into a second sub-group. In the pixels PX of the third pixel group PG3, the second green pixels Gb may be grouped into a first sub-group, and the white pixels W may be grouped into a second sub-group. In the pixels PX of the fourth pixel group PG4, the blue pixels B may be grouped into a first sub-group, and the white pixels W may be grouped into a second sub-group. A first sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG4 may correspond to a first conversion gain (e.g., a Low Conversion Gain (LCG)), and a second sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG4 may correspond to a second conversion gain (e.g., a High Conversion Gain (HCG)). The number of conversion gains may be equal to or less than the number of subgroups.

However, the inventive concept is not limited thereto, and according to example embodiments, a plurality of sub-groups may be divided according to the positions of the pixels PX of a group. For example, referring to fig. 3, pixels of a pixel group located in a first diagonal direction may be divided into a first sub-group, and pixels located in a second diagonal direction different from the first diagonal direction may be divided into a second sub-group.

The pixel array 110 may output the pixel signals of the first sub-group and the pixel signals of the second sub-group by a single readout for a plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the pixel array 110 may output pixel signals of a first sub-group including the red pixels R and pixel signals of a second sub-group including the white pixels W by a single readout with respect to the first pixel group PG 1.

Although fig. 3 illustrates that the pixel array 110 includes a 4 × 4 pixel matrix, the inventive concept is not limited thereto, and the pixel array 110 may include an M × N pixel matrix (M and N are positive integers). Alternatively, the pixel array 110 may have various patterns as well as the RGBW pattern. For example, the pixel array 110 may have an RGBY pattern in which a yellow pixel Y is disposed instead of a white pixel W.

Fig. 4 is a diagram for explaining a read-out operation during a combining operation according to an example embodiment of the inventive concepts. Specifically, fig. 4 is a diagram for explaining a readout operation of the pixel array 110 of fig. 3 with respect to the pixel groups PG1, PG2, PG3, and PG 4.

Referring to fig. 4, when the image sensor 100 operates in the first mode in which the combining operation is performed, the plurality of pixel groups PG1, PG2, PG3, and PG4 may output pixel signals in units of sub-groups. In an example embodiment, in the readout process, the pixel values of the pixels PX included in the same sub-group may be summed and output as one pixel signal. For example, referring to fig. 4, in a readout process, pixel values of red pixels R included in a first sub-group of the first pixel group PG1 may be summed and output as a first pixel signal, and pixel values of white pixels W included in a second sub-group of the first pixel group PG1 may be summed and output as a second pixel signal.

In some example embodiments, in the readout process, a conversion gain corresponding to the sub group may be applied to the pixels PX included in the sub group. For example, referring to fig. 4, when a first sub-group of the first pixel group PG1 corresponds to a first conversion gain (e.g., a low conversion gain), a first pixel signal applied with a low conversion gain may be output from the red pixels R included in the first sub-group. When the second sub-group of the first pixel group PG1 corresponds to a second conversion gain (e.g., a high conversion gain), a second pixel signal to which the high conversion gain is applied may be output from the white pixels W included in the second sub-group.

As described above, when the image sensor 100 performs the combining operation, a plurality of pixel signals corresponding to a plurality of conversion gains can be output from one pixel group by a single readout.

The readout circuit 120 may generate a plurality of first image data IDT1_ CG1 and IDT1_ CG2 corresponding to a plurality of conversion gains based on a plurality of pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. In an example embodiment, the readout circuit 120 may receive a first pixel signal corresponding to a first conversion gain from a first sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG4, and generate first image data IDT1_ CG1 corresponding to the first conversion gain based on the received first pixel signal.

For example, referring to fig. 4, the readout circuit 120 may receive a first pixel signal from red pixels R included in a first sub-group of the first pixel group PG1, and calculate a first pixel value R1 corresponding to the first pixel group PG1 based on the received first pixel signal. The readout circuit 120 may receive a first pixel signal from the first green pixels Gr included in the first sub-group of the second pixel group PG2, and calculate a first pixel value Gr1 corresponding to the second pixel group PG2 based on the received first pixel signal. The readout circuit 120 may receive the first pixel signal from the second green pixels Gb included in the first sub-group of the third pixel group PG3, and calculate the first pixel value Gb1 corresponding to the third pixel group PG3 based on the received first pixel signal. The readout circuit 120 may receive a first pixel signal from the blue pixels B included in the first sub-group of the fourth pixel group PG4, and calculate a first pixel value B1 corresponding to the fourth pixel group PG4 based on the received first pixel signal. The readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain based on the calculated first pixel values R1, Gr1, Gb1, and B1 corresponding to the plurality of pixel groups PG1, PG2, PG3, and PG 4.

The readout circuit 120 may receive a second pixel signal corresponding to the second conversion gain from a second sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG4, and generate the first image data IDT1_ CG2 corresponding to the second conversion gain based on the received second pixel signal.

For example, referring to fig. 4, the readout circuit 120 may receive a second pixel signal from the white pixels W included in the second sub-group of the first pixel group PG1, and calculate a second pixel value W1 corresponding to the first pixel group PG1 based on the received second pixel signal. The readout circuit 120 may receive a second pixel signal from the white pixel W included in the second sub-group of the second pixel group PG2, and calculate a second pixel value W2 corresponding to the second pixel group PG2 based on the received second pixel signal. The readout circuit 120 may receive a second pixel signal from the white pixel W included in the second sub-group of the third pixel group PG3, and calculate a second pixel value W3 corresponding to the third pixel group PG3 based on the received second pixel signal. The readout circuit 120 may receive a second pixel signal from the white pixel W included in the second sub-group of the fourth pixel group PG4, and calculate a second pixel value W4 corresponding to the fourth pixel group PG4 based on the received second pixel signal. Thereafter, the readout circuit 120 may generate the first image data IDT1_ CG2 corresponding to the second conversion gain based on the calculated second pixel values W1, W2, W3, and W4 corresponding to the plurality of pixel groups PG1, PG2, PG3, and PG 4.

Fig. 5 is a diagram for explaining a method of generating synthetic image data according to an example embodiment of the inventive concepts. Specifically, fig. 5 is a diagram for explaining a method of generating output image data OIDT which is synthesized image data obtained by synthesizing a plurality of first image data IDT1_ CG1 and IDT1_ CG 2.

Referring to fig. 5, the readout circuit 120 may transmit a plurality of first image data IDT1_ CG1 and IDT1_ CG2 corresponding to a plurality of conversion gains to the image signal processor 130. The image signal processor 130 may generate the output image data ODIT by synthesizing a plurality of first image data IDT1_ CG1 and IDT1_ CG2 in units of pixel groups.

For example, the image signal processor 130 may calculate the third pixel value R2 by combining the first pixel value R1 and the second pixel value R2 of the plurality of first image data IDT1_ CG1 and IDT1_ CG2 corresponding to the first pixel group PG 1. The image signal processor 130 may calculate a third pixel value Gr2 by combining the first pixel value Gr1 and the second pixel value W2 of the plurality of first image data IDT1_ CG1 and IDT1_ CG2 corresponding to the second pixel group PG 2. The image signal processor 130 may calculate the third pixel value Gb2 by combining the first pixel value Gb1 and the second pixel value W3 of the plurality of first image data IDT1_ CG1 and IDT1_ CG2 corresponding to the third pixel group PG 3. The image signal processor 130 may calculate the third pixel value B2 by combining the first pixel value B1 and the second pixel value W4 of the plurality of first image data IDT1_ CG1 and IDT1_ CG2 corresponding to the fourth pixel group PG 4. The image signal processor 130 may generate the output image data OIDT based on the calculated third pixel values R2, Gr2, Gb2, and B2 corresponding to the plurality of pixel groups PG1, PG2, PG3, and PG 4.

As described above, the image sensor 100 according to an exemplary embodiment of the inventive concept may generate a plurality of image data corresponding to a plurality of conversion gains by a single readout to maintain a high frame rate and generate a color-range-rich image.

Although it is described above with reference to fig. 3 to 5 that the pixel array 110 has the RGBW pattern, the inventive concept is not limited thereto. For example, the inventive concept may also be applied to an example embodiment in which the pixel array 110 has a pattern different from the RGBW pattern. A case where the pixel array 110 has different patterns according to an exemplary embodiment of the inventive concept will be described below with reference to fig. 6 to 8B.

Fig. 6 to 8B are diagrams for explaining a method of generating synthetic image data according to a pattern type of a pixel array. Fig. 6 to 8B will be described below, in which description of the same portions as those in fig. 3 to 5 is omitted.

First, fig. 6 is a diagram for explaining a method of generating synthetic image data when the pixel array 110a has a TETRA pattern. Referring to fig. 6, the pixel array 110a may have a four-pattern in which a red pixel group PG1 including red pixels R arranged in a 2 × 2 matrix, a first green pixel group PG2 including first green pixels Gr arranged in a 2 × 2 matrix, a second green pixel group PG3 including second green pixels Gb arranged in a 2 × 2 matrix, and a blue pixel group PG4 including blue pixels B arranged in a 2 × 2 matrix are repeatedly arranged.

The pixels PX included in each of the pixel groups PG1, PG2, PG3, and PG4 of the pixel array 110a may be divided into a plurality of subgroups. In an example embodiment, the plurality of sub-groups may be divided according to the positions of the pixels PX in the pixel group. For example, referring to fig. 7, in a pixel group, pixels located in a first diagonal direction (e.g., pixels located at upper left and lower right sides of the pixel group) may be grouped into a first sub-group, and pixels located in a second diagonal direction different from the first diagonal direction (e.g., pixels located at upper right and lower left sides of the pixel group) may be grouped into a second sub-group.

In the pixel array 110a, a first conversion gain may be applied to the pixels PX included in the first sub-group, and a second conversion gain may be applied to the pixels PX included in the second sub-group. The pixel array 110 may output the pixel signals of the first sub-group and the pixel signals of the second sub-group by a single readout for a plurality of pixel groups PG1, PG2, PG3, and PG 4.

For example, the pixel array 110a may output, by a single readout, a first pixel signal of a first sub-group including the red pixel R located in a first diagonal direction and a second pixel signal of a second sub-group including the red pixel R located in a second diagonal direction with respect to the first pixel group PG 1. The pixel array 110a may output pixel signals of the other pixel groups PG2, PG3, and PG4 by the above method.

The readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain based on the first pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain from the first pixel values R1, Gr1, Gb1, and B1 generated based on the first pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

In addition, the readout circuit 120 may generate the first image data IDT1_ CG2 corresponding to the second conversion gain based on the second pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG2 corresponding to the second conversion gain from the second pixel values R2, Gr2, Gb2, and B2 generated based on the second pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

The image signal processor 130 may generate the output image data ODIT by synthesizing the first image data IDT1_ CG1 and IDT1_ CG2 in units of pixel groups. For example, the image signal processor 130 may generate the output image data OIDT including the third pixel values R3, Gr3, Gb3, and B3 by synthesizing the first image data IDT1_ CG1 and IDT1_ CG2 in units of a plurality of pixel groups PG1, PG2, PG3, and PG 4.

Fig. 7 is a diagram for explaining a method of generating synthetic image data when the pixel array 110b has a NONA pattern. Referring to fig. 6, the pixel array 110a may have a NONA pattern in which a red pixel group PG1 including red pixels R arranged in a 3 × 3 matrix, a first green pixel group PG2 including first green pixels Gr arranged in a 3 × 3 matrix, a second green pixel group PG3 including second green pixels Gb arranged in a 3 × 3 matrix, and a blue pixel group PG4 including blue pixels B arranged in a 3 × 3 matrix are repeatedly arranged.

The pixels PX included in each of the pixel groups PG1, PG2, PG3, and PG4 of the pixel array 110b may be divided into a plurality of subgroups. In an example embodiment, the plurality of sub-groups may be divided according to the positions of the pixels PX in the pixel group.

For example, referring to fig. 7, in the pixel group, pixels located in a first column may be divided into a first sub-group, pixels located in a second column may be divided into a second sub-group, and pixels located in a third column may be divided into a third sub-group. However, the inventive concept is not limited thereto, and for example, in the pixel group, the pixels located in the first row may be divided into a first sub-group, the pixels located in the second row may be divided into a second sub-group, and the pixels located in the third row may be divided into a third sub-group.

In the pixel array 110b, a first conversion gain may be applied to the pixels PX included in the first sub-group, a second conversion gain may be applied to the pixels PX included in the second sub-group, and a third conversion gain may be applied to the pixels PX included in the third sub-group. The first to third conversion gains may be the same or different from each other. The pixel array 110 may output pixel signals of the first, second, and third sub-groups by a single readout for a plurality of pixel groups PG1, PG2, PG3, and PG 4.

For example, the pixel array 110b may output a first pixel signal of a first sub-group including a red pixel R located in a first column, a second pixel signal of a second sub-group including a red pixel R located in a second column, and a third pixel signal of a third sub-group including a red pixel R located in a third column. The pixel array 110b may output pixel signals of the other pixel groups PG2, PG3, and PG4 by the above method.

The readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain based on the first pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain from the first pixel values R1, Gr1, Gb1, and B1 generated based on the first pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

In addition, the readout circuit 120 may generate the first image data IDT1_ CG2 corresponding to the second conversion gain based on the second pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG2 corresponding to the second conversion gain from the second pixel values R2, Gr2, Gb2, and B2 generated based on the second pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

In addition, the readout circuit 120 may generate the first image data IDT1_ CG3 corresponding to the third conversion gain based on the third pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG3 corresponding to the third conversion gain from third pixel values R3, Gr3, Gb3, and B3 generated based on third pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

The image signal processor 130 may generate the output image data ODIT by synthesizing the first image data IDT1_ CG1, IDT1_ CG2 and IDT1_ CG3 in units of pixel groups. In an example embodiment, the image signal processor 130 may generate the output image data OIDT including the fourth pixel values R4, Gr4, Gb4, and B4 by synthesizing the first image data IDT1_ CG1, IDT1_ CG2, and IDT1_ CG3 in units of a plurality of pixel groups PG1, PG2, PG3, and PG 4.

For example, the image signal processor 1130 may calculate a fourth pixel value R4 corresponding to the first pixel group PG1 based on the pixel values R1, R2, and R3 of the first image data IDT1_ CG1, IDT1_ CG2, and IDT1_ CG3 corresponding to the first pixel group PG 1. In this way, the image signal processor 130 may calculate fourth pixel values Gr4, Gb4, and B4 corresponding to the other pixel groups PG2, PG3, and PG 4. Thereafter, the image signal processor 130 may generate the output image data OIDT based on the fourth pixel values R4, Gr4, Gb4, and B4.

Fig. 8A and 8B are diagrams for explaining a method of generating synthetic image data when the pixel array 110c has a hevadeca pattern. Referring to fig. 6, the pixel array 110a may have a HEXADECA pattern in which a red pixel group PG1 including red pixels R arranged in a 4 × 4 matrix, a first green pixel group PG2 including first green pixels Gr arranged in a 4 × 4 matrix, a second green pixel group PG3 including second green pixels Gb arranged in a 4 × 4 matrix, and a blue pixel group PG4 including blue pixels B arranged in a 4 × 4 matrix are repeatedly arranged.

The pixels PX included in each of the plurality of pixel groups PG1, PG2, PG3, and PG4 of the pixel array 110c may be divided into a plurality of subgroups. In an example embodiment, the plurality of sub-groups may be divided according to the positions of the pixels PX in the pixel group.

For example, referring to fig. 8A, each of a plurality of pixel groups PG1, PG2, PG3, and PG4 may be divided into four sub-groups. Specifically, among the pixels included in the plurality of pixel groups PG1, PG2, PG3, and PG4, the upper left four pixels may be grouped into a first sub-group, the upper right four pixels may be grouped into a second sub-group, the lower left four pixels may be grouped into a third sub-group, and the lower right four pixels may be grouped into a fourth sub-group. However, the inventive concept is not limited thereto, and for example, in the pixel group, pixels located in a first row (first column) may be grouped into a first sub-group, pixels located in a second row (second column) may be grouped into a second sub-group, pixels located in a third row (third column) may be grouped into a third sub-group, and pixels located in a fourth row (fourth column) may be grouped into a fourth sub-group.

In the pixel array 110c, a first conversion gain may be applied to the pixels PX included in the first sub-group, a second conversion gain may be applied to the pixels PX included in the second sub-group, a third conversion gain may be applied to the pixels PX included in the third sub-group, and a fourth conversion gain may be applied to the pixels PX included in the fourth sub-group. The first to fourth conversion gains may be the same or different from each other. The pixel array 110c may output pixel signals of the first, second, third, and fourth sub-groups by a single readout for a plurality of pixel groups PG1, PG2, PG3, and PG 4.

For example, referring to fig. 8A, the pixel array 110c may output, through a single readout, a first pixel signal of a first sub-group including an upper left red pixel R, a second pixel signal of a second sub-group including an upper right red pixel R, a third pixel signal of a third sub-group including a lower left red pixel R, and a fourth pixel signal of a fourth sub-group including a lower right red pixel R with respect to the first pixel group PG 1. The pixel array 110c may output pixel signals of the other pixel groups PG2, PG3, and PG4 by the above method.

The readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain based on the first pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain from the first pixel values R1, Gr1, Gb1, and B1 generated based on the first pixel signals of the first sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

In addition, the readout circuit 120 may generate the second image data IDT1_ CG2 corresponding to the second conversion gain based on the second pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG2 corresponding to the second conversion gain from the second pixel values R2, Gr2, Gb2, and B2 generated based on the second pixel signals of the second sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

The readout circuit 120 may generate the first image data IDT1_ CG3 corresponding to the third conversion gain based on the third pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG3 corresponding to the third conversion gain from third pixel values R3, Gr3, Gb3, and B3 generated based on third pixel signals of a third sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

The readout circuit 120 may generate the first image data IDT1_ CG4 corresponding to the fourth conversion gain based on the fourth pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG4 corresponding to the fourth conversion gain from fourth pixel values R4, Gr4, Gb4, and B4 generated based on fourth pixel signals of a fourth sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

The image signal processor 130 may generate the output image data OD _ not by synthesizing the first image data IDT1_ CG1, IDT1_ CG2, IDT1_ CG3 and IDT1_ CG4 in units of pixel groups. In an example embodiment, the image signal processor 130 may generate the output image data 0IDT including the fifth pixel values R5, Gr5, Gb5 and B5 by synthesizing the first image data IDT1_ CG1, IDT1_ CG2, IDT1_ CG3 and IDT1_ CG4 in units of a plurality of pixel groups PG1, PG2, PG3 and PG 4.

For example, the image signal processor 130 may calculate a fifth pixel value R5 corresponding to the first pixel group PG1 based on the pixel values R1, R2, R3, and R4 of the first image data IDT1_ CG1, IDT1_ CG2, IDT1_ CG3, and IDT1_ CG4 corresponding to the first pixel group PG 1. In this way, the image signal processor 130 may calculate fifth pixel values Gr5, Gb5, and B5 corresponding to the other pixel groups PG2, PG3, and PG 4. Thereafter, the image signal processor 130 may generate output image data OIDT based on the fifth pixel values Gr5, Gb5, and B5.

For example, referring to fig. 8A, each of a plurality of pixel groups PG1, PG2, PG3, and PG4 may be divided into two sub-groups. Specifically, among the pixels included in each of the plurality of pixel groups PG1, PG2, PG3, and PG4, eight upper pixels may be grouped into a first sub-group, and eight lower pixels may be grouped into a second sub-group. However, the inventive concept is not limited thereto, and for example, in the pixel group, the left portion of pixels may be grouped into a first sub-group, and the right portion of pixels may be grouped into a second sub-group.

In the pixel array 110c, a first conversion gain may be applied to the pixels PX included in the first sub-group, and a second conversion gain may be applied to the pixels PX included in the second sub-group. The first conversion gain and the second conversion gain may be the same or different from each other. The pixel array 110 may output the pixel signals of the first sub-group and the pixel signals of the second sub-group by a single readout for a plurality of pixel groups PG1, PG2, PG3, and PG 4.

For example, referring to fig. 8B, the pixel array 110a may output a first pixel signal of a first sub-group including the upper red pixel R and a second pixel signal of a second sub-group including the lower red pixel R through a single readout with respect to the first pixel group PG 1. The pixel array 110c may output pixel signals of the other pixel groups PG2, PG3, and PG4 by the above method.

The readout circuit 120 may generate the first image data IDT1_ CG1 corresponding to the first conversion gain based on the first pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data TDT1_ CG1 corresponding to the first conversion gain from the first pixel values R1, Gr1, Gb1, and B1 generated based on the first pixel signals of the first sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

In addition, the readout circuit 120 may generate the second image data IDT1_ CG2 corresponding to the second conversion gain based on the second pixel signals of the plurality of pixel groups PG1, PG2, PG3, and PG 4. For example, the readout circuit 120 may generate the first image data IDT1_ CG2 corresponding to the second conversion gain from the second pixel values R2, Gr2, Gb2, and B2 generated based on the second pixel signals of the second sub-group of the plurality of pixel groups PG1, PG2, PG3, and PG 4.

The image signal processor 130 may generate the output image data OD _ no by synthesizing the first image data IDT1_ CG1 and IDT1_ CG2 in units of pixel groups. In an example embodiment, the image signal processor 130 may generate the output image data OIDT including the third pixel values R3, Gr3, Gb3, and B3 by synthesizing the first image data IDT1_ CG1 and IDT1_ CG2 in units of a plurality of pixel groups PG1, PG2, PG3, and PG 4.

For example, referring to fig. 8B, the image signal processor 130 may calculate a third pixel value R3 corresponding to the first pixel group PG1 based on the pixel values R1 and R2 of the first image data IDT1_ CG1 and IDT1_ CG2 corresponding to the first pixel group PG 1. In this way, the image signal processor 130 may calculate third pixel values Gr3, Gb3, and B3 corresponding to the other pixel groups PG2, PG3, and PG 4. Further, the image signal processor 130 may generate the output image data OIDT based on the third pixel values R3, Gr3, Gb3, and B3.

Fig. 9 and 10 are diagrams illustrating a method of generating multi-frame High Dynamic Range (HDR) image data according to an example embodiment of the inventive concepts. Although it is assumed for convenience of explanation that the pixel array 110 has an RGBW pattern, the inventive concept is not limited thereto, and the following description may be basically applied to an example embodiment in which the pixel array 110 has an RGBY pattern, a TETRA pattern, a NONA pattern, or a HEXADECA pattern. Fig. 9 and 10 will be described below, in which description of the same portions as those in fig. 3 to 8 is omitted.

Referring to fig. 9, the electronic device 10 of fig. 1 may generate a plurality of synthesized image data (e.g., the output image data of fig. 5) for each of a plurality of frame portions by the method described above with reference to fig. 3-8. The electronic device 10 may generate a multi-frame HDR image by synthesizing a plurality of generated synthesized image data.

In an example embodiment, the image sensor 100 may generate the first image data IDT1_ CG1 corresponding to the first gain and the first image data IDT1_ CG2 corresponding to the second conversion gain in the first frame portion tFRAME1 corresponding to the first exposure by the method described above with reference to fig. 3 to 8. The image signal processor 130 of the image sensor 100 may generate the first output image data oid t1 by combining the first image data IDT1_ CG1 and IDT1_ CG2 corresponding to the first frame portion tFRAME1 in the method described above with reference to fig. 3 to 8.

The image sensor 100 may generate the second image data IDT2_ CG1 corresponding to the first gain and the second image data IDT2_ CG2 corresponding to the second conversion gain in the second frame portion tFRAME2 corresponding to the second exposure by the method described above with reference to fig. 3 to 8. The second exposure may have a different exposure period than the first exposure. The image signal processor 130 of the image sensor 100 may combine the second image data IDT2_ CG1 and IDT2_ CG2 corresponding to the second frame portion tFRAME2 by the method described above with reference to fig. 3 to 8 to generate the second output image data OIDT 2.

Referring to fig. 10, the image signal processor 130 may provide the first output image data OIDT1 and the second output image data OIDT2 to the processor 200. The processor 200 may generate the composite image data IDT HDR by merging the first output image data OIDT1 and the second output image data OIDT2 in units of pixel groups.

For example, the processor 200 may calculate a first pixel value R5 of the synthesized image data IDT _ HDR based on the pixel values R2 and R4 of the first output image data OIDT1 and the second output image data OIDT2 corresponding to the first pixel group. For example, the processor 200 may calculate other pixel values Gr5, Gb5, and B5 of the synthesized image data IDT _ HDR based on pixel values of the first output image data OIDT1 and the second output image data OIDT2 corresponding to the other pixel groups.

As described above, the electronic device 10 according to an exemplary embodiment of the present inventive concept may generate multi-frame HDR image data using a plurality of output image data corresponding to each of a plurality of frame portions by the above-described method, thereby significantly increasing a color range.

Although it is described above with reference to fig. 9 and 10 that the electronic device 10 generates multi-frame HDR image data by using two output image data corresponding to two frame portions, the inventive concept is not limited thereto, and multi-frame HDR image data may be generated using three output image data corresponding to three or more frame portions.

Fig. 11 is a flowchart of an operating method of an image sensor according to an example embodiment of the inventive concepts. In particular, fig. 11 is a flow chart of a method of operation of the image sensor 100 of fig. 1.

Referring to fig. 11, the image sensor 100 may output a first pixel signal corresponding to a first pixel to which a first conversion gain is applied through a single readout from each of a plurality of pixel groups included in a pixel array (S110). Specifically, the image sensor 100 may output a first pixel signal by summing first pixel values of first pixels of each of a plurality of pixel groups.

Next, the image sensor 100 may output a second pixel signal corresponding to the second pixel, to which the second conversion gain is applied, through a single readout from each of a plurality of pixel groups included in the pixel array (S120). Specifically, the image sensor 100 may output the second pixel signal by summing the second pixel values of the second pixels of each of the plurality of pixel groups. S110 and S120 may be performed simultaneously by a single readout.

The image sensor 100 may generate first image data based on first pixel signals of a plurality of pixel groups (S130). The image sensor 100 may generate second image data based on the second pixel signals of the plurality of pixel groups (S140). Thereafter, the image sensor 100 may generate output image data by merging the first image data and the second image data in units of pixel groups (S150).

Fig. 12 is a diagram illustrating an electronic device according to an example embodiment of the inventive concepts.

Referring to fig. 12, an electronic device 1000 may include an image sensor 1100, an application processor 1200, a display 1300, a memory 1400, a storage device 1500, a user interface 1600, and a wireless transceiver 1700. The image sensor 1100 of fig. 12 may correspond to the image sensor 100 of fig. 1, and the application processor 1200 of fig. 14 may correspond to the processor 200 of fig. 1. Description of the same portions as those in fig. 1 will be omitted.

The application processor 1200 may control the overall operation of the electronic device 1000, and may be provided in the form of a system on chip (SoC) for driving an application program, an operating system, and the like.

The memory 1400 may store programs and/or data to be processed or executed by the application processor 1200. The storage device 1500 may be embodied as a non-volatile storage device such as a NAND flash memory or a resistive memory, and may be provided, for example, in the form of a memory card (MMC, eMMC, SD, or micro SD) or the like. The storage device 1500 may store data related to an execution algorithm for controlling an image processing operation of the application processor 1200 and/or a program, and may load the data and/or the program to the memory 1400 when the image processing operation is executed.

The user interface 1600 may be embodied as various types of devices capable of receiving user input, such as a keyboard, a curtain key panel (touch panel), a fingerprint sensor, or a microphone. The user interface 1600 may receive a user input and provide a signal corresponding to the received user input to the application processor 1200. The wireless transceiver 1700 may include a modem 1710, a transceiver 1720, and an antenna 1730.

Fig. 13 is a diagram illustrating a portion of an electronic device according to an example embodiment of the inventive concepts. Fig. 14 is a diagram illustrating a specific configuration of a camera module according to an example embodiment of the inventive concepts. Specifically, fig. 13 is a diagram showing an electronic apparatus 2000 that is a part of the electronic apparatus 1000 of fig. 12, and fig. 14 is a diagram showing a specific configuration of the camera module 2100b of fig. 13.

Referring to fig. 13, the electronic device 2000 may include a multi-camera module 2100, an application processor 2200, and a memory 2300. The memory 2300 may perform the same functions as the memory 1400 of fig. 12, and thus, a description of the same portions of the memory 2300 as those of the memory 1400 will be omitted.

The electronic device 2000 may capture and/or store an image of an object by using a CMOS image sensor, and may be embodied as a mobile phone, a tablet computer, or a portable electronic device. Portable electronic devices may include laptop computers, mobile phones, smart phones, tablet PCs, wearable devices, and the like.

The multi-camera module 2100 may include a first camera module 2100a, a second camera module 2100b, and a third camera module 2100 c. The multi-camera module 2100 may perform the same functions as the image sensor 100 of fig. 1. Although fig. 13 illustrates that the multi-camera module 2100 includes three camera modules 2100a to 2100c, the inventive concept is not limited thereto and various numbers of camera modules may be included in the multi-camera module 2100.

The configuration of the camera module 2100b will be described in more detail below with reference to fig. 14, but the following description may also be applied to other camera modules 2100a and 2100b according to an example embodiment.

Referring to fig. 14, the camera module 2100b may include a prism 2105, an Optical Path Folding Element (OPFE)2110, an actuator 2130, an image sensing device 2140, and a memory 2150.

The prism 2105 can change the path of light L incident from the outside, which includes a reflective surface 2107 of a light reflective material.

In an example embodiment, the prism 2105 may change the path of light incident in the first direction X to a second direction Y perpendicular to the first direction X. The prism 2105 can change the path of the light L incident in the first direction X to the second direction Y perpendicular to the first direction X by rotating the reflection surface 2107 of the light reflective material about the central axis 2106 in the direction a or about the central axis 2106 in the direction B. In some example embodiments, the OPFE 2110 may be moved in a third direction Z perpendicular to the first direction X and the second direction Y.

OPFE 2110 may include, for example, m groups of optical lenses (where m is a natural number). The m-group lens may be moved in the second direction Y to change the optical zoom ratio of the camera module 2100 b.

The actuator 2130 may move the OPFE 2110 or an optical lens (hereinafter referred to as optical lens) to a certain position. For example, for accurate sensing, the actuator 2130 may adjust the position of the optical lens such that the image sensor 2142 is located at the focal length of the optical lens.

The image sensing device 2140 may include an image sensor 2142, a control logic device 2144, and a memory 2146. The image sensor 2142 may sense an image of an object to be sensed by using the light L provided through the optical lens. The image sensor 2142 of fig. 14 may be similar in function to the image sensor 100 of fig. 1, and thus redundant description thereof will be omitted. The control logic 2144 may control the overall operation of the second camera module 2100 b.

The memory 2146 may store information necessary to operate the second camera module 2100b, such as calibration data 2147. The calibration data 2147 may include information necessary for the second camera module 2100b to generate image data by using light L provided from the outside. The calibration data 2147 may include, for example, information about a degree of rotation, a focal length, an optical axis, and the like. When the second camera module 2100b is in the form of a multi-state camera whose focal length varies according to the position of the optical lens, the calibration data 2147 may include a focal length value for each position (or each state) of the optical lens, and information related to auto-focus.

The memory 2150 may store image data sensed by the image sensor 2142. The memory 2150 may be disposed outside the image sensing device 2140 and may be stacked with a sensor chip of the image sensing device 2140. In an example embodiment, the memory 2150 may be embodied as an electrically erasable programmable read-only memory (EEPROM), but example embodiments are not limited thereto.

Referring to fig. 13 and 14, in an example embodiment, each of the plurality of camera modules 2100a, 2100b, and 2100c may include an actuator 2130. Thus, the multiple camera modules 2100a, 2100b, and 2100c may include the same or different calibration data 2147 depending on the operation of the actuators 1130 in the multiple camera modules 2100a, 2100b, and 2100 c.

In an example embodiment, a camera module (e.g., the second camera module 2100b) of the plurality of camera modules 2100a, 2100b, and 2100c may be a folded lens type camera module including the prism 2105 and the OPFE 2110, and the other camera modules (e.g., the camera modules 2100a and 2100b) may be a vertical camera module not including the prism 2105 and the OPFE 2110, but the example embodiment is not limited thereto.

In an example embodiment, a camera module (e.g., the third camera module 2100c) of the plurality of camera modules 2100a, 2100b, and 2100c may be a vertical depth camera that extracts depth information, for example, by using Infrared (IR). In some example embodiments, the application processor 2200 may generate a three-dimensional (3D) depth image by merging image data provided from a depth camera and image data provided from another camera module (e.g., the first camera module 2100a or the second camera module 2100 b).

In an example embodiment, at least two camera modules (e.g., the first camera module 2100a and the second camera module 2100b) of the plurality of camera modules 2100a, 2100b, and 2100c may have different fields of view (FOVs) (different viewing angles). In some example embodiments, for example, at least two camera modules (e.g., camera modules 2100a and 2100b) of the plurality of camera modules 2100a, 2100b, and 2100c may include different optical lenses, but example embodiments are not limited thereto. For example, the FOV of a first camera module 2100a of the plurality of camera modules 2100a, 2100b, and 2100c may be smaller than the FOV of the second and third camera modules 2100b and 2100 c. However, example embodiments are not limited thereto, and the multi-camera module 2100 may further include a camera module having a FOV larger than the FOV of the initially used camera modules 2100a, 2100b, and 2100 c.

In some example embodiments, the viewing angles of the plurality of camera modules 2100a, 2100b, and 2100c may be different from each other. In some example embodiments, optical lenses included in the plurality of camera modules 2100a, 2100b, and 2100c may be different from each other, but example embodiments are not limited thereto.

In some example embodiments, the plurality of camera modules 2100a, 2100b, and 2100c may be arranged to be physically separated from each other. That is, the sensing area of one image sensor 2142 is not divided and used by the plurality of camera modules 2100a, 2100b, and 2100c, but the image sensor 1142 may be independently provided in each of the plurality of camera modules 2100a, 2100b, and 2100 c.

The application processor 2200 may include a plurality of sub-processors 2210a, 2210b, and 2210c, a camera module controller 2230, a memory controller 2400, and an internal memory 2500. The application processor 1200 may be separated from the plurality of camera modules 2100a, 2100b, and 2100 c. For example, the application processor 2200 and the plurality of camera modules 2100a, 2100b, and 2100c may be semiconductor chips separated from each other.

The image data generated by the camera module 2100a, the image data generated by the camera module 2100b, and the image data generated by the camera module 2100c may be supplied to the sub-processors 2210a, 2210b, and 2210c corresponding thereto through image signal lines ISLa, ISLb, and ISLc that are separated from each other. Such image data may be transmitted using a Camera Serial Interface (CSI), for example, based on a Mobile Industrial Processor Interface (MIPI), although example embodiments are not limited thereto.

In an example embodiment, one sub-processor may be arranged to correspond to a plurality of camera modules. For example, unlike shown in the drawings, the first and third sub-processors 2210a and 2210b may not be separated from each other, but may be integrated together as one sub-processor, and image data supplied from the camera modules 2100a and 2100c may be selected by a selector (e.g., a multiplexer) or the like and supplied to the sub-processors.

The sub-processors 2210a, 2210b, and 2210c may perform image processing on the received image data and output the processed image data to the image generator 2220.

The camera module controller 2230 may provide control signals to the camera modules 2100a, 2100b, and 2100 c. The control signals generated by the camera module controller 2230 may be provided to the camera modules 2100a, 2100b, and 2100c through control signal lines CSLa, CSLb, and CSLc that are separated from one another.

Having thus described exemplary embodiments of the inventive concept, it will be apparent that the same may be modified in numerous ways. Such variations are not to be regarded as a departure from the intended spirit and scope of the example embodiments of the inventive concept, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.

32页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:能够调整曝光时间的TDI图像传感器和包括其的检查系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类