Method for acquiring depth information and camera module

文档序号:573291 发布日期:2021-05-18 浏览:21次 中文

阅读说明:本技术 获取深度信息的方法以及摄像头模块 (Method for acquiring depth information and camera module ) 是由 朱洋贤 李昌奕 金炯珍 于 2019-10-07 设计创作,主要内容包括:根据一个实施例,公开了一种方法,能够获取深度信息的摄像头模块通过该方法控制光的输出时间点和接收时间点。通过控制光的输出时间点和接收时间点两者,即使以线为单位控制光源或接收像素,摄像头模块也可以获取相邻接收像素彼此不同相位的光。(According to one embodiment, a method by which a camera module capable of acquiring depth information controls an output time point and a reception time point of light is disclosed. By controlling both the output time point and the reception time point of light, the camera module can acquire light of which adjacent reception pixels are out of phase with each other even if the light source or the reception pixels are controlled in units of lines.)

1. A camera module, comprising:

a light source array comprising a plurality of light sources for outputting light to an object;

a receiver for receiving light reflected from the object through a receiving pixel; and

a processor obtaining depth information about the object by using a phase difference between light output from the array of light sources and light received by the receiver,

wherein the array of light sources comprises light sources on a first output line and light sources on a second output line,

wherein a phase difference between light output from the light source on the first output line and light output from the light source on the second output line is a first value,

wherein the receiving pixels include pixels on a first receiving line and pixels on a second receiving line,

wherein a phase difference between a point of time when the pixel on the first reception line receives light and a point of time when the pixel on the second reception line receives light is a second value, and

wherein the first value and the second value are different values.

2. The camera module of claim 1, wherein the difference between the first value and the second value is 90 degrees.

3. The camera module according to claim 1, wherein the first and second output lines are adjacent to each other, and the first and second receiving lines are adjacent to each other.

4. The camera module of claim 1, wherein said first output line and said second output line are parallel to each other, said first receiving line and said second receiving line are parallel to each other, and said first output line and said first receiving line are orthogonal to each other.

5. The camera module according to claim 1, wherein a first pixel, a second pixel, a third pixel, and a fourth pixel are adjacent to each other, the first pixel being a pixel on the first receiving line for receiving light output from the light source on the first output line, the second pixel being a pixel on the second receiving line for receiving light output from the light source on the first output line, the third pixel being a pixel on the first receiving line for receiving light output from the light source on the second output line, and the fourth pixel being a pixel on the second receiving line for receiving light output from the light source on the second output line.

6. The camera module according to claim 5, wherein reception time points of light received by the receiver are all different in the first to fourth pixels.

7. The camera module according to claim 5, wherein reception time points of light received by the receiver in the first to fourth pixels differ by a time corresponding to a phase of 90 degrees.

8. The camera module according to claim 1, wherein the receiver includes a first block and a second block obtained by dividing the reception pixels, and

the processor obtains the depth information using both the light received through the first block and the light received through the second block.

9. A method of obtaining depth information, the method comprising the steps of:

outputting light to the object through the pixels on the first output line and the pixels on the second output line;

receiving light reflected from the object by the pixels on the first receiving line and the pixels on the second receiving line; and

by using the phase difference between the output light and the received light to obtain depth information about the object,

wherein a phase difference between light output from the light source on the first output line and light output from the light source on the second output line is a first value, and

wherein a phase difference between a point of time when the pixel on the first reception line receives light and a point of time when the pixel on the second reception line receives light is a second value.

10. The method of obtaining depth information of claim 9, wherein the difference between the first value and the second value is 90 degrees.

Technical Field

The present disclosure relates to a method of obtaining depth information and a camera module.

Background

Devices that obtain information by outputting light and reflecting the light on an object have been used in various fields. For example, from 3D imaging to distance measurement techniques, a technique of obtaining information by outputting light is being used in various ways.

For example, time-of-flight (ToF) is a term representing a principle of measuring a distance by measuring a time difference between a time of outputting light and a reception time of received light reflected from an object and returned, and is used in various fields such as aviation, shipbuilding, civil engineering, photography, surveying, and the like because an implementation method of the ToF technology is simple.

Also in this respect, there is an increasing demand for cameras with good performance compared to hardware.

Disclosure of Invention

Technical problem

According to one or more embodiments, the present disclosure may provide a method of obtaining depth information and a camera module using the same. By controlling both the output time point and the reception time point of light, the camera module can obtain light of different phases from adjacent receiving pixels even if the light source or the receiving pixels are controlled in units of lines (lines). The technical problem to be solved is not limited to the above technical problems but may further include various technical problems within a scope apparent to those skilled in the art.

Technical scheme

The camera module according to the first aspect includes: a light source array including a plurality of light sources outputting light to an object; a receiver for receiving light reflected from the object through the receiving pixels; and a processor obtaining depth information on the object by using a phase difference between light output from the light source array and light received by the receiver, wherein the light source array includes a light source on a first output line and a light source on a second output line, wherein a phase difference between light output from the light source on the first output line and light output from the light source on the second output line is a first value, wherein the reception pixel includes a pixel on the first reception line and a pixel on the second reception line, wherein a phase difference between a point of time when the pixel on the first reception line receives the light and a point of time when the pixel on the second reception line receives the light is a second value, and wherein the first value and the second value may be different values.

In addition, the difference between the first value and the second value may be 90 degrees. In addition, the first output line and the second output line may be adjacent to each other, and the first receiving line and the second receiving line may be adjacent to each other.

In addition, the first output line and the second output line are parallel to each other, the first receiving line and the second receiving line are parallel to each other, and the first output line and the first receiving line may be orthogonal to each other.

In addition, a first pixel, a second pixel, a third pixel, and a fourth pixel may be adjacent to each other, the first pixel being a pixel on the first receiving line that receives the light output from the light source on the first output line, the second pixel being a pixel on the second receiving line that receives the light output from the light source on the first output line, the third pixel being a pixel on the first receiving line that receives the light output from the light source on the second output line, and the fourth pixel being a pixel on the second receiving line that receives the light output from the light source on the second output line.

In addition, in the first to fourth pixels, the reception time points of the light received by the receiver may be different.

In addition, in the first to fourth pixels, the reception time points of light received by the receiver may differ by a time corresponding to a phase of 90 degrees.

In addition, the processor may improve resolution by applying a super-resolution technique.

In addition, the receiver includes a first block and a second block obtained by dividing the received pixels, and the processor may obtain the depth information using both the light received through the first block and the light received through the second block.

In addition, two pixels among the four pixels included in the first block and the four pixels included in the second block may overlap.

In addition, the first value may be 180 degrees and the second value may be 90 degrees.

In addition, the first value may be 90 degrees and the second value may be 180 degrees.

The method of obtaining depth information according to the second aspect comprises the steps of: outputting light to the object through the pixels on the first output line and the pixels on the second output line; receiving light reflected from the object by the pixels on the first receiving line and the pixels on the second receiving line; and obtaining depth information on the object by using a phase difference between the output light and the received light, wherein a phase difference between the light output from the light source on the first output line and the light output from the light source on the second output line is a first value, and wherein a phase difference between a point of time when the pixel on the first receiving line receives the light and a point of time when the pixel on the second receiving line receives the light may be a second value.

In addition, the difference between the first value and the second value may be 90 degrees.

In addition, the first output line and the second output line are adjacent to each other, and the first receiving line and the second receiving line may be adjacent to each other.

In addition, the first output line and the second output line are parallel to each other, the first receiving line and the second receiving line are parallel to each other, and the first output line and the first receiving line may be orthogonal to each other.

The third aspect may provide a computer-readable recording medium in which a program for executing the method according to the second aspect on a computer is recorded.

Advantageous effects

According to one or more embodiments, the present disclosure may provide a method of obtaining depth information and a camera module using the same. By controlling both the output time point of light and the reception time point of light, the camera module can obtain light of different phases from adjacent receiving pixels even if the light source or the receiving pixels are controlled in units of lines.

Drawings

Fig. 1 is a block diagram illustrating the configuration and operation of a camera module according to an embodiment.

Fig. 2 is a cross-sectional view of a camera module according to an embodiment.

Fig. 3 schematically shows an example of a method of obtaining a depth image using four phase images.

Fig. 4 is a diagram showing an example in which a camera module according to the embodiment applies different phase signals to receiving pixels included in a block in each period T by controlling the receiving pixels included in a receiver and a plurality of light sources included in a light source array in units of lines.

Fig. 5 is a timing diagram illustrating operation of the camera module of fig. 4 over time.

Fig. 6 is a diagram showing an example of a case where a 90-degree phase retarder is used for a light source array, a 180-degree phase retarder is used for a receiver, and a reception line connected to the 180-degree phase retarder is adjacent.

Fig. 7 is a diagram showing an example of a case where a 90-degree phase retarder is used for a light source array, a 180-degree phase retarder is used for a receiver, and an output line connected to the 90-degree phase retarder is adjacent.

Fig. 8 is a diagram showing an example of a case where a 90-degree phase retarder is used for a light source array, a 180-degree phase retarder is used for a receiver, and a line connected to the phase retarder in a reception line and an output line is not adjacent to each other.

Fig. 9 is a diagram showing an example in which a camera module controls a receiving pixel included in a receiver and a plurality of light sources included in a light source array in units of lines by using a 180-degree phase retarder for a light source array and a 90-degree phase retarder for a receiver, thereby applying different phase signals to the receiving pixels included in a block in each period T.

Fig. 10 is a timing diagram illustrating the operation of the camera module of fig. 9 over time.

Fig. 11 is a diagram illustrating a method of a camera module for improving resolution of an image using a super resolution technique.

Fig. 12 is a diagram for illustrating an example of improving resolution according to the super-resolution technique according to the embodiment.

Fig. 13 is a flowchart illustrating a method of obtaining depth information about an object according to an embodiment.

Detailed Description

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

However, the technical idea of the present invention is not limited to some embodiments to be described, but may be implemented in various forms, and one or more constituent elements may be selectively combined or substituted between the embodiments within the scope of the technical idea of the present invention.

In addition, unless explicitly defined and described, terms (including technical terms and scientific terms) used in the embodiments of the present invention may be interpreted as meanings that can be generally understood by those skilled in the art, and common terms (e.g., terms defined in dictionaries) may be interpreted in consideration of the meanings of the context of the related art.

In addition, the terms used in the present specification are used to describe embodiments, and are not intended to limit the present invention.

In this specification, the singular form may include the plural form unless explicitly stated in the wording, and when it is described as "at least one (or more than one) of a and B and C", it may include one or more of all combinations that may be combined by A, B and C.

In addition, in describing the components of embodiments of the present invention, terms such as first, second, A, B, (a) and (b) may be used. These terms are only intended to distinguish one element from another, and do not limit the nature, order, or sequence of the elements.

Also, when an element is described as being "connected," "coupled," or "coupled" to another element, it can be directly connected, coupled, or coupled to the other element, however, it is to be understood that another element can be "connected," "coupled," or "coupled" between the elements.

In addition, when it is described that "upper (upper)" or "lower (lower)" of each component is formed or arranged, it is meant to include not only a case where two components are in direct contact with each other but also a case where one or more other components are formed or disposed between the two components. In addition, when it is expressed as "upper (upper)" or "lower (lower)", not only a meaning based on an upward direction of one component but also a meaning based on a downward direction of one component may be included.

In addition, the numerical values described below may be construed as values within a reasonable range in terms of error. For example, a number written as "1" may be interpreted as "1.01".

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. Hereinafter, "light" may be understood as a concept including "optical signal", and "signal" may be understood as a concept including "optical signal", and may be used interchangeably.

Fig. 1 is a block diagram illustrating the configuration and operation of a camera module 100 according to an embodiment.

As shown in fig. 1, camera module 100 can include a light source array 1100, a processor 1000, and a receiver 120.

However, it will be understood by those skilled in the art that other general components than those shown in fig. 1 may be further included in the camera module 100. For example, the camera module 100 may further include: a diffuser through which light output from the light source array passes, a light modulator (not shown) included in the light source array 1100, or a memory (not shown) connected to the processor 1000. The term "memory" may be broadly interpreted to include any electronic component capable of storing electronic information. The term "memory" may refer to various types of processor-readable media, such as Random Access Memory (RAM), Read Only Memory (ROM), non-volatile random access memory (NVRAM), Programmable Read Only Memory (PROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, magnetic or optical data storage, registers, and so forth. If the processor 1000 can read information from, and/or write information to, the memory is said to be in electronic communication with the processor. A memory integrated in the processor 1000 is in electronic communication with the processor.

In addition, the memory may include at least one type of storage medium among a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g., SD or XD memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, and an optical disk.

Alternatively, according to another embodiment, one skilled in the art may understand that some of the components shown in fig. 1 may be omitted.

The light source array 1100 according to the embodiment may output light. The light output from the light source array 1100 may have a wavelength within a preset range.

For example, the light source array 1100 may be a Light Emitting Diode (LED) or a Laser Diode (LD) capable of emitting light having an infrared wavelength and light having a Near Infrared (NIR) wavelength of about 850nm, which is not visible to the human eye for safety, but the wavelength band and the type of the light source array are not limited. For example, the wavelength of light output from the light source array 1100 may be included in a visible light region or an ultraviolet light region.

For example, the light source array 1100 may output light by performing amplitude modulation or phase modulation according to a control signal received from the processor 1000. The light output from the light source array 1100 to the object 130 according to the control signal of the processor 1000 may have the form of a periodic continuous function (having a preset period). For example, the light may have a specifically defined waveform, such as a sine wave, a ramp wave, a square wave or a pulsed wave, but may have an undefined general form.

The receiver 120 may receive light reflected from the object 130. The camera module 100 can obtain various pieces of information by the received light received by the receiver 120.

The camera module 100 according to the embodiment can obtain information about the object 130 through the received light. For example, processor 1000 may obtain information about the object, such as the shape, size, color, depth, etc. of object 130.

The receiver 120 can distinguish the received light obtained by reflecting the light output from the light source array 1100 to the object 130 among various lights entering the receiver 120. For example, the receiver 120 may selectively obtain light in the range of 750nm to 950nm by filtering the light when the light source array 1100 outputs the light in the range of 750nm to 950 nm. In addition, the receiver 120 can obtain accurate information about the object 130 by selectively obtaining the received light corresponding to the light.

Since the camera module 100 according to the embodiment can extract depth information using the ToF function, it may be understood as being interchanged with a ToF camera module or a ToF module in the present disclosure.

The light source array 1100 may generate light to be output and irradiate the object 130 with the light. In this case, the light source array 1100 may generate and output light in the form of a pulse wave or a continuous wave. The continuous wave may be in the form of a sine wave or a square wave. By generating light in the form of a pulse wave or a continuous wave, the camera module 100 can determine a phase difference between light output from the light source array 1100 and light reflected from an object and then received by the camera module 100.

The light source array 1100 may irradiate the generated light onto the object 130 during a preset exposure period. The exposure period may refer to one frame period. In the case where a plurality of frames are generated, the set exposure period may be repeated. For example, when the camera module 100 photographs an object at 20FPS, the exposure period is 1/20 seconds. In addition, when 100 frames are generated, the exposure period may be repeated 100 times.

The light source array 1100 may generate a plurality of lights having different frequencies. The light source array 1100 may sequentially and repeatedly generate a plurality of lights having different frequencies. Alternatively, the light source array 1100 may simultaneously generate a plurality of lights having different frequencies.

The light source array 1100 according to the embodiment may output light to the object 130 through a plurality of light sources. The light source array 1100 may include a plurality of light sources, and each of the plurality of light sources may independently output light. For example, the plurality of light sources may output light of different intensities, may output light of different frequencies, may output light of different phases, and output light having different delay times. Each of the plurality of light sources may include a light emitting diode.

The receiver 120 according to the embodiment may receive light through the receiving pixel. The receiver 120 may receive reflected light obtained as a result of the light output from the light source array 1100 being reflected from the object 130. The receiver 120 may include receiving pixels, and each receiving pixel may receive light independently of each other. For example, the receiving pixels may receive light at different times and may use different filtering methods to receive the light.

The receiver 120 according to the embodiment may include a lens (not shown) and an image sensor. The lens may collect light reflected from the object 130 and transmit it to an image sensor (not shown). The image sensor may receive light and generate an electrical signal corresponding to the received light.

According to an embodiment, the light source array 1100 may output light of different frequencies over time. For example, the light source array 1100 may output a frequency f during the first half of the exposure period1And during the other half of the exposure period, with an output frequency f2Of (2) is detected.

According to an embodiment, several of the plurality of light emitting diodes comprised in the light source array 1100 may output light having a frequency f1And the other light emitting diodes may output light having a frequency f2Of (2) is detected.

To control the plurality of light emitting diodes included in the light source array 1100, the light source array 1100 may include a light modulator.

The light source array 1100 may generate light. The light generated by the light source array 1100 may be infrared light having a wavelength of 770 to 3000nm, or visible light having a wavelength of 380 to 770 nm. The light source array 1100 may use Light Emitting Diodes (LEDs), and may have a shape in which a plurality of light emitting diodes are arranged according to a predetermined pattern. The light source array 1100 may include an Organic Light Emitting Diode (OLED) or a Laser Diode (LD). Alternatively, the light source array 1100 may be a Vertical Cavity Surface Emitting Laser (VCSEL). The VCSEL is one of laser diodes that convert an electrical signal into light, and a wavelength of about 800nm to 1000nm (e.g., about 850nm or about 940nm) may be used.

The light source array 1100 may repeatedly blink (on/off) at predetermined time intervals and generate light in the form of a pulse wave or a continuous wave. The predetermined time interval may be the frequency of the light. The flashing of the array of light sources 1100 may be controlled by a light modulator.

The light modulator may control the blinking of the light source array 1100 to control the light source array 1100 to generate light in a continuous wave form or a pulse wave form. The light modulator may control the light source array 1100 to generate light in a continuous wave form or a pulse wave form by frequency modulation, pulse modulation, or the like.

The processor 1000 according to the embodiment may obtain depth information about the object 130 by using a phase difference between light output from the light source array 1100 and light received by the receiver 120. By using a plurality of reference signals having different phase differences, the receiver 120 may generate an electrical signal corresponding to each reference signal. The frequency of the reference signal may be determined to be equal to the frequency of the light output from the light source array 1100. Accordingly, when the light source array 1100 generates light having a plurality of frequencies, the receiver 120 may generate an electrical signal using a plurality of reference signals corresponding to the respective frequencies. The electrical signal may include information about the amount of charge or voltage corresponding to each reference signal.

The number of reference signals according to an embodiment may be four, i.e., C1 to C4. Each of the reference signals C1 to C4 may have the same frequency as the light output from the light source array 1100, but may have a phase difference of 90 degrees from each other. One of the four reference signals C1 may have the same phase as the light output from the light source array 1100. Light obtained by reflection from the object 130 may be delayed in phase by the distance that light output from the light source array 1100 is reflected from the object 130 and returned. The receiver 120 may generate signals Q1 through Q4 for each reference signal by mixing the received light and each reference signal, respectively.

The receiver 120 may include an image sensor configured in a structure in which a plurality of pixels are arranged in a grid form. The image sensor may be a Complementary Metal Oxide Semiconductor (CMOS) image sensor or may be a Charge Coupled Device (CCD) image sensor. In addition, the image sensor may include a ToF sensor that receives infrared light reflected from an object and measures a distance using a time difference or a phase difference.

Specifically, the processor 1000 may calculate a phase difference between the output light and the input light using information on the charge amount of the electrical signal.

As described above, four electrical signals may be generated for each frequency of light output from the light source array 1100. Accordingly, the processor 1000 may determine the phase difference t between the light output from the light source array 1100 and the light received by the receiver 120 by using equation 1 belowd

[ EQUATION 1 ]

Here, Q1To Q4May be the charge amount of each of the four electrical signals. Q1Is the charge amount, Q, of the electric signal corresponding to the reference signal of the same phase as the light output from the light source array 11002Is the charge amount, Q, of the electrical signal corresponding to the reference signal with a phase lag of 180 degrees of the light output from the light source array 11003Is the charge amount, Q, of the electrical signal corresponding to the reference signal with a phase lag of 90 degrees of the light output from the light source array 11004May be the amount of charge of the electrical signal corresponding to the reference signal that is phase-delayed by 270 degrees by the light output from the light source array 1100.

The processor 1000 may then use the phase difference between the light output from the light source array 1100 and the light received by the receiver 120 to determine the distance between the object 130 and the camera module 100. In this case, the processor 1000 according to the embodiment may determine the distance d between the object 130 and the camera module 100 using equation 2.

[ EQUATION 2 ]

Here, c is the speed of light, and f may be the frequency of the output light.

According to an embodiment, ToF IR images and depth images may be obtained from the camera module 100.

The processor 1000 according to the embodiment may obtain depth information about the object 130 by using a difference between a time point when the light source array 1100 outputs light and a time point when the receiver 120 receives light. The light source array 1100 may obtain depth information by outputting light (e.g., laser light or infrared light) to the object 130, receiving the reflected and returned light, and calculating a time difference.

Fig. 2 is a cross-sectional view of a camera module 100 according to an embodiment.

Referring to fig. 2, camera module 100 includes a lens assembly 200, an image sensor 250, and a printed circuit board 260. The processor 1000 of fig. 1, etc. may be implemented within the printed circuit board 260. Although not shown, the light source array 1100 of fig. 1 is disposed on the side of the image sensor 250 on the printed circuit board 260, or it may be disposed outside the camera module 100, for example, on the side of the camera module 100, without being limited thereto.

Lens assembly 200 may include a lens 230, a barrel 210, lens holders 221 and 222, and an IR filter 240.

The lens 230 may be composed of a plurality of elements, or may be composed of one element. When lens 230 is formed of multiple elements, each lens may be aligned with respect to a central axis to form an optical system. Here, the central axis may be the same as the optical axis of the optical system.

The lens barrel 210 is coupled to the lens holders 221 and 222, and a space for accommodating lenses may be provided therein. The lens barrel 210 may be rotatably coupled with one or more lenses, but this is exemplary, and it may be coupled in other ways, for example, a method using an adhesive (e.g., an adhesive resin such as epoxy).

The lens holders 221 and 222 may be coupled to the lens barrel 210 to support the lens barrel 210, and may be coupled to a printed circuit board 260 on which the image sensor 250 is mounted. A space to which the IR filter 240 can be attached may be formed under the lens barrel 210 by the lens frames 221 and 222. A spiral pattern may be formed on the inner circumferential surfaces of the lens holders 221 and 222, and similarly, it may be rotatably coupled with the lens barrel 210 formed on the outer circumferential surface. However, this is exemplary, and the lens frames 221 and 222 and the lens barrel 210 may be coupled by an adhesive, or the lens frames 221 and 222 and the lens barrel 210 may be integrally formed.

The lens holders 221 and 222 may be divided into an upper holder 221 coupled to the lens barrel 210 and a lower holder 222 coupled to the printed circuit board 260 on which the image sensor 250 is mounted. The upper bracket 221 and the lower bracket 222 may be integrally formed, formed as structures separated from each other and then fastened or combined, or may have structures separated from each other and spaced apart from each other. In this case, the diameter of the upper bracket 221 may be formed to be smaller than that of the lower bracket 222, but is not limited thereto.

The above examples are merely embodiments, and the lens 230 may be configured with another structure capable of collimating and transferring light incident on the camera module 100 to the image sensor 250.

The image sensor 250 may generate an electrical signal by using light collimated by the lens 230.

The image sensor 250 may detect the input light in synchronization with the blinking period of the light source array 1100. Specifically, the image sensor 250 may detect light output from the light source array 1100 in and out of phase, respectively. That is, the image sensor 250 may repeatedly perform the step of absorbing light when the light source array 1100 is turned on and the step of absorbing light when the light source array 1100 is turned off.

The image sensor 250 may generate an electric signal corresponding to each reference signal by using a plurality of reference signals having different phase differences. The frequency of the reference signal may be determined to be equal to the frequency of the light output from the light source array 1100. Accordingly, when the light source array 1100 generates light having a plurality of frequencies, the image sensor 250 may generate an electrical signal using a plurality of reference signals corresponding to the respective frequencies. The electrical signal may include information about the amount of charge or voltage corresponding to each reference signal.

The processor 1000 according to the embodiment may control a delay time of light output from each of the plurality of light sources, and may determine a direction of the light output through the plurality of light sources. In the following, an embodiment is shown in which the processor 1000 determines the direction of the light by controlling the delay time.

Fig. 3 briefly shows an example of a method of obtaining a depth image using four phase images, and fig. 4 shows the method of fig. 3 in detail.

Referring to fig. 3, the camera module may sequentially obtain a first depth image (1), a second depth image (2), and a third depth image (3). Specifically, the camera module 100 obtains a first depth image (1) by obtaining a 0-degree phase image, a 90-degree phase image, a 180-degree phase image, and a 270-degree phase image in the first-first period, obtains a second depth image (2) by obtaining a 0-degree phase image, a 90-degree phase image, a 180-degree phase image, and a 270-degree phase image in the second-first period, and can obtain a third depth image (3) by obtaining a 0-degree phase image, a 90-degree phase image, a 180-degree phase image, and a 270-degree phase image in the third-first period.

Specifically, the first to fourth pixels included in the block 300 may obtain a 0-degree phase image, a 90-degree phase image, a 180-degree phase image, and a 270-degree phase image, respectively, in one period. Which of the first to fourth pixels is to obtain which phase image may be determined according to a predetermined setting. The first to fourth pixels may be reception pixels.

Since the strength of a signal received from the on time to the off time of a pixel at a time is weak, the camera module 100 according to the embodiment may repeat the same process several times to obtain a depth image. For example, the block 300 may repeat the process of obtaining the phase image several times (e.g., 100 times or more) to obtain the depth image through the combination or accumulation of the signals.

Referring to fig. 3, a different phase signal may be applied to each pixel included in the block 300 during each period T. For example, the block 300 may include a first pixel, a second pixel, a third pixel, and a fourth pixel, and a 0-degree phase signal may be applied to the first pixel, a 90-degree phase signal may be applied to the second pixel, a 180-degree phase signal may be applied to the third pixel, and a 270-degree phase signal may be applied to the fourth pixel during each period T, but is not limited thereto.

Since the intensity of the signal received by each pixel during one period T is weak, the same process may be repeated several times. The camera module 100 may combine or accumulate the signals by repeating the period T in which the phase signals different for each pixel are applied several times, for example, 100 times or more. Thereafter, information on the phase of 0 degree is read out from the first pixel, information on the phase of 90 degrees is read out from the second pixel, information on the phase of 180 degrees is read out from the third pixel, and information on the phase of 270 degrees is read out from the fourth pixel. In addition, the first depth image (1) may be obtained using information on a phase of 0 degrees obtained from the first pixel, information on a phase of 90 degrees obtained from the second pixel, information on a phase of 180 degrees obtained from the third pixel, and information on a phase of 270 degrees obtained from the fourth pixel.

In this way, a different phase signal is applied to each pixel included in one block 30 during each period T, and when a depth image is extracted using information on each phase obtained from each pixel, the time required to obtain the depth image can be reduced.

During each period T, different phase signals may be applied to adjacent pixels, and during each period T, at least a portion of at least two of the segment in which the 0-degree phase signal is applied to the first pixel, the segment in which the 90-degree phase signal is applied to the second pixel, the segment in which the 180-degree phase signal is applied to the third pixel, and the segment in which the 270-degree phase signal is applied to the fourth pixel may overlap each other. Therefore, the time required to obtain one depth image can be reduced as compared with the case where the segment to which the 0-degree phase signal is applied, the segment to which the 90-degree phase signal is applied, the segment to which the 180-degree phase signal is applied, and the segment to which the 270-degree phase signal is applied do not overlap with each other.

Fig. 4 is a diagram showing an example in which the camera module 100 according to the embodiment applies different phase signals to the receiving pixels included in the block 400 in each period T by controlling the receiving pixels included in the receiver 120 and the plurality of light sources included in the light source array 1100 in units of lines.

The plurality of light sources include a light source on the first output line 411 and a light source on the second output line 412, and a phase difference of light output from the light source on the first output line 411 and light output from the light source on the second output line 412 may be a first value. In addition, the reception pixels include a pixel on the first reception line 421 and a pixel on the second reception line 422, and a phase difference between a point of time when the pixel on the first reception line 421 receives light and a point of time when the pixel on the second reception line 422 receives light may be a second value. At this time, the first value and the second value may be different. For example, the difference between the first value and the second value may be 90 degrees. For example, as shown in fig. 4, the first value may be 90 degrees and the second value may be 180 degrees. As another example, the first value may be 180 degrees and the second value may be 90 degrees. As yet another example, the first value may be 270 degrees and the second value may be 180 degrees. As yet another example, the first value may be 180 degrees and the second value may be 270 degrees, without being limited thereto.

Referring to fig. 4, the first output line 411 and the second output line 412 may be adjacent to each other, and the first receiving line 421 and the second receiving line 422 may be adjacent to each other.

Referring to fig. 4, the first output line 411 and the second output line 412 are in a horizontal direction, and the first receiving line 421 and the second receiving line 422 are in a vertical direction, but is not limited thereto. In addition, the first and second output lines 411 and 412 may be parallel to each other, and the first and second receiving lines 421 and 422 may be parallel to each other. In addition, the first output line 411 and the first receiving line 421 or the second receiving line 422 may be orthogonal to each other.

Referring to fig. 4, the first output line 411 is disposed above the second output line 412, and the first receiving line 421 is disposed at the left side of the second receiving line 422, but is not limited thereto. For example, the first output line 411 may be disposed below the second output line 412, or the first receiving line 421 may be disposed at the right side of the second receiving line 422.

Referring to fig. 4, a block 400 may include a first pixel 401, a second pixel 402, a third pixel 403, and a fourth pixel 404. The first pixel 401, the second pixel 402, the third pixel 403, and the fourth pixel 404 may be receiving pixels.

The first pixel 401 is a pixel on the first receiving line 421 that receives light output from the light source on the first output line 411, the second pixel 402 is a pixel on the second receiving line 422 that receives light output from the light source on the first output line 411, the third pixel 403 is a pixel on the first receiving line 421 that receives light output from the light source on the second output line 412, and the fourth pixel 404 may be a pixel on the second receiving line 422 that receives light output from the light source on the second output line 412.

The processor 1000 according to the embodiment may control a plurality of light sources included in the light source array 1100 in units of lines. The phase delayer 430 may delay the phase of each line. Referring to fig. 4, a phase retarder 430 may be connected to the second output line 412 and the second receiving line 422. The phase retarder 430 may delay the phase by 90 degrees or 180 degrees.

Accordingly, the phase of the light output from the second output line 412 may be delayed by 90 degrees from the phase of the light output from the first output line 411. Alternatively, the time at which light is output from the second output line 412 may be delayed from the time at which light is output from the first output line 411 by a time corresponding to a phase of 90 degrees.

In addition, the time point at which the light reception starts at the second reception line 422 may be delayed by a time corresponding to a phase of 180 degrees from the time point at which the light reception starts at the first reception line 421.

The light output from the first output line 411 may be received at the first pixel 401 included in the first receiving line 421. Since the light output from the first output line 411 is delayed by 0 degree and the receiving time point of the light at the receiving line 421 is delayed by 0 degree compared to the light output time point, a 0-degree phase signal may be applied to the first pixel 401.

The light output from the first output line 411 may be received at the second pixel 402 included in the second receiving line 422. Since the light output from the first output line 411 is delayed by 0 degrees and the receiving time point of the light at the second receiving line 422 is delayed by 180 degrees compared to the light output time point, a 180-degree phase signal may be applied to the second pixel 402.

The light output from the second output line 412 may be received at the third pixel 403 included in the first receiving line 421. Since the light output from the second output line 412 is delayed by 90 degrees and the receiving time point of the light at the first receiving line 421 is delayed by 0 degrees compared to the light output time point, a 90-degree phase signal may be applied to the third pixel 403.

The light output from the second output line 412 may be received at the fourth pixel 404 included in the second receiving line 422. Since the light output from the second output line 412 is delayed by 90 degrees and the receiving time point of the light at the second receiving line 422 is delayed by 180 degrees compared to the light output time point, a 270-degree phase signal may be applied to the fourth pixel 404.

The reception time points of light received by the block 400 may be different in all of the first, second, third, and fourth pixels 401, 402, 403, and 404. For example, phase signals (or reception time points of light) applied to the first pixel 401, the second pixel 402, the third pixel 403, and the fourth pixel 404 may be different by 90 degrees. Referring to fig. 4, a 0-degree phase signal is applied to the first pixel 401, a 180-degree phase signal is applied to the second pixel 402, a 90-degree phase signal is applied to the third pixel 403, and a 270-degree phase signal is applied to the fourth pixel 404, without being limited thereto, and the type of the phase signal applied to each pixel may vary according to which line the phase retarder 430 is connected to.

Referring to fig. 4, the third output line 413 and the fourth output line 414 may correspond to the first output line 411 and the second output line 412, respectively. Specifically, the third output line 413 may output light delayed by 0 degrees, and the fourth output line 414 may output light delayed by 90 degrees. In addition, the light receiving time point of the third receiving line 423 may be delayed by 0 degree, and the light receiving time point of the fourth receiving line 424 may be delayed by 180 degrees.

Unlike the arrangement shown in fig. 4, the arrangement of the first and second output lines 411 and 412 and the arrangement of the first and second receiving lines 421 and 422 may be changed. In addition, the phase delayed by the phase delayer 430 applied to the plurality of light sources and the phase delayed by the phase delayer 430 applied to the receiving pixel may be different. However, the phase signals applied to the first, second, third, and fourth pixels 401, 402, 403, and 404 included in the block 400 are different from each other and may correspond to any one of 0 degrees, 90 degrees, 180 degrees, and 270 degrees, respectively.

Fig. 5 is a timing diagram illustrating the operation of the camera module 100 of fig. 4 over time.

As shown in fig. 4, since the phase retarder 430 is connected to the second output line 412 in the light source array 1100, light output from the first output line 411 is delayed by 0 degrees. Then, the light output from the second output line 412 may be delayed by 90 degrees.

As shown in fig. 4, since the phase delayer 430 is connected to the second receiving line 422 in the receiver 120, a time point at which light reception starts at the first receiving line 421 is delayed by 0 degrees, and a time point at which light reception starts from the second receiving line 422 may be delayed by 180 degrees.

Accordingly, referring to fig. 5, the light output from the first output line 411 is output after being delayed by 0 degrees, and since the first pixel 401, which is a pixel on the first receiving line 421, receives the light output from the first output line 411 delayed by 0 degrees, the first pixel 401 may receive a 0-degree phase signal.

In addition, the light output from the first output line 411 is output after being delayed by 0 degree, and since the second pixel 402, which is a pixel on the second receiving line 422, receives the light output from the first output line 411 delayed by 180 degrees, the second pixel 402 can receive a 180-degree phase signal.

In addition, the light output from the second output line 412 is output after being delayed by 90 degrees, and since the third pixel 403, which is a pixel on the first receiving line 421, receives the light output from the second output line 412 delayed by 0 degrees, the third pixel 403 can receive a 90-degree phase signal.

In addition, the light output from the second output line 412 is output after being delayed by 90 degrees, and since the fourth pixel 404, which is a pixel on the second receiving line 422, receives the light output from the second output line 412 delayed by 180 degrees, the fourth pixel 404 can receive a 270-degree phase signal.

Fig. 6 to 8 are diagrams showing an example of a case where a 90-degree phase retarder is used for the light source array 1100 and a 180-degree phase retarder is used for the receiver 120. Referring to fig. 6, an example of the case where a reception line connected to a 180-degree phase retarder is adjacent is shown.

Fig. 6 may be understood with reference to fig. 4. In fig. 6, unlike fig. 4, the third receiving line 423 may be located at the right side and the fourth receiving line 424 may be located at the left side. In fig. 6, the positions of the third receiving line 423 and the fourth receiving line 424 are different from those in fig. 4, but since the phase signals applied to the four pixels included in the block are different, the depth information may be obtained using the four phase information in each block.

Referring to fig. 7, an example of the case where a reception line connected to a 90-degree phase retarder is adjacent is shown. Fig. 7 may be understood with reference to fig. 6. In fig. 7, unlike fig. 6, the third output line 413 may be positioned below and the fourth output line 414 may be positioned above. In fig. 7, the positions of the third output line 413 and the fourth output line 414 are different from those in fig. 6, but since the phase signals applied to the four pixels included in the block are different, the depth information can be obtained using the four phase information in each block.

Referring to fig. 8, an example of a case where a line connected to the phase delayer among the reception line and the output line is not adjacent to each other is shown. Fig. 8 can be understood with reference to fig. 4. In fig. 8, unlike fig. 4, the first receiving line 421 is on the right side, the second receiving line 422 is on the left side, the third receiving line 423 is on the right side, and the fourth receiving line 424 may be on the left side.

In addition, in fig. 8, unlike fig. 4, the first output line 411 is located below the second output line 412, and the third output line 413 may be located below the fourth output line 414.

In fig. 8, the positions of the first receiving line 421, the second receiving line 422, the third receiving line 423, and the fourth receiving line 424 are different from the case of fig. 4, but since phase signals applied to four pixels included in a block are different, depth information may be obtained using four phase information in each block.

Fig. 9 is a diagram showing an example in which the camera module 100 controls a receiving pixel included in the receiver 120 and a plurality of light sources included in the light source array 1100 in units of lines by using a 180-degree phase retarder for the light source array 1100 and a 90-degree phase retarder for the receiver 120, thereby applying different phase signals to the receiving pixels included in the block 400 in each period T. Fig. 9 may be understood with reference to fig. 4.

Referring to fig. 9, the first output line 411 is disposed above the second output line 412, and the first receiving line 421 is disposed at the left side of the second receiving line 422, without being limited thereto.

Referring to fig. 9, a block 400 may include a first pixel 401, a second pixel 402, a third pixel 403, and a fourth pixel 404. The first pixel 401, the second pixel 402, the third pixel 403, and the fourth pixel 404 may be receiving pixels.

The first pixel 401 is a pixel on the first receiving line 421 that receives light output from the pixel on the first output line 411, the second pixel 402 is a pixel on the second receiving line 422 that receives light output from the pixel on the first output line 411, the third pixel 403 is a pixel on the first receiving line 421 that receives light output from the pixel on the second output line 412, and the fourth pixel 404 may be a pixel on the second receiving line 422 that receives light output from the pixel on the second output line 412.

The processor 1000 according to the embodiment may control a plurality of light sources included in the light source array 1100 in units of lines. The phase delayer 430 may delay the phase of each line. Referring to fig. 9, a phase retarder 430 may be connected to the second output line 412 and the second receiving line 422. The phase retarder 430 may delay the phase by 90 degrees or 180 degrees.

In fig. 9, unlike the case of fig. 4, the phase of light output from the second output line 412 may be delayed by 180 degrees from the phase of light output from the first output line 411. Alternatively, the point in time at which light is output from the second output line 412 may be delayed by a time corresponding to a phase of 180 degrees from the point in time at which light is output from the first output line 411.

In addition, the time point at which the light reception starts at the second reception line 422 may be delayed by a time corresponding to a phase of 90 degrees from the time point at which the light reception starts at the first reception line 421.

The light output from the first output line 411 may be received at the first pixel 401 included in the first receiving line 421. Since the light output from the first output line 411 is delayed by 0 degree and the receiving time point of the light at the receiving line 421 is delayed by 0 degree compared to the light output time point, a 0-degree phase signal may be applied to the first pixel 401.

The light output from the first output line 411 may be received at the second pixel 402 included in the second receiving line 422. Since the light output from the first output line 411 is delayed by 0 degrees and the receiving time point of the light at the second receiving line 422 is delayed by 90 degrees compared to the light output time point, a 90-degree phase signal may be applied to the second pixel 402.

The light output from the second output line 412 may be received at the third pixel 403 included in the first receiving line 421. Since the light output from the second output line 412 is delayed by 180 degrees and the receiving time point of the light at the first receiving line 421 is delayed by 0 degrees compared to the light output time point, a 180-degree phase signal may be applied to the third pixel 403.

The light output from the second output line 412 may be received at the fourth pixel 404 included in the second receiving line 422. Since the light output from the second output line 412 is delayed by 180 degrees and the receiving time point of the light at the second receiving line 422 is delayed by 90 degrees compared to the light output time point, a 270-degree phase signal may be applied to the fourth pixel 404.

The reception time points of light received by the block 400 may be different in all of the first, second, third, and fourth pixels 401, 402, 403, and 404. For example, phase signals (or reception time points of light) applied to the first pixel 401, the second pixel 402, the third pixel 403, and the fourth pixel 404 may be different by 90 degrees. Referring to fig. 4, a 0-degree phase signal is applied to the first pixel 401, a 90-degree phase signal is applied to the second pixel 402, a 180-degree phase signal is applied to the third pixel 403, and a 270-degree phase signal is applied to the fourth pixel 404, without being limited thereto, and the type of the phase signal applied to each pixel may vary according to which line the phase retarder 430 is connected to.

Fig. 10 is a timing chart showing the operation of the camera module 100 of fig. 9 over time.

As shown in fig. 9, in the light source array 1100, since the phase retarder 430 is connected to the second output line 412, light output from the first output line 411 is delayed by 0 degrees, and light output from the second output line 412 may be delayed by 180 degrees.

As shown in fig. 9, since the phase delayer 430 is connected to the second receiving line 422 in the receiver 120, a time point at which light reception starts at the first receiving line 421 is delayed by 0 degree, and a time point at which light reception starts from the second receiving line 422 may be delayed by 90 degrees.

Therefore, referring to fig. 10, the light output from the first output line 411 is delayed by 0 degree and output, and since the first pixel 401, which is a pixel on the first receiving line 421 receiving the light output from the first output line 411, receives the light delayed by 0 degree, the first pixel 401 can receive the 0 degree phase signal.

In addition, the light output from the first output line 411 is delayed by 0 degree and output, and since the second pixel 402, which is a pixel on the second receiving line 422 receiving the light output from the first output line 411, receives the light delayed by 90 degrees, the second pixel 402 can receive a 90-degree phase signal.

In addition, the light output from the second output line 412 is delayed by 180 degrees and output, and since the third pixel 403 (which is a pixel on the first receiving line 421 that receives the light output from the second output line 412) receives the light delayed by 0 degrees, the third pixel 403 can receive a 180-degree phase signal.

In addition, the light output from the second output line 412 is delayed by 180 degrees and output, and since the fourth pixel 404, which is a pixel on the second receiving line 422 receiving the light output from the second output line 412, receives the light delayed by 90 degrees, the fourth pixel 404 may receive a 270-degree phase signal.

Fig. 11 is a diagram illustrating a method of the camera module 100 for improving the resolution of an image using the super resolution technique.

Meanwhile, according to an embodiment, the camera module 100 may improve the resolution of the depth image using a Super Resolution (SR) technique. The SR technique may broadly refer to a method of obtaining a high resolution image from a plurality of low resolution images.

Specifically, the processor 1000 may obtain one depth information in units of blocks. If one depth information can be obtained for each pixel, 16 depth information can be obtained from 16 pixels. However, if one depth information can be obtained in units of blocks, acquirable information is reduced. Since one depth information can be obtained by collecting information of four pixels, in principle, the information that can be obtained can be reduced to one fourth. For example, processor 1000 may obtain four depth information from first block 1110, second block 1120, third block 1130, and fourth block 1140.

However, when one depth information is obtained by collecting information obtained from four pixels, more information can be obtained in the case where the used pixels are repeatedly used. For example, the processor 1000 includes first through fourth blocks 1110 through 1140, and fifth, sixth, seventh, and eighth blocks 1150, 1160, 1170, and 1180 may be further used. In addition, in some cases, one depth information may be obtained by four non-adjacent pixels.

In order to obtain one depth information from four pixels, each light applied to the four pixels may include phase signals of different phases. For example, a 0-degree phase signal, a 90-degree phase signal, a 180-degree phase signal, and a 270-degree phase signal may be respectively applied to four pixels included in the fifth block 1150.

In fig. 11, the case where the number of pixels included in a block is 4 and the number of pixels overlapped between overlapped blocks is 2 is described according to an exemplary embodiment, without being limited thereto.

Fig. 12 is a diagram for illustrating an example of improving resolution according to the super-resolution technique according to the embodiment.

Referring to the first resolution map 1210, when information is obtained in units of pixels, a resolution corresponding to the number of pixels may be obtained. However, in the case where information is obtained in units of blocks, when one pixel is used only once, resolution may be reduced due to the number of pixels included in a block. For example, the resolution of the second resolution map 1220 is reduced to a quarter compared to the first resolution map 1210. However, when the above SR technique is used, the resolution may be significantly improved, and a higher resolution than that expressed in the third resolution map 1230 may be achieved by an additional algorithm.

Fig. 13 is a flowchart illustrating a method of obtaining depth information about an object according to an embodiment. Fig. 13 can be understood with reference to the contents of fig. 1 to 12 described above.

The camera module 100 according to the embodiment outputs light to the object through the light sources on the first output lines and the light sources on the second output lines in step S1310, and the camera module 100 according to the embodiment receives light reflected from the object through the pixels on the first receiving lines and the pixels on the second receiving lines in step S1320.

The phase difference between the light output from the light source on the first output line and the light output from the light source on the second output line is a first value, and when the phase difference between a point of time when the pixel on the first receiving line receives the light and a point of time when the pixel on the second receiving line receives the light is a second value, the first value and the second value may be different from each other.

In step S1330, the camera module 100 according to the embodiment obtains depth information about the object by using the phase difference between the output light and the received light. Alternatively, the camera module 100 may obtain depth information on an object by comparing an output time point of outputting light and a receiving time point of receiving light.

Meanwhile, the above-described method may be written as a program that can be executed on a computer, and may be implemented in a general-purpose digital computer that operates the program using a computer-readable recording medium. In addition, the structure of data used in the above-described method may be recorded on a computer-readable recording medium in various ways. The recording medium that can be read by the above-mentioned computer includes storage media such as magnetic storage media (e.g., ROM, RAM, USB, floppy disks, hard disks, etc.) and optical reading media (e.g., CD-ROMs, DVDs, etc.).

The embodiments of the present invention have been described above with reference to the accompanying drawings, but it will be understood by those skilled in the art that the present invention may be embodied in other specific forms without changing the technical idea or essential features. It is therefore to be understood that the above described embodiments are illustrative and not restrictive in all respects.

23页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像编码/解码方法和装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类