Imaging system and related electronic device

文档序号:1874796 发布日期:2021-11-23 浏览:29次 中文

阅读说明:本技术 成像系统以及相关电子装置 (Imaging system and related electronic device ) 是由 李宗德 王浩任 张继宗 于 2021-07-07 设计创作,主要内容包括:本申请公开了一种成像系统以及相关电子装置及成像系统的操作方法。所述成像系统包括:图像传感器,包括:像素阵列;触发单元,用以控制所述图像传感器在节能模式和非节能模式之间切换;飞行时间分析单元,用以读取所述像素阵列并据以产生多个被读取光点在所述像素阵列上的位置信息以及所述多个被读取光点的飞行时间,并据以获得所述目标物的深度信息;区域划分单元,用以在所述非节能模式下,依据所述多个被读取光点在所述像素阵列上的所述位置信息,将所述像素阵列划分为多个第一类型区域与多个第二类型区域;其中所述飞行时间分析单元在所述节能模式下仅读取所述像素阵列中的所述多个第一类型区域中的像素的传感结果。(The application discloses an imaging system, a related electronic device and an operation method of the imaging system. The imaging system includes: an image sensor, comprising: an array of pixels; a trigger unit for controlling the image sensor to switch between an energy-saving mode and a non-energy-saving mode; a time-of-flight analysis unit, configured to read the pixel array and accordingly generate position information of a plurality of read light spots on the pixel array and time-of-flight of the plurality of read light spots, and accordingly obtain depth information of the target object; the area dividing unit is used for dividing the pixel array into a plurality of first type areas and a plurality of second type areas according to the position information of the plurality of read light spots on the pixel array in the non-energy-saving mode; wherein the time-of-flight analysis unit reads only sensing results of pixels in the plurality of first type regions in the pixel array in the power saving mode.)

1. An imaging system, comprising:

an image sensor, comprising:

a pixel array having a plurality of pixel rows extending in a first predetermined direction and a plurality of pixel columns extending in a second predetermined direction, the first predetermined direction being perpendicular to the second predetermined direction, the pixel array being configured to sense a light reflection signal reflected by a target to the pixel array, the light reflection signal including a plurality of reflection light spots;

a trigger unit for controlling the image sensor to switch between an energy-saving mode and a non-energy-saving mode;

a time-of-flight analysis unit, configured to read the pixel array and accordingly generate position information of a plurality of read light spots on the pixel array and time-of-flight of the plurality of read light spots, and accordingly obtain depth information of the target object, where the read light spots are part or all of the plurality of reflected light spots;

the area dividing unit is used for dividing the pixel array into a plurality of first type areas and a plurality of second type areas according to the position information of the plurality of read light spots on the pixel array in the non-energy-saving mode; and

a storage, coupled to the time-of-flight analysis unit and the region division unit, for storing location information of the plurality of first type regions;

wherein the time-of-flight analysis unit reads only sensing results of pixels in the plurality of first type regions in the pixel array in the power saving mode, and the time-of-flight analysis unit reads sensing results of all pixels in the pixel array in the non-power saving mode.

2. The imaging system of claim 1, wherein in the power saving mode, the triggering unit is further configured to find a reflected light spot of the plurality of reflected light spots that does not fall on the first type area according to the position information of the plurality of read light spots on the pixel array.

3. The imaging system of claim 2, wherein in the energy-saving mode, the pixel array performs multiple sensing, the triggering unit is further configured to count a number of times that each of the reflected light spots is not read continuously, and when a number of reflected light spots in each of the reflected light spots that are not read continuously is greater than a first threshold is greater than a second threshold, the triggering unit controls the image sensor to switch to the non-energy-saving mode.

4. The imaging system of claim 2, wherein in the energy-saving mode, the pixel array performs multiple sensing, the triggering unit is further configured to count a probability that each reflected light spot is not read, and when the number of reflected light spots in each reflected light spot with the probability that the reflected light spot is not read is greater than a third threshold is greater than a fourth threshold, the triggering unit controls the image sensor to switch to the non-energy-saving mode.

5. The imaging system of claim 1, wherein in the power-saving mode, the triggering unit controls the image sensor to switch to the non-power-saving mode for sensing N times every M times the pixel array senses, wherein M, N is a positive integer.

6. The imaging system according to claim 1, wherein in the non-energy-saving mode, the area dividing unit is further configured to determine whether each of a plurality of unit areas of the pixel array is hit by any one of the plurality of reflected light spots according to the position information of the plurality of read light spots on the pixel array.

7. The imaging system according to claim 6, wherein in the non-energy-saving mode, the pixel array performs sensing for a plurality of times, and the area dividing unit is further configured to count a probability that the plurality of unit areas of the pixel array are hit by any one of the plurality of reflection light spots, and to classify the plurality of unit areas as the first type area or the second type area according to the probability.

8. The imaging system of claim 7, wherein the plurality of unit areas are a plurality of pixel rows of the pixel array.

9. The imaging system of claim 7, wherein the region dividing unit is further configured to update the location information of the plurality of first type regions stored by the storage.

10. The imaging system of claim 1, wherein the memory is further configured to store location information for the plurality of first type regions obtained by the region dividing unit at different temperatures for the image sensor.

11. The imaging system of claim 10, wherein the time-of-flight analysis unit is further configured to obtain the position information of the plurality of first type regions in the storage correspondingly according to the temperature of the image sensor in the energy saving mode, and accordingly, the time-of-flight analysis unit reads the sensing results of the pixels in the plurality of first type regions in the pixel array.

12. The imaging system of claim 1, further comprising:

and the light emitting module is used for sending an optical signal to the target object to obtain the optical reflection signal.

13. The imaging system of claim 12, wherein the memory is configured to further store the position information of the plurality of first type regions obtained by the region dividing unit for the light emitting modules at different temperatures.

14. The imaging system of claim 13, wherein the time-of-flight analyzing unit is further configured to obtain the position information of the first type regions in the storage correspondingly according to the temperature of the light emitting module in the energy saving mode, and accordingly, the time-of-flight analyzing unit reads the sensing results of the pixels in the first type regions in the pixel array.

15. The imaging system of claim 1, wherein the plurality of spots have a plurality of spot rows extending in the first predetermined direction.

16. The imaging system of claim 12, wherein the light emitting modules are arranged on one side of the pixel array along the first predetermined direction.

17. The imaging system of claim 16, wherein a line connecting a center of the light emitting module and a center of the pixel array is parallel to the first predetermined direction.

18. The imaging system of claim 12, wherein the light module comprises: a light source for outputting a light signal; and

an optical element for changing the traveling route of the optical signal to generate the optical signal.

19. An electronic device, comprising:

the imaging system of any of claims 1 to 18.

Technical Field

The present disclosure relates to sensing systems, and more particularly to an imaging system and related electronic device.

Background

The time of flight (TOF) ranging technique continuously transmits an optical signal from a transmitting end to a target object and receives the optical signal returned from the target object at a receiving end, thereby calculating the time of flight of the optical signal from the transmitting end to the receiving end and obtaining the distance between the target object and the transmitting end/receiving end. The time-of-flight ranging technology can be roughly divided into two different schemes of a point light source and a surface light source, wherein the scheme of the point light source can concentrate energy on a limited number of light spots, and is suitable for long-distance application. The scheme of this application to the pointolite improves, under the prerequisite that does not influence the degree of accuracy of time of flight range finding technique, satisfies the demand of low power consumption.

Disclosure of Invention

An objective of the present application is to disclose an imaging system, and an electronic device and an operating method of the imaging system, which solve the above problems.

An embodiment of the present application discloses an imaging system, including: an image sensor, comprising: a pixel array having a plurality of pixel rows extending in a first predetermined direction and a plurality of pixel columns extending in a second predetermined direction, the first predetermined direction being perpendicular to the second predetermined direction, the pixel array being configured to sense a light reflection signal reflected by a target to the pixel array, the light reflection signal including a plurality of reflection light spots; a trigger unit for controlling the image sensor to switch between an energy-saving mode and a non-energy-saving mode; a time-of-flight analysis unit, configured to read the pixel array and accordingly generate position information of a plurality of read light spots on the pixel array and time-of-flight of the plurality of read light spots, and accordingly obtain depth information of the target object, where the read light spots are part or all of the plurality of reflected light spots; the area dividing unit is used for dividing the pixel array into a plurality of first type areas and a plurality of second type areas according to the position information of the plurality of read light spots on the pixel array in the non-energy-saving mode; the storage is coupled with the flight time analysis unit and the area division unit and used for storing the position information of the first type areas; wherein the time-of-flight analysis unit reads only sensing results of pixels in the plurality of first type regions in the pixel array in the power saving mode, and the time-of-flight analysis unit reads sensing results of all pixels in the pixel array in the non-power saving mode.

An embodiment of the present application discloses an electronic device, which includes the aforementioned imaging system.

The imaging system, the related electronic device and the operating method of the imaging system can reduce power consumption and increase speed on the premise of not influencing accuracy.

Drawings

Fig. 1 is a schematic view of a first embodiment of an imaging system of the present application.

Fig. 2 is a schematic view of a second embodiment of the imaging system of the present application.

Fig. 3 is a schematic diagram of an embodiment in which a plurality of light spots are irradiated on a pixel array.

FIG. 4 is a diagram illustrating an embodiment of multiple light spots illuminating a pixel array after imaging system variation.

Fig. 5 is a schematic diagram of an embodiment of an electronic device of the present application.

Detailed Description

The following disclosure provides various embodiments or illustrations that can be used to implement various features of the disclosure. The embodiments of components and arrangements described below serve to simplify the present disclosure. It is to be understood that such descriptions are merely illustrative and are not intended to limit the present disclosure. For example, in the description that follows, forming a first feature on or over a second feature may include certain embodiments in which the first and second features are in direct contact with each other; and may also include embodiments in which additional elements are formed between the first and second features described above, such that the first and second features may not be in direct contact. In addition, the present disclosure may repeat reference numerals and/or characters in the various embodiments. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.

Moreover, spatially relative terms, such as "under," "below," "over," "above," and the like, may be used herein to facilitate describing a relationship between one element or feature relative to another element or feature as illustrated in the figures. These spatially relative terms are intended to encompass a variety of different orientations of the device in use or operation in addition to the orientation depicted in the figures. The device may be otherwise oriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.

Although numerical ranges and parameters setting forth the broad scope of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as possible. Any numerical value, however, inherently contains certain standard deviations found in their respective testing measurements. As used herein, "the same" generally means that the actual value is within plus or minus 10%, 5%, 1%, or 0.5% of a particular value or range. Alternatively, the term "the same" means that the actual value falls within the acceptable standard error of the mean, subject to consideration by those of ordinary skill in the art to which this application pertains. It is understood that all ranges, amounts, values and percentages used herein (e.g., to describe amounts of materials, length of time, temperature, operating conditions, quantitative ratios, and the like) are "the same" unless otherwise specifically indicated or indicated. Accordingly, unless indicated to the contrary, the numerical parameters set forth in the specification and attached claims are approximations that may vary depending upon the desired properties sought to be obtained. At the very least, these numerical parameters are to be understood as meaning the number of significant digits recited and the number resulting from applying ordinary carry notation. Herein, numerical ranges are expressed from one end to the other or between the two ends; unless otherwise indicated, all numerical ranges set forth herein are inclusive of the endpoints.

Fig. 1 is a functional block diagram schematic of a first embodiment of an imaging system 100 of the present disclosure. The imaging system 100 may be implemented by a three-dimensional imaging system for obtaining depth information (or depth image) of surrounding objects. By way of example, but not limiting of the present disclosure, the imaging system 100 may be a time-of-flight imaging system that may obtain depth information of the target object 102 by measuring a distance between the target object 102 and the imaging system 100. It is noted that in some embodiments, the imaging system 100 may be a three-dimensional imaging system, which can determine the depth information of the target object 102 according to the pattern deformation of the light reflection signal received by the receiving end. For the sake of brevity, the imaging scheme of the present disclosure is described below in terms of an embodiment in which the imaging system 100 is implemented as a time-of-flight imaging system. However, those skilled in the art will appreciate that the imaging scheme of the present disclosure can be applied to other three-dimensional imaging systems that obtain depth images from optical signals at the transmitting end and the receiving end.

The imaging system 100 employs a point light source scheme, including (but not limited to) a light emitting module 110 and an image sensor 120. The light emitting module 110 is configured to generate an optical signal LS, wherein the optical signal LS may have a predetermined pattern (pattern) such that energy is concentrated in the predetermined pattern, for example, the predetermined pattern may be a speckle array, and the optical energy is concentrated in each scattered spot of the speckle array. The light emitting module 110 may include a light source 112 and an optical element 114. The optical element 114 may be used to change a traveling route, an illumination range, and the like of the light signal LI output by the light source 112, thereby generating the light signal LS having the predetermined pattern. In this embodiment, the projection of the light signal LS on the object 102 may form a plurality of light spots (light spots) separated from each other to reduce the influence of the background noise on the measurement result.

By way of example, and not limitation, the optical source 112 may include a Vertical-Cavity Surface-Emitting Laser (VCSEL) array, and the optical element 114 may include a Diffractive Optical Element (DOE) or a Refractive Optical Element (ROE) for cone-diffracting (or cone-refracting) the optical signal LI to generate the optical signal LS, such that the optical signal LS may be projected on the target 102 to form a plurality of spots separated from each other. In some embodiments, a collimating lens is further included between the optical source 112 and the optical element 114 for shaping the optical signal LI into parallel light.

The image sensor 120 is configured to sense a light reflection signal LR returned from the object 102 to obtain image information of the object 102, wherein the light reflection signal LR is generated by the light signal LS reflected by the object 102. In this embodiment, the image sensor 120 includes, but is not limited to, a pixel array 122, a time-of-flight analysis unit 124, a trigger unit 125, a region division unit 126, and a storage 128. The implementation manners of the time-of-flight analyzing unit 124, the triggering unit 125, and the area dividing unit 126 are not limited in the present application, for example, in some embodiments, the time-of-flight analyzing unit 124, the triggering unit 125, and the area dividing unit 126 may be implemented by using different specific circuits respectively; in some embodiments, the time-of-flight analysis unit 124, the triggering unit 125, and the area division unit 126 may be software modules and operated by a computer unit. Referring to fig. 3, the pixel array 122 has a plurality of pixel rows extending in the first predetermined direction X and a plurality of pixel columns extending in the second predetermined direction Y, such as a first pixel row composed of pixels PD00 through PD09, a second pixel row composed of pixels PD10 through PD19, a third pixel row composed of pixels PD20 through PD29, and so on; and a first pixel column made up of pixels PD00 through PD90, a second pixel column made up of pixels PD01 through PD91, a third pixel column made up of pixels PD02 through PD92, and so on. The first predetermined direction X is perpendicular to the second predetermined direction Y, and the pixel array 122 is used for sensing the light reflection signal LR. It is noted that the optical signal LS may form a plurality of light spots separated from each other on the surface of the object 102, the plurality of light spots are reflected to the pixel array 122 and form a plurality of reflected light spots separated from each other on the pixel array 122, such as black spots in fig. 3 (shown as brighter light spots on the actual image, the black spots in the figure are only schematic), and each reflected light spot may be irradiated on at least one pixel.

In the present embodiment, the light emitting modules 110 are disposed adjacent to the image sensor 120, the light emitting modules 110 are arranged at one side of the pixel array 122 along the first predetermined direction X and are disposed side by side, and a connection line between a center of the light emitting module 110 (i.e., a center of the light source 112 and/or the optical element 114) and a center of the pixel array 122 is parallel to the first predetermined direction X. The time-of-flight analyzing unit 124 is coupled to the pixel array 122 for reading data of pixel units in the pixel array 122 and obtaining position information of a plurality of read light spots on the pixel array 122. Since the imaging system 100 adopts a point light source scheme, although each light spot is displaced to different degrees due to different depths, on the premise that the distance between the target 102 and the imaging system 100 is within an allowable range, the time-of-flight analysis unit 124 can still determine which one of the light spots of the optical signal LS emitted by the light emitting module 110 each read light spot should correspond to by comparing with the light spot pattern of the optical signal LS. Therefore, the time-of-flight analysis unit 124 can obtain the time-of-flight of the plurality of read light spots according to the light sensing signal of the pixel unit irradiated by the read light spot, and accordingly obtain the depth information of the target object 102. In order to reduce the power consumption of the image sensor 120, the time-of-flight analysis unit 124 optionally decides which pixels in the pixel array 122 to read, and thus the read spot is optionally part or all of the plurality of reflected spots.

Specifically, the storage 128 is coupled to the time-of-flight analysis unit 124 and the region dividing unit 126, and the storage 128 stores location information of a plurality of first-type regions for recording which pixels in the pixel array 122 are not required to be read. When the image sensor 120 operates in the power saving mode or the default mode in some cases, the time-of-flight analysis unit 124 reads the sensing results of the pixels in the plurality of first type regions in the pixel array 122 only according to the position information of the plurality of first type regions stored in the storage 128, that is, only partial scanning is performed on the pixel array 122 to save power consumption, and the speed can be increased since a smaller amount of information needs to be processed; when the image sensor 120 operates in the non-power saving mode, the time-of-flight analysis unit 124 reads the sensing results of all the pixels in the pixel array 122, i.e. the pixel array 122 is scanned completely to establish a database of the pixels hit by the reflected light spots. For example, fig. 3 includes a first type region 302, and a first type region 306 enclosed by dashed lines.

Therefore, the image sensor 120 can switch between the energy saving mode and the non-energy saving mode to reduce the power consumption of the time-of-flight analysis unit 124 during a general operation phase (e.g., when a general user takes a picture using the device). In some embodiments, the image sensor 120 may further include a pre-operation phase that is performed before the general operation phase, and the time-of-flight analysis unit 124 may scan the pixel array 122 during the pre-operation phase to enable the region division unit 126 to establish a library of useful pixels. Specifically, the pre-operation stage may be a pre-operation stage before the imaging system 100 is actually used, for example, in the pre-operation stage, the target 102 with a complete plane without concave-convex depth is used as a reference target to return the light reflection signal LR to the pixel array 122, and the time-of-flight analysis unit 124 is enabled to read the sensing result of all pixels in the pixel array 122 in the pre-operation stage to detect all of the multiple positions of the multiple reflection spots irradiated on the pixel array 122, that is, the multiple read spots are generated to include all of the multiple reflection spots, the area dividing unit 126 is enabled to divide the multiple pixels in the pixel array 122 into the multiple first type areas (areas to be read under the energy saving operation) and the multiple second type areas (areas not to be read under the energy saving operation) according to the position information of the multiple read spots, and stores the location information of the plurality of first type regions in the storage 128 for use by the general operation phase following the pre-operation phase. For example, the areas outside the first-type area 302, and the first-type area 306 in fig. 3 are the second-type areas. The pre-operation phase may be completed before the imaging system leaves the factory, and is not generally open for the end user to perform the pre-operation.

When the pixel array 122 is divided into the plurality of first type regions and the plurality of second type regions, the region dividing unit 126 may first divide the pixel array 122 into a plurality of unit regions, for example, each unit region includes one row of pixel rows of the pixel array 122. The area dividing unit 126 determines whether each of the unit areas of the pixel array is hit by any of the reflective light spots according to the position information of the read light spots on the pixel array 122. In some embodiments, the unit area may be one pixel of the pixel array 122 or other sizes. Generally, the image sensor 120 performs multiple sensing operations to obtain more statistical data, and the objects targeted by the multiple sensing operations may have different distances and depths, so that the result obtained by the region dividing unit 126 is more representative. For example, the area dividing unit 126 may determine a probability that each of the unit areas of the pixel array 122 is hit by any of the plurality of reflective light spots, divide the first type areas and the second type areas accordingly, and store the obtained position information of the first type areas in the storage 128. For example, a unit area having a probability of being hit by any one of the plurality of reflection light spots more than 20% is divided into the first type area, and the remaining unit areas are divided into the second type area. That is, each unit area divided into the first type area is hit by the reflected light spot at least 20 times per one hundred sensing of the image sensor 120. In some embodiments, a trend of the hit probability of each unit region may be further calculated, for example, a trend unit region with a downward hit probability is divided into the second type regions.

In the general operation phase, the image sensor 120 may enter the power saving mode to reduce the power consumption of the time-of-flight analysis unit 124. On the premise that the distance between the target object 102 and the imaging system 100 is within the allowable range, the plurality of read light spots ideally includes all of the plurality of reflection light spots, that is, each reflection light spot does not exceed the range of the plurality of first type areas. However, if the imaging system 100 is subjected to a change in shape or relative position of the elements in the imaging system 100 due to an external force (e.g. impact) or other factors (hereinafter referred to as variation of the imaging system 100), the reflected light spots may be beyond the plurality of first type areas. In response to the above situation, the present application further proposes a detection mechanism for determining when the region partitioning unit 126 should be triggered to update the location information of the plurality of first-type regions stored in the storage 128.

In the embodiment of fig. 1, the triggering unit 125 is coupled to the time-of-flight analyzing unit 124 and the area dividing unit 126. In the general operation phase, the image sensor 120 may first enter the energy saving mode (i.e., default energy saving mode), in which the time-of-flight analysis unit 124 reads the pixel units of the first type area according to the information of the first type area stored in the pre-operation phase. The triggering unit 125 determines which of the plurality of reflected light spots are not in the plurality of read-out light spots according to the position information of the plurality of read-out light spots on the pixel array 122 provided by the time-of-flight analyzing unit 124. As mentioned above, after the imaging system 100 is modified, since the physical displacement of each element in the imaging system 100 may cause each reflected light spot to irradiate the pixel array 122, which is changed from fig. 3 to fig. 4, part of the reflected light spots 312 and 314 in fig. 4 leave the originally planned first type area 302, and first type area 306, so that two reflected light spots 312 and 314 are not read by the time-of-flight analysis unit 124 in the energy-saving mode. In other words, the triggering unit 125 finds out the light spot which is not read by the time-of-flight analyzing unit 124 from the plurality of reflected light spots.

In the energy saving mode, the pixel array 122 performs sensing for multiple times, and when the sensing times are enough, which represents that different situations (for example, the depth of the object) are more, the trigger unit 125 can use enough abundant information to count the times of continuous unread of each reflected light spot, and when the number of reflected light spots in each reflected light spot that are continuously unread for more than the first threshold TH1 is greater than the second threshold TH2, the number of reflected light spots representing the second threshold TH2 is likely to never fall within the range of the plurality of first type regions because of the variation of the imaging system 100, in other words, the degree of the variation of the imaging system 100 exceeds the allowable threshold. At this time, the triggering unit 125 controls the time-of-flight analyzing unit 124 to switch to the non-energy-saving mode to re-estimate the location information of the first type areas.

In some embodiments, when the number of the reflection light spots with the probability of not being read is greater than the third threshold TH3 is greater than the fourth threshold TH4, the probability that the reflection light spots with the number of the fourth threshold TH4 no longer fall within the plurality of first type regions due to the variation of the imaging system 100 is greater than the third threshold TH3, in other words, the severity of the variation of the imaging system 100 exceeds the allowable threshold. At this time, the triggering unit 125 controls the time-of-flight analyzing unit 124 to switch to the non-energy-saving mode to perform a complete scan on the pixel array 122, that is, read sensing results of all pixels in the pixel array 122; the triggering unit 125 further controls the area dividing unit 126 to switch to the non-energy-saving mode simultaneously to estimate the location information of the plurality of first type areas, and update the location information to the storage 128 after the estimation is completed.

In the non-power saving mode, the pixel array 122 performs multiple sensing, and in the embodiment of fig. 1, once the non-power saving mode is entered, the pixel array 122 performs multiple sensing until the updating of the first type region is completed. In the non-energy-saving mode, the area dividing unit 126 determines whether each unit area in a plurality of unit areas of the pixel array 122 is hit by any one of the plurality of reflective light spots according to the position information of the plurality of read light spots on the pixel array 122, counts the probability that each unit area in the pixel array 122 is hit by any one of the plurality of reflective light spots, and classifies the plurality of unit areas as the first type area or the second type area according to the probability. The storage 128 may then update the location information of the plurality of first type regions stored therein. In the non-power saving mode, the operation of the area division unit 126 may be substantially the same as or similar to that in the pre-operation stage.

The embodiment of fig. 2 is different from the embodiment of fig. 1 in that the triggering unit 125 does not use the information obtained by the time-of-flight analyzing unit 124 to determine whether to enter the non-energy-saving mode from the energy-saving mode, and the triggering unit 125 of fig. 2 enters the non-energy-saving mode from the energy-saving mode at a fixed period. In other words, the triggering unit 125 does not need to determine whether the variation of the imaging system 100 is serious, but performs a certain number of sensing operations to uniformly and automatically update the storage 128. For example, in the power saving mode, every time the pixel array 122 performs sensing M times, i.e., performs partial scanning on M frames consecutively, the triggering unit 125 controls the image sensor 120 to switch to the non-power saving mode, so that the pixel array 122 performs sensing N times, i.e., performs full scanning on N frames consecutively, where M, N is a positive integer. After k cycles (k is a positive integer), the region dividing unit 126 counts the probability that the unit regions of the pixel array 122 are hit by any one of the reflective light spots according to the information of k × N complete scans obtained from the time-of-flight analyzing unit 124, and classifies the unit regions into the first type region or the second type region accordingly. The storage 128 may then update the location information of the plurality of first type regions stored therein.

Since the imaging system 100 may have different variations at different temperatures, in some embodiments, the position information of the first type regions corresponding to the respective temperatures may be respectively established in the storage 128 for different temperatures, for example, the position information of the first type regions may be respectively established for the image sensor 120 or the light emitting module 110 at different temperatures. For example, the area dividing unit 126 establishes the position information of the plurality of first type areas for the image sensor 120 at 20 degrees celsius or less and at 20 degrees celsius exceeding, so that two sets of position information of the plurality of first type areas can be obtained, and therefore, in the energy saving mode, the time-of-flight analyzing unit 124 needs to correspondingly select one of the two sets of position information in the storage 128 according to the temperature of the image sensor 120, and accordingly, read the pixel array 122. For another example, the area dividing unit 126 respectively establishes the position information of the plurality of first type areas for the light emitting module 110 at a temperature below 0 celsius, above 0 celsius, below 20 celsius, and above 20 celsius, so that three sets of position information of the plurality of first type areas can be obtained, and therefore the time-of-flight analyzing unit 124 correspondingly selects one of the three sets of position information in the storage 128 according to the temperature of the light emitting module 110 in the energy saving mode, and reads the pixel array 122 accordingly.

Fig. 5 is a schematic diagram of an embodiment of the imaging system 100 applied in an electronic device 500. In some embodiments, the electronic device 500 may be any electronic device such as a smart phone, a personal digital assistant, a handheld computer system, or a tablet computer.

The foregoing description has set forth briefly the features of certain embodiments of the present application so that those skilled in the art may more fully appreciate the various aspects of the present disclosure. Those skilled in the art should appreciate that they may readily use the present disclosure as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should understand that they can still make various changes, substitutions and alterations herein without departing from the spirit and scope of the present disclosure.

14页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种激光雷达扫描装置以及激光雷达扫描方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类