Sensor system architecture with feedback loop and multiple power states

文档序号:1836569 发布日期:2021-11-12 浏览:13次 中文

阅读说明:本技术 具有反馈回路和多个功率状态的传感器系统架构 (Sensor system architecture with feedback loop and multiple power states ) 是由 W·尼斯迪克 于 2020-03-20 设计创作,主要内容包括:在一个具体实施中,一种系统包括图像流水线和具有像素阵列的事件传感器。像素阵列被配置为操作处于启用状态的第一像素子集和处于非启用状态的第二像素子集。事件传感器被配置为输出像素事件。响应于第一像素子集内的特定像素检测到超过比较器阈值的光强度变化,生成每个相应像素事件。图像流水线被配置为消费从像素事件导出的图像数据并基于图像数据将反馈信息传送到事件传感器。反馈信息使得第一像素子集内的像素从启用状态转变到另一个状态。(In one implementation, a system includes an image pipeline and an event sensor having an array of pixels. The pixel array is configured to operate a first subset of pixels in an enabled state and a second subset of pixels in a non-enabled state. The event sensor is configured to output a pixel event. Each respective pixel event is generated in response to a particular pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. The image pipeline is configured to consume image data derived from the pixel events and to communicate feedback information to the event sensor based on the image data. The feedback information causes pixels within the first subset of pixels to transition from the enabled state to another state.)

1. A system, comprising:

an event sensor having an array of pixels configured to operate a first subset of pixels in an enabled state and a second subset of pixels in a non-enabled state, the event sensor configured to output pixel events, each respective pixel event generated in response to a particular pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold; and

an image pipeline configured to consume image data derived from the pixel events and communicate feedback information to the event sensor based on the image data, the feedback information causing pixels within the first subset of pixels to transition from the enabled state to another state.

2. The system of claim 1, wherein the feedback information corresponds to a region of interest tracked by the image pipeline within the image data.

3. The system of any of claims 1-2, wherein each pixel in the second subset of pixels operates in an off power mode.

4. The system of any of claims 1-3, wherein the pixel array further comprises a third subset of pixels in a standby state, each pixel in the third subset of pixels operating in a low power mode.

5. The system of claim 4, wherein the feedback information causes the pixels within the first subset of pixels to transition to the standby state.

6. A system, comprising:

an event sensor having a pixel array configured to have an enabled pixel region operating in a full power mode and a non-enabled pixel region operating in an off power mode or a low power mode;

a processor; and

a computer-readable storage medium comprising instructions that, when executed by the processor, cause the system to perform operations comprising:

outputting pixel events from the event sensor to an image pipeline, each respective pixel event being generated in response to a particular pixel within the enabled pixel region detecting a change in light intensity that exceeds a comparator threshold;

receiving feedback information from the image pipeline based on image data derived from the pixel events; and

directing pixels within the enabled pixel region to operate in the off power mode or the low power mode in response to receiving the feedback information.

7. The system of claim 6, wherein each pixel within the non-enabled pixel region operates in the off-power mode, and wherein the pixel array further comprises a standby pixel region operating in the low-power mode.

8. The system of claim 7, wherein the standby pixel area is interposed between the enabled pixel area and the non-enabled pixel area within the pixel array.

9. The system of any of claims 6-8, wherein a subset of pixels within the enabled pixel region operate in the off power mode or the low power mode.

10. The system of any of claims 6 to 9, wherein outputting the pixel event comprises:

outputting a pixel event box from the event sensor to the image pipeline.

11. The system of claim 10, wherein the pixel event bins are output from the event sensor on a periodic basis.

12. The system of claim 11, wherein the periodic basis is synchronized with a global read operation or a global reset operation of the event sensor.

13. The system of claim 10, wherein the event sensor outputs each pixel event bin after generating a predefined number of pixel events.

14. The system of claim 10, wherein each pixel event bin is output as a two-dimensional tile of pixel events.

15. The system of claim 10, wherein each pixel event bin is output from the event sensor as a pixel event list.

16. A system, comprising:

a processor;

an image pipeline; and

a computer-readable storage medium comprising instructions that, when executed by the processor, cause the system to perform operations comprising:

receiving, by the image pipeline, pixel events from an event sensor having a pixel array including a first subset of pixels in an enabled state and a second subset of pixels in a non-enabled state, each respective pixel event being generated in response to a particular pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold;

deriving image data from the pixel events using the image pipeline; and

generating, with the image pipeline, feedback information based on the image data, the feedback information causing the event sensor to direct the pixels within the first subset of pixels to transition from the enabled state to another state.

17. The system of claim 16, wherein generating the feedback information comprises:

tracking a region of interest within the image data using the image pipeline.

18. The system of any of claims 16 to 17, further comprising:

a light source configured to emit light toward a scene disposed within a field of view of the event sensor.

19. The system of claim 18, wherein the instructions, when executed, further cause the system to perform additional operations comprising:

pulsing the light source at a defined frequency such that pixels within the first subset of pixels generate event data at a rate proportional to the defined frequency.

20. The system of any of claims 16 to 19, wherein the feedback information is an enabled region offset that defines a region of the pixel array that corresponds to a region of interest tracked by the image pipeline within the image data.

Technical Field

The present disclosure relates generally to the field of image processing, and in particular, to techniques for implementing a sensor system architecture having a feedback loop and a sensor configured to support multiple power states.

Background

The event camera may include an image sensor called a dynamic vision sensor ("DVS"), a silicon retina, an event-based sensor, or a frameless sensor. Thus, the event camera generates (and transmits) data about the light intensity variations at each pixel sensor, as opposed to the frame-based camera outputting data about the absolute light intensity at each pixel. In other words, while the illumination level of a scene disposed within the field of view remains stable, the frame-based camera will continue to generate (and transmit) data regarding the absolute light intensity at each pixel, while the event camera will refrain from generating or transmitting data until a change in illumination level is detected.

Some image processing operations utilize incomplete image datasets derived from pixel events output by event-driven sensors. Such image processing operations may improve computational efficiency by cropping image data, and process the cropped image data to save power, etc. However, the pixels of the event driven sensor corresponding to image data outside of the cropped image data continue to operate and thus continue to consume power. It is therefore desirable to address this inefficiency arising when image processing operations utilize incomplete image data sets derived from pixel events output by event driven sensors.

Disclosure of Invention

Various implementations disclosed herein relate to techniques for implementing an event camera system architecture having a feedback loop and event driven sensors configured to support multiple power states. In one implementation, a system includes an image pipeline and an event sensor having an array of pixels. The pixel array is configured to operate a first subset of pixels in an enabled state and a second subset of pixels in a non-enabled state. The event sensor is configured to output a pixel event. Each respective pixel event is generated in response to a particular pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. The image pipeline is configured to consume (consume) image data derived from the pixel events and communicate feedback information to the event sensor based on the image data. The feedback information causes pixels within the first subset of pixels to transition from the enabled state to another state.

In another implementation, a system includes an event sensor, a processor, and a computer-readable storage medium. The event sensor includes a pixel array configured to have an enabled pixel region operating in a full power mode and a non-enabled pixel region operating in an off power mode or a low power mode. The computer-readable storage medium includes instructions that, when executed by the processor, cause the system to perform operations. The operations include outputting pixel events from an event sensor to an image pipeline. Each respective pixel event is generated in response to a particular pixel within the enabled pixel region detecting a change in light intensity that exceeds a comparator threshold. The operations also include receiving feedback information from the image pipeline based on image data derived from the pixel events. In response to receiving the feedback information, directing pixels within the enabled pixel region to operate in an off power mode or a low power mode.

In another implementation, a system includes a processor, an image pipeline, and a computer-readable storage medium including instructions that, when executed by the processor, cause the system to perform operations. The operations include receiving, by an image pipeline, a pixel event from an event sensor having a pixel array including a first subset of pixels in an enabled state and a second subset of pixels in a non-enabled state. Each respective pixel event is generated in response to a particular pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. The operations also include deriving image data from the pixel events using an image pipeline. The image pipeline generates feedback information based on the image data. The feedback information causes the event sensor to direct the pixels within the first subset of pixels to transition from the enabled state to another state.

Drawings

Accordingly, the present disclosure may be understood by those of ordinary skill in the art and a more particular description may be had by reference to certain illustrative embodiments, some of which are illustrated in the accompanying drawings.

FIG. 1 illustrates a functional block diagram of an event sensor, according to some implementations.

FIG. 2 is a block diagram of an exemplary system for implementing an event driven sensor having a hardware architecture configured to support enabled, standby, and non-enabled operating states.

FIG. 3 shows an example of a complete image data set derived by the image pipeline from pixel events output by the event sensor.

FIG. 4 illustrates an example of cropped image data derived by the image pipeline from pixel events output by the event sensor.

FIG. 5 shows an example of different pixels within a pixel array of an event sensor having different operating states based on feedback information received from an image pipeline.

FIG. 6 shows an example of a pixel array having different pixels with different operating states, some of the pixels having operating states modified as feedback information received from the image pipeline is updated between a first time and a second time.

FIG. 7 illustrates a subset of pixels within the pixel array of FIG. 6 transitioning from one operating state to another as feedback information received from the image pipeline is updated between a first time and a second time.

FIG. 8 illustrates an exemplary two-dimensional ("2D") tile of pixel events for event sensor output for further processing, according to some implementations.

Fig. 9 is a block diagram of an example Head Mounted Device (HMD), according to some implementations.

FIG. 10 is a flow diagram illustrating an example of a method of implementing an event camera system architecture having a feedback loop and event driven sensors configured to support multiple power states.

FIG. 11 is a flow diagram illustrating another example of a method of implementing an event camera system architecture having a feedback loop and event driven sensors configured to support multiple power states.

In accordance with common practice, the various features shown in the drawings may not be drawn to scale. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. Additionally, some of the figures may not depict all of the components of a given system, method, or apparatus. Finally, throughout the specification and drawings, like reference numerals may be used to refer to like features.

Detailed Description

Numerous details are described in order to provide a thorough understanding of example implementations shown in the drawings. The drawings, however, illustrate only some example aspects of the disclosure and therefore should not be considered limiting. It will be apparent to one of ordinary skill in the art that other effective aspects or variations do not include all of the specific details described herein. Moreover, well-known systems, methods, components, devices, and circuits have not been described in detail so as not to obscure more pertinent aspects of the example implementations described herein.

A functional block diagram of an exemplary event sensor 100 is shown in fig. 1. Event sensor 100 includes a plurality of pixels 105 positioned to receive light from a scene disposed within a field of view of event sensor 100. In fig. 1, the plurality of pixels 105 are arranged in a matrix 107 of rows and columns, and thus, each pixel of the plurality of pixels 105 is associated with a row value and a column value. Each pixel of the plurality of pixels 105 includes a photodetector circuit 110 and an event circuit 180.

The photodetector circuit 110 is configured to generate a signal indicative of the intensity of light incident on the respective pixel 105 ("incident illumination"). To this end, the photodetector circuitry 110 includes a photodiode 112 configured to generate a photocurrent proportional to the incident illumination intensity. The photocurrent generated by the photodiode 112 flows into the logarithmic amplifier 120 formed by the transistors 121, 123, 125, and 127. The logarithmic amplifier 120 is configured to convert the photocurrent at node a into a voltage having a value that is a logarithm of the photocurrent value. Then, the voltage at the node a is amplified by the buffer amplifier 130 formed by the transistors 131 and 133 before being applied to the input side of the differential circuit 140 of the event circuit 180.

The pixel 105 also includes an event circuit 180 that includes a difference circuit 140, a comparator 160, and a controller 170. The differential circuit 140 is comprised of an alternating current ("AC") coupling capacitor 145 and a switched capacitor amplifier 150. The difference circuit 140 is configured to remove a direct current ("DC") voltage component from the voltage at node a to produce pixel data at sampling node B. Sampling the pixel data at node B provides a differential value of the incident illumination intensity detected by the photodiode 112 by removing the DC voltage component from the voltage at node a. The gain provided by amplifier 151 corresponds to a ratio defined by the respective capacitance values of AC coupling capacitor 145 and capacitor 153. The reset switch 155 is activated (i.e., transitions from an open state to a closed state) when a reset signal is received from the controller 170. By activating the reset switch 155, the operating point of the amplifier 151 is reset to the reference voltage associated with the threshold of the comparator 160.

The comparator 160 is configured to provide pixel-level processing of pixel data received from the sample node B. To this end, the comparator 160 outputs an electrical response (e.g., voltage) when the pixel data received from the sample node B indicates that the photodiode 112 detects a change in the incident illumination intensity that breaches the threshold value. Optionally, the comparator 160 suppresses the output electrical response when the pixel data received from the sample node B indicates that the photodiode 112 has not detected a change in the incident illumination intensity that breaches the threshold value. In some cases, the electrical response output by the comparator 160 is referred to as event data.

In one implementation, the comparator 160 is implemented using a plurality of comparators including a first comparator configured to output an electrical response indicative of a positive event (e.g., an event having a positive polarity) and a second comparator configured to output an electrical response indicative of a negative event (e.g., an event having a negative polarity). In one implementation, the first comparator outputs an electrical response when the pixel data received from sample node B indicates that the photodiode 112 detects a change in the incident illumination intensity that breaches a positive threshold. In one implementation, the second comparator outputs an electrical response when the pixel data received from sample node B indicates that the photodiode 112 detects a change in the incident illumination intensity that breaches the negative threshold.

The controller 170 is configured to cooperate with other components of the event sensor 100 (e.g., controllers within other pixels) to communicate an event signal (e.g., a sample of event data) to the event compiler 190 for each electrical response output by the comparator 160. In one implementation, the reset switch 155 receives a reset signal from the controller 170 each time the comparator 160 obtains pixel data at the sampling node B that breaches the threshold.

Event compiler 190 receives an event signal (e.g., a sample of event data) from each of the plurality of pixels 105 that each indicate that the incident illumination intensity variation breaches a threshold. In response to receiving a sample of event data from a particular pixel of the plurality of pixels 105, the event compiler 190 generates a pixel event. A pixel event generated by event compiler 190 may be referred to as a "positive" pixel event when the event signal is associated with pixel data indicating a change in incident illumination intensity that breaches a positive threshold (or voltage). In one implementation, a positive pixel event is a pixel having a threshold or voltage ("V") above an upper threshold or voltage indicative of the intensity of incident illuminationth") positive polarity pixel event with a net increase in amplitude. When the event signal is associated with pixel data indicating a change in incident illumination intensity that breaches a negative threshold (or voltage), the pixel event generated by the event compiler may be referred to as a "negative" pixel event. In one implementation, a negative pixel event is a pixel having a threshold or voltage ("-V) above a lower threshold or voltage indicative of the intensity of incident illuminationth") a pixel event of negative polarity with a net reduction in magnitude defined.

In addition, event compiler 190 populates the pixel event with information indicative of the electrical response included in the event signal (e.g., the value or polarity of the electrical response). In one implementation, event compiler 190 also populates pixel events with one or more of: timestamp information corresponding to a point in time at which a pixel event is generated and an address identifier corresponding to a particular pixel that sent an event signal that triggered the pixel event. The pixel event stream including each pixel event generated by the event compiler 190 may then be passed to an image pipeline (e.g., image or video processing circuitry) (not shown) associated with the event sensor 100 for further processing.

By way of example, the pixel event streams generated by event compiler 190 may be accumulated or otherwise combined to produce image data. In some implementations, the pixel event streams are combined to provide an intensity reconstructed image. In this implementation, an intensity reconstruction image generator (not shown) may accumulate pixel events over time to reconstruct/estimate absolute intensity values. As additional pixel events accumulate, the intensity reconstructed image generator changes the corresponding values in the reconstructed image. In this way, it generates and maintains an updated image of values for all pixels of the image, even though only some pixels may have recently received an event.

In various implementations, the event driven sensors are implemented with a hardware architecture configured to support enabled, standby, and operational states. Generally, this involves the event sensor 210 outputting pixel events to the image pipeline 220 and, in response, receiving feedback information from the image pipeline 220, as shown in the exemplary system 200 of fig. 2. Image pipeline 220 is configured to consume image data derived from pixel events output by event sensor 210. To this end, image pipeline 220 includes one or more components, such as an intensity reconstruction image generator discussed above with reference to FIG. 1, to derive image data from pixel events. One or more components of image pipeline 220 may be implemented using various combinations of hardware components (e.g., application specific integrated circuits, digital signal processors, etc.) and software components (e.g., noise reduction processes, image scaling processes, color space conversion processes, etc.).

In various implementations, the image pipeline 220 implements some functionality that utilizes incomplete image datasets derived from pixel events output by the event sensor 210. By way of example, the image pipeline 220 may further include a feature tracker configured to detect a feature depicted in the image data derived from the pixel events (e.g., using a technique such as SIFT, KAZE, etc.), and track the feature over time (e.g., using a technique such as a Kanade-Lucas-Tomasi tracker, a Shi-Tomasi tracker, etc.). In this example, the feature tracker of the image pipeline 220 may implement an eye tracking function by detecting and tracking gaze characteristics (e.g., pupil center, pupil contour, glint position, gaze direction, etc.) using image data depicting the user's eye derived from pixel events output by the event sensor 210.

Fig. 3 shows an example of a complete image dataset 300 depicting user eyes that the image pipeline 220 may derive from pixel events output by the event sensor 210. To implement the eye tracking functionality, the feature tracker of the image pipeline 220 has estimated the location of the pupil center ("estimated pupil center") 310 within the eye using a subset of the image data 300 residing in the region of interest 320. Processing the full image data set 300 to implement the eye tracking functionality may be computationally intensive for the feature tracker of the image pipeline 220 and consume excessive power and computational resources. To improve computational efficiency and reduce power consumption, the feature tracker of the image pipeline 220 can process a subset of the image data residing in the region of interest 320. Image data residing outside of region of interest 320 may be cropped to form cropped image data 400, as shown in FIG. 4.

Image pipeline 220 may be used to implement a technique for cropping image data residing outside of region of interest 320. In accordance with the techniques, image pipeline 220 may receive pixel events corresponding to a field of view of event sensor 210. To form cropped image data 400, image pipeline 220 may ignore pixel events corresponding to image data residing outside of region of interest 320, or crop image data residing outside of region of interest 320 after deriving complete image data set 300. In either case, however, the event sensor 210 includes a subset of pixels that generate pixel events corresponding to image data residing outside the region of interest 320 that continue to consume power. In addition, pixel events corresponding to image data residing outside of the region of interest 320 continue to consume bandwidth of the communication path between the event sensor 210 and the image pipeline 220. Thus, techniques that enable cropping of image data residing outside of the region of interest 320 involving the event sensor 210 may further reduce power and bandwidth consumption.

To do so, the image pipeline 220 passes feedback information to the event sensor 210, as shown in FIG. 2. In various implementations, such feedback information represents a feedback loop between an event sensor (e.g., event sensor 210) and an image pipeline (e.g., image pipeline 220). As discussed in more detail below, the image pipeline consumes image data derived from pixel events output by the event sensor. Based on the image data, the image pipeline generates feedback information corresponding to a subset of the image data (e.g., a region of interest), which may be more useful for a particular image processing operation than other portions of the image data. That is, the feedback information corresponds to a subset of the image data on which processing is performed for a particular image processing operation. In response to the feedback information, the operating state of each pixel within the pixel array of the event sensor may be modified accordingly. In particular, different pixels within the pixel array of the event sensor may have different operating states based on feedback information received from the image pipeline.

Fig. 5 shows an example of a pixel array 500 of an event sensor, where the pixels are configured to support different operating states. Pixel array 500 includes a plurality of pixels positioned to receive light from a scene disposed within a field of view of an event sensor. Thus, when the operating state of each of the plurality of pixels is an enabled state, the image data derived from the pixel events output by the event sensor generally depicts the field of view of the event sensor. As used herein, "enabled state" refers to an operational state of a pixel in which the photodetector circuitry and event circuitry of the pixel are each activated (or fully functional). In one implementation, a pixel having an event circuit and a photodetector circuit that are each activated (or fully functional) is defined to operate in a full power mode.

When the event sensor receives feedback information from the image pipeline that an incomplete image dataset is being processed by a particular image processing operation, some pixels of the event sensor may transition from an enabled state to another operating state. For example, some pixels of the event sensor may transition to a non-enabled state. As used herein, a "non-enabled state" refers to an operating state of a pixel in which the pixel function is incomplete. In one implementation, the photodetector circuit and the event circuit of a pixel in the non-enabled state are each deactivated (or non-functional). In one implementation, a pixel having an event circuit and a photodetector circuit that are each deactivated (or non-functional) is defined to operate in an off-power mode.

In some cases, a pixel of the event sensor may not immediately transition from the non-enabled state to the enabled state. To mitigate such latency issues, some pixels of the event sensor may transition from an enabled state to a standby state. As used herein, "standby state" refers to an operating state of a pixel in which the pixel function is not complete, but stronger than the pixel function in a non-enabled state. In one implementation, the event circuit of the pixel is deactivated (or non-functional) when the pixel transitions to a standby state, while the photodetector circuit of the pixel is activated (or fully functional). In one implementation, a pixel having an event circuit that is deactivated (or non-functional) and a photodetector circuit that is activated (or fully functional) is defined to operate in a low power mode.

By way of example, the image pipeline may communicate feedback information based on the image data 300 of fig. 3. In response to the feedback information, a first subset of pixels within region 520 of pixel array 500 are in an enabled state, a second subset of pixels within region 510 are in a standby state, and a third subset of pixels outside of regions 510 and 520 are in a non-enabled state. In this example, the first subset of pixels within the region 520 may be associated with pixel events corresponding to the region of interest 320 of fig. 3 and 4. In one implementation, the region 520 defines an enable region of the pixel array 500. In one implementation, a subset of pixels within an enable region (e.g., region 520) operate in an off power mode or a low power mode.

In one implementation, the feedback information includes parameters defining the location of one or more regions within the pixel array 500. For example, the parameters defining the location of the area 510 may include an offset value, such as an x-offset 512, a y-offset 514, or a combination thereof, specified relative to the boundary of the pixel array 500. As another example, the parameters defining the location of the region 520 may include offset values specified relative to the boundaries of the pixel array 500, such as some combination of x-offset 512, x-offset 522, y-offset 514, and y-offset 524.

In one implementation, one or more regions of the pixel array 500 have a predefined size. For example, the area 510 may have predefined dimensions specified as a width 516 and a height 518. As another example, the area 520 may have predefined dimensions designated as a width 526 and a height 528. In one implementation, the feedback information includes parameters that define the size of one or more regions within the pixel array 500. For example, the parameters of the feedback information may define one or more of width 516, width 526, height 518, and height 528.

FIG. 6 shows an example of a pixel array 600 of event sensors having different pixels with different operating states, some of the pixels having operating states modified as feedback information received from the image pipeline is updated between a first time and a second time. At a first time, the image pipeline may generate feedback information based on image data derived from pixel events output by the event sensor. In response to receiving feedback information generated by the image pipeline at a first time, a first subset of pixels within region 620A of pixel array 600 are in an enabled state, a second subset of pixels within region 610A are in a standby state, and a third subset of pixels outside of regions 610A and 620A are in a non-enabled state.

After the first time, the image pipeline may receive additional pixel events from the event sensor that alter the image data being processed by the image pipeline. For example, the location of the feature of interest (e.g., pupil center 310 of fig. 3) within the image data may change as the image data is updated by additional pixel events. At a second time, the image pipeline may generate feedback information that accounts for the change in the image data caused by the additional pixel event. In response to receiving feedback information generated by the image pipeline at a second time, a first subset of pixels within region 620B of pixel array 600 are in an enabled state, a second subset of pixels within region 610B are in a standby state, and a third subset of pixels outside of regions 610B and 620B are in a non-enabled state.

Between the first time and the second time, some pixels within the pixel array 600 transition from one operating state to another in response to feedback information received from the image pipeline. For example, as shown in FIG. 7, pixels within sub-region 710 that are in a standby state at a first time will transition to a non-enabled state at a second time. Pixels within the sub-region 720 of the pixel array 600 that are in the enabled state at the first time will transition to the standby state at the second time. Similarly, pixels within sub-region 730 that are in the standby state at a first time will transition to the enabled state at a second time, and pixels within sub-region 740 that are in the non-enabled state at the first time will transition to the standby state at the second time.

In various implementations, an event sensor (e.g., event sensor 210 of fig. 2) may be configured to output pixel events to an image pipeline (e.g., image pipeline 220). As discussed above with respect to fig. 1, in various implementations, an event compiler (e.g., event compiler 190) of the event sensor may populate each pixel event with some combination of: (i) an address identifier corresponding to a particular pixel that sent an event signal that triggered a corresponding pixel event (e.g., x/y coordinates of the particular pixel)Label- [ x, y]) (ii) a (ii) Information indicative of the electrical response included in the event signal (e.g., the value or polarity- "intensity" of the electrical response); and (iii) timestamp information corresponding to a point in time ("T") at which the corresponding pixel event was generated. If the event compiler of the event sensor generates a certain number ("N") of pixel events and fills each pixel event with all three data points, these N pixel events (pixel event 1.. N) can be represented as the following list of pixel events: ([ x, y)]Strength of1,T1)、([x,y]Strength of2,T2)、...、([x,y]Strength ofN,TN). In one implementation, the event sensor is configured to output such pixel events as pixel event boxes to an image pipeline (e.g., image pipeline 220). Generally, a box of pixel events is considered to be a collection of pixel events. Those skilled in the art will recognize that binning involves grouping individual data values (e.g., pixel events) into defined intervals (or bins).

In one implementation, such intervals may be defined based on event counts. For example, the event sensor may output each pixel event bin after generating a predefined number of pixel events (e.g., 10 pixel events). In this implementation, continuing with the previous example and assuming N-40, the 40 pixel events generated by the event compiler may be grouped into 4 pixel event bins for output to the image pipeline. The four bins of this example would include: a first pixel event box comprising pixel events 1.. 10; a second pixel event box comprising pixel events 11.. 20; a third pixel event box comprising pixel events 21.. 30; and a fourth pixel event box comprising pixel events 31 to 40. In one implementation, a hardware/software based event counter of the event sensor may monitor the number of pixel events generated by one or more event compilers and cause the event sensor to output a pixel event box when the number reaches a predefined number of pixel events.

In one implementation, such intervals may be defined using a periodic basis, e.g., every 0.5 milliseconds ("ms"). In this implementation, continuing with the previous example and assuming regularly spaced timestamps between 40 pixel events over a 4ms time period, the 40 pixel events may be grouped into 8 pixel event bins. The eight bins of this example would include: a first pixel event bin comprising pixel events (pixel events 1.. 5) generated between 0ms and 0.5ms of a 4ms time period; a second pixel event bin comprising pixel events generated between 0.5ms and 1.0ms (pixel events 6.. 10); a third pixel event bin comprising pixel events generated between 1.0ms and 1.5ms (pixel events 11.. 15); a fourth pixel event bin comprising pixel events generated between 1.5ms and 2.0ms (pixel events 16.. 20); a fifth pixel event box comprising pixel events generated between 2.0ms and 2.5ms (pixel events 21.. 25); a sixth pixel event bin comprising pixel events generated between 2.5ms and 3.0ms (pixel events 26.. 30); a seventh pixel event bin comprising pixel events (pixel events 31.. 35) generated between 3.0ms and 3.5 ms; and an eighth pixel event box including pixel events generated between 3.5ms and 4.0ms (pixel events 36.. 40).

In one implementation, the periodic basis is synchronized with a global read operation or a global reset operation of the event sensor. In one implementation, the global readout operation includes a respective controller for each pixel within a particular subset of pixels (e.g., a particular row or a particular column) triggering a respective comparator to process pixel data at a common (or substantially common) time. In one implementation, the global reset operation includes resetting a pixel data value (or voltage) to a reference value (or voltage) V each time a sample of pixel data is processed by a respective comparatorref

In one implementation, each pixel event box is output as a pixel event list (e.g., similar to the pixel event list presented above). In one implementation, each pixel event bin is output as a two-dimensional ("2D") tile of pixel events. FIG. 8 depicts an example of such a 2D tile of pixel events that an event sensor may generate to output to an image pipeline. In one implementation, each pixel event is mapped to a particular location of a corresponding 2D tile of pixel events using address identifier information. In one implementation, each 2D tile encodes the value or polarity of the electrical response provided by each pixel event included in the corresponding pixel event bin. Upon receiving a particular 2D tile (e.g., tile 810), the image pipeline may identify a particular pixel that detected both a change in normal incidence illumination within a given interval (e.g., pixel event 812) and a change in negative incidence illumination within a given interval (e.g., pixel event 814). In one implementation, the image pipeline may update the intensity reconstructed image using the values or polarities of the electrical responses encoded in the 2D tiles.

Fig. 9 illustrates a block diagram of a head-mounted device 900 according to some implementations. The head mounted device 900 includes a housing 901 (or casing) that houses various components of the head mounted device 900. The housing 901 includes (or is coupled to) an eye pad 905 disposed at a proximal end of the housing 901 relative to the user 10 of the head-mounted device 900. In various implementations, the eye pad 905 is a plastic or rubber piece that comfortably and snugly holds the head-mounted device 900 in place on the face of the user 10 (e.g., around the eyes of the user 10).

In some implementations, image data is presented to the user 10 of the head mounted device 900 via a display 910 disposed within the housing 901. Although fig. 9 shows the head mounted device 900 including the display 910 and the eye pad 905, in various implementations, the head mounted device 900 does not include the display 910 or includes an optical see-through display without the eye pad 905.

The head-mounted device 900 also includes a gaze tracking system disposed within the housing 901 that includes an event sensor 924, a controller 980, and optionally one or more light sources 922. Generally, controller 980 is configured to interact with event sensor 924 and a feature tracker of an image pipeline (e.g., image pipeline 220 of fig. 2) to detect and track gaze characteristics of user 10. In one implementation, the system includes one or more light sources 922 that emit light that reflects from the eye of the user 10 in a light pattern (e.g., a flash ring) and is detected by the event sensor 924. To this end, the controller 980 is configured to activate the one or more light sources 922 in response to information (e.g., feedback information) received from the image pipeline. Based on the light pattern, a feature tracker of the image pipeline may determine gaze tracking characteristics (e.g., gaze direction, pupil center, pupil size, etc.) of the user 10.

In one implementation, the controller 980 is configured to activate the light source 922 by pulsing the light source 922 at a defined frequency (e.g., 300 hertz). In one implementation, pulsing the light source 922 at a defined frequency causes at least a subset of pixels (e.g., pixels in an enabled state) within the event sensor 924 to generate event data at a rate proportional to the defined frequency.

In one implementation, no light source is used and the light present in the environment passively illuminates the eye. Gaze tracking characteristics (e.g., gaze direction, pupil center, pupil size, etc.) may be determined by analyzing the image and extracting features such as pupil position, appearance, and shape (e.g., using template matching, or combining corner or feature detectors with classifiers, or using trained neural networks) and associated with the position and appearance of additional features of the eye, such as iris contour (limbus) or eyelid shape and eyelid corner position.

Fig. 10 is a flow diagram illustrating an example of a method 1000 of implementing an event camera system architecture having a feedback loop and event driven sensors configured to support multiple power states. In one implementation, the method 1000 is implemented by the event sensor 210 of FIG. 2. At block 1002, method 1000 includes outputting pixel events from an event sensor having a pixel array to an image pipeline. The pixel array is configured to have an enabled pixel region operating in a full power mode and a non-enabled pixel region operating in an off power mode or a low power mode. Each respective pixel event is generated in response to a particular pixel within the enabled pixel region detecting a change in light intensity that exceeds a comparator threshold.

In one implementation, the pixel events are output from the event sensor as a pixel event box. In one implementation, the event sensor is configured to output a pixel event bin on a periodic basis. In one implementation, the periodic basis is synchronized with a global read operation or a global reset operation of the event sensor. In one implementation, after a predefined number of pixel events are generated, a pixel event box is output from the event sensor. In one implementation, the pixel event bins are output as a 2D tile of pixel events (e.g., the 2D tile of pixel events depicted in fig. 8). In one implementation, the pixel events are output from the event sensor as a list of pixel events.

At block 1004, the method 1000 includes receiving feedback information from the image pipeline based on image data derived from the pixel events. In one implementation, the feedback information is an enable region offset that defines a region of the pixel array that corresponds to a region of interest that the image pipeline tracks within the image data. At block 1006, method 1000 includes directing pixels within the enabled pixel region to operate in an off power mode or a low power mode in response to receiving the feedback information.

In one implementation, the feedback information may include a bitmask that encodes the target pixel state for each individual pixel, e.g., the bitmask may represent a circular area in an enabled state while the rest of the sensor is in a ready or non-enabled state; furthermore, it can be easily seen how the mask may represent that any arbitrarily shaped region or group of regions in the sensor is in one of the mentioned states, where the smallest sized region is any individual pixel.

FIG. 11 is a flow diagram illustrating another example of a method 1100 of implementing an event camera system architecture having a feedback loop and event driven sensors configured to support multiple power states. In one implementation, method 1000 is implemented by image pipeline 220 of FIG. 2. At block 1102, method 1100 includes receiving, by an image pipeline, a pixel event from an event sensor having a pixel array including a first subset of pixels in an enabled state and a second subset of pixels in a non-enabled state. Each respective pixel event is generated in response to a particular pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. In one implementation, the pixel events are received from an event sensor as a pixel event box. In one implementation, the pixel events are received from the event sensor as a list of pixel events.

At block 1104, the method 1100 includes deriving image data from the pixel events using an image pipeline. At block 1106, the method 1100 includes generating feedback information based on the image data using the image pipeline. The feedback information causes the event sensor to direct the pixels within the first subset of pixels to transition from an enabled state to another operating state. In one implementation, the feedback information causes the event sensor to direct the pixels within the first subset of pixels to transition from an enabled state to a non-enabled state. In one implementation, the feedback information causes the event sensor to direct the pixels within the first subset of pixels to transition from an enabled state to a standby state. In one implementation, generating the feedback information includes utilizing an image pipeline to track a region of interest within the image data. In one implementation, the feedback information is an enable region offset that defines a region of the pixel array that corresponds to a region of interest that the image pipeline tracks within the image data.

In one implementation, the method 1100 further includes pulsing a light source configured to emit light toward a scene disposed within a field of view of the event sensor at a defined frequency. In one implementation, the light source is pulsed at a defined frequency such that pixels within the first subset of pixels generate event data at a rate proportional to the defined frequency.

The use of "adapted to" or "configured to" herein is meant to be an open and inclusive language that does not exclude devices adapted to or configured to perform additional tasks or steps. Additionally, the use of "based on" means open and inclusive, as a process, step, calculation, or other action that is "based on" one or more stated conditions or values may in practice be based on additional conditions or values beyond those stated. The headings, lists, and numbers included herein are for ease of explanation only and are not intended to be limiting.

It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first node may be referred to as a second node, and similarly, a second node may be referred to as a first node, which changes the meaning of the description, as long as all occurrences of the "first node" are renamed consistently and all occurrences of the "second node" are renamed consistently. The first node and the second node are both nodes, but they are not the same node.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the claims. As used in the description of this particular implementation and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or groups thereof.

As used herein, the term "if" may be interpreted to mean "when the prerequisite is true" or "in response to a determination" or "according to a determination" or "in response to a detection" that the prerequisite is true, depending on the context. Similarly, the phrase "if it is determined that [ the prerequisite is true ]" or "if [ the prerequisite is true ]" or "when [ the prerequisite is true ]" is interpreted to mean "upon determining that the prerequisite is true" or "in response to determining" or "according to determining that the prerequisite is true" or "upon detecting that the prerequisite is true" or "in response to detecting" that the prerequisite is true, depending on context.

The foregoing description and summary of the invention is to be understood as being in every respect illustrative and exemplary, but not restrictive, and the scope of the invention disclosed herein is not to be determined solely by the detailed description of the exemplary implementations, but rather according to the full breadth permitted by the patent laws. It will be understood that the specific embodiments shown and described herein are merely illustrative of the principles of the invention and that various modifications can be implemented by those skilled in the art without departing from the scope and spirit of the invention.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像传感器架构

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类