Image sensor architecture

文档序号:1836570 发布日期:2021-11-12 浏览:9次 中文

阅读说明:本技术 图像传感器架构 (Image sensor architecture ) 是由 L.埃恩 V.C.卡尔代伊 C.W.克拉多克 于 2020-03-30 设计创作,主要内容包括:一种图像传感器,包括:第一集成电路层,包括基于位置分组成像素传感器群组的像素传感器;与第一集成电路层电连通的第二集成电路层,该第二集成电路层包括被配置为各自从对应像素传感器群组接收像素信息的图像处理电路群组,图像处理电路群组进一步被配置为在图像传感器的操作期间对像素信息执行图像处理操作以提供经处理像素信息;与第二集成电路层电连通的第三集成电路层,并且第三集成电路层包括被配置为各自从对应图像处理电路群组接收经处理像素信息且在图像传感器的操作期间对经处理像素信息执行对象检测的分析的神经网络电路群组。(An image sensor, comprising: a first integrated circuit layer comprising pixel sensors grouped into groups of pixel sensors based on location; a second integrated circuit layer in electrical communication with the first integrated circuit layer, the second integrated circuit layer including a group of image processing circuits configured to each receive pixel information from a corresponding group of pixel sensors, the group of image processing circuits further configured to perform image processing operations on the pixel information during operation of the image sensor to provide processed pixel information; a third integrated circuit layer in electrical communication with the second integrated circuit layer, and the third integrated circuit layer includes a group of neural network circuits configured to each receive processed pixel information from a corresponding group of image processing circuits and perform analysis of object detection on the processed pixel information during operation of the image sensor.)

1. An image sensor, comprising:

a first integrated circuit layer comprising pixel sensors grouped into groups of pixel sensors based on location;

a second integrated circuit layer in electrical communication with the first integrated circuit layer, the second integrated circuit layer comprising a group of image processing circuits configured to each receive pixel information from a corresponding group of pixel sensors, the group of image processing circuits further configured to perform image processing operations on the pixel information during operation of the image sensor to provide processed pixel information;

a third integrated circuit layer in electrical communication with the second integrated circuit layer, the third integrated circuit layer including a group of neural network circuits configured to each receive processed pixel information from a corresponding group of image processing circuits, the group of neural network circuits further configured to perform analysis of object detection on the processed pixel information during operation of the image sensor; and

a circuit to output information indicative of a result of an analysis of object detection by the neural network circuit group,

the first integrated circuit layer is stacked on the second integrated circuit layer, and the second integrated circuit layer is stacked on the third integrated circuit layer.

2. The image sensor of claim 1, wherein the results of the analysis of object detection comprise at least one of the group consisting of:

a selected region of interest representing the detected pixels;

metadata containing time and geometric location information;

intermediate calculation results prior to object detection;

statistical information about the level of network certainty; and

a classification of the object is detected.

3. The image sensor of claim 1, wherein the neural network circuit groups each comprise a circuit configured to implement a convolutional neural network.

4. The image sensor of claim 1, wherein the convolutional neural networks each detect an object sensed by a group of pixel sensors corresponding to a group of image processing circuits corresponding to a group of neural network circuits.

5. The image sensor of claim 1, wherein the third integrated circuit layer comprises circuitry configured to implement a recurrent neural network.

6. The image sensor of claim 1, wherein the recurrent neural network receives information about objects detected by all neural network circuit groups and detects objects sensed across multiple ones of the pixel sensor groups.

7. The image sensor of claim 1, wherein each of the neural network circuit groups is positioned directly below an image processing circuit group that provides processed pixel information to the neural network circuit group.

8. The image sensor of claim 1, wherein each of the groups of image processing circuits is positioned directly beneath or immediately adjacent a group of pixel sensors that provide pixel information to the group of image processing circuits.

9. The image sensor of claim 1, wherein the first, second, and third integrated circuit layers are integrated in a single integrated chip.

10. The image sensor of claim 1, wherein each of the pixel sensor groups comprises a same number of pixel sensors.

11. The image sensor of claim 1, wherein the image processing operation performed on the pixel information includes high dynamic range fusion prior to processing.

12. The image sensor of claim 1, wherein each of the group of image processing circuits comprises an analog-to-digital converter.

13. A method, comprising:

obtaining, by pixel sensors in a first integrated circuit layer and grouped into groups of pixel sensors based on location, pixel information;

performing, by a group of image processing circuitry in a second integrated circuit layer in electrical communication with the first integrated circuit layer, an image processing operation on pixel information from a corresponding group of pixel sensors to provide processed pixel information;

performing, by a group of neural network circuits in a third integrated circuit layer in electrical communication with the second integrated circuit layer, analysis of object detection on processed pixel information from a corresponding group of image processing circuits; and

outputting information indicative of a result of an analysis of object detection by the neural network circuit group,

the first integrated circuit layer is stacked on the second integrated circuit layer, and the second integrated circuit layer is stacked on the third integrated circuit layer.

14. The method of claim 13, wherein the results of the analysis of object detection include at least one of the group consisting of:

a selected region of interest representing the detected pixels;

metadata containing time and geometric location information;

intermediate calculation results prior to object detection;

statistical information about the level of network certainty; and

a classification of the object is detected.

15. The method of claim 12, wherein the groups of neural network circuits each comprise a circuit configured to implement a convolutional neural network.

16. The method of claim 12, wherein the convolutional neural networks each detect an object sensed by a group of pixel sensors corresponding to a group of image processing circuits corresponding to a group of neural network circuits.

17. The method of claim 12, wherein the third integrated circuit layer comprises circuitry configured to implement a recurrent neural network.

18. The method of claim 12, wherein the recurrent neural network receives information about objects detected by all of the neural network circuit groups and detects objects sensed across multiple ones of the pixel sensor groups.

19. The method of claim 12, wherein each of the groups of neural network circuits is positioned directly below a group of image processing circuits that provide processed pixel information to the groups of neural network circuits.

20. A method of forming an image sensor, comprising:

forming a first integrated circuit layer comprising pixel sensors grouped into pixel sensor groups based on location;

electrically coupling a second integrated circuit layer with the first integrated circuit layer, the second integrated circuit layer including a group of image processing circuits configured to each receive pixel information from a corresponding group of pixel sensors, the group of image processing circuits further configured to perform image processing operations on the pixel information during operation of the image sensor to provide processed pixel information;

electrically coupling a third integrated circuit layer with the second integrated circuit layer, the third integrated circuit layer including a group of neural network circuits configured to each receive processed pixel information from a corresponding group of image processing circuits, the group of neural network circuits further configured to perform analysis of object detection during operation of the image sensor to detect objects from the processed pixel information; and

electrically coupling a circuit to the group of neural network circuits, the circuit outputting information indicative of a result of an analysis of object detection by the group of neural network circuits,

the first integrated circuit layer is stacked on the second integrated circuit layer, and the second integrated circuit layer is stacked on the third integrated circuit layer.

21. A method, comprising:

obtaining a plurality of images captured by a pixel sensor of an image sensor;

analyzing the plurality of images for object detection using a neural network circuit integrated in an image sensor;

generating, for each of the plurality of images, neural network output data relating to results of analysis of the plurality of images for object detection using a neural network circuit integrated in an image sensor; and

transmitting, from an image sensor, neural network output data for each of the plurality of images and image data for a subset of the plurality of images instead of image data for each of the plurality of images.

22. The method of claim 21, wherein transmitting the neural network output data for each of the plurality of images and the image data for the subset of the plurality of images instead of the image data for each of the plurality of images comprises:

the method includes transmitting image data of a first image and neural network output data of the first image in the plurality of images, transmitting neural network output data of a second image, and transmitting image data of a third image and neural network output data of the third image in the plurality of images before transmitting the image data of the second image and without transmitting the image data of the second image.

23. The method of claim 22, comprising:

analyzing a second image of the plurality of images for object detection while transmitting image data of a first image of the plurality of images and neural network output data of the first image.

24. The method of claim 22, comprising:

generating neural network output data related to results of the analysis of the plurality of images for object detection while capturing the third image.

25. The method of claim 21, wherein transmitting, from an image sensor, the neural network output data for each of the plurality of images and the image data for a subset of the plurality of images but not the image data for each of the plurality of images comprises:

transmitting image data for a subset of the plurality of images at a particular number of frames per second, and transmitting neural network output data for an image when image data for the subset of the plurality of images is not transmitted.

26. The method of claim 21, wherein the subset of the plurality of images includes every nth image captured by the image sensor, where N is an integer greater than 1.

27. The method of claim 21, wherein the image data for a particular one of the images comprises data indicative of a value for each pixel within the image.

28. The method of claim 21, wherein the neural network output data for a particular one of the images comprises one or more of:

a selected region of interest representing the detected pixels;

metadata containing time and geometric location information;

intermediate calculation results prior to object detection;

statistical information about the level of network certainty; and

a classification of the object is detected.

29. The method of claim 21, wherein the neural network output data representation of a particular one of the images is provided to a processor for further processing of the partially processed data.

30. The method of claim 21, wherein transmitting the neural network output data and the image data comprises transmitting the neural network output data and the image data to a central processing unit.

31. An image sensor having circuitry configured to:

obtaining a plurality of images captured by a pixel sensor of an image sensor;

analyzing the plurality of images for object detection using a neural network circuit integrated in an image sensor;

generating, for each of the plurality of images, neural network output data relating to results of analysis of the plurality of images for object detection using a neural network circuit integrated in an image sensor; and

transmitting, from an image sensor, neural network output data for each of the plurality of images and image data for a subset of the plurality of images instead of image data for each of the plurality of images.

32. The image sensor of claim 31, wherein transmitting the neural network output data for each of the plurality of images and the image data for the subset of the plurality of images instead of the image data for each of the plurality of images comprises:

the method includes transmitting image data of a first image and neural network output data of the first image in the plurality of images, transmitting neural network output data of a second image, and transmitting image data of a third image and neural network output data of the third image in the plurality of images before transmitting the image data of the second image and without transmitting the image data of the second image.

33. The image sensor of claim 32, comprising:

analyzing a second image of the plurality of images for object detection while transmitting image data of a first image of the plurality of images and neural network output data of the first image.

34. The image sensor of claim 33, comprising:

generating neural network output data related to results of the analysis of the plurality of images for object detection while capturing the third image.

35. The image sensor of claim 31, wherein transmitting, from the image sensor, the neural network output data for each of the plurality of images and the image data for the subset of the plurality of images but not the image data for each of the plurality of images comprises:

transmitting image data for a subset of the plurality of images at a particular number of frames per second and transmitting neural network output data for an image when image data for the subset of the plurality of images is not transmitted.

36. The image sensor of claim 31, wherein the subset of the plurality of images comprises every nth image captured by the image sensor, wherein N is an integer greater than 1.

37. The image sensor of claim 31, wherein the image data for a particular one of the images comprises data indicative of a value for each pixel within the image.

38. The image sensor of claim 31, wherein the neural network output data for a particular one of the images comprises one or more of:

a selected region of interest representing the detected pixels;

metadata containing time and geometric location information;

intermediate calculation results prior to object detection;

statistical information about the level of network certainty; and

a classification of the object is detected.

39. The image sensor of claim 31, wherein the neural network output data representative of a particular image in the image is provided to a processor for further processing of the partially processed data.

40. The image sensor of claim 31, wherein transmitting the neural network output data and the image data comprises transmitting the neural network output data and the image data to a central processing unit.

Background

This description relates to image sensors and systems including image sensors.

Disclosure of Invention

This specification describes technologies relating to an image sensor that captures an image and performs processing on the image sensor to detect an object. This specification further describes a data transfer protocol for transferring data from the image sensor to the remote processor.

In general, one novel aspect of the subject matter described in this specification can be embodied in an image sensor comprising: a first integrated circuit layer comprising pixel sensors grouped into groups of pixel sensors based on location; a second integrated circuit layer in electrical communication with the first integrated circuit layer, the second integrated circuit layer including a group of image processing circuits configured to each receive pixel information from a corresponding group of pixel sensors, the group of image processing circuits further configured to perform image processing operations on the pixel information during operation of the image sensor to provide processed pixel information; a third integrated circuit layer in electrical communication with the second integrated circuit layer, the third integrated circuit layer including a group of neural network circuits configured to each receive processed pixel information from a corresponding group of image processing circuits, the group of neural network circuits further configured to perform analysis of object detection on the processed pixel information during operation of the image sensor; and circuitry to output information indicative of a result of the analysis of the object detection by the neural network circuitry group, wherein the first integrated circuit layer is stacked on the second integrated circuit layer, and the second integrated circuit layer is stacked on the third integrated circuit layer.

Other embodiments of this aspect include corresponding methods of performing actions corresponding to the image sensor and performing the actions to form the image sensor.

These and other embodiments may each optionally include one or more of the following features. In some aspects, the results of the analysis of the object detection include at least one of the group consisting of: a selected region of interest representing the detected pixels; metadata containing time and geometric location information; intermediate calculation results prior to object detection; statistical information about the level of network certainty; and detecting a classification of the object. In certain aspects, the group of neural network circuits each include a circuit configured to implement a convolutional neural network.

In some aspects, the convolutional neural networks each detect an object sensed by a group of pixel sensors corresponding to a group of image processing circuits corresponding to a group of neural network circuits. In some implementations, the third integrated circuit layer includes circuitry configured to implement a recurrent neural network. In certain aspects, the recurrent neural network receives information about objects detected by all of the neural network circuit groups and detects objects sensed across multiple ones of the pixel sensor groups.

In some aspects, each of the group of neural network circuits is positioned directly below a group of image processing circuits that provide processed pixel information to the group of neural network circuits. In certain aspects, each of the groups of image processing circuits is positioned directly beneath or immediately adjacent a group of pixel sensors that provide pixel information to the group of image processing circuits. In some embodiments, the first integrated circuit layer, the second integrated circuit layer, and the third integrated circuit layer are integrated in a single integrated chip.

In some aspects, each of the pixel sensor groups includes the same number of pixel sensors. In certain aspects, the image processing operations performed on the pixel information include high dynamic range fusion prior to processing. In some implementations, each of the group of image processing circuits includes an analog-to-digital converter.

In general, one novel aspect of the subject matter described in this specification can be embodied in a method that includes the acts of: obtaining a plurality of images captured by a pixel sensor of an image sensor; analyzing the plurality of images for object detection using a neural network circuit integrated in the image sensor; generating, for each of the plurality of images, neural network output data relating to results of the analysis of the plurality of images for object detection using a neural network circuit integrated in the image sensor; and transmitting, from the image sensor, the neural network output data for each of the plurality of images and image data for a subset of the plurality of images instead of the image data for each of the plurality of images.

Other embodiments of this aspect include corresponding image sensors configured to perform the actions of the methods.

These and other embodiments may each optionally include one or more of the following features. In some aspects, transmitting the neural network output data for each of the plurality of images and the image data for the subset of the plurality of images instead of the image data for each of the plurality of images comprises: the method includes transmitting image data of a first image and neural network output data of the first image in the plurality of images, transmitting neural network output data of a second image, and transmitting image data of a third image and neural network output data of the third image in the plurality of images before transmitting the image data of the second image and without transmitting the image data of the second image.

In certain aspects, the actions include analyzing a second image of the plurality of images for object detection while transmitting image data of a first image of the plurality of images and neural network output data of the first image. In some embodiments, the action includes generating neural network output data related to results of the analysis of the plurality of images for object detection while the third image is captured. In some aspects, transmitting, from the image sensor, the neural network output data for each of the plurality of images and the image data for the subset of the plurality of images instead of the image data for each of the plurality of images comprises: image data for a subset of the plurality of images is transmitted at a particular number of frames per second and neural network output data for the images is transmitted when image data for the subset of the plurality of images is not transmitted.

In certain aspects, the subset of the plurality of images includes every nth image captured by the image sensor, where N is an integer greater than 1. In some implementations, the image data for a particular one of the images includes data indicating a value for each pixel within the image. In some aspects, the neural network output data for a particular one of the images includes one or more of: a selected region of interest representing the detected pixels; metadata containing time and geometric location information; intermediate calculation results prior to object detection; statistical information about the level of network certainty; and detecting a classification of the object.

In certain aspects, the neural network output data representation for a particular one of the images is provided to a processor for further processing of the partially processed data. In some embodiments, transmitting the neural network output data and the image data includes transmitting the neural network output data and the image data to a central processing unit.

Particular embodiments of the subject matter described in this specification can be implemented to realize one or more of the following advantages. An advantage of this technique may be that including multiple stacked integrated circuit layers in an image sensor may allow processing to be distributed among the layers, which may allow parallel processing of the layers and/or more processing to occur on the image sensor, relative to using a processor remote from the image sensor solely to process image data.

Using layers of the image sensor rather than performing additional processing outside the image sensor may remove the need to process particular data outside the image sensor, which may reduce the amount of bandwidth used to output information from the image sensor. For example, only a subset of the frames captured by the image sensor may need to be output, as some processing of the image sensor may already be done for the frames. Another advantage may be that the multiple layers may be configured such that information may be transferred over short distances between layers, which may allow for faster transmission of data.

The details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.

Drawings

FIG. 1 is a block diagram of an example image sensor having three integrated circuit layers.

FIG. 2 is a flow diagram of an example process for detecting an object using three integrated circuit layers of an image sensor.

FIG. 3 is a block diagram of an example integrated circuit layer with a pixel sensor group.

FIG. 4 is a block diagram of an example integrated circuit layer with a group of image processing circuits.

FIG. 5 is a block diagram of an example integrated circuit layer with a neural network circuit group.

FIG. 6 is a diagram of timing of an example image sensor without neural network circuitry.

FIG. 7 is a diagram of timing for an example image sensor with neural network circuitry.

FIG. 8 is a flow chart of an example process for transmitting neural network output data and image data from an image sensor having neural network circuitry.

FIG. 9 is a block diagram of an example system including an image sensor for use by an autonomous vehicle.

FIG. 10 is a block diagram of an example system including a group of pixel sensors and a neural network circuit on a separate chip.

Like reference numbers and designations in the various drawings indicate like elements.

Detailed Description

Fig. 1 is a block diagram of an example image sensor 100 having three integrated circuit layers. The image sensor 100 may use three integrated circuit layers to detect an object. For example, the image sensor 100 may capture an image including a person and output an indication that a "person detected". In another example, the image sensor 100 may capture an image and output a portion that includes an image of the vehicle detected by the image sensor 100.

The three integrated circuit layers include a first integrated circuit layer 110, a second integrated circuit layer 120, and a third integrated circuit layer 130. The first integrated circuit layer 110 is stacked on the second integrated circuit layer 120, and the second integrated circuit layer 120 is stacked on the third integrated circuit layer 130. For example, the first integrated circuit layer 110 is in direct contact with the top of the second integrated circuit layer 120, and the third integrated circuit layer 130 is in direct contact with the bottom of the second integrated circuit layer 120.

The first integrated circuit layer 110 may be in electrical communication with the second integrated circuit layer 120. For example, the first integrated circuit layer 110 and the second integrated circuit layer 120 may be physically connected to each other using an interconnect. The second integrated circuit layer 120 may be in electrical communication with the third integrated circuit layer 130. For example, the second integrated circuit layer 120 and the third integrated circuit layer 130 may be physically connected to each other using an interconnect.

The first integrated circuit layer 110 may have the same area as the second integrated circuit layer 120. For example, the first integrated circuit layer 110 and the second integrated circuit layer 120 may be the same length and width and different heights. The third integrated circuit layer 130 may have an area larger than the first integrated circuit layer 110 and the second integrated circuit layer 120. For example, the third integrated circuit layer 130 may have a length and width that are 20% greater than both the length and width of the first integrated circuit layer 110 and the second integrated circuit layer 120.

The first integrated circuit layer 110 may include an array of pixel sensors grouped into pixel sensor groups (each pixel sensor group referred to in fig. 1 as a "pixel group") 112A-112C (collectively 112) according to location. For example, the first integrated circuit layer 110 may include an array of 6400x 4800 pixel sensors grouped into 320x 240 pixel sensor groups, where each pixel sensor group includes an array of 20x 20 pixel sensors.

Each of the pixel sensor groups 112 may include a 2x 2 pixel sensor sub-group. For example, each of the pixel sensor groups of the 20x 20 pixel sensor may include a 10x 10 pixel sensor sub-group, where each pixel sensor sub-group includes a red pixel sensor at the top left, a green pixel sensor at the bottom right, a first transparent pixel sensor at the bottom left, and a second transparent pixel sensor at the top right, each sub-group also referred to as a red-transparent-green (RCCG) sub-group.

In some embodiments, the size of the pixel sensor group can be selected to increase silicon utilization. For example, the size of the pixel sensor group may be such that more silicon is covered by the pixel sensor group with the same pattern of pixel sensors.

The second integrated circuit layer 120 may include image processing circuit groups (each referred to in fig. 1 as a "processing group") 122A-122C (collectively referred to as 122). For example, the second integrated circuit layer 120 may include 320x 240 image processing circuit groups.

The groups of image processing circuits 122 may be configured to each receive pixel information from a corresponding group of pixel sensors and further configured to perform image processing operations on the pixel information to provide processed pixel information during operation of the image sensor 100.

In some implementations, each group of image processing circuits 122 can receive pixel information from a single corresponding group of pixel sensors 112. For example, image processing circuit group 122A may receive pixel information from pixel sensor group 112A and not from any other pixel group, and image processing circuit group 122B may receive pixel information from pixel sensor group 112B and not from any other pixel group.

In some embodiments, each group of image processing circuits 122 may receive pixel information from a plurality of corresponding groups of pixel sensors 112. For example, image processing circuit group 122A may receive pixel information from both pixel sensor groups 112A and 112B and not from the other pixel groups, and image processing circuit group 122B may receive pixel information from pixel group 112C and another pixel group and not from the other pixel groups.

Having the group of image processing circuits 122 receive pixel information from the corresponding group of pixels may result in a fast transfer of pixel information from the first integrated circuit layer 110 to the second layer 120 because the group of image processing circuits 122 may be physically close to the corresponding group of pixel sensors 112. The longer the distance of information transfer, the longer the time consumed for transfer. For example, pixel sensor group 112A may be directly above image processing circuit group 122A, and pixel sensor group 112A may not be directly above image processing circuit group 122C, so if there is an interconnect between pixel sensor group 112A and image processing circuit group 122C, transferring pixel information from pixel sensor group 112A to image processing circuit group 122A may be faster than transferring pixel information from pixel sensor group 112A to image processing circuit group 122C.

The image processing circuitry group 122 may be configured to perform image processing operations on pixel information received by the image processing circuitry group 122 from the pixel group. For example, image processing circuit group 122A may perform high dynamic range fusion on pixel information from pixel sensor group 112A, and image processing circuit group 122B may perform high dynamic range fusion on pixel information from pixel sensor group 112B. Other image processing operations may include, for example, analog to digital signal conversion and demosaicing.

Having the image processing circuit group 122 perform image processing operations on pixel information from the corresponding pixel sensor group 112 may enable image processing operations to be performed in parallel by the image processing circuit group 122 in a distributed manner. For example, image processing circuitry group 122A may perform image processing operations on pixel information from pixel sensor group 112A while image processing circuitry group 122B performs image processing operations on pixel information from pixel group 122B.

The third integrated circuit layer 130 may include neural network circuit groups (each referred to as an "NN group" in fig. 1) 132A-132C (collectively referred to as 132) and a full image neural network circuit 134. For example, the third integrated circuit layer 130 may include 320x 240 neural network circuit groups.

The neural network circuit groups 132 may be configured to each receive processed pixel information from a corresponding image processing circuit group and further configured to perform analysis of object detection on the processed pixel information during operation of the image sensor 100. In some implementations, the neural network circuit groups 132 can each implement a Convolutional Neural Network (CNN).

In some implementations, each neural network circuit group 132 can receive processed pixel information from a single corresponding image processing circuit group 122. For example, the neural network circuit group 132A may receive processed pixel information from the image processing circuit group 122A and not from any other image processing circuit group, and the neural network circuit group 132B may receive processed pixel information from the image processing circuit group 122B and not from any other image processing circuit group.

In some implementations, each neural network circuit group 132 can receive processed pixel information from a plurality of corresponding image processing circuit groups 122. For example, the neural network circuit group 132A may receive processed pixel information from both the image processing circuit groups 122A and 122B and not from the other image processing circuit groups, and the neural network circuit group 132B may receive processed pixel information from both the image processing circuit group 122C and the other pixel group and not from the other pixel groups.

Having the neural network circuit group 132 receive processed pixel information from the corresponding image processing circuit group may result in a fast transfer of the processed pixel information from the second integrated circuit layer 120 to the third integrated circuit layer 130 because the neural network circuit group 132 may be physically close to the corresponding image processing circuit group 122. The longer the distance of information transfer, the longer the time consumed for transfer. For example, the image processing circuit group 122A may be directly above the neural network circuit group 132A, so if there is an interconnect between the image processing circuit group 122A and the neural network circuit group 132C, transferring processed pixel information from the image processing circuit group 122A to the neural network circuit group 132A may be faster than transferring processed pixel information from the image processing circuit group 122A to the neural network circuit group 132C.

The neural network circuit group 132 may be configured to detect objects from the processed pixel information received by the neural network circuit group 132 from the image processing circuit group 122. For example, the neural network circuit group 132A may detect objects from processed pixel information from the image processing circuit group 122A, and the neural network circuit group 132B may detect objects from processed pixel information from the image processing circuit group 122B.

Having the neural network circuit group 132 detect objects from the processed pixel information from the corresponding image processing circuit group 122 enables detection to be performed in parallel in a distributed manner by each of the neural network circuit groups 132. For example, the neural network circuit group 132A may detect objects from processed pixel information from the image processing circuit group 122A, while the neural network circuit group 132B may detect objects from processed pixel information from the image processing circuit group 122B.

In some embodiments, the neural network circuit group 132 may perform intermediate processing. Thus, the image sensor 100 may use three integrated circuit layers 110, 120, and 130 to perform some intermediate processing and output only intermediate results. For example, the image sensor 100 may capture an image including a person and output an indication of "an area of interest in a certain area of the image" without classifying the object of interest (person). Other processing performed outside of the image sensor 100 may classify the region of interest as a person.

Thus, the output from the image sensor 100 may comprise some data representing the output of some convolutional neural network. This data itself may be difficult to decrypt, but once it continues to be processed outside the image sensor 100, the data may be used to classify the region as including a person. This hybrid approach may have the advantage of reducing the required bandwidth. Thus, the output from the neural network circuit group 132 may include one or more of a selected region of interest representing detected pixels, metadata including temporal and geometric location information, intermediate calculation results prior to object detection, statistical information about the network certainty level, and classification of detected objects.

In some implementations, the neural network circuit group 132 may be configured to implement CNN with high recall and low accuracy. The neural network circuit group 132 may each output a list of detected objects in which an object is detected and a timing of detection of the object.

The full image neural network circuit 134 may be configured to receive data from each of the neural network circuit groups 132 indicative of an object detected by the neural network circuit group 132, and detect the object from the data. For example, the neural network circuit group 132 may not be able to detect an object captured by multiple pixel groups because each individual (individual) neural network circuit group may only receive a portion of the processed pixel information corresponding to the object, but the full image neural network circuit 134 may receive data from the multiple neural network circuit groups 132 in order to be able to detect the object sensed by the multiple pixel groups. In some implementations, the full image neural network circuit 134 can implement a Recurrent Neural Network (RNN). The neural network may be configurable both with respect to its architecture (number and type of layers, activation functions, etc.) and with respect to the actual values of the neural network components (e.g., weights, biases, etc.).

In some embodiments, having the image sensor 100 perform processing may simplify the processing pipeline architecture, provide higher bandwidth and lower latency, allow for selective frame rate operation, reduce cost using stacked architectures, provide higher system reliability (as integrated circuits may have fewer potential failure points), and provide significant cost and power savings of computing resources.

In some embodiments, the third integrated circuit layer 130 may be bonded to the second integrated circuit layer 120 using a silicon-to-silicon direct bond method or a "flip chip" method in which a substrate is interposed between the layers and connections are made using wire bonds from silicon to the substrate.

FIG. 2 is a flow diagram of an example process 200 for detecting an object using three integrated circuit layers of an image sensor. Process 200 may be performed by image sensor 100 of fig. 1 or some other image sensor.

Process 200 may include obtaining pixel information by pixel sensors in a first integrated circuit layer and grouping groups of imaging pixel sensors based on location (210). For example, each of the pixel sensors in the pixel sensor group 112 in the first integrated circuit layer 110 can generate pixel information (which is an analog signal representative of the intensity of light sensed by the pixel sensor). In some implementations, each of the pixel sensor groups includes the same number of pixel sensors. For example, each of the pixel sensor groups 112 may include four hundred pixel sensors forming a 20x 20RCCG sub-group.

Process 200 may include performing, by a group of image processing circuits in the second integrated circuit layer, an image processing operation on pixel information from a corresponding group of pixel sensors to provide processed pixel information (220). For example, image processing circuitry group 122A may receive pixel information from pixel sensor group 112A and perform image processing operations on the pixel information from pixel sensor group 112A, while image processing circuitry group 122B receives pixel information from pixel sensor group 112B and performs image processing operations on the pixel information from pixel sensor group 112B.

In some implementations, each of the groups of image processing circuits is positioned directly below a group of pixel sensors that provide pixel information to the groups of image processing circuits. For example, image processing circuit group 122A may be positioned directly below pixel sensor group 112A and image processing circuit group 122B may be positioned directly below pixel sensor group 112B. In some implementations, each of the groups of image processing circuits can be located proximate to a group of pixel sensors that provide pixel information to the group of image processing circuits. In some embodiments, the second integrated circuit layer may be in electrical communication with the first integrated circuit layer. For example, the second integrated circuit layer 120 may be connected to the first integrated circuit layer 110 through an interconnect formed of a conductive material.

Process 200 may include performing, by a group of neural network circuits in a third integrated circuit layer, analysis of object detection on processed pixel information from a corresponding group of image processing circuits (230). For example, the neural network circuit group 132A may receive processed pixel information from the image processing circuit group 122A and detect an object from the processed pixel information from the image processing circuit group 122A, while the neural network circuit group 132B receives pixel information from the image processing circuit group 122B and detects an object from the processed pixel information from the image processing circuit group 122B. In another example, the neural network circuit group 132 may receive the processed pixel information from the corresponding image processing circuit group 122 and output one or more of a selected region of interest representing the detected pixels, metadata including temporal and geometric location information, intermediate calculation results prior to object detection, statistical information about the level of network certainty, and classification of the detected objects.

In some implementations, the groups of neural network circuits each include a circuit configured to implement CNN. For example, neural network circuit group 132A may include circuitry for implementing a first CNN and neural network circuit group 132B may include circuitry for implementing a second, different CNN.

The convolutional neural networks may each detect an object sensed by a group of pixel sensors corresponding to a group of image processing circuits (which corresponds to a group of neural network circuits). For example, the CNN of the neural network circuit group 132A may detect a first object captured by a pixel sensor in the pixel sensor group 112A, and the CNN of the neural network circuit group 132B may detect a second object captured by a pixel sensor in the pixel sensor group 112B.

In some implementations, the third integrated circuit layer includes circuitry configured to implement RNN. For example, the third integrated circuit layer 130 may include circuitry configured for receiving information about objects detected by all of the neural network circuit groups and detecting RNNs of objects sensed across multiple ones of the pixel sensor groups.

In some implementations, each of the group of neural network circuits is positioned directly below a group of image processing circuits that provide processed pixel information to the group of neural network circuits. For example, neural network circuit group 132A may be positioned directly below image processing circuit group 122A and neural network circuit group 132B may be positioned directly below image processing circuit group 122B.

In some embodiments, the third integrated circuit layer may be in electrical communication with the second integrated circuit layer. For example, the third integrated circuit layer 130 may be connected to the second integrated circuit layer 120 through an interconnection formed of a conductive material.

Process 200 may include outputting information indicative of a result of the analysis of the object detection by the neural network circuit group (240). For example, the third integrated circuit layer 130 may include circuitry configured to output metadata from the image sensor 100 to a central processing unit external to the image sensor 100, where the metadata specifies an object detected by the neural network circuitry group 132.

In some embodiments, process 200 may be performed with a first integrated circuit layer stacked on a second integrated circuit layer and the second integrated circuit layer stacked on a third integrated circuit layer. For example, process 200 may be performed by image sensor 100 where the bottom of first integrated circuit layer 110 directly contacts the top of second integrated circuit layer 120 and the bottom of second integrated circuit layer 120 directly contacts the top of third integrated circuit layer 130.

In some embodiments, process 200 may be performed with the first, second, and third integrated circuit layers integrated into a single integrated chip. For example, process 200 may be performed by image sensor 100 as a single integrated chip.

Fig. 3 is a block diagram of an example integrated circuit layer 300 with a pixel sensor group. In some embodiments, the integrated circuit layer 300 may be the first integrated circuit layer 110 shown in fig. 1. The integrated circuit layer 300 includes a pixel sensor group 310, circuitry 320 and interconnects 330 for row driver, timing, and automotive safety integrity levels. As shown in fig. 3, pixel sensor groups 310 each include multiple 2x 2 arrays of RCCG subgroups, and each of pixel sensor groups 310 includes the same number of 2x 2 arrays of RCCG subgroups. However, other implementations of the integrated circuit layer 300 may include different groups of pixel sensors. For example, each pixel group may include a 3x 3 array of RCCG subgroups.

Fig. 4 is a block diagram of an example integrated circuit layer 400 having a group of image processing circuits. In some embodiments, the integrated circuit layer 400 may be the second integrated circuit layer 120 shown in fig. 1. The integrated circuit layer 400 may include a group of image processing circuits 410, circuits for logic and automotive safety integrity levels 420, and interconnects 430. As shown in fig. 3, each of the image processing circuit groups 410 may include a circuit for a high/low analog-to-digital converter (ADCS), a circuit for high dynamic range fusion, a circuit for a car safety integrity level, a circuit for a pixel memory, and a circuit for a multiplexer. The multiplexer may allow flexibility in routing information between layers as a single pixel (multiple bits per pixel) or a group of pixels serialized over a single link (connection) towards the integrated circuit layer 500 with a neural network circuit group. Each of the image processing circuit groups 410 may cover the same area as the pixel sensor group 310 that provides pixel information to the image processing circuit group 410.

FIG. 5 is a block diagram of an example integrated circuit layer 500 with a neural network circuit group. In some embodiments, the integrated circuit layer 500 may be the third integrated circuit layer 130 shown in fig. 1. The integrated circuit layer 400 may include a neural network circuit group 510, circuitry for logic and automotive safety integrity levels 520, interconnects 530, and RNNs.

As shown in fig. 5, each of the neural network circuit groups 510 may include a circuit for memory, a CNN, and a demultiplexer. The demultiplexer may deserialize the bits and pixels to recover the captured pixel configuration. Each of the neural network circuit groups 510 may cover the same area as the image processing circuit group 410 that provides processed pixel information to the neural network circuit group 410.

FIG. 6 is a diagram 600 of the timing of an example image sensor without neural network circuitry. The diagram 600 may include a first row 610 representing a timing of a transmit frame, a second row 620 representing a timing of a capture frame, and a third row 630 indicating an example time frame related to the above acts.

The diagram 600 shows how an image sensor may capture a frame (also referred to as an image) every one hundred milliseconds for ten milliseconds and then transmit the transmitted frame for twenty milliseconds. The image sensor may be idle for the remaining eighty milliseconds of one hundred milliseconds. The image sensor may only transmit a single image every one hundred milliseconds because the processing of the frame is done outside the image sensor and takes eighty milliseconds per frame. For example, object detection of a full image by a central processing unit external to the image sensor may take eighty milliseconds. Thus, the graph 600 may show how only a single frame is used every one hundred milliseconds.

Fig. 7 is a diagram 700 of the timing of an example image sensor with neural network circuitry. For example, graph 700 may illustrate a timing sequence for image sensor 100 of fig. 1.

Diagram 700 may include a first row 710 representing a timing of a transmission frame, a second row 720 representing a timing of generating neural network output data related to a test object within the frame, a third row 730 representing a timing of a capture frame, and a fourth row 740 indicating an example time frame related to the above acts.

Diagram 700 may show how an image sensor captures a frame every ten milliseconds, with the image sensor transmitting one frame for twenty milliseconds and transmitting neural network output data generated for ten frames captured by the image sensor for the remaining eighty milliseconds every one hundred milliseconds. Thus, the image sensor may provide information for up to ten times the frames in diagram 600 because, in diagram 700, ten frames of neural network output data are provided every one hundred milliseconds, although only a single frame of image data is transmitted every one hundred milliseconds.

Since the neural network output data from the image sensor may already indicate whether an object is detected in this and other frames, having the image sensor transmit the frame for further processing along with the neural network output data may result in less processing outside the image sensor.

As shown in diagram 700, once a frame is captured, neural network output data for the frame is generated. For example, once frame a is captured, neural network output data may then be generated from frame a. In another example, once frame B is captured, neural network output data may then be generated from frame B.

One frame is captured while the other frame's neural network output data is generated. For example, frame B may be captured when the neural network output data for frame a is generated. Using the image sensor 100, the first integrated circuit layer 110 captures a frame B while the third integrated circuit layer 130 detects an object and generates neural network output data indicative of the detected object.

As shown in diagram 700, the neural network output data of a frame is transmitted after being generated. For example, the neural network output data may be transmitted once it is generated from frame a and once transmission of frame a has been completed. In some embodiments, once the group completes processing, the transfer of neural network output data may begin. This may be effective for rolling shutter sensors.

In some embodiments, image sensor 100 may also multiplex image grayscale or color data with neural network output data for transmission. For example, full image grayscale or color data may be multiplexed with the preprocessed objects and time information into a single data stream. The multiplexed data stream may have a far lower (e.g., more than three times smaller) output bandwidth requirement than the full image stream information. Multiplexing a single serial link with multiplexed information as a camera output rather than multiple links can greatly simplify the vehicle-level architecture, as the number of physical links can be reduced.

FIG. 8 is a flow diagram of an example process 800 for transmitting neural network output data and image data from an image sensor having neural network circuitry. Process 800 may be performed by image sensor 100 of fig. 1 or some other image sensor.

Process 800 may include obtaining a plurality of images captured by a pixel sensor of an image sensor (810). For example, a pixel sensor of pixel group 112C in first integrated circuit layer 110 of image sensor 100 may capture a different frame every ten milliseconds.

Process 800 may include analyzing the plurality of images for object detection using neural network circuitry integrated in the image sensor (820). For an example, the neural network circuit group 132 in the third integrated circuit layer 130 of the image sensor 100 may detect an object within each of the portions of the frame for which the neural network circuit group receives processed pixel information.

Process 800 may include generating, for each of the plurality of images, neural network output data related to results of the analysis of the plurality of images for object detection using neural network circuitry integrated in the image sensor (830). For example, a first neural network circuit group 132A may generate metadata indicating that a first object is detected in a portion of frame a, and a second neural network circuit group 132B may generate metadata indicating that no object is detected in another portion of frame a.

Process 800 may include transmitting, from the image sensor, the neural network output data for each of the plurality of images and image data for a subset of the plurality of images instead of the image data for each of the plurality of images (840). For example, the image sensor 100 may transmit data corresponding to the metadata generated by each of the neural network circuit groups indicating whether the neural network circuit groups detected an object in a respective portion of each frame captured by the image sensor 100, and image data for every tenth frame captured only by the image sensor.

In some embodiments, transmitting the neural network output data for each of the plurality of images and the image data for the subset of the plurality of images instead of the image data for each of the plurality of images comprises: transmitting image data of a first image of the plurality of images and neural network output data of the first image; transmitting neural network output data of the second image; and transmitting image data of a third image of the plurality of images and neural network output data of the third image before transmitting the image data of the second image and without transmitting the image data of the second image. For example, the image sensor 100 may transmit image data of frame a, then transmit neural network output data of frames a through I, then transmit image data of frame K, and then transmit neural network output data of frames J through T.

In some embodiments, transmitting, from the image sensor, the neural network output data for each of the plurality of images and the image data for the subset of the plurality of images instead of the image data for each of the plurality of images comprises transmitting the image data for the subset of the plurality of images at a particular number of frames per second and transmitting the neural network output data for the image when the image data for the subset of the plurality of images is not transmitted. For example, the image sensor 100 may transmit image data ten frames per second (with the image data for each frame being transmitted for ten milliseconds) and transmit neural network output data for other images during the remaining nine hundred milliseconds per second when image data is not being transmitted.

In some embodiments, transmitting the neural network output data and the image data includes transmitting the neural network output data and the image data to a central processing unit. For example, the image sensor 100 may transmit the image data of frame a and the neural network output data of frames a-I to a computer so that the computer may perform additional object detection using the neural network output data of frames a and a-I.

In some implementations, the process 800 includes analyzing a second image of the plurality of images for object detection while transmitting image data of a first image of the plurality of images and neural network output data of the first image. For example, the neural network circuit group 132 of the third integrated circuit layer 130 may detect an object in frame B while outputting neural network output data generated for frame a from the image sensor 100.

In some implementations, the process 800 includes generating neural network output data related to results of the analysis of the plurality of images for object detection while capturing the third image. For example, the neural network circuit group 132 of the third integrated circuit layer 130 may generate neural network output data related to object detection within frame B when a group of pixels in the first integrated circuit layer 110 captures frame C.

In some implementations, in process 800, the subset of the plurality of images for which image data is transmitted includes every nth image captured by the image sensor, where N is an integer greater than 1. For example, the subset of the plurality of images may be every tenth image captured by the image sensor 100. In some implementations, in process 800, image data of a particular one of the images includes data indicating a value of each pixel within the image. For example, the image data may be a red-green-blue (RGB) intensity value for each pixel in the image.

In some implementations, in process 800, the neural network output data for a particular one of the images includes one or more of a selected region of interest representing detected pixels, metadata including temporal and geometric location information, intermediate calculation results prior to object detection, statistical information about a level of network certainty, and a classification of the detected objects. For example, the neural network output data for frame a may indicate that a person was detected at a particular coordinate of frame a. In some embodiments, in process 800, the neural network output data representation for a particular image in the image is provided to a processor for further processing of the partially processed data. For example, instead of representing the detection object, the neural network output data may represent partially processed data that is to be further processed outside of the image sensor 100 to detect the object.

Fig. 9 is a block diagram of an example system 900 including an image sensor 100 for use by an autonomous vehicle 910. The autonomous vehicle 910 may be an autonomous vehicle that captures an image, detects objects in the image, and then drives based on the objects detected by the autonomous vehicle 910. For example, the autonomous vehicle 910 may detect another vehicle moving in front of the autonomous vehicle 910 and immediately following the vehicle. In another example, the autonomous vehicle 910 may detect a person in front of the autonomous vehicle 910 and stop.

Autonomous vehicle 910 may include camera optics 920, image sensor 100, and computer processing module 940. Camera optics 920 may include a lens that alters the light. For example, camera optics 920 may include a panoramic lens. The image sensor 100 may receive light altered by the camera optics 920 and capture an image based on the light falling on the image sensor 100.

The image sensor 100 may then transmit data to the computer processing module 940 across a data transmission cable that electrically couples the image sensor 100 and the computer processing module 940. For example, image sensor 100 may capture one frame every hundred milliseconds for ten milliseconds, and transmit the image data of every tenth captured frame and the neural network output data for all frames captured by image sensor 100 to computer processing module 940 across a data transmission cable. The computer processing module 940 may receive image data and neural network output data from the image sensor 100, optionally perform further object detection using the image data and the neural network output data, and determine how the autonomous vehicle 910 should move based on the detected objects.

Thus, using the image sensor 100 (at least some processing is performed on the image sensor 100 for object detection), the computer processing module 940 may determine how to move based on images obtained every ten milliseconds, while using another image sensor, the computer processing module 940 may determine how to move based only on images obtained every one hundred milliseconds. Thus, by using the image sensor 100 instead of another image sensor, the autonomous vehicle 940 may more quickly detect objects and move in response to those objects.

Fig. 10 is a block diagram of an example system 1000 including a group of pixel sensors and neural network circuitry on a separate chip. The system 1000 may be a camera including a first chip 1110 (which includes a pixel sensor group) and a second chip 1120 (which includes a neural network circuit). The pixel sensor groups and neural network circuits may be similar to those described above, except that they are not included in different layers within a single integrated chip, but instead are included in the first chip 1110 and the second chip 1120. For example, a first chip 1110 and a second chip 1120 may be positioned on top of each other in the system 1000 and connected by a conductive material so that data may be transferred between the corresponding group of pixel sensors and the neural network circuitry.

Embodiments of the operations and subject matter described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a computer storage medium for execution by, or to control the operation of, data processing apparatus.

The computer storage medium may be or be included in a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Further, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium may also be or be included in one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).

The operations described in this specification may be implemented as operations performed by data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.

The term "data processing apparatus" encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple or combinations of the foregoing. An apparatus may comprise special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit). In addition to hardware, an apparatus can also include code that creates an execution environment for the computer program of interest, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them. The apparatus and execution environment may implement a variety of different computing model infrastructures, such as web services, distributed computing, and grid computing infrastructures.

A computer program (also known as a program, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may (but need not) correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.

The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).

Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with the instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, the computer need not have such devices. Also, the computer may be embedded in another device, e.g., a mobile telephone, a Personal Digital Assistant (PDA), a mobile audio or video player, a game player, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a Universal Serial Bus (USB) flash drive), to name a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example: semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.

To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having: a display device for displaying information to a user, for example, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor; and a keyboard and pointing device, such as a mouse or trackball, by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, the computer may interact with the user by sending documents to and receiving documents from a device used by the user; for example, by sending a web page to a web browser on a user's user device in response to a request received from the web browser.

Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification), or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include local area networks ("LANs") and wide area networks ("WANs"), internetworks (e.g., the internet), and ad-hoc peer-to-peer networks (e.g., ad-hoc peer-to-peer networks).

The computing system may include a user and a server. A user and server are generally remote from each other and typically interact through a communication network. The relationship of user and server arises by virtue of computer programs running on the respective computers and having a user-server relationship to each other. In some embodiments, the server transmits data (e.g., HTML pages) to the user device (e.g., for purposes of displaying data to a user interacting with the user device and receiving user input from the user). Data generated at the user device (e.g., as a result of the user interaction) may be received at the server from the user device.

While this specification contains many specific implementation details, such details should not be construed as limitations on the scope of any features or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Furthermore, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.

Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described processing components and systems can generally be integrated together in a single software product or packaged into multiple software products.

Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may be advantageous.

29页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:遗留物检测装置及遗留物检测方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类