Display assembly with electronically emulated transparency
阅读说明:本技术 具有电子仿真透明度的显示器组件 (Display assembly with electronically emulated transparency ) 是由 M·A·拉姆金 K·M·灵根贝格 J·D·拉姆金 于 2019-01-23 设计创作,主要内容包括:在一个实施例中,一种电子显示组件包括:电路板;位于所述电路板的第一侧的第一微透镜层,和位于所述电路板的与第一微透镜层相反的一侧的第二微透镜层。第一微透镜层包括第一多个微透镜,以及所述第二微透镜层包括第二多个微透镜。电子显示组件还包括与第一微透镜层相邻的图像传感器层,和与第二微透镜阵列相邻的显示层。所述图像传感器层包括传感器像素,所述传感器像素用于检测通过所述第一微透镜的进来光,以及所述显示层包括用于通过所述第二微透镜发射光的显示像素。所述电子显示组件通过以与检测到的通过所述第一微透镜的进来光的角度对应的角度从所述第二微透镜发射光来仿真透明度。(In one embodiment, an electronic display assembly comprises: a circuit board; a first microlens layer on a first side of the circuit board, and a second microlens layer on a side of the circuit board opposite the first microlens layer. The first microlens layer includes a first plurality of microlenses and the second microlens layer includes a second plurality of microlenses. The electronic display assembly also includes an image sensor layer adjacent the first microlens layer, and a display layer adjacent the second microlens array. The image sensor layer includes sensor pixels for detecting incoming light through the first microlenses, and the display layer includes display pixels for emitting light through the second microlenses. The electronic display component simulates transparency by emitting light from the second microlenses at an angle corresponding to the angle of detected incoming light through the first microlenses.)
1. An electronic display assembly comprising:
a circuit board;
a first microlens layer on a first side of the circuit board, the first microlens layer including a first plurality of microlenses;
a second microlens layer on a side of the circuit board opposite the first microlens layer, the second microlens layer comprising a second plurality of microlenses;
An image sensor layer adjacent to the first microlens layer, the image sensor layer comprising a plurality of sensor pixels configured to detect incoming light through the first plurality of microlenses;
a display layer adjacent to a second microlens array, the display layer comprising a plurality of display pixels configured to emit light through the second plurality of microlenses;
a logic cell layer coupled to the circuit board, the logic cell layer comprising one or more logic cells configured to simulate transparency by directing signals from the plurality of sensor pixels to the plurality of display pixels to emit light from the second plurality of microlenses at an angle corresponding to the angle of the detected incoming light through the first plurality of microlenses.
2. The electronic display assembly of claim 1, wherein:
the first plurality of microlenses facing a first direction; and
the second plurality of microlenses are oriented in a second direction 180 degrees from the first direction.
3. The electronic display assembly of claim 1, wherein:
the image sensor layer is disposed within the first microlens layer; and
The display layer is disposed within the second microlens layer.
4. The electronic display assembly of claim 1, wherein the circuit board is flexible.
5. The electronic display assembly of claim 1, wherein simulating transparency comprises emitting light from the second plurality of microlenses such that an image is displayed that matches an image that would be seen in the absence of the electronic display assembly.
6. The electronic display assembly of claim 1, wherein the logic cell layer is located between the image sensor layer and the circuit board.
7. The electronic display assembly of claim 1, wherein the logic cell layer is located between the display layer and the circuit board.
8. The electronic display assembly of claim 1, wherein each of the first and second plurality of microlenses comprises a three-dimensional shape, the collimating lens being located at one end of the three-dimensional shape, the three-dimensional shape comprising:
a triangular polyhedron;
a rectangular cuboid;
a pentagonal polyhedron;
a hexagonal polyhedron;
a heptagonal polyhedron; or
An octagonal polyhedron.
9. The electronic display assembly of claim 8, wherein each of the first and second plurality of microlenses further comprises a plurality of opaque walls configured to prevent light from leaking into adjacent microlenses.
10. An electronic display assembly comprising:
a circuit board;
a first microlens layer on a first side of the circuit board, the first microlens layer including a first plurality of microlenses;
a second microlens layer on a side of the circuit board opposite the first microlens layer, the second microlens layer comprising a second plurality of microlenses;
an image sensor layer adjacent to the first microlens layer, the image sensor layer comprising a plurality of sensor pixels configured to detect incoming light through the first plurality of microlenses; and
a display layer adjacent to a second microlens array, the display layer comprising a plurality of display pixels configured to emit light through the second plurality of microlenses;
wherein the electronic display component is configured to simulate transparency by emitting light from the second plurality of microlenses at an angle corresponding to the angle of detected incoming light through the first plurality of microlenses.
11. The electronic display assembly of claim 10, wherein:
the first plurality of microlenses facing a first direction; and
the second plurality of microlenses are oriented in a second direction 180 degrees from the first direction.
12. The electronic display assembly of claim 10, wherein:
the image sensor layer is disposed within the first microlens layer; and
the display layer is disposed within the second microlens layer.
13. The electronic display assembly of claim 10, wherein the circuit board is flexible.
14. The electronic display assembly of claim 10, wherein simulating transparency comprises emitting light from the second plurality of microlenses such that an image is displayed that matches an image that would be seen in the absence of the electronic display assembly.
15. The electronic display assembly of claim 10, wherein each of the first and second plurality of microlenses comprises a three-dimensional shape, the collimating lens being located at one end of the three-dimensional shape, the three-dimensional shape comprising:
a triangular polyhedron;
a rectangular cuboid;
a pentagonal polyhedron;
a hexagonal polyhedron;
a heptagonal polyhedron; or
An octagonal polyhedron.
16. The electronic display assembly of claim 15, wherein each of the first and second plurality of microlenses further comprises a plurality of opaque walls configured to prevent light from leaking into adjacent microlenses.
17. A method of manufacturing an electronic display, the method comprising:
forming a plurality of unit attachment locations on the circuit board, each unit attachment location corresponding to one of the plurality of display units and one of the plurality of sensor units;
coupling a plurality of sensor units to a first side of a circuit board, each sensor unit coupled to a respective one of the unit attachment locations; and
coupling a plurality of display units to a second side of the circuit board opposite the first side, each display unit coupled to a respective one of the unit attachment locations;
coupling a first plurality of microlenses to the plurality of sensor units; and
coupling a second plurality of microlenses to the plurality of display units.
18. The method of manufacturing an electronic display of claim 17, further comprising coupling a plurality of logic units between the circuit board and the plurality of display units.
19. The method of manufacturing an electronic display of claim 17, further comprising coupling a plurality of logic units between the circuit board and the plurality of sensor units.
20. A method of manufacturing an electronic display according to claim 17, wherein each microlens in the first and second plurality of microlenses comprises:
A three-dimensional shape with a collimating lens located at one end of the three-dimensional shape, the three-dimensional shape comprising:
a triangular polyhedron;
a rectangular cuboid;
a pentagonal polyhedron;
a hexagonal polyhedron;
a heptagonal polyhedron; or
An octagonal polyhedron; and
a plurality of opaque walls configured to prevent light from leaking into adjacent microlenses.
Technical Field
The present disclosure relates generally to light field displays and cameras, and more particularly, to display assemblies having electronically emulated transparency.
Background
Electronic displays are used in a variety of applications. For example, displays are used in smart phones, laptops, and digital cameras. In addition to electronic displays, some devices (such as smartphones and digital cameras) may also include image sensors. Although some cameras and electronic displays capture and reproduce light fields separately, light field displays and light field cameras are typically not integrated with each other.
Disclosure of Invention
In one embodiment, an electronic display assembly comprises: a circuit board; a first microlens layer on a first side of the circuit board, and a second microlens layer on a side of the circuit board opposite the first microlens layer. The first microlens layer includes a first plurality of microlenses and the second microlens layer includes a second plurality of microlenses. The electronic display assembly also includes an image sensor layer adjacent to the first microlens layer. The image sensor layer includes a plurality of sensor pixels configured to detect incoming light through the first plurality of microlenses. The electronic display assembly also includes a display layer adjacent to the second microlens array. The display layer includes a plurality of display pixels configured to emit light through the second plurality of microlenses. The electronic display assembly also includes a logic cell layer coupled to the circuit board. The logic cell layer includes one or more logic cells configured to simulate transparency by directing signals from the plurality of sensor pixels to the plurality of display pixels to emit light from the second plurality of microlenses at an angle corresponding to the angle of the detected incoming light through the first plurality of microlenses.
In another embodiment, an electronic display assembly includes a circuit board and a first microlens layer on a first side of the circuit board. The first microlens layer includes a first plurality of microlenses. The electronic display assembly also includes a second microlens layer located on an opposite side of the circuit board from the first microlens layer. The second microlens layer includes a second plurality of microlenses. The electronic display assembly also includes an image sensor layer adjacent to the first microlens layer. The image sensor layer includes a plurality of sensor pixels configured to detect incoming light through the first plurality of microlenses. The electronic display assembly also includes a display layer adjacent to the second microlens array. The display layer includes a plurality of display pixels configured to emit light through the second plurality of microlenses. The electronic display component is configured to simulate transparency by emitting light from the second plurality of microlenses at an angle corresponding to the angle of detected incoming light through the first plurality of microlenses.
In another embodiment, a method of manufacturing an electronic display comprises: the method includes forming a plurality of unit attachment locations on a circuit board, coupling a plurality of sensor units to a first side of the circuit board, and coupling a plurality of display units to a second side of the circuit board opposite the first side. Each unit attachment location corresponds to one of the plurality of display units and one of the plurality of sensor units. Each sensor unit is coupled to a respective one of the unit attachment locations, and each display unit is coupled to a respective one of the unit attachment locations. The method of manufacturing an electronic display also includes coupling a first plurality of microlenses to the plurality of sensor units and coupling a second plurality of microlenses to the plurality of display units.
The present disclosure provides several technical advantages. Some embodiments provide a complete and accurate re-creation of the target light field while maintaining a light weight and comfortable wearing for the user. Some embodiments provide a thin electronic system that provides both opacity and controllable one-way emulated transparency, as well as digital display capabilities such as Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR). Some embodiments provide a direct sensor-to-display system that uses direct association of input pixels to corresponding output pixels to avoid the need for image transformation. For some systems, this reduces complexity, cost and power requirements. Some embodiments provide an intra-layer signal processing architecture that provides locally distributed processing of large amounts of data (e.g., 160k of image data or more), thereby avoiding bottlenecks and performance, power, and transmission line issues associated with existing solutions. Some embodiments use a microlens layer with an array of plenoptic cells to accurately capture and display to a viewer an amount of light. Plenoptic elements include opaque element walls to eliminate optical crosstalk between elements, thereby improving the accuracy of the reproduced light field.
Some embodiments provide a three-dimensional electronic device through a geodesic facet. In such embodiments, the flexible circuit board with an array of small rigid surfaces (e.g., display and/or sensor facets) may be formed into any 3D shape that is particularly helpful to accommodate the narrow radius of curvature (e.g., 30-60mm) required for a head-mounted near-eye wrap display. Some embodiments provide a distributed multi-screen array for high-density displays. In such embodiments, an array of custom sized and shaped small high resolution microdisplays (e.g., display facets) is formed and then assembled on a larger flexible circuit board, which can then be formed into a 3D shape (e.g., a hemispherical surface). Each microdisplay can operate independently of any other display, thereby providing a large array of many high resolution displays with unique content on each high resolution display such that the entire assembly together forms essentially a single very high resolution display. Some embodiments provide a distributed multi-aperture camera array. Such embodiments provide an array of custom sized and shaped small image sensors (e.g., sensor facets), all assembled on a larger flexible circuit board that is then formed into a 3D (e.g., hemispherical) shape. Each discrete image sensor can operate independently of any other image sensor so as to provide a large array of many apertures, capturing unique content on each aperture, such that the entire assembly becomes essentially a seamless, extremely high resolution multi-node camera.
Other technical advantages will be readily apparent to one skilled in the art from the figures 1A through 42, their description, and the claims. Moreover, while specific advantages have been enumerated above, various embodiments may include all, some, or none of the enumerated advantages.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
1A-1C illustrate reference scenes with various three-dimensional (3D) objects and various viewing positions, in accordance with certain embodiments;
2A-2C illustrate viewing the 3D object of FIGS. 1A-1C through a transparent panel according to some embodiments;
3A-3C illustrate viewing of the 3D object of FIGS. 1A-1C through a camera image panel according to some embodiments;
4A-4C illustrate viewing the 3D object of FIGS. 1A-1C through a simulated transparency electronic panel according to some embodiments;
5A-5C illustrate viewing the 3D object of FIGS. 1A-1C from an alternative angle through the camera image panel of FIGS. 3A-3C, in accordance with certain embodiments;
6A-6C illustrate viewing the 3D object of FIGS. 1A-1C from an alternative angle through the simulated transparency electronic panel of FIGS. 4A-4C, in accordance with certain embodiments;
FIG. 7 illustrates a cross-sectional view of a simulated transparency assembly in accordance with certain embodiments;
FIG. 8 illustrates an exploded view of the simulated transparency component of FIG. 7 in accordance with certain embodiments;
FIG. 9 illustrates a method of manufacturing the simulated transparency assembly of FIG. 7 in accordance with certain embodiments;
FIG. 10 illustrates a direct sensor to display system that may be used by the simulated transparency component of FIG. 7 in accordance with certain embodiments;
FIG. 11 illustrates a method of manufacturing the direct sensor to display system of FIG. 10, in accordance with certain embodiments;
12-13 illustrate various intra-layer signal processing structures that may be used by the simulated transparency component of FIG. 7, in accordance with certain embodiments;
FIG. 14 illustrates a method of manufacturing the intralayer signal processing system of FIGS. 12-13 in accordance with certain embodiments;
FIG. 15 represents a plenoptic cell (plenoptic cell) component that may be used by the simulated transparency component of FIG. 7, in accordance with certain embodiments;
figure 16 shows a cross-section of a portion of the all-optical primitive assembly of figure 15 in accordance with certain embodiments;
17A-17C show cross-sections of a portion of the plenoptic element assembly of FIG. 15 with various incoming light fields, in accordance with certain embodiments;
18A-18B illustrate a method of fabricating the all-optical cellular component of FIG. 15, in accordance with certain embodiments;
figures 19A-19B illustrate another method of fabricating the all-optical cellular component of figure 15 in accordance with certain embodiments;
Figures 20-21 represent a full-gloss cellular component that can be fabricated by the method of figures 18A-19B, in accordance with certain embodiments;
FIG. 22 illustrates a flexible circuit board that may be used by the simulated transparency assembly of FIG. 7 in accordance with certain embodiments;
FIG. 23 shows additional details of the flexible circuit board of FIG. 22 in accordance with certain embodiments;
FIG. 24 illustrates data flow through the flexible circuit board of FIG. 22 in accordance with certain embodiments;
FIG. 25 illustrates a method of manufacturing an electronic assembly using the flexible circuit board of FIG. 22, in accordance with certain embodiments;
FIG. 26 shows a cross-sectional view of a curved multi-display array according to some embodiments;
FIG. 27 shows an exploded view of the curved multi-display array of FIG. 26 according to some embodiments;
28-29 illustrate logic and display facets of the curved multi-display array of FIG. 26 in accordance with certain embodiments;
FIG. 30 illustrates a back side of the flexible circuit board of FIG. 22 in accordance with certain embodiments;
FIG. 31 illustrates data flow through the flexible circuit board of FIG. 30 in accordance with certain embodiments;
FIG. 32 illustrates the flexible circuit board of FIG. 30 having been formed into a hemispherical shape in accordance with certain embodiments;
FIG. 33 shows data flow through the flexible circuit board of FIG. 32 in accordance with certain embodiments;
Fig. 34 illustrates an array of logic facets that have been formed into a hemispherical shape in accordance with certain embodiments;
FIG. 35 illustrates communication between the logical facets of FIG. 34 in accordance with certain embodiments;
FIG. 36 illustrates a method of fabricating the curved multi-display array of FIG. 26 according to some embodiments;
figure 37 shows a cross-sectional view of a curved multi-camera array according to some embodiments;
FIGS. 38-39 illustrate exploded views of the curved multi-camera array of FIG. 37, in accordance with certain embodiments;
FIG. 40 illustrates a rear view of the flexible circuit board of FIG. 32 in accordance with certain embodiments;
FIG. 41 illustrates data flow through the flexible circuit board of FIG. 40 in accordance with certain embodiments; and
figure 42 illustrates a method of fabricating the curved multi-camera array of figure 37, in accordance with certain embodiments.
Detailed Description
Electronic displays are used in a variety of applications. For example, displays are used in smart phones, laptops, and digital cameras. In addition to electronic displays, some devices (such as smartphones and digital cameras) may also include image sensors. However, devices with displays and image sensors are typically limited in their ability to accurately capture and display a plenoptic environment.
To address the problems and limitations associated with existing electronic displays, embodiments of the present disclosure provide various electronic components for capturing and displaying light fields. Fig. 1A-9 relate to display assemblies with electronic artificial transparency, fig. 10-11 relate to direct camera-display systems, fig. 12-14 relate to intra-layer signal processing, fig. 15-21 relate to plenoptic imaging systems, fig. 22-25 relate to three-dimensional (3D) electronic device distribution through a geodesic facet, fig. 26-36 relate to distributed multi-screen arrays for high-density displays, and fig. 37-42 relate to distributed multi-aperture camera arrays.
To facilitate a better understanding of the present disclosure, the following examples of certain embodiments are given. The following examples should not be read to limit or define the scope of the present disclosure. Embodiments of the present disclosure and their advantages are best understood by referring to fig. 1A-42, wherein like numerals are used for like and corresponding parts.
1A-9 illustrate various aspects of a component having electronically emulated transparency, in accordance with certain embodiments. In general, the electronic components shown in detail in fig. 7-8 may be used in different applications to provide features such as Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR). For VR applications, there is a need for a digital display that can completely replace the real world field of view, similar to how a standard computer monitor blocks the view of the scene behind it. However, for AR applications, a digital display capable of overlaying data over a real world field of view is required, such as a heads-up display for pilots in modern cockpits. MR applications require a combination of both. Typical systems for providing some or all of these features are unsatisfactory for a number of reasons. For example, typical solutions do not provide for accurate or complete re-creation of the target light field. As another example, existing solutions are often bulky and uncomfortable for the user.
To address the problems and limitations associated with existing electronic displays, embodiments of the present disclosure provide a thin electronic system that provides both opacity and controllable one-way emulated transparency as well as digital display capabilities. From one side, the surface appears opaque, but from the opposite side, the surface can appear completely transparent, appear completely opaque, function as a digital display, or any combination of these. In some embodiments, simultaneous plenoptic sensing and display technologies are combined within a single layered structure to form a surface that appears one-way visually transparent. The system may include multiple layers of electronics and optics to manually recreate transparency that may be enhanced and/or digitally controlled. The individual image sensor pixels on one side may be spatially arranged to match the position of the display pixels on the opposite side of the assembly. In some embodiments, all of the electronic drive circuitry and some of the display logic circuitry may be sandwiched between the sensor layer and the display layer, and the output signal of each sensor pixel may be directed by the circuitry to the corresponding display pixel on the opposite side. In some embodiments, such centrally processed signals are aggregated with incoming signals from an array of plenoptic imaging sensors on the opposite side, and processed according to the following mode of operation. In VR mode, the external video source replaces the user's view of the external world completely with the incoming view from the video in preference to the camera data. In AR mode, an external video source is overlaid on the camera data, resulting in a combined view of the outside world and the view from the video (e.g., the video data is simply added to the scene). In MR mode, an external video source is blended with the camera data, allowing virtual objects to appear to interact with real objects in the real world, changing the virtual content to appear to be integrated with the real environment through object occlusion, lighting, etc.
With sensor pixels on one side of the assembly and display pixels on the other side, and with pixel-to-pixel alignment between the camera and the display, some embodiments combine stacked transparent High Dynamic Range (HDR) sensors and display pixels into a single structure. Both the sensor and display pixel arrays may be focused through multiple sets of microlenses to capture and display a four-dimensional light field. This means that: a full field of view of the real world is captured on one side of the assembly and electronically rendered on the other side, allowing partial or complete changes in the incoming image while maintaining image sharpness, brightness, and sufficient angular resolution so that the display side appears transparent even when viewed at oblique angles.
Fig. 1A-6C are provided to represent the difference between the electronic simulated transparency provided by embodiments of the present disclosure and a typical camera image (such as a current camera image displayed through a camera viewfinder or using a smart phone). 1A-1C illustrate reference scenes with various 3D objects 110 (i.e., 110A-C) and front viewing positions, according to some embodiments. Fig. 1A is a top view of the arrangement of the 3D object 110 and the front viewing direction of the 3D object 110. Fig. 1B is a perspective view of the same arrangement and front viewing direction of the 3D object 110 as fig. 1A. Fig. 1C is a front view of the 3D object 110 taken from the position shown in fig. 1A and 1B. It can be seen that the field of view of the 3D object 110 in fig. 1C is the normal expected field of view of the 3D object 110 (i.e., the field of view of the 3D object 110 is completely unchanged because there is nothing between the observer and the 3D object 110).
2A-2C illustrate viewing the 3D object 110 of FIGS. 1A-1C through a transparent panel 210 according to some embodiments. The transparent panel 210 may be, for example, a piece of transparent glass. Fig. 2A is a top view of the front viewing direction of the 3D object 110 through the transparent panel 210, and fig. 2B is a perspective view of the same arrangement and front viewing direction of the 3D object 110 as fig. 2A. Fig. 2C is a front view of the 3D object 110 obtained through the transparent panel 210 from the position shown in fig. 2A and 2B. It can be seen that the field of view of the 3D object 110 through the transparent panel 210 in fig. 2C is the normal expected field of view of the 3D object 110 (i.e., the field of view of the 3D object 110 is not changed at all because the viewer is looking through the transparent panel 210). In other words, the field of view of the 3D object 110 through the transparent panel 210 in fig. 2C is the same as the field of view in fig. 1C with no object between the viewer and the 3D object 110 (i.e., a "perceived" transparency). In other words, the edges of the projected image on the transparent panel 210 are aligned with the field of view of the actual 3D object 110 behind the transparent panel 210 to create a field of view aligned
3A-3C illustrate viewing of the 3D object 110 of FIGS. 1A-1C through a
Fig. 4A-4C illustrate viewing of the 3D object 110 of fig. 1A-1C through a simulated transparency
Fig. 5A-5C show the 3D object 110 of fig. 1A-1C viewed through the
Fig. 6A-6C show the 3D object 110 of fig. 1A-1C viewed through the simulated transparency
As shown above in fig. 4A-4C and 6A-6C, the
In some embodiments,
Microlens array 720 (i.e., sensor-
In some embodiments, the microlenses of sensor-
In general,
In general,
In some embodiments, the sensor pixels of the
Although fig. 7-8 describe the emulated
FIG. 9 illustrates a method 900 of manufacturing the
At step 920, a plurality of sensor units are coupled to a first side of the circuit board. In some embodiments, the sensor unit is a sensor unit 735. In some embodiments, each sensor unit is coupled to a respective one of the unit attachment locations of step 910 in step 920. In some embodiments, the sensor cells are first formed into an image sensor layer (such as image sensor layer 730), and the image sensor layer is coupled to the first side of the circuit board in this step.
At step 930, a plurality of display units are coupled to a second side of the circuit board opposite the first side. In some embodiments, the display unit is display unit 765. In some embodiments, each display unit is coupled to a respective one of the unit attachment locations. In some embodiments, the display unit is first formed as a display layer (such as electronic display layer 760), and the display layer is coupled to the second side of the circuit board in this step.
At step 940, a first plurality of microlenses is coupled to the plurality of sensor units of step 920. In some embodiments, the microlenses are
At step 950, a second plurality of microlenses is coupled to the plurality of display units of step 930. In some embodiments, the microlenses are
In some embodiments, method 900 may additionally include coupling a plurality of logic units between the circuit board of step 910 and the plurality of display units of step 930. In some embodiments, the logic unit is logic unit 755. In some embodiments, the plurality of logic units are coupled between the circuit board and the plurality of sensor units of step 920.
Particular embodiments may repeat one or more steps of method 900, where appropriate. Although this disclosure describes and illustrates particular steps of method 900 as occurring in a particular order, this disclosure contemplates any suitable steps of method 900 occurring in any suitable order (e.g., any temporal order). Further, while this disclosure describes and represents an exemplary simulated transparency assembly manufacturing method including particular steps of method 900, this disclosure contemplates any suitable simulated transparency assembly manufacturing method including any suitable steps, which may include all, some, or none of the steps of method 900, where appropriate. Additionally, although this disclosure describes and illustrates particular components, devices, or systems performing particular steps of method 900, this disclosure contemplates any suitable combination of any suitable components, devices, or systems performing any suitable steps of method 900.
FIG. 10 illustrates a direct sensor to display system 1000 that may be implemented by the simulated transparency component of FIG. 7, in accordance with certain embodiments. In general, FIG. 10 represents how an embodiment of the
As shown in fig. 10, each sensor unit 735 is directly coupled to a corresponding display unit 765. For example, the sensor unit 735A may be directly coupled to the display unit 765A, the sensor unit 735B may be directly coupled to the display unit 765B, and so on. In some embodiments, the signaling between the sensor unit 735 and the display unit 765 may be any suitable differential signaling, such as Low Voltage Differential Signaling (LVDS). More specifically, each sensor unit 735 may output a first signal in a particular format (e.g., LVDS) corresponding to the incoming
Because no conversion is required in the signaling between the sensor unit 735 and the display unit 765, the emulated
In some embodiments, each particular sensor pixel of the sensor unit 735 is mapped to a single display pixel of the corresponding display unit 765, and the display pixel displays light corresponding to light captured by its mapped sensor pixel. This is best shown in fig. 17A-17B. As one example, each central sensing pixel 1725 of a particular
In some embodiments, the sensor unit 735 is coupled directly to the
FIG. 11 illustrates a method 1100 of manufacturing the direct sensor-to-display system 1000 of FIG. 10, in accordance with certain embodiments. The method 1100 may begin at step 1110 where a plurality of unit attachment locations are formed on a circuit board at step 1110. In some embodiments, the circuit board is
At step 1120, a plurality of sensor units are coupled to a first side of the circuit board. In some embodiments, each sensor unit is coupled to a respective one of the unit attachment locations of step 1110. At step 1130, a plurality of display units are coupled to a second side of the circuit board opposite the first side. In some embodiments, each display unit is coupled to a respective one of the unit attachment locations of step 1110 such that each particular one of the plurality of sensor pixel units is mapped to a corresponding one of the plurality of display pixel units. By mapping each particular sensor pixel cell to one of the display pixel cells, the display pixel of each particular display pixel cell of the plurality of display pixel cells is configured to display light corresponding to light captured by the sensor pixel of its mapped sensor pixel cell. After step 1130, the method 1100 may end.
Particular embodiments may repeat one or more steps of method 1100 where appropriate. Although this disclosure describes and illustrates particular steps of method 1100 as occurring in a particular order, this disclosure contemplates any suitable steps of method 1100 occurring in any suitable order (e.g., any temporal order). Moreover, while this disclosure describes and represents an exemplary direct sensor-to-display system manufacturing method that includes particular steps of method 1100, this disclosure contemplates any suitable direct sensor-to-display system manufacturing method including any suitable steps, which may include all, some, or none of the steps of method 1100, where appropriate. Additionally, although this disclosure describes and illustrates particular components, devices, or systems performing particular steps of method 1100, this disclosure contemplates any suitable combination of any suitable components, devices, or systems performing any suitable steps of method 1100.
Fig. 12-13 represent various intra-layer signal processing structures that may be used by the
As shown in fig. 12-13, some embodiments of
In some embodiments, the
In some embodiments, the
In some embodiments, the logic unit 755 is configured to communicate using the same protocol as the sensor unit 735 and the display unit 765. For example, in embodiments where the logic 755 is a discrete IC, the IC may be configured to interface with the sensor and display facetSame protocol (e.g., LVDS or built-in integrated circuit (I)2C) ) communication. This eliminates the problem of having to switch between the sensor and the display facet, thereby reducing power and cost.
In some embodiments,
Fig. 14 illustrates a method 1400 of fabricating the intra-layer signal processing system of fig. 12-13, in accordance with certain embodiments. The method 1400 may begin at step 1410, where a plurality of sensor units are coupled to a first side of a circuit board at step 1410. In some embodiments, the sensor unit is a sensor unit 735, and the circuit board is a
At step 1420, a plurality of display cells are formed. In some embodiments, the display unit is a combination of a display unit 765 and a logic unit 755. Each display cell may be formed by combining the electronic display and logic cells into a single 3D integrated circuit using through silicon vias. Each display unit includes a plurality of display pixels.
At step 1430, the plurality of display cells of step 1420 are coupled to a second side of the circuit board opposite the first side. In some embodiments, each logic unit is coupled to a respective one of the unit attachment locations. After step 1430, the method 1400 may end.
Particular embodiments may repeat one or more steps of method 1400 where appropriate. Although this disclosure describes and illustrates particular steps of the method 1400 as occurring in a particular order, this disclosure contemplates any suitable steps of the method 1400 occurring in any suitable order (e.g., any temporal order). Moreover, while this disclosure describes and represents an exemplary intra-layer signal processing system fabrication method including particular steps of method 1400, this disclosure contemplates any suitable intra-layer signal processing system fabrication method including any suitable steps, which may include all, some, or none of the steps of method 1400, where appropriate. Additionally, although this disclosure describes and illustrates particular components, devices, or systems performing particular steps of method 1400, this disclosure contemplates any suitable combination of any suitable components, devices, or systems performing any suitable steps of method 1400.
Fig. 15-17C represent various views of an array 1500 of
Standard electronic displays typically include a planar arrangement of pixels that form a two-dimensional rasterized image, conveying two-dimensional data in an intrinsic manner. One limitation is that: the planar image cannot be rotated in order to perceive different perspectives within the scene being transmitted. In order to clearly view this image, the viewer's eye or the camera's lens must be focused on the screen, regardless of what is depicted within the image itself. In contrast, a certain amount of light entering the eye from the real world allows the eye to naturally focus on any point within the certain amount of light. Because the rays from the scene naturally enter the eye, this plenoptic "field" of light contains rays from the scene, rather than a virtual image focused by an external lens at a single focal plane. Although existing light field displays may be able to replicate this phenomenon, they provide a significant tradeoff between spatial and angular resolution, resulting in a perceived amount of light that appears blurred or insufficient in detail.
To overcome the problems and limitations associated with existing light field displays, embodiments of the present disclosure provide a coupled light field capture and display system that is capable of recording and subsequently electronically recreating an incoming full amount of light. Both capture and display processing are accomplished by the arrangement of
In some embodiments, the all-optical primitive 1510 includes transparent lenslets 1512 and
Transparent lenslets 1512 may be formed of any suitable transparent optical material. For example, transparent lenslets 1512 may be formed of a polymer, silica glass, or sapphire. In some embodiments, transparent lenslets 1512 may be formed of a polymer, such as polycarbonate or acrylic. In some embodiments, transparent lenslets 1512 may be replaced with waveguides and/or photonic crystals in order to capture and/or generate optical fields.
Typically,
In some embodiments, the image sensor 1520 includes or is coupled to backplane circuitry 1630a and the display 1530 includes or is coupled to backplane circuitry 1630 b. Typically, backplane circuitry 1630a-B provides electrical connections to allow image data to flow from image sensor 1520 to display 1530. In some embodiments, backplane circuitry 1630a and backplane circuitry 1630b are opposing sides of a single backplane. In some embodiments, backplane circuitry 1630a and backplane circuitry 1630b are
In some embodiments, filter layers 1640 may be included at one or both ends of transparent lenslets 1512 to limit light entry or exit to particular angles of incidence. For example, a first filter layer 1640A may be included at the protruding ends of transparent lenslets 1512, and/or a second filter layer 1640B may be included at the opposite ends of transparent lenslets 1512. Similar to
Each of fig. 17A-17C represents a cross-sectional view of seven adjacent
In fig. 17B, the incoming light field 1720 from an object fourteen degrees off the axis of the
In fig. 17C, the
Fig. 18A-18B illustrate a method of fabricating the all-optical cell assembly of fig. 15, in accordance with certain embodiments. In fig. 18A, a microlens array (MLA) plate 1810 is formed or obtained. The MLA plate 1810 includes a plurality of lenslets, as shown in the figure. In fig. 18B, a plurality of grooves 1820 are cut to a predetermined depth around each of the plurality of lenslets of the MLA plate 1810. In some embodiments, the notch 1820 may be cut using multiple passes to achieve a desired depth. In some embodiments, the groove 1820 may be cut using laser ablation, etching, photolithographic processing, or any other suitable method. After the grooves 1820 are cut to a desired depth, they are filled with a material configured to prevent light from leaking through the grooves 1820. In some embodiments, the material is any light absorbing (e.g., carbon nanotubes) or opaque material (e.g., non-reflective opaque material or colored polymer) when hardened. The resulting all-optical primitive assembly after the groove 1820 is filled and allowed to harden is shown in fig. 20-21.
Figures 19A-19B illustrate another method of fabricating the all-optical cellular component of figure 15, in accordance with certain embodiments. In fig. 19A, a pre-formed grid structure 1830 with gaps 1840 is obtained or formed. The grid structure 1830 is made of any suitable material as described above for the
In fig. 19B, the gap 1840 is filled with an optical polymer 1850. Optical polymer 1850 can be any suitable material as described above for transparent lenslets 1512. After the gaps 1840 are filled with the optical polymer 1850, a final lens profile is created using molding or ablation. Examples of plenoptic components obtained after forming the lens are shown in fig. 20-21.
Fig. 22-23 illustrate a flexible circuit board 2210 that may be used as a
To address the problems and limitations of the current solutions, embodiments of the present disclosure provide a 3D (e.g., spherical or hemispherical) electronic device manufacturing method using a geodesic facet solution that includes an array of small rigid surfaces arranged on a single flexible circuit. In some embodiments, the flexible circuit is cut into a particular mesh shape, then wrapped into a 3D shape (e.g., a spherical or hemispherical shape) and locked into place to prevent wear from repeated bending. The method is particularly useful for accommodating the narrow radius of curvature (e.g., 30-60mm) required for head mounted near-eye wrap displays. In some embodiments, the assembly includes a single base flexible printed circuit system layer, with the rigid sensor and display arrays disposed on opposite sides of the flexible circuit. The entire assembly, including the sensor and display layers, can be fabricated by standard planar semiconductor processes (e.g., spin coating, photolithography, etc.). The rigid electronics layer may be etched to form individual sensors and display cells (i.e., "facets"), then connected to the flexible circuitry by connection pads and bonded by patterned conductive and non-conductive adhesives. This allows the flexible circuitry to fold slightly at the edges between the rigid facets. In some embodiments, planar fabrication is followed, and the fully cured functional electronics stack is formed into the desired final 3D shape using one side of the final rigid polymer housing as a mold. In this way, the array of rigid electronic device facets are not deformed, but simply fall into place in their mold, with the flexible circuitry bending at defined creases/gaps to match the interior of the housing facets. The assembly may finally be covered and sealed with the opposing mating sides using a rigid housing.
Embodiments of the present disclosure are not limited to only spherical or hemispherical shapes, but such shapes are certainly contemplated. The disclosed embodiments may be formed in any compound curvature or any other rotational shape. Additionally, the disclosed embodiments can be formed with any non-uniform curvature as well as non-curved (i.e., flat) surfaces.
Fig. 22 shows the flexible circuit board 2210 in two different states: a flat
In general,
In some embodiments, the
Fig. 23 shows additional details of a flexible circuit board 2210 according to some embodiments. In some embodiments, each
In general, each
In some embodiments, wiring traces 2230 are included on the flexible circuit board 2210 to electrically connect the
Fig. 25 illustrates a method 2500 of manufacturing an electronic assembly using the flexible circuit board 2210 of fig. 22, in accordance with some embodiments. At step 2510, a plurality of facet locations are formed on the flexible circuit board. In some embodiments, the facet locations are
At step 2520, the flexible circuit board of step 2510 is cut or otherwise formed into a pattern that allows the flexible circuit board to be later formed into a 3D shape, such as a spherical or hemispherical shape. When the flexible circuit board is flat, at least some of the facet locations are separated from one or more adjacent facet locations by a plurality of gaps (such as gap 2215). The plurality of gaps are substantially eliminated when the flexible circuit board is formed into a 3D shape.
At step 2530, the electronic assembly is assembled by coupling a first plurality of rigid facets to a first side of a flexible circuit board. The first plurality of rigid facets may be
In some embodiments, the first plurality of rigid facets of step 2530 are rigid sensor facets (such as sensor facet 3735), and method 2500 further includes coupling a plurality of rigid display facets (such as display facet 2665) to a second side of the flexible circuit board opposite the first side. In this case, each particular facet location is configured to send a signal between a particular rigid sensor facet electrically coupled to the particular facet location and a particular rigid display facet electrically coupled to the same particular facet location. This allows light from the particular rigid display facet corresponding to the light captured by the corresponding rigid sensor facet to be displayed.
At step 2540, the assembled electronic component is formed into a desired 3D shape. In some embodiments, this step comprises: the flexible circuit board with its coupled rigid facets is placed on one side of a rigid housing having the desired shape. This allows the rigid facets to fall into defined spaces in the housing and the flexible circuit board to bend at defined creases/gaps between the rigid facets. After placing the flexible circuit board with its coupled rigid facets on one side of the rigid housing, the opposing mating side of the rigid housing may be attached to the first side, thereby sealing the assembly into a desired shape.
Particular embodiments may repeat one or more steps of method 2500 where appropriate. Although this disclosure describes and illustrates particular steps of method 2500 as occurring in a particular order, this disclosure contemplates any suitable steps of method 2500 occurring in any suitable order (e.g., any temporal order). Further, while this disclosure describes and represents an exemplary method of manufacturing an electronic assembly using a flexible circuit board, this disclosure contemplates any suitable method of manufacturing an electronic assembly using a flexible circuit board, which may include all, some, or none of the steps of method 2500, where appropriate. Additionally, although this disclosure describes and illustrates particular components, devices, or systems performing particular steps of method 2500, this disclosure contemplates any suitable combination of any suitable components, devices, or systems performing any suitable steps of method 2500.
FIGS. 26-36 illustrate a distributed multi-screen array for high density displays according to some embodiments. In general, to provide a near-eye display capable of simulating the entire field of view of a single human eye, a high dynamic range image display having a resolution order of magnitude greater than that of current common display screens is required. Such a display should be able to provide a light field display with an angular resolution and a spatial resolution sufficient to accommodate 20/20 human visual acuity. This is a large amount of information equivalent to a total horizontal pixel count of 100K to 200K. These displays should also surround the entire field of view of a human eye (approximately 160 deg. horizontally and 130 deg. vertically). To present binocular vision, a pair of such displays would be required that span the entire curved surface around each eye. However, typical displays available today do not meet these requirements.
To address these and other limitations of current displays, embodiments of the present disclosure provide an array of custom sized and shaped small high-resolution microdisplays (e.g., display facets 2665), all of which are formed and then assembled on a larger flexible circuit board 2210, which flexible circuit board 2210 may be formed into a 3D shape (e.g., a hemispherical surface). The microdisplay can be mounted inside the hemispherical circuitry where another layer containing an array of TFT logic cells (e.g., logic cells 755) can be included to handle all power and signal management. Typically, one logic unit 755 may be included for each microdisplay. Each microdisplay acts as a separate unit, displaying data from the logic unit behind it. Any additional information (e.g., such as external video for AR, VR, or MR applications) may be passed to the entire array via the central control processor. In some embodiments, the external data signals are advanced from one microdisplay to the next in sequence as a packetized multiplexed stream, while the TFT logic for each display determines the source and portion of the read signal. This allows each cell to operate independently of any other display, providing a large array of many high resolution displays with unique content on each high resolution display, such that the entire assembly together forms essentially a single very high resolution display.
To meet the requirements of resolution, color definition, and luminance output, each microdisplay may have a unique high-performance pixel architecture. For example, each microdisplay screen may include an array of display pixels 100 as described in FIGS. 1-4 and their associated description in U.S. patent application No. 15/724,004 entitled "Stacked Transparent Pixel structures for Electronic Displays," which is hereby incorporated by reference in its entirety. The microdisplay screen can be assembled on the same substrate using any suitable method. Such simultaneous manufacturing using standard semiconductor layering and photolithography processes virtually eliminates the overhead and costs associated with the production and packaging of many individual screens, greatly increasing affordability.
Fig. 26 illustrates a cross-sectional view of a curved multi-display array 2600 in accordance with certain embodiments. FIG. 26 is essentially the back side of the
In some embodiments, each
In general, each
The array of
The
In addition to having a selectable/controllable display resolution, each
Fig. 27 shows an exploded view of the curved multi-display array 2600 of fig. 26, and fig. 28-29 show additional details of the
Fig. 30 and 32 show the back side of the flexible circuit board 2210 of fig. 22, and show similar details as described with reference to fig. 23. Fig. 31 and 33 represent serial data flow through the flexible circuit board 2210 and show similar details as described with reference to fig. 24. Fig. 34 illustrates an array of
FIG. 36 illustrates a
At
At
Particular embodiments may repeat one or more steps of
Fig. 37-42 illustrate a distributed multi-aperture camera array 3700 according to some embodiments. Typically, in order to capture the entire light field of the entire field of view of a single human eye, a large high dynamic range image sensor with a resolution much higher than currently available is required. Such an image sensor would enable a light field camera with angular and spatial resolution sufficient to accommodate 20/20 human visual sensitivity. This is a large amount of information equivalent to a total horizontal pixel count of 100K to 200K. Such a multi-aperture image sensor must also encompass the entire field of view of a human eye (approximately 160 deg. horizontally and 130 deg. vertically). To image binocular vision, a pair of such cameras across the entire curved surface around each eye is required. Typical image sensor assemblies available today do not meet these requirements.
To overcome these and other limitations of typical image sensors, embodiments of the present disclosure provide an array of custom sized and shaped small image sensors, all assembled on a larger flexible circuit board 2210, the flexible circuit board 2210 being formed into a 3D (e.g., hemispherical) shape. An image sensor (e.g., sensor facet 3735) is mounted to the outside of the flexible circuit board 2210, where another layer containing an array of TFT logic cells (e.g., logic cells 755) may be provided to handle all power and signal management — one logic cell is provided for each display. Each image sensor acts as a discrete unit, passing the readout data to its subsequent logic (in embodiments that include logic), where it is processed and routed accordingly (e.g., in some embodiments, to the corresponding display facet 2665). This allows each
To meet the requirements of resolution, color definition, and luminance output, each microsensor may have a unique high-performance pixel architecture. For example, each microsensor can comprise an array of sensor pixels 1800 as described in FIGS. 18-20 and their associated description in U.S. patent application No. 15/724,027 entitled "Stacked transmissive Pixel structures for Image Sensors," which is hereby incorporated by reference in its entirety. The microsensors can be assembled on the same substrate using any suitable method. Such simultaneous manufacturing using standard semiconductor layering and photolithography processes virtually eliminates the overhead and costs associated with the production and packaging of many individual screens, greatly increasing affordability.
Another characteristic of some embodiments of the distributed multi-aperture camera array 3700 is built-in depth perception based on disparity between different plenoptic primitives. The images produced by the primitives on opposite sides of a given sensor can be used to calculate an offset in image detail, where the offset distance is directly related to the proximity of the detail to the sensor surface. This scene information can be used by the central processor when overlaying any enhanced video signal, resulting in AR/MR content being placed in front of the viewer at the appropriate depth. The information can also be used for various artificial focus blur and depth sensing tasks, including simulating depth of field, spatial edge detection, and other visual effects.
Fig. 37 shows a cross-sectional view of a distributed multi-aperture camera array 3700 according to some embodiments. Fig. 37 is essentially the
In some embodiments, each
In general, each
The array of
The
In addition to having a selectable/controllable resolution, each
Fig. 38-39 show exploded views of the distributed multi-aperture camera array 3700 of fig. 37, in accordance with certain embodiments. As shown in these figures, each
Fig. 42 illustrates a
At
At
Particular embodiments may repeat one or more steps of
Here, "or" is inclusive rather than exclusive, unless otherwise indicated explicitly or by context. Thus, herein, "a or B" means "A, B or both," unless the context clearly indicates otherwise or indicates otherwise by context. Further, "and" are both common and individual unless otherwise indicated explicitly or by context. Thus, herein, "a and B" means "a and B, collectively or individually," unless the context clearly indicates otherwise or indicates otherwise by context.
The scope of the present disclosure includes all changes, substitutions, variations, alterations, and modifications to the exemplary embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of the present disclosure is not limited to the exemplary embodiments described or illustrated herein. Moreover, although the present disclosure describes and illustrates various embodiments herein as including particular components, elements, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would understand. Further, reference in the appended claims to an apparatus or system or a component of an apparatus or system adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function includes the apparatus, system, component, whether or not it or the particular function is activated, enabled, or unlocked, so long as the apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative.
Although the present disclosure describes and illustrates various embodiments herein as including particular components, elements, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would understand.
Further, reference in the appended claims to an apparatus or system or a component of an apparatus or system adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function includes the apparatus, system, component, whether or not it or the particular function is activated, enabled, or unlocked, so long as the apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative.
- 上一篇:一种医用注射器针头装配设备
- 下一篇:具水平存取线的自选择存储器阵列