Observation system for interpupillary distance compensation based on head movement

文档序号:690444 发布日期:2021-04-30 浏览:29次 中文

阅读说明:本技术 基于头部运动的瞳孔间距离补偿的观察系统 (Observation system for interpupillary distance compensation based on head movement ) 是由 S·A·米勒 于 2019-08-02 设计创作,主要内容包括:本发明提供了一种观察系统,其包括:增强现实系统,其至少部分地基于用户的IPD向用户生成视觉呈现;以及IPD补偿器,其基于IPD补偿因子来调节视觉呈现。(The present invention provides an observation system, comprising: an augmented reality system that generates a visual presentation to a user based at least in part on the user's IPD; and an IPD compensator to adjust the visual presentation based on the IPD compensation factor.)

1. A viewing system, comprising:

an interpupillary distance (IPD) detector positionable to detect an IPD of a user and generate IPD data;

a head motion detector device that generates head motion data based on motion of the user's head;

a correlator connected to the IPD detector and the head motion detection device to generate a correlation between the IPD data and the head motion data; and

a storage system connected to the correlator to store the correlation.

2. The observation device of claim 1, further comprising:

an apparatus frame securable to the user's head, the IPD detector and the head movement device secured to the apparatus frame.

3. The viewing device of claim 2, wherein the IPD detector is a camera having a field of capture oriented toward the user's eye.

4. The viewing device of claim 2, wherein the head motion detector comprises one or more accelerometers, gyroscopes, Inertial Measurement Units (IMUs), or cameras.

5. The viewing device of claim 2, wherein the head motion detector determines at least one of a rotation and a position of the user's head.

6. The observation device of claim 2, further comprising:

a bite interface for the user to bite into to securely attach the device frame to the user's head.

7. The viewing device of claim 6, wherein the user can speed up his head as the IPD data is collected.

8. The observation device of claim 1, further comprising:

an IPD compensation factor calculator that calculates an IPD compensation factor based on the correlation.

9. The observation device of claim 1, further comprising:

an augmented reality system that generates a visual presentation to a user based at least in part on the user's IPD; and

an IPD compensator to adjust the visual presentation based on the IPT compensation factor.

10. A viewing system, comprising:

an augmented reality system that generates a visual presentation to a user based at least in part on the user's IPD; and

an IPD compensator to adjust the visual presentation based on an IPD compensation factor.

11. The viewing system of claim 10, further comprising:

a pitch angle detector that detects a pitch angle of the user's head, wherein the IPD compensation factor depends on the pitch angle detected by the pitch angle detector.

12. The viewing system of claim 10, further comprising:

an observation calibration system that directs the user to conduct a series of observation exercises to determine one or more IPD compensation factors.

13. The viewing system of claim 12, further comprising:

an IPD detector positionable to detect an IPD of a user and generate IPD data;

a head motion detector device that generates head motion data based on motion of the user's head;

a correlator connected to the IPD detector and the head motion detection device to generate a correlation between the IPD data and the head motion data; and

a storage system connected to the correlator to store the correlation.

14. The viewing system of claim 13, further comprising:

an apparatus frame securable to the user's head, the IPD detector and the head movement device secured to the apparatus frame.

15. The viewing system of claim 14, wherein the IPD detector is a camera having a field of capture oriented toward the user's eye.

16. The viewing system of claim 14, wherein the head motion detector comprises one or more accelerometers, gyroscopes, Inertial Measurement Units (IMUs), or cameras.

17. The viewing system of claim 14, wherein the head motion detector determines at least one of a rotation and a position of the user's head.

18. The viewing system of claim 14, further comprising:

a bite interface for the user to bite into to securely attach the device frame to the user's head.

19. The viewing system of claim 18, wherein the user can speed up his head as the IPD data is collected.

Technical Field

The present invention relates to connected mobile computing systems, methods, and configurations, and more particularly to mobile computing systems, methods, and configurations featuring at least one wearable component that may be used for virtual and/or augmented reality operations.

Background

Mixed reality or augmented reality near-eye displays are desired to be lightweight, low cost, have a small form factor, have a wide virtual image field of view, and be as transparent as possible. In addition, it is desirable to have a configuration that presents virtual image information in multiple focal planes (e.g., two or more) in order to accommodate a wide variety of use cases without exceeding an acceptable vergence adjustment mismatch tolerance.

Disclosure of Invention

The invention provides an observation system. Including an interpupillary distance (IPD) detector positionable to detect an IPD of a user and generate IPD data, a head motion detector device to generate head motion data based on motion of the user's head, a correlator connected to the IPD detector and the head motion detection device to generate a correlation between the IPD data and the head motion data, and a storage system connected to the correlator to store the correlation.

The observation apparatus may further include: an apparatus frame securable to the user's head, the IPD detector and the head movement device secured to the apparatus frame.

The observation apparatus may further include: the IPD detector is a camera having a field of capture oriented toward an eye of the user.

The observation apparatus may further include: the head motion detector comprises one or more accelerometers, gyroscopes, inertial measurement units IMU, or cameras.

The observation apparatus may further include: the head motion detector determines at least one of a rotation and a position of the user's head.

The observation apparatus may further include: a bite interface for the user to bite into to securely attach the device frame to the user's head.

The observation apparatus may further include: the user can speed up his header as the IPD data is collected.

The observation apparatus may further include: an IPD compensation factor calculator that calculates an IPD compensation factor based on the correlation.

The observation apparatus may further include: an augmented reality system that generates a visual presentation to a user based at least in part on the user's IPD, and an IPD compensator that adjusts the visual presentation based on the IPT compensation factor.

The present invention also provides an observation system comprising: an augmented reality system that generates a visual presentation to a user based at least in part on the user's IPD, and an IPD compensator that adjusts the visual presentation based on an IPD compensation factor.

The observation system may further include: a pitch angle detector that detects a pitch angle of the user's head, wherein the IPD compensation factor depends on the pitch angle detected by the pitch angle detector.

The observation system may further include: an observation calibration system that directs the user to conduct a series of observation exercises to determine one or more IPD compensation factors.

The observation system may further include: an IPD detector positionable to detect an IPD of a user and generate IPD data, a head motion detector device to generate head motion data based on motion of the user's head, a correlator connected to the IPD detector and the head motion detection device to generate a correlation between the IPD data and the head motion data, and a storage system connected to the correlator to store the correlation.

The observation system may further include: an apparatus frame securable to the user's head, the IPD detector and the head movement device secured to the apparatus frame.

The observation system may further include: the IPD detector is a camera having a field of capture oriented toward an eye of the user.

The observation system may further include: the head motion detector comprises one or more accelerometers, gyroscopes, inertial measurement units IMU, or cameras.

The observation system may further include: the head motion detector determines at least one of a rotation and a position of the user's head.

The observation system may further include: a bite interface for the user to bite into to securely attach the device frame to the user's head.

The observation system may further include: the user can speed up his header as the IPD data is collected.

Drawings

The invention is further described, by way of example, with reference to the accompanying drawings, in which:

FIG. 1 is a schematic diagram illustrating an augmented reality viewing system;

FIG. 2 is a schematic diagram of a user showing various head movements of the user and changes in the user's interpupillary distance (IPD);

FIG. 3 is a view similar to FIG. 2, with the user tilting their head in an upward direction;

FIGS. 4A and 4B are perspective views showing a user with an experimental apparatus for detecting IPD compensation based on head motion;

FIG. 5 is a graph showing IPD compensation factors with respect to head pitch;

FIG. 6A is a flow chart illustrating IPD head rotation compensation;

FIG. 6B is a flow chart illustrating IPD head rotation compensation;

FIG. 6C is a flow chart illustrating IPD head rotation compensation;

FIG. 7 is a partial top view and partial block diagram of an augmented reality system; and

fig. 8 is a top view of an augmented reality system showing its IPD compensation feature.

Detailed Description

Referring to fig. 1, an augmented reality system is shown featuring a head-mounted viewing assembly (2), a handheld controller assembly (4), and an interconnected auxiliary computing or controller assembly (6) that can be configured to be worn on a user as a belt pack or the like. Each of these components may be operatively coupled (10, 12, 14, 16, 17, 18) to each other and to other connected resources (8), such as cloud computing or cloud storage resources, via wired or wireless communication configurations, such as those specified by IEEE 802.11, bluetooth (RTM), and other connection standards and configurations. For example, as described in U.S. patent application serial nos. 14/555,585, 14/690,401, 14/331,218, 15/481,255, and 62/518,539, each of which is incorporated herein by reference in its entirety, various aspects of such components are described, such as various embodiments of two depicted optical elements (20) through which a user can see the world around them, as well as visual components that can be produced by associated system components, for an augmented reality experience. In various embodiments, such as many of the embodiments described in the aforementioned referenced patent applications, one or more components may feature devices or sub-components, such as accelerometers, gyroscopes, potentiometers, integrated inertial measurement units ("IMUs"), and cameras, which are used to determine or estimate the position and/or orientation of mutually coupled user body parts, such as the position or orientation of a user's head when coupled to an instrumented head mounted viewing component (2), and to facilitate linear and/or angular determination of its velocity and/or acceleration. In various embodiments, such as many of the embodiments described in the aforementioned referenced patent applications, it may be valuable for the system to utilize an individual user's interpupillary distance ("IPD") as at least one input in presenting visual information related to an augmented or virtual reality experience to such a user. In various embodiments, it is convenient to simply measure the user's IPD and provide this information as user input to the system prior to using the associated system; in other embodiments, it may be that the system is configured to utilize an inward (i.e., toward the user's eyes) device such as a camera to automatically determine the user's IPD information before and/or during the runtime of various applications or presented information. As discussed in further detail below, while utilizing various embodiments of an augmented reality system and associated testing apparatus, we have determined that as such users variously rotate or reorient their heads relative to the rest of their body and their surrounding environment, various users may benefit from compensation or adjustment of the positioning of the presented augmented reality information. For example, in one embodiment, it may be valuable to have a compensation factor that slightly changes the z-axis position of the presented augmented reality information (i.e., straight out of the plane of the user's face) as the pitch position of the user's head. This may be related to at least some of these users experiencing an actual or functional (functional) change in IPD as they change the pitch of their heads, yaw their heads sideways, or even roll their heads (i.e., such as around a z-axis extending vertically from their noses). In one embodiment, changes in the IPD information input to the correlation calculation, which may be related to head orientation (such as in the form of an equation or lookup table relating IPD adjustment factors or compensation to head orientation), may be utilized as compensation variables to generate the presented augmented reality information to the user in such configurations.

Referring to fig. 2, there is shown a representation of a user's body (30) with an attached head (32), the user being located within a room comprising fixed walls (22, 24), a floor (28) and a ceiling (26), for example, these may be associated with the global coordinate system of such a room featuring X, Y, Z cartesian coordinates (40, 42, 44 respectively). Another coordinate system may be associated with the user's head (32) such that the Z-axis (38) is approximately straight from the face, and the X (34) and Y (36) axes are orthogonal to the Z-axis (38), as shown in fig. 2. The user's head (32) is oriented such that the Z-axis (38) is substantially parallel to the Z-axis of the room global coordinate system (44) and the gaze vectors (52, 54) from the user's eyes are focused on a target (46), which target (46) may be virtual or actual, the target (46) being at the location of the left wall (22), at a location that has a horizontal eye gaze (52, 54) substantially parallel to the Z-axis (38) of the user's head, which location is substantially horizontal to the floor, as described above in this example, or substantially parallel to the Z-axis (44) of the room coordinate system. From such a position, the user may tilt their head down towards the floor, or up towards the ceiling. The depicted position may be considered a zero rotation position, for example, a typical person can pitch down to about-90 degrees toward the floor and up to about +90 degrees toward the ceiling. In the zero rotation position, the IPD (50) may be measured manually and/or automatically using aspects of the augmented reality system wearable assembly (2).

Referring to fig. 3, the same user (30, 32) is shown with the user's head rotated (56) up to a pitch of about +50 degrees relative to the plane of the floor (28; or Z-axis of the room coordinate system 44) and the gaze of the user's eyes (52, 54) of the augmented reality system wearable assembly (2) directed at the second target (48). In such a rotational configuration, aspects of the augmented reality system wearable assembly (2) may be used to manually and/or automatically measure IPD. In laboratory experiments using such a configuration for different subject users, we found that the IPD (50) varies as the head pitch angle varies.

Using experimental apparatus such as shown in fig. 4A and 4B, we have collected data on linear and/or rotational position (i.e., relative to the room or ambient environment), linear and/or rotational velocity (i.e., relative to the room or ambient environment), and linear and/or rotational acceleration (i.e., relative to the room or ambient environment) from repositioning of the head of various users. The depicted apparatus includes a high resolution camera (62) having a capture field oriented toward a user's eye (70) such that the user's IPD may be measured from video information captured by mutually coupled (64) computing systems, and may also include one or more angular or linear motion measurement devices, such as one or more accelerometers, gyroscopes, IMUs, or cameras, which may be operatively coupled to the apparatus frame (68) and configured for determining rotation/position based on images captured from the surrounding environment (i.e., such as "head pose" determination based on computer vision techniques). An apparatus frame (68) to which the camera device (62) is fixedly coupled is removably coupled to the user's head using a bite interface (66) for the user to bite into so that the user can relatively easily move and accelerate his head when acquiring data related to his eyes and IPD. Referring to fig. 5, a graph (72) of sample data from a set of user subjects is shown featuring a plot (76) of diopter error versus head pitch angle; also shown is a polynomial equation that can be mathematically fit to the sample data (74) that can be used as an IPD compensation factor (zero pitch as shown in fig. 2; pitch in degrees; -90 is the user looking generally vertically down at the floor; and +90 is the user looking generally straight up at the ceiling). It can be seen that in the sample experimental data shown in figure 5, the diopter error generally increases slightly as the user's head is pitched from-90, gradually pitching towards 0, and then up to + 90. An associated IPD compensation factor (74) developed for the sample experimental data can be used as an input to the augmented reality system such that focus is maintained, for example, during pitch rotation of the user's head according to the sample.

Such as the apparatus shown in fig. 4A and 4B, or a virtual or augmented reality system with appropriate components, such as the system shown in fig. 1 and 2, may be used to obtain not only information on the relationship between the measured IPD and the head pitch angle position, but also the linear and/or angular velocity of the pitch relative to the surroundings, and the linear and/or angular acceleration of the pitch relative to the surroundings. Further, such relationships may be determined for other axes, such as orthogonal yaw axes or orthogonal roll axes. We have experimentally seen the change in eye positioning associated with the change in position, velocity and acceleration about each of these axes.

Referring to fig. 6A, a configuration without IPD head rotation compensation is shown, where, for example, a user wears a calibrated (i.e., with initial input or determination of IPD) augmented reality system (80). When a user is gazing at a first target in space (82), the system is configured to generate a visual presentation or portion thereof related to the user gazing at the first target based at least in part on the IPD of the user (84). The user may change gaze to a second target (86), and the system may be similarly configured to generate a visual presentation or portion thereof related to the user gazing at the second target (88), again based at least in part on the user's IPD (i.e., head position or rotation related variables have not been compensated for).

Referring to fig. 6B, a compensation configuration is shown in which, after initial calibration (80) and gazing at the first target (82), the system is configured to generate a visual presentation, or portion thereof, related to the user gazing at the first target based at least in part on the IPD compensating for head orientation, such as the head pitch angle when viewing the first target, as determined by the system. Then, if the user changes gaze to a second target (86), the system is configured to generate a visual presentation, or portion thereof, related to the user gazing at the second target based at least in part on the IPD compensating for head orientation (such as the head pitch angle when viewing the second target, as determined by the system).

Referring to FIG. 6C, the system itself may be used to develop one or more compensation relationships for a particular user. As shown in fig. 6C, the user may wear a calibrated augmented reality system (e.g., the IPD determined by the system at a horizontal head pitch angle, such as the IPD associated with gazing to infinity on a substantially horizontal line of sight) (100). To determine any change in the user's visual system with various positions, angular or linear velocities, or angular or linear accelerations in the actual IPD or functional IPD, the observation calibration system may direct the user to make a series of observation exercises (i.e., where he positions, accelerates his head, and where the system is configured to capture data (102) related to the actual and/or functional IPD. the system may be configured to determine the user's IPD compensation configuration (such as a look-up table or one or more mathematical relationships) that may change (104) with various positions, angular or linear velocities, or angular or linear accelerations, to complete the user's IPD compensation configuration (106). then, when the user gazes at a first target (82) in space, the system may be configured to based at least in part on the position, linear acceleration, or linear acceleration of the user's head with respect to the first target A user IPD compensation configuration of velocity and/or acceleration (angular and/or cartesian) is configured to generate a visual presentation or portion thereof related to the user gazing at the first target (108). Then, while the user is gazing at the second target (86), the system may be configured to generate a visual presentation or portion thereof related to the user gazing at the second target based at least in part on the user IPD compensation configuration for the position, velocity and/or acceleration (angle and/or cartesian) of the user's head gazing at the second target (110).

Fig. 7 illustrates the augmented reality system 142 in more detail. The system 142 includes a stereo analyzer 144 that is connected to the rendering engine 130 and forms part of the visual data and algorithms.

System 142 further includes left and right projectors 166A and 166B and left and right waveguides 170A and 170B. The left projector 166A and the right projector 166B are connected to a power supply. Each projector 166A and 166B has a respective input for image data to be provided to the respective projector 166A or 166B. The respective projector 166A or 166B generates and emits light in a two-dimensional pattern when energized. Left waveguide 170A and right waveguide 170B are positioned to receive light from left projector 166A and right projector 166B, respectively. The left waveguide 170A and the right waveguide 170B are transparent waveguides.

In use, a user mounts the head-mounted frame 140 to their head. The components of the head-mounted frame 140 may, for example, include straps (not shown) that wrap around the back of the user's head. Left and right waveguides 170A and 170B are then positioned in front of the user's left and right eyes 220A and 220B.

The rendering engine 130 inputs the image data it receives into the stereo analyzer 144. The image data is projected onto a plurality of virtual planes. Stereo analyzer 144 analyzes the image data to determine a left image data set and a right image data set based on the image data for projection onto each depth plane. The left image dataset and the right image dataset are datasets representing a two-dimensional image which gives the user a sense of depth in the form of a three-dimensional projection.

Stereo analyzer 144 inputs the left image data set and the right image data set into left projector 166A and right projector 166B. Left projector 166A and right projector 166B then create left and right illumination patterns. The components of the system 142 are shown in plan view, but it should be understood that the left and right patterns are two-dimensional patterns when shown in elevation view. Each light pattern includes a plurality of pixels. For purposes of illustration, light rays 224A and 226A from two pixels are shown exiting left projector 166A and entering left waveguide 170A. Light rays 224A and 226A reflect from the sides of the left waveguide 170A. Rays 224A and 226A are shown propagating through internal reflection from left to right within the left waveguide 170A, but it should be understood that rays 224A and 226A also propagate into the paper in certain directions using a refraction and reflection system.

Rays 224A and 226A exit left light guide 170A through pupil 228A and then enter left eye 220A through pupil 230A of left eye 220A. Then, the light rays 224A and 226A fall on the retina 232A of the left eye 220A. In this manner, the left light pattern falls on the retina 232A of the left eye 220A. The perception to the user is that the pixels formed on retina 232A are pixels 234A and 236A that the user perceives as being at a distance on the side of left waveguide 170A opposite left eye 220A. Depth perception is created by manipulating the focal length of the light.

In a similar manner, stereo analyzer 144 inputs the right image dataset into right projector 166B. The right projector 166B transmits a right light pattern represented by pixels in the form of rays 224B and 226B. Rays 224B and 226B reflect within right waveguide 170B and exit through pupil 228B. Light rays 224B and 226B then enter through pupil 230B of right eye 220B and fall on retina 232B of right eye 220B. The pixels of light rays 224B and 226B are perceived as pixels 134B and 236B behind right waveguide 170B.

The patterns created on retinas 232A and 232B are perceived as left and right images, respectively. The left and right images are slightly different from each other due to the function of the stereo analyzer 144. The left and right images are perceived as a three-dimensional rendering in the user's mind.

As described above, the left waveguide 170A and the right waveguide 170B are transparent. Light from a real object, such as a table 116 on a side of the left and right waveguides 170A and 170B opposite the eyes 220A and 220B, may be projected through the left and right waveguides 170A and 170B and fall on the retinas 232A and 232B.

Fig. 8 shows more details of the device 142 as it relates to IPD compensation as described previously. The apparatus further includes an IPD camera 302 serving as an IPD detector, a world camera 304 and IMU 306 detecting head motion, a correlator 308, a storage system 310, an IPD compensation factor calculator 312, an IPD compensator 314 and an observation calibration system 316. The correlator 308 is connected to the IPD camera 302, the world camera 304 and the IMU 306. A correlator 308 correlates the head motion data from the world camera 304 and IMU 306 with the IPD data from the IPD camera 302. The storage system 310 is connected to the correlator 308 and stores the correlations generated by the correlator 308. The IPD compensation factor calculator 312 calculates an IPD compensation factor. The IPD compensator 314 is connected to the IPD compensation factor calculator 312, and the rendering engine 130 is connected to the IPD compensator 314. The IPD compensator 314 modifies the visualization created by the rendering engine 130 based on the IPD compensation factor calculator 312.

The observation calibration system 316 prompts the user through a series of visual tests to generate one or more IPD compensation factors, such as IPD compensation factor calculator 312.

While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since modifications may occur to those ordinarily skilled in the art.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:自动手术机器人视觉系统中多模态感测深度的系统和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类