Head-mounted visual equipment and eyeball tracking system for same

文档序号:1390374 发布日期:2020-02-28 浏览:10次 中文

阅读说明:本技术 头戴式可视设备及用于头戴式可视设备的眼球追踪系统 (Head-mounted visual equipment and eyeball tracking system for same ) 是由 张扣文 郭美杉 宋立通 唐溦 谢剑 于 2018-08-20 设计创作,主要内容包括:头戴式可视设备及用于所述头戴式可视设备的眼球追踪系统,其包括一屏幕,用于投射一虚拟场景图像至使用者的眼球,一VR(Virtual Reality)光学透镜,用于在所述屏幕和该使用者的眼球之间构建一光学路径;以及,一眼球追踪系统,用于检测该使用者的眼球的视线方向。所述眼球追踪系统包括:至少一光源,用于投射一检测光至该使用者的眼球;以及,一接收模组,用于接收自该使用者的眼球处所反射的该检测光,其中,所述接收模组位于所述VR光学透镜的侧部,并朝向该使用者的眼球,以使得自该使用者的眼球处所反射的该检测光直接被所述接收模组所接收。这样,所述眼球追踪系统的光路不经过VR光学透镜,以简化所述头戴式可视设备的整体光路系统设计,利于实施。(The head-mounted visual equipment comprises a screen, a VR (virtual reality) optical lens and a visual tracking system, wherein the screen is used for projecting a virtual scene image to the eyeball of a user; and an eyeball tracking system for detecting the sight line direction of the eyeballs of the user. The eye tracking system comprises: at least one light source for projecting a detection light to the eyeball of the user; and the receiving module is used for receiving the detection light reflected from the eyeball of the user, wherein the receiving module is positioned on the side part of the VR optical lens and faces the eyeball of the user, so that the detection light reflected from the eyeball of the user is directly received by the receiving module. Therefore, the light path of the eyeball tracking system does not pass through the VR optical lens, so that the design of the whole light path system of the head-wearing visual equipment is simplified, and the implementation is facilitated.)

1. A head-mounted visualization device, comprising:

a screen for projecting a virtual scene image to the eyes of a user,

a VR (virtual reality) optical lens for establishing an optical path between the screen and the eyeball of the user so that the virtual scene image projected by the screen can reach the eyeball of the user through the VR optical lens; and

an eye tracking system for detecting a visual direction of an eye of the user to adjust a display position of the virtual scene image on the screen based on the visual direction, wherein the eye tracking system comprises:

at least one light source for projecting a detection light to the eyeball of the user; and

the receiving module is used for receiving the detection light reflected from the eyeball of the user so as to detect the sight line direction of the eyeball of the user, wherein the receiving module is positioned on the side part of the VR optical lens and faces the eyeball of the user, so that the detection light reflected from the eyeball of the user is directly received by the receiving module.

2. The head-mounted visual equipment according to claim 1, wherein the optical lens comprises at least one optical lens and a light-sensing chip, wherein the optical lens is used for receiving the detecting light reflected by the eyeball of the user, and an included angle is formed between a plane defined by the light-sensing chip and a plane defined by the at least one optical lens.

3. The head-mounted visualization apparatus according to claim 2, wherein a size of an angle between a plane defined by the photo sensor chip and a plane defined by the at least one optical lens is determined by an angle between an object optical axis defined by the user's eyeball and a photo-sensing optical axis defined by the receiving module and an optical parameter of the at least one optical lens.

4. The head-mounted vision apparatus of claim 3, wherein an angle between an object optical axis defined by the user's eyeball and a photosensitive optical axis defined by the receiving module is determined by a distance between the user's eyeball and the VR optical lens, a predetermined diameter of the user's eyeball, and a distance between the VR optical lens and the receiving module.

5. The head-mounted visual equipment according to claim 4, wherein the included angle between the object optical axis set by the user's eyeball and the photosensitive optical axis set by the receiving module is in the range of: 25.0 to 40.0 degrees.

6. The head-mounted visualization apparatus as recited in claim 5, wherein an angle between the optical axis defined by the user's eyeball and the optical axis defined by the receiving module is set to be 32 °.

7. The head-mounted visual device of claim 6, wherein an angle between a plane defined by the photosensitive chip and a plane defined by the at least one optical lens is set to be 20 °.

8. The head-mounted visual apparatus of any one of claims 1-7, wherein said at least one optical lens of said receiving module is implemented as a monolithic aspheric optical lens.

9. The head-mounted visualization device of claim 8, wherein the eye tracking system is integrated with the VR optical lens such that the VR optical lens and the eye tracking system have a unitary structure.

10. The head-mounted visualization device of claim 8, wherein the at least one light source comprises 8 light sources, wherein the 8 light sources are circumferentially arranged around the VR optical lens for projecting the detection light to the user's eyeball.

11. The head-mounted visualization device according to claim 10, wherein each of the light sources comprises a plurality of optical fibers and a non-visible light source, the plurality of optical fibers are respectively connected to the non-visible light source, so that the detection light is generated at each optical fiber of the plurality of optical fibers after the non-visible light source is conducted.

12. The head-mounted visual apparatus of any one of claims 1-7, wherein the screen is movable relative to the VR optical lens to adjust the diopter of the user's eyeball by changing the distance between the screen and the VR optical lens.

13. An eye tracking system for a head-mounted visual device, comprising:

at least one light source for projecting a detection light to the eyeball of the user; and

the receiving module is used for receiving the detection light reflected from the eyeball of the user so as to detect the sight line direction of the eyeball of the user, wherein the receiving module is positioned on the side part of the VR optical lens of the head-mounted visual equipment and faces the eyeball of the user, so that the detection light reflected from the eyeball of the user is directly received by the receiving module.

14. The eye tracking system according to claim 13, wherein the receiving module comprises an optical lens and a light sensor, wherein the optical lens comprises at least one optical lens for receiving the detecting light reflected by the user's eyes, and an included angle is formed between a plane defined by the light sensor and the plane defined by the at least one optical lens.

15. The eye tracking system according to claim 14, wherein the size of the angle between the plane defined by the photo sensor chip and the plane defined by the at least one optical lens is determined by the angle between the object optical axis defined by the user's eye and the photo sensor optical axis defined by the receiving module and the optical parameters of the at least one optical lens.

16. The eye tracking system of claim 15, wherein an angle between the object axis defined by the user's eye and the photosensitive axis defined by the receiving module is determined by a distance between the user's eye and the VR optical lens, a predetermined diameter of the user's eye, and a distance between the VR optical lens and the receiving module.

17. The eye tracking system according to claim 16, wherein an angle between the optical axis defined by the user's eyes and the optical axis defined by the receiving module is set to 32 °, and an angle between the plane defined by the photo sensor chip and the plane defined by the at least one optical lens is set to 20 °.

18. The eye tracking system according to any one of claims 13-17, wherein said at least one optical lens of said receiving module is implemented as a single-piece aspheric optical lens.

19. The eye tracking system according to claim 18, wherein the at least one light source comprises 8 light sources, wherein 8 light sources are circumferentially disposed around the VR optical lens of the head mounted vision device for projecting the detection light to the user's eye, wherein each light source comprises a plurality of optical fibers and a non-visible light source, and the plurality of optical fibers are respectively connected to the non-visible light source, so that the detection light is generated at each optical fiber of the plurality of optical fibers after the non-visible light source is turned on.

20. An eye tracking method for a head-mounted visual device, comprising:

projecting a detection light to an eyeball of a user; and

the detection light reflected from the user's eyeball is received by a receiving module to detect the sight direction of the user's eyeball, wherein the receiving module is located at the side of the VR optical lens of the head-mounted visual equipment and faces the user's eyeball, so that the detection light reflected from the user's eyeball is directly received by the receiving module.

21. A diopter adjustment method for a head-mounted visual device, comprising:

the screen is moved to adjust the distance between the screen and the VR optical lens to change the diopter of the user's eyeball.

22. The diopter adjustment method of claim 21, wherein an eyeball tracking system is integrated with the VR optical lens such that the VR optical lens and the eyeball tracking system have a unitary structure.

Technical Field

The present invention relates to the field of virtual reality, and in particular, to a head-mounted visual device for implementing virtual reality and an eyeball tracking system for the same.

Background

In recent years, virtual Reality (virtual Reality) and Augmented Reality (Augmented Reality) have created a unique sensory experience for humans. Virtual reality is an interactive experience with computer-generated virtual scenes in a simulated environment. Immersive simulated environments can resemble or depart from the real world, creating sensory experiences that are not available in the ordinary physical real world. Currently, numerous VR technology-related products are emerging on the market, through which users can immerse and interact with three-dimensional stereo space vision.

The most common VR product is a Head-Mounted visual device (Head-Mounted Display) that is shaped like eyeglasses. When in use, a user wears the device on the head for virtual reality experience. Mainstream support technologies for head-mounted visual devices include: SLAM algorithms and eye tracking techniques. The SLAM algorithm (Simultaneous Localization and mapping) mainly plays a role in constructing the immersive virtual environment, and the technical core of the SLAM algorithm lies in synchronous positioning and mapping. For the entire virtual environment constructed by the SLAM algorithm, it should be ensured that the virtual environment can be observed by human eyes at the display position of the head-mounted display device. This is the core goal of eye tracking technology: the display position of the virtual environment is adjusted by detecting the sight line direction of the human eyes, thereby ensuring that the human eyes can observe the virtual environment image.

However, there are still a number of technical challenges in the implementation of head-mounted visual devices, especially the design of the optical system of the head-mounted visual device. These technical difficulties seriously affect the user experience and restrict the wider application of the head-mounted visual device in the actual industry.

Disclosure of Invention

The main objective of the present invention is to provide a head-mounted visual device and an eye tracking system for the same, wherein the optical path of the eye tracking system does not pass through a VR optical lens, so as to simplify the design of the whole optical path system of the head-mounted visual device, which is beneficial to implementation.

Another objective of the present invention is to provide a head-mounted vision device and an eye tracking system for the same, wherein the optical path of the eye tracking system does not pass through the VR optical lens, so that the VR optical lens only needs to optically process the image projected by the screen without considering the influence on the optical path of the eye tracking system. In other words, the optical parameters of the VR optical lens are relatively uniform, which facilitates the optical design and processing of the VR lens.

Another objective of the present invention is to provide a head-mounted vision apparatus and an eye tracking system for the same, wherein the optical path of the eye tracking system does not pass through the VR optical lens. In other words, the VR optical lens and the eye tracking system are independent from each other, so as to facilitate stability of the overall performance of the head-mounted visual device.

Another objective of the present invention is to provide a head-mounted visual device and an eye tracking system for the same, wherein the detection light for implementing eye tracking is directly sensed by the receiving module for implementing eye tracking after being diffusely reflected at the human eye, so that compared with the existing eye tracking system for the head-mounted visual device, a reflector for reflecting the detection light is omitted, thereby simplifying the optical system of the eye tracking system and saving the cost.

Another objective of the present invention is to provide a head-mounted visual device and an eyeball tracking system for the same, wherein an included angle is formed between a plane of a photosensitive chip of the receiving module and a plane of an optical lens of the receiving module. In other words, the plane where the photosensitive chip is located and the plane where the optical lens is located are obliquely arranged, so that the optical design requirement of the receiving module is simplified in an 'imaging plane inclination' mode, and the imaging quality is improved.

Another objective of the present invention is to provide a head-mounted visual device and an eye tracking system for the same, wherein the "relative illuminance" can be compensated by the "inclination of the imaging plane".

Another object of the present invention is to provide a head-mounted vision device and an eye tracking system for the same, wherein the light source for projecting the detection light has a relatively small size in the eye tracking system, so as to prevent the size of the light source from adversely affecting the observation of the human eye. In other words, the range of the blind zone of the human eye is reduced by the light source with smaller size.

Another objective of the present invention is to provide a head-mounted vision device and an eye tracking system for the head-mounted vision device, wherein in an embodiment of the present invention, the eye tracking system is integrated with the VR optical lens, so as to ensure the optical stability of the eye tracking system and the VR optical lens through the stable relationship between the eye tracking system and the VR optical lens.

Another object of the present invention is to provide a head-mounted vision device and an eye tracking system for the same, wherein in an embodiment of the present invention, the eye tracking system is integrated with the VR optical lens, so as to eliminate errors caused during the assembly process, reduce the weight and facilitate the later maintenance.

Another object of the present invention is to provide a head-mounted visual device and an eye tracking system for the same, wherein, in an embodiment of the present invention, the screen of the head-mounted visual device is movable relative to the VR optical lens to adjust the diopter of the human eye by adjusting the distance between the screen and the VR optical lens, so as to ensure the user experience.

Other advantages and features of the invention will become apparent from the following description and may be realized by means of the instrumentalities and combinations particularly pointed out in the appended claims.

To achieve at least one of the above objects or advantages, the present invention provides a head-mounted visual device, including:

a screen for projecting a virtual scene image to the eyes of a user,

a VR (virtual reality) optical lens for establishing an optical path between the screen and the eyeball of the user so that the virtual scene image projected by the screen can reach the eyeball of the user through the VR optical lens; and

an eye tracking system for detecting a visual direction of an eye of the user to adjust a display position of the virtual scene image on the screen based on the visual direction, wherein the eye tracking system comprises:

at least one light source for projecting a detection light to the eyeball of the user; and

the receiving module is used for receiving the detection light reflected from the eyeball of the user so as to detect the sight line direction of the eyeball of the user, wherein the receiving module is positioned on the side part of the VR optical lens and faces the eyeball of the user, so that the detection light reflected from the eyeball of the user is directly received by the receiving module.

In an embodiment of the present invention, the optical lens includes at least one optical lens and a light sensing chip, wherein the optical lens is configured to receive the detection light reflected by the eyeball of the user, and an included angle is formed between a plane set by the light sensing chip and a plane set by the at least one optical lens.

In an embodiment of the invention, a size of an included angle between the plane set by the photo sensor chip and the plane set by the at least one optical lens depends on an included angle between an object optical axis set by an eyeball of the user and the photo sensor optical axis set by the receiving module and an optical parameter of the at least one optical lens.

In an embodiment of the invention, an included angle between the object optical axis set by the user's eyeball and the photosensitive optical axis set by the receiving module is determined by a distance between the user's eyeball and the VR optical lens, a preset diameter of the user's eyeball, and a distance between the VR optical lens and the receiving module.

In an embodiment of the present invention, the range of the included angle between the object optical axis set by the user's eyeball and the photosensitive optical axis set by the receiving module is: 25.0 to 40.0 degrees.

In an embodiment of the present invention, an included angle between the optical axis set by the user's eyeball and the optical axis set by the receiving module is set to be 32 °.

In an embodiment of the invention, an included angle between a plane defined by the photo sensor chip and a plane defined by the at least one optical lens is set to be 20 °.

In an embodiment of the invention, the at least one optical lens of the receiving module is implemented as a single-piece aspheric optical lens.

In an embodiment of the invention, the eye tracking system is integrated with the VR optical lens, so that the VR optical lens and the eye tracking system have an integrated structure.

In an embodiment of the invention, the at least one light source includes 8 light sources, wherein the 8 light sources are circumferentially disposed on a periphery of the VR optical lens for projecting the detection light to an eyeball of the user.

In an embodiment of the invention, each of the light sources includes a plurality of optical fibers and a non-visible light source, and the plurality of optical fibers are respectively connected to the non-visible light source, so that the detection light is generated at each optical fiber of the plurality of optical fibers after the non-visible light source is conducted.

In an embodiment of the invention, the screen is movable relative to the VR optical lens to adjust the diopter of the user's eyeball by changing the distance between the screen and the VR optical lens.

According to another aspect of the present invention, there is also provided an eye tracking system for a head-mounted visual device, comprising:

at least one light source for projecting a detection light to the eyeball of the user; and

the receiving module is used for receiving the detection light reflected from the eyeball of the user so as to detect the sight line direction of the eyeball of the user, wherein the receiving module is positioned on the side part of the VR optical lens of the head-mounted visual equipment and faces the eyeball of the user, so that the detection light reflected from the eyeball of the user is directly received by the receiving module.

In an embodiment of the invention, the receiving module includes an optical lens and a light sensing chip, wherein the optical lens includes at least one optical lens for receiving the detecting light reflected by the eyeball of the user, and an included angle is formed between a plane set by the light sensing chip and the plane set by the at least one optical lens.

In an embodiment of the invention, a size of an included angle between the plane set by the photosensitive chip and the plane set by the at least one optical lens depends on an included angle between an object optical axis set by the eyeball of the user and the photosensitive optical axis set by the receiving module and an optical parameter of the at least one optical lens.

In an embodiment of the invention, an included angle between the object optical axis set by the user's eyeball and the photosensitive optical axis set by the receiving module is determined by a distance between the user's eyeball and the VR optical lens, a preset diameter of the user's eyeball, and a distance between the VR optical lens and the receiving module.

In an embodiment of the invention, an included angle between the optical axis set by the eyeball of the user and the optical axis set by the receiving module is set to be 32 °, and an included angle between the plane set by the photosensitive chip and the plane set by the at least one optical lens is set to be 20 °.

In an embodiment of the invention, the at least one optical lens of the receiving module is implemented as a single-piece aspheric optical lens.

In an embodiment of the invention, the at least one light source includes 8 light sources, wherein the 8 light sources are circumferentially disposed around a periphery of a VR optical lens of the head-mounted visual device and are used for projecting the detection light to an eyeball of the user, and each light source includes a plurality of optical fibers and a non-visible light source, and the plurality of optical fibers are respectively communicated with the non-visible light source, so that the detection light is respectively generated at each optical fiber of the plurality of optical fibers after the non-visible light source is conducted.

According to another aspect of the present invention, the present invention also provides an eye tracking method for a head-mounted visual device, comprising:

projecting a detection light to an eyeball of a user; and

the detection light reflected from the user's eyeball is received by a receiving module to detect the sight direction of the user's eyeball, wherein the receiving module is located at the side of the VR optical lens of the head-mounted visual equipment and faces the user's eyeball, so that the detection light reflected from the user's eyeball is directly received by the receiving module.

According to another aspect of the present invention, the present invention also provides a diopter adjustment method for a head-mounted visual device, including:

the screen is moved to adjust the distance between the screen and the VR optical lens to change the diopter of the user's eyeball.

In an embodiment of the invention, the eye tracking system is integrated with the VR optical lens, so that the VR optical lens and the eye tracking system have an integrated structure.

Further objects and advantages of the invention will be fully apparent from the ensuing description and drawings.

These and other objects, features and advantages of the present invention will become more fully apparent from the following detailed description, the accompanying drawings and the claims.

Drawings

Fig. 1 is a schematic view of an optical path system of a conventional head-mounted visual device.

Fig. 2 is a schematic diagram of a conventional head-mounted visual apparatus in which an LED light source in an eye tracking light path projects detection light to an eye of a user.

Fig. 3 is a schematic diagram of a process of adjusting eye diopter of a conventional head-mounted visual device.

Fig. 4 is a schematic diagram illustrating an optical system of the head-mounted visual device according to a preferred embodiment of the invention.

Fig. 5 is a schematic diagram illustrating a relative position relationship between an ideal imaging surface of the receiving module and a plane where the photosensitive chip is located when the photosensitive chip of the receiving module and the photosensitive optical axis Y are arranged in a perpendicular relationship.

Fig. 6 and 7 are schematic diagrams illustrating a specific optical system design of the eye tracking system according to the preferred embodiment of the invention.

Fig. 8 and 9 are schematic diagrams illustrating the at least one light source of the eye tracking system according to the preferred embodiment of the invention.

FIG. 10 is a schematic view illustrating a process of adjusting diopter of an eyeball of the head-mounted visual equipment according to the preferred embodiment of the invention

Detailed Description

The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.

It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be constructed and operated in a particular orientation and thus are not to be considered limiting.

It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.

Summary of the application

As described above, the main support technologies of the head-mounted visual device are: slam (simultaneouslocalization and mapping) algorithm and eye tracking technique. However, there are technical difficulties in the specific implementation of the head-mounted visual device, especially the design of the optical system of the head-mounted visual device.

Fig. 1 is a schematic view of an optical path system of a conventional head-mounted visual device. As shown in fig. 1, the optical path system of the head-mounted visual device is mainly composed of a virtual scene imaging optical path and an eyeball tracking optical path. The path of the imaging light path of the virtual scene is as follows: the screen 1P-VR optical lens 2P-human eye 3P. In operation, the virtual scene image projected by the screen 1P passes through the VR optical lens 2P and reaches the eyeball 3P of the user, so as to allow the user to observe and interact with the virtual scene image. The path of the eyeball tracking light path is as follows: the light source 4P-human eye 3P-VR optical lens 2P-reflector 5P-receiving module 6P. The function of the eye tracking optical path is to detect the direction of the user's sight line, so as to adjust the display position of the virtual scene image on the screen 1P based on the direction, so as to ensure that the virtual scene image is located in the region where the human eyes can observe. However, there are many drawbacks to such an arrangement of the optical system.

First, both the virtual scene imaging optical path and the eye tracking optical path pass through the VR optical lens 2P. In other words, for the VR optical lens 2P, it is not only necessary to modulate visible light but also to modulate non-visible light (the light of the virtual scene projection optical path is visible light, and the light of the eyeball tracking optical path is non-visible light). Here, those skilled in the art should understand that when the optical element needs to modulate light waves of different wavelength bands to different degrees, the optical design is difficult and the structure is complex. Even if the VR optical lens 2P can be prepared by a complicated process, the stability of the performance of an optical system constituted by the VR optical lens 2P is relatively poor. It is not difficult to imagine that when the position of the VR optical lens 2P is shifted due to vibration or other unexpected factors, the performance of both the virtual scene imaging optical path and the eye tracking optical path is affected.

Next, the light source 4P of the conventional eye tracking system is usually an led (light Emitting diode) light source, which is circumferentially installed on the periphery of the VR optical lens 2P for projecting the detection light to the user's eyes. Since the LED light sources are mounted on the periphery of the VR optical lens 2P, i.e. the LED light sources are located in the viewing direction of the human eye, the arrangement position of the LED light sources on the VR optical lens 2P will limit the viewing range of the human eye. In other words, a blind visual zone of the human eye is formed. Fig. 2 is a schematic diagram of the conventional head-mounted visual equipment in which the LED light source in the eye tracking light path projects the detection light to the user's eye 3P. As shown in fig. 2, the visual range of the user is limited to the shadow area formed by the LED light source, and the area outside the shadow cannot be observed by human eyes. Here, the adverse effect of the LED light source interfering with the visual experience will increase due to the relatively large size of the LED light source.

Furthermore, the eye tracking optical path of the existing head-mounted visual device is: the light source 4P-human eye 3P-VR optical lens 2P-reflector 5P-receiving module 6P. In other words, in order to ensure the detection performance of the eye tracking system, the arrangement position of the light source 4P, the relative position relationship between the VR optical lens 2P and the reflector 5P, and the relative position relationship between the reflector 5P and the receiving module 6P need to be maintained with relatively high precision. This undoubtedly results in increased design difficulty, assembly difficulty, and structural complexity of the eye tracking system.

In addition, for the head-mounted visual device, the adjustment is needed according to the degree of myopia or hyperopia of human eyes, so as to meet the user experience and demand. As shown in fig. 3, the technical solution for the existing head-mounted visual device to meet the requirement is as follows: the relative positions of the fixed reflector 5P and the receiving module 6P are kept unchanged, and the VR optical lens 2P is moved to change the focal position of the virtual scene image on the screen 1P imaged in human eyes. However, such an adjustment manner (moving the VR optical lens 2P) changes the object distance between the eyeball 3P and the receiving module 6P in the eyeball tracking optical path, resulting in that the imaging quality of the receiving module 6P is affected to further cause a decrease in the human eye gaze direction detection accuracy of the eyeball tracking optical path.

In view of the above technical problems, the basic concept of the present invention is to change the light path design of the eye tracking system, so that the detection light for detecting the direction of the eye line of the human eye is directly received by the receiving module without passing through the VR optical lens, and thus the light path of the eye tracking system and the imaging light path of the virtual scene are relatively kept independent, thereby reducing the design difficulty of the eye tracking system, simplifying the structure thereof, and facilitating the improvement of the overall performance stability of the head-mounted visual device.

Based on this, the invention proposes a head-mounted visual device comprising: a screen for projecting a virtual scene image to the user's eyeball, a VR (virtual reality) optical lens for constructing an optical path between the screen and the user's eyeball, so that the virtual scene image projected by the screen can reach the user's eyeball through the VR optical lens; and an eyeball tracking system for detecting the sight direction of the eyeballs of the user so as to adjust the display position of the virtual scene image on the screen based on the sight direction, wherein the eyeball tracking system comprises: at least one light source for projecting a detection light to the eyeball of the user; and a receiving module, for receiving the detecting light reflected from the user's eyeball to detect the sight direction of the user's eyeball, wherein the receiving module is located at the side of the VR optical lens and faces the user's eyeball, so that the detecting light reflected from the user's eyeball is directly received by the receiving module.

Having described the general principles of the present invention, various non-limiting embodiments of the present invention will now be described in detail with reference to the accompanying drawings.

Exemplary head-mounted visual device

Referring to fig. 4 to 10, a head-mounted visual device according to a preferred embodiment of the invention is illustrated, wherein a user can wear the head-mounted visual device on the head for a virtual reality experience.

Fig. 4 is a schematic diagram illustrating an optical system of the head-mounted visual device according to the preferred embodiment of the invention. As shown in fig. 4, the optical system of the head-mounted visual device is mainly composed of two parts: a virtual scene imaging system 10 and an eye tracking system 20.

As shown in fig. 4, the virtual scene imaging system 10 includes a screen 11 and a VR (virtual reality) optical lens 12, wherein the screen 11 is used for projecting a virtual scene image to the eyeball of the user, and the VR optical lens 12 is located between the screen 11 and the eyeball 30 of the user for constructing an optical path therebetween, so that the virtual scene image projected by the screen 11 can reach the eyeball 30 of the user through the VR optical lens 12.

The eye tracking system 20 includes at least one light source 21 and a receiving module 22, wherein the at least one light source 21 is configured to project a detecting light 200 to the user's eyes 30, and the receiving module 22 is configured to receive the detecting light 200 reflected from the user's eyes 30 to detect the visual direction of the user's eyes 30. After the eye tracking system 20 obtains the eye 30 sight direction of the user, the display position of the image on the screen 11 is adjusted accordingly to ensure that the image projected by the screen 11 is always within the visual field of the human eye. In other words, in the head-mounted visual device, the virtual scene imaging system 10 and the eye tracking system 20 cooperate with each other to ensure that the user can always observe and interact with the virtual scene image generated by the screen 11, so as to enhance the user experience.

In an implementation, the virtual scene image of the screen 11 may be generated by a SLAM algorithm. Those skilled in the art will appreciate that the SLAM algorithm is a frontier technique of visual domain spatial localization, whose main role is to construct an immersive virtual scene. The spatial localization of the human eye in the head-mounted visual device can be solved by means of the SLAM algorithm and a map of the environment created. After the immersive virtual scene is constructed by the SLAM algorithm, the virtual scene image may be directly displayed on the screen 11 or projected on the screen 11 by a projection device. In other words, in the preferred embodiment of the present invention, the screen 11 can be implemented as an active screen 11 or a passive screen 11.

When the screen 11 is implemented as an active screen 11 (e.g., a liquid crystal display), the virtual scene image constructed by the SLAM algorithm can be directly displayed at a specific position of the screen 11. At this time, the user can observe the virtual scene image formed on the screen 11 through the VR optical lens 12. Accordingly, when the screen 11 is implemented as a passive screen 11, the head-mounted visual device further includes a projection device for projecting the virtual scene image constructed by the SLAM algorithm at a specific position of the screen 11. Likewise, the virtual scene image formed on the screen 11 is viewable by the user through the VR optical lens 12. The difference between the two is that when the screen 11 is an active screen 11, the virtual scene image is directly displayed at a specific position of the screen 11, that is, the screen 11 actively projects the virtual scene image to the eyes of the user. When the screen 11 is a passive screen 11, the screen 11 is used to receive the virtual scene image projected by the projection device, that is, the screen 11 passively projects the virtual scene image to the eyes of the user.

As mentioned above, after the eye tracking system 20 detects the direction of the eye 30 of the user, the display position of the virtual scene image on the screen 11 needs to be adjusted accordingly to ensure that the image projected by the screen 11 is always within the field of view of the eye. Accordingly, when the screen 11 is an active screen 11, after obtaining the information of the eye sight direction of the user, the screen 11 can actively adjust the display position of the image on the screen 11, so as to ensure that the image projected by the screen 11 can be observed by the human eyes through the VR optical lens 12. When the screen 11 is a passive screen 11, the display position of the image on the screen 11 needs to be adjusted by the projection device. Accordingly, after obtaining the information of the sight line direction of the eyeballs of the user, the projection device can change the projection direction thereof based on the information, so as to adjust the display position of the virtual scene image projected by the projection device on the screen 11, thereby ensuring that the display position of the image on the screen 11 is always within the range of the field of vision of the eyeballs.

Further, in a specific implementation, the VR optics lens 12 is typically implemented as a fresnel lens (fresnel lens). Those skilled in the art will appreciate that fresnel lenses, also known as "screw lenses", have a series of saw-tooth grooves on one side of the lens, and the grooves formed by these grooves can be used to achieve a bandpass (refractive or reflective) effect on light of a given spectrum. Fresnel lenses have a relatively low cost compared to other optical lenses. Of course, those skilled in the art will appreciate that the VR optical lens 12 may be implemented as other types of optical lenses in the preferred embodiment of the present invention. The invention is not limited in this respect.

For the head-mounted visual device, its performance depends mainly on: the virtual scene imaging system 10, the eye tracking system 20, and the cooperation between the virtual scene imaging system 10 and the eye optical path system. As previously described, in existing head-mounted visualization devices, the virtual scene imaging optical path and the eye tracking optical path share a VR optical lens 12. In other words, in the existing head-mounted visual device, the virtual scene imaging system 10 and the eyeball tracking system 20 are optical systems closely related in structure. Such an optical path system causes a series of technical problems (this part is described in detail in the summary of the application and therefore will not be described here in detail).

Accordingly, as shown in fig. 4, in the preferred embodiment of the present application, the optical path design of the eye tracking system 20 is adjusted such that the detection light 200 for detecting the direction of the human eye line is directly received by the receiving module 22 without passing through the VR optical lens 12. In this way, the eyeball tracking system 20 and the virtual scene imaging system 10 are kept independent relatively, so as to achieve the technical purposes of reducing the design difficulty of the eyeball tracking system 20, simplifying the structure thereof, and being beneficial to improving the overall performance stability of the head-mounted visual device.

More specifically, in the preferred embodiment of the present invention, the receiving module 22 of the eye tracking system 20 is disposed toward the user's eye 30, so that the detecting light 200 reflected from the user's eye 30 can be directly received by the receiving module 22 without passing through the VR lens 12 and through the reflector as in the prior art. Thus, for the VR optical lens 12, in the present invention, it only needs to process the visible light band projected by the screen 11, and does not need to process the non-visible light band for eye tracking. In other words, in the preferred embodiment of the present invention, the VR optical lens 12 has a simplified structure due to the reduced difficulty of optical design.

Further, by means of the special arrangement of the receiving module 22, the optical routing light source-eye-VR optical lens-reflector-receiving module of the eye tracking system 20 is simplified to the at least one light source 21-the user's eye 30-the receiving module 22. As can be seen from the simplification of the optical path, the eye tracking system 20 involves a reduced number of components. Those skilled in the art will readily appreciate that for a single system, the fewer components of the system, the more readily the precision of the fit between the components within the system is assured and the more stable the system is. In other words, the design difficulty, the assembly difficulty, the complexity of the overall structure, etc. of the eye tracking system 20 are all reduced.

More importantly, the virtual scene imaging system 10 is structurally independent of the eye tracking system 20 through adjustment of the optical path of the eye tracking system 20. In other words, from a structural point of view, the virtual scene imaging system 10 and the eye tracking system 20 are two completely independent systems. Those skilled in the art will appreciate that for multiple systems, the lower the correlation between systems, the higher the stability of the multiple systems. In the present invention, the eyeball tracking system 20 and the virtual scene imaging system 10 have no common component (the VR optical lens 12). That is, the degree of association between the eye tracking system 20 and the virtual scene imaging system 10 is low, and therefore, the head-mounted visual device constituted by the eye tracking system 20 and the virtual scene construction system 10 has high stability.

Further, as shown in FIG. 4, in the preferred embodiment of the present invention, the receiving module 22 is disposed toward the user's eyeball 30 and on the side (top or bottom) of the VR optical lens 12, in such a position as to directly receive the detecting light 200 reflected from the user's eyeball, here, it should be observed that the axis defined by the user's eyeball 30 is set as the object-side optical axis X, and the axis defined by the optical lens 222 of the receiving module 22 is set as the photosensitive optical axis Y, and an included angle α exists between the object-side optical axis X and the photosensitive optical axis Y. in other words, the user's eyeball 30 is in an inclined state relative to the receiving module 22.

It should be known to those skilled in the art that, as shown in fig. 5, when there is an included angle between the object optical axis X and the photosensitive optical axis Y, if the photosensitive chip 221 of the receiving module 22 and the photosensitive optical axis Y are arranged in a perpendicular relationship (arrangement manner of a conventional camera module), there is a certain included angle between the ideal imaging surface of the receiving module 22 and the plane set by the photosensitive chip 221. From the aspect of image representation, the human eye image formed on the photosensitive chip 221 has low definition.

In order to improve the imaging quality of the receiving module 22, a first scheme that can be adopted is as follows: the field curvature parameter of the receiving module 22 is increased, so that the ideal imaging surface of the receiving module 22 can coincide with the plane set by the photosensitive chip 221 under the action of the field curvature. However, the technical solution of improving the receiving module 22 by changing the field curvature parameter increases the difficulty of the optical design of the receiving module 22, and has little effect on improving the imaging quality.

Preferably, in the preferred embodiment of the present invention, a second scheme is adopted, in which the relative positional relationship of the plane set by the photo-sensing chip 221 with respect to the photo-sensing optical axis Y is changed, that is, the relative positional relationship between the photo-sensing chip 221 and the optical lens 222 is changed, so that the plane set by the photo-sensing chip 221 and an ideal imaging plane are in a state of overlapping as much as possible, in other words, in the second scheme, the position of the photo-sensing chip 221 with respect to the optical lens 222 is adjusted so that the plane set by the photo-sensing chip 221 and the plane set by the optical lens 222 are in a non-parallel state, that is, an included angle β exists between the plane set by the photo-sensing chip 221 and the plane set by the optical lens 222.

It will be appreciated that, in the case of the "imaging plane tilt" scheme, the difficulty of designing the receive module 22 is greatly reduced relative to the first scheme. Meanwhile, as the field angle of the receiving module 22 increases, the image point positions become increasingly dense, so that the relative illuminance of the receiving module 22 is compensated to some extent.

In a specific optical design, the included angle β between the plane defined by the photo sensor chip 221 and the plane defined by the optical lens 222 depends on the included angle α between the object optical axis X and the photo sensor optical axis Y and the parameters of the optical lens 222, in other words, it is required to solve the included angle β between the plane defined by the photo sensor chip 221 and the plane defined by the optical lens 222, and it is required to solve the included angle α between the object optical axis X and the photo sensor optical axis Y and the optical parameters of the optical lens 222 in advance.

Here, when the scheme of "image plane tilting" is adopted, the optical design of the optical lens 222 of the receiving module 22 can be simplified. In particular, in the preferred embodiment of the present invention, the optical lens 222 of the receiving module 22 can be implemented as a single-chip aspheric optical lens 2221 having optical parameters such as specific power. Here, it should be appreciated that the plane defined by the optical lens 222 is the plane defined by the aspheric optical lens 2221.

In addition, in the preferred embodiment of the present invention, the included angle α between the object axis X and the photosensitive axis Y depends on the distance between the user's eyeball 30 and the VR optical lens 12, the preset diameter of the user's eyeball 30, and the distance between the VR optical lens 12 and the receiving module 22. generally, the included angle α between the object axis X and the photosensitive axis Y ranges from 25.0 to 40.0 degrees. FIGS. 6 and 7 are schematic diagrams illustrating the specific optical system design of the eyeball tracking system 20 according to the preferred embodiment of the present invention. As shown in FIG. 6, the distance between the user's eyeball 30 and the VR optical lens 12 is set to 15mm, the preset diameter of the user's eyeball is set to 35mm, and the distance between the VR optical lens 12 and the receiving module 22 is set to 30mm, and the 3 parameters are finally set while considering the own optical parameters of the receiving module 22, and the included angle between the object axis X and the photosensitive axis Y is α degrees.

Therefore, on the premise that the included angle α between the object optical axis X and the photosensitive optical axis Y is set to be 32 °, and it is determined that the optical lens 222 of the receiving module 22 is implemented as the single aspheric optical lens 2221, it is found through detection optimization that when the included angle β between the plane set by the photosensitive chip 221 and the plane set by the optical lens 222 is 20 °, the imaging performance of the receiving module 22 is better and can meet the design requirement, as shown in fig. 7.

Here, it should be easily understood by those skilled in the art that, in the preferred embodiment of the present invention, the included angle β between the plane set by the photo sensor chip 221 and the plane set by the optical lens 222 is a variable, and the specific value thereof depends on the included angle α between the object optical axis X and the photo sensor optical axis Y and the optical parameters of the optical lens 222 itself, meanwhile, in the specific implementation, the imaging performance of the receiving module 22 can be adjusted and optimized by adjusting the included angle β between the plane set by the photo sensor chip 221 and the plane set by the optical lens 222, so that the final value of the included angle β between the plane set by the photo sensor chip 221 and the plane set by the optical lens 222 can be determined after the imaging effect meeting the design requirements is obtained.

To further optimize the performance of the eye tracking system 20, the type and arrangement of the light sources of the eye tracking system 20 are adjusted. In the preferred embodiment of the present invention, the eye tracking system 20 comprises 8 light sources 21, wherein 8 light sources 21 are circumferentially disposed around the VR optical lens 12 for projecting the detecting light 200 to the user's eye 30. Specifically, as shown in fig. 8 and fig. 9, each of the light sources 21 includes a plurality of optical fibers 221 and a non-visible light source 212 (e.g., a near-infrared light source or an infrared light source), and the plurality of optical fibers 221 are respectively connected to the non-visible light source, so that the detection light 200 is generated at each optical fiber 221 of the plurality of optical fibers 221 after the non-visible light source is conducted, thereby implementing multi-point illumination.

It is worth mentioning that compared with the conventional led (light Emitting diode) light source, the light source beam formed by the multi-path optical fiber 221 has a relatively small size, so that the limitation of the light source on the visual range of the human eye is effectively reduced, and the visual blind area is reduced. In addition, the LED light source group is replaced by the multi-path optical fiber 221, so that the cost can be effectively reduced, and the product is lighter and more attractive.

As mentioned above, the head-mounted visual device needs to be adjusted according to the degree of myopia or hyperopia of human eyes to meet the user experience and demand. In particular, in the preferred embodiment of the present invention, as shown in fig. 10, the screen 11 is movable relative to the VR optical lens 12, so that the diopter of the user's eyeball 30 can be adjusted by changing the distance between the screen 11 and the VR optical lens 12. It should be appreciated that compared to the prior art in which the relative positions of the fixed mirror and the receiving module 22 are kept unchanged, the technical solution for adjusting the diopter of the eye by moving the VR optical lens 12 is that in the technical solution for moving the screen 11, the relative positional relationship between the VR optical lens 12, the receiving module 22 and the eyeball of the user is kept unchanged, that is, the optical path of the eyeball tracking system 20 is kept unchanged, so that the overall stability of the head-mounted visual device is increased.

Accordingly, according to yet another aspect of the present invention, the present invention also provides a diopter adjustment method for a head-mounted visual device, which includes: the screen 11 is moved to adjust a distance between the screen 11 and the VR optical lens 12 to change the diopter of the user's eyeball. In an embodiment of the present invention, the eye tracking system 20 is integrated with the VR optical lens 12, so that the VR optical lens 12 and the eye tracking system 20 have an integrated structure.

Further, the eye tracking system 20 is integrally provided with the VR optical lens 12 to ensure stability of the optical performance of the eye tracking system 20 and the VR optical lens 12 through a structurally stable relationship between the eye tracking system 20 and the VR optical lens 12. Meanwhile, the integration mode is favorable for eliminating errors caused in the assembly process, reducing the weight and facilitating later maintenance.

In summary, by changing the optical path design of the eye tracking system 20 such that the detection light 200 for detecting the visual line direction of the human eye is directly received by the receiving module 22 without passing through the VR optical lens 12, the technical principle and the technical effect that the optical path of the eye tracking system 20 is relatively independent from the projection optical path of the virtual scene are clarified.

Exemplary eye tracking System

As shown in fig. 4 to 10, according to another aspect of the present invention, the present invention further provides an eye tracking system 20 for a head-mounted visual device, wherein the eye tracking system 20 is used for detecting a visual line direction of an eyeball of a user. As shown in fig. 4, the eye tracking system 20 includes at least one light source 21 and a receiving module 22, wherein the at least one light source 21 is configured to project a detecting light 200 to the user's eye 30, and the receiving module 22 is configured to receive the detecting light 200 reflected from the user's eye 30 to detect the visual direction of the user's eye 30. After the eye tracking system 20 obtains the eye 30 sight direction of the user, the display position of the virtual scene image is correspondingly adjusted based on the eye sight direction, so as to ensure that the virtual scene image is always located within the field of view of the eyes.

As shown in fig. 4, in the preferred embodiment of the present application, the optical path design of the eye tracking system 20 is adjusted such that the detection light 200 for detecting the eye line direction is directly received by the receiving module 22 without passing through the VR optical lens 12. In this way, the eyeball tracking system 20 and the virtual scene imaging system 10 are kept independent relatively, so as to achieve the technical purposes of reducing the design difficulty of the eyeball tracking system 20, simplifying the structure thereof, and being beneficial to improving the overall performance stability of the head-mounted visual device.

More specifically, in the preferred embodiment of the present invention, the receiving module 22 of the eye tracking system 20 is disposed toward the user's eye 30, so that the detecting light 200 reflected from the user's eye 30 can be directly received by the receiving module 22 without passing through the VR lens 12 and through the reflector as in the prior art. Thus, for the VR optical lens 12, in the present invention, it only needs to process the visible light band projected by the screen 11, and does not need to process the non-visible light band for eye tracking. In other words, in the preferred embodiment of the present invention, the VR optical lens 12 has a simplified structure due to the reduced difficulty of optical design.

Further, by means of the special arrangement of the receiving module 22, the optical routing light source-eye-VR optical lens-reflector-receiving module of the eye tracking system 20 is simplified to the at least one light source 21-the user's eye 30-the receiving module 22. As can be seen from the simplification of the optical path, the eye tracking system 20 involves a reduced number of components. Those skilled in the art will readily appreciate that for a single system, the fewer components of the system, the more readily the precision of the fit between the components within the system is assured and the more stable the system is. In other words, the design difficulty, the assembly difficulty, the complexity of the overall structure, etc. of the eye tracking system 20 are all reduced.

More importantly, the virtual scene imaging system 10 is structurally independent of the eye tracking system 20 through adjustment of the optical path of the eye tracking system 20. In other words, from a structural point of view, the virtual scene imaging system 10 and the eye tracking system 20 are two completely independent systems. Those skilled in the art will appreciate that for multiple systems, the lower the correlation between systems, the higher the stability of the multiple systems. In the present invention, the eyeball tracking system 20 and the virtual scene imaging system 10 have no common component (the VR optical lens 12). That is, the degree of association between the eye tracking system 20 and the virtual scene imaging system 10 is low, and therefore, the head-mounted visual device constituted by the eye tracking system 20 and the virtual scene imaging system 10 has high stability.

Further, as shown in FIG. 4, in the preferred embodiment of the present invention, the receiving module 22 is disposed toward the user's eyeball 30 and on the side (top or bottom) of the VR optical lens 12, in such a position as to directly receive the detecting light 200 reflected from the user's eyeball, here, it should be observed that the axis defined by the user's eyeball 30 is set as the object-side optical axis X, and the axis defined by the optical lens 222 of the receiving module 22 is set as the photosensitive optical axis Y, and an included angle α exists between the object-side optical axis X and the photosensitive optical axis Y. in other words, the user's eyeball 30 is in an inclined state relative to the receiving module 22.

It should be known to those skilled in the art that, as shown in fig. 5, when there is an included angle between the object optical axis X and the photosensitive optical axis Y, if the photosensitive chip 221 of the receiving module 22 and the photosensitive optical axis Y are arranged in a perpendicular relationship (arrangement manner of a conventional camera module), there is a certain included angle between the ideal imaging surface of the receiving module 22 and the plane set by the photosensitive chip 221. From the aspect of image representation, the human eye image formed on the photosensitive chip 221 has low definition.

In order to improve the imaging quality of the receiving module 22, a first scheme that can be adopted is as follows: the field curvature parameter of the receiving module 22 is increased, so that the ideal imaging surface of the receiving module 22 can coincide with the plane set by the photosensitive chip 221 under the action of the field curvature. However, the technical solution of improving the receiving module 22 by changing the field curvature parameter increases the difficulty of the optical design of the receiving module 22, and has little effect on improving the imaging quality.

Preferably, in the preferred embodiment of the present invention, a second scheme is adopted, in which the relative positional relationship of the plane set by the photo-sensing chip 221 with respect to the photo-sensing optical axis Y is changed, that is, the relative positional relationship between the photo-sensing chip 221 and the optical lens 222 is changed, so that the plane set by the photo-sensing chip 221 and an ideal imaging plane are in a state of overlapping as much as possible, in other words, in the second scheme, the position of the photo-sensing chip 221 with respect to the optical lens 222 is adjusted so that the plane set by the photo-sensing chip 221 and the plane set by the optical lens 222 are in a non-parallel state, that is, an included angle β exists between the plane set by the photo-sensing chip 221 and the plane set by the optical lens 222.

It will be appreciated that, in the case of the "imaging plane tilt" scheme, the difficulty of designing the receive module 22 is greatly reduced relative to the first scheme. Meanwhile, as the field angle of the receiving module 22 increases, the image point positions become increasingly dense, so that the relative illuminance of the receiving module 22 is compensated to some extent.

In a specific optical design, the included angle β between the plane defined by the photo sensor chip 221 and the plane defined by the optical lens 222 depends on the included angle α between the object optical axis X and the photo sensor optical axis Y and the parameters of the optical lens 222, in other words, it is required to solve the included angle β between the plane defined by the photo sensor chip 221 and the plane defined by the optical lens 222, and it is required to solve the included angle α between the object optical axis X and the photo sensor optical axis Y and the optical parameters of the optical lens 222 in advance.

Here, when the scheme of "image plane tilting" is adopted, the optical design of the optical lens 222 of the receiving module 22 can be simplified. In particular, in the preferred embodiment of the present invention, the optical lens 222 of the receiving module 22 can be implemented as a single-chip aspheric optical lens 2221 having optical parameters such as specific power. Here, it should be appreciated that the plane defined by the optical lens 222 is the plane defined by the aspheric optical lens 2221.

In addition, in the preferred embodiment of the present invention, the included angle α between the object axis X and the photosensitive axis Y depends on the distance between the user's eyeball 30 and the VR optical lens 12, the preset diameter of the user's eyeball 30, and the distance between the VR optical lens 12 and the receiving module 22. generally, the included angle α between the object axis X and the photosensitive axis Y ranges from 25.0 to 40.0 degrees. FIGS. 6 and 7 are schematic diagrams illustrating the specific optical system design of the eyeball tracking system 20 according to the preferred embodiment of the present invention. As shown in FIG. 6, the distance between the user's eyeball 30 and the VR optical lens 12 is set to 15mm, the preset diameter of the user's eyeball is set to 35mm, and the distance between the VR optical lens 12 and the receiving module 22 is set to 30mm, and the 3 parameters are finally set while considering the own optical parameters of the receiving module 22, and the included angle between the object axis X and the photosensitive axis Y is α degrees.

Therefore, on the premise that the included angle α between the object optical axis X and the photosensitive optical axis Y is set to be 32 °, and it is determined that the optical lens 222 of the receiving module 22 is implemented as a single aspheric optical lens, it is found through detection optimization that when the included angle β between the plane set by the photosensitive chip 221 and the plane set by the optical lens 222 is 20 °, the imaging performance of the receiving module 22 is better and can meet the design requirement, as shown in fig. 7.

Here, it should be easily understood by those skilled in the art that, in the preferred embodiment of the present invention, the included angle β between the plane set by the photo sensor chip 221 and the plane set by the optical lens 222 is a variable, and the specific value thereof depends on the included angle α between the object optical axis X and the photo sensor optical axis Y and the optical parameters of the optical lens 222 itself, meanwhile, in the specific implementation, the imaging performance of the receiving module 22 can be adjusted and optimized by adjusting the included angle β between the plane set by the photo sensor chip 221 and the plane set by the optical lens 222, so that the final value of the included angle β between the plane set by the photo sensor chip 221 and the plane set by the optical lens 222 can be determined after the imaging effect meeting the design requirements is obtained.

To further optimize the performance of the eye tracking system 20, the type and arrangement of the light sources of the eye tracking system 20 are adjusted. In the preferred embodiment of the present invention, the eye tracking system 20 comprises 8 light sources 21, wherein 8 light sources 21 are circumferentially disposed around the VR optical lens 12 for projecting the detecting light 200 to the user's eye 30. Specifically, as shown in fig. 8 and fig. 9, each of the light sources 21 includes a plurality of optical fibers 221 and a non-visible light source 212 (e.g., a near-infrared light source or an infrared light source), and the plurality of optical fibers 221 are respectively connected to the non-visible light source, so that the detection light 200 is generated at each optical fiber 221 of the plurality of optical fibers 221 after the non-visible light source is conducted, thereby implementing multi-point illumination.

It is worth mentioning that compared with the conventional led (light Emitting diode) light source, the light source beam formed by the multi-path optical fiber 221 has a relatively small size, so that the limitation of the light source on the visual range of the human eye is effectively reduced, and the visual blind area is reduced. In addition, the LED light source group is replaced by the multi-path optical fiber 221, so that the cost can be effectively reduced, and the product is lighter and more attractive.

According to still another aspect of the present invention, there is also provided an eye tracking method for a head-mounted visual device, comprising:

projecting a detecting light 200 to the eyeball of the user; and

the detecting light 200 reflected from the user's eyeball 30 is received by a receiving module 22 to detect the visual direction of the user's eyeball 30, wherein the receiving module 22 is located at the side of the VR optical lens 12 and faces the user's eyeball 30, so that the detecting light 200 reflected from the user's eyeball 30 is directly received by the receiving module 22.

Here, although the above description has been made by taking an example in which the eye tracking system 20 is applied to a head-mounted visual device. However, it should be understood by those skilled in the art that the eye tracking system 20 according to the embodiment of the present invention can also be applied to other VR products, even VR products and other fields. And are not intended to limit the scope of the present invention.

It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:光学设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!