Vision detection method and device and wearable device

文档序号:1880025 发布日期:2021-11-26 浏览:28次 中文

阅读说明:本技术 一种视力检测方法、装置及可穿戴设备 (Vision detection method and device and wearable device ) 是由 黄桂平 王越超 尚春莉 于 2021-08-31 设计创作,主要内容包括:本申请提供了一种视力检测方法、装置及可穿戴设备,适用于视力检测技术领域,该方法包括:可穿戴设备在移动镜片组的位置的过程中,接收用户触发的确认操作,该确认操作为用户的眼睛感知到清晰图像时所触发的操作;可穿戴设备获取在接收到该确认操作时成像平面与目标平面的目标距离,该目标平面为镜片组中近眼侧的第一个镜片所在的平面;可穿戴设备根据目标距离,确定镜片组的像距,并根据像距确定该用户的视力对应的近视度数。通过本申请实施例可以解决视力检测操作繁琐,检测效率较低的问题,提高了视力检测的效率,为随时检测操作提供了极大的便利。(The application provides a vision detection method, a device and wearable equipment, which are applicable to the technical field of vision detection, and the method comprises the following steps: the wearable device receives a confirmation operation triggered by a user in the process of moving the position of the lens group, wherein the confirmation operation is triggered when eyes of the user perceive a clear image; the wearable device obtains a target distance between an imaging plane and a target plane when the confirmation operation is received, wherein the target plane is a plane where a first lens on the near-eye side of the lens group is located; the wearable device determines the image distance of the lens group according to the target distance, and determines the myopia degree corresponding to the vision of the user according to the image distance. Can solve the visual detection complex operation through this application embodiment, the lower problem of detection efficiency has improved the efficiency of visual detection, provides very big facility for the detection operation at any time.)

1. A vision detection method is applied to a wearable device, and comprises the following steps:

the wearable device receives a confirmation operation triggered by a user in the process of moving the position of the lens group, wherein the confirmation operation is an operation triggered when the eyes of the user perceive a clear image;

acquiring a target distance between an imaging plane and a target plane when the confirmation operation is received, wherein the target plane is a plane where a first lens on the near-eye side of the lens group is located;

determining the image distance of the lens group according to the target distance;

and determining the myopia degree corresponding to the vision of the user according to the image distance.

2. The method of claim 1, wherein the moving the position of the lens group of the wearable device comprises:

and in response to a vision detection instruction, driving the lens group to move towards the image display device from the initial position.

3. The method of claim 1, wherein said obtaining a target distance of an imaging plane from a target plane at the time said confirmation operation is received comprises:

acquiring a moving distance of the lens group relative to an initial position when the confirmation operation is received;

and determining the target distance according to the moving distance and the distance between the initial position and the imaging plane.

4. The method of claim 3, wherein the movement distance is obtained by a displacement sensor.

5. The method of any of claims 1 to 4, wherein after said obtaining a target distance of an imaging plane from a target plane at a time of receiving the confirmation operation, the method further comprises:

acquiring the object distance between the plane of the first lens on the near-eye side in the lens group and the plane of the image display device when the confirmation operation is received;

determining the focal length of the lens group according to the object distance and the target distance;

determining focal power corresponding to the lens group according to the focal length;

and determining the myopia degree corresponding to the vision of the user according to the focal power.

6. The method of any of claims 1 to 4, wherein after said obtaining a target distance of an imaging plane from a target plane at a time of receiving the confirmation operation, the method further comprises:

and according to the corresponding relation between the distance and the vision, taking the target vision corresponding to the target distance as the vision of the user.

7. The method of claim 6, wherein the user's vision or myopia is displayed by a voice announcement or a display screen.

8. The vision detection device is characterized by comprising a lens group, an image display device and a motor driving device;

the motor driving device is used for driving the lens group to move towards the image display device and acquiring the moving distance of the lens group when receiving confirmation operation triggered by a user;

the image display device is used for displaying an image when receiving a vision detection instruction;

the lens group is used for adjusting the target distance between the imaging plane and the target plane in the moving process;

the moving distance is used for determining a target distance between an imaging plane and a target plane, the target plane is a plane where a first lens on the near-eye side of the lens group is located, and the confirming operation is an operation triggered when eyes of a user perceive a clear image; the lens group comprises N concave lenses and N convex lenses which are arranged at intervals, wherein N is an integer larger than or equal to 3.

9. A vision testing device, comprising:

the wearable device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for receiving a confirmation operation triggered by a user in the process of moving the position of the lens group, and the confirmation operation is triggered when eyes of the user perceive a clear image;

the second acquisition module is used for acquiring a target distance between an imaging plane and a target plane when the confirmation operation is received, wherein the target plane is a plane where a first lens on the near-eye side in the lens group is located;

the first determining module is used for determining a first focal length of the lens group according to the target distance;

and the second determining module is used for determining the myopia degree corresponding to the vision of the user according to the first focal length.

10. A wearable device comprising a memory, a processor, a computer program being stored on the memory and being executable on the processor, the processor implementing the steps of the method according to any of claims 1 to 7 when executing the computer program.

Technical Field

The application belongs to the technical field of vision detection, and particularly relates to a vision detection method and device and wearable equipment.

Background

At present, the traditional vision testing methods include visual chart examination, medical examination, and lens switching examination.

However, when testing through the visual chart checking method, the body needs to be ensured to be vertical, the head cannot be raised, and the visual chart checking method needs to identify the body from top to bottom; the medical examination needs to arrive at a designated medical institution, a doctor acquires the identification result by using a specific instrument such as an indicator, and the vision is further determined according to the identification result; the lens switching inspection requires confirming vision by switching different lenses a plurality of times; therefore, the vision detection operation is complicated, and the detection efficiency is low.

Disclosure of Invention

The embodiment of the application provides a vision detection method, a vision detection device and wearable equipment, and can solve the problems of complex operation of vision detection and low detection efficiency.

The application provides a visual detection method in a first aspect, which comprises the following steps:

the wearable device receives a confirmation operation triggered by a user in the process of moving the position of the lens group, wherein the confirmation operation is triggered when eyes of the user perceive a clear image; the wearable device obtains a target distance between an imaging plane and a target plane when the confirmation operation is received, wherein the target plane is a plane where a first lens on the near-eye side of the lens group is located; the wearable device determines the image distance of the lens group according to the target distance, and determines the myopia degree corresponding to the vision of the user according to the image distance.

In one possible implementation manner of the first aspect, the moving the position of the lens group of the wearable device includes:

the wearable device starts driving the lens group to move towards the image display device from the initial position in response to the vision detection instruction.

In one possible implementation manner of the first aspect, the obtaining, by the wearable device, a target distance between the imaging plane and the target plane when the confirmation operation is received includes:

acquiring the movement distance of the lens group relative to the initial position when the confirmation operation is received; and determining the target distance according to the moving distance and the distance between the initial position and the imaging plane.

In one possible implementation manner of the first aspect, the movement distance is acquired by a displacement sensor.

In one possible implementation manner of the first aspect, after obtaining the target distance between the imaging plane and the target plane when the confirmation operation is received, the method further includes:

acquiring the object distance between the plane of the first lens on the near-eye side in the lens group and the plane of the image display device when the confirmation operation is received; determining the focal length of the lens group according to the object distance and the target distance; determining the focal power corresponding to the lens group according to the focal length; and determining the myopia degree corresponding to the vision of the user according to the focal power.

In one possible implementation manner of the first aspect, after obtaining the target distance between the imaging plane and the target plane when the confirmation operation is received, the method further includes:

and according to the corresponding relation between the distance and the vision, taking the target vision corresponding to the target distance as the vision of the user.

In one possible implementation of the first aspect, the wearable device displays the vision or myopia degree of the user through voice broadcast or a display screen.

For example, the wearable device can create account information of a user and record historical data of the user, and can realize detection and comparison of the vision of the user at any time so as to remind the user of vision change within a period of time.

For example, through the implementation manner, before the wearable device is worn on the eyes of the user, the state of wearing glasses by the eyes of the user can be better simulated, and the motor driving device in the wearable device can drive the lens group to return to the initial position by receiving the reset key or the close key triggered by the user.

The second aspect of the present application provides a vision testing device, which comprises a lens set, an image display device, and a motor driving device;

the motor driving device is used for driving the lens group to move towards the image display device and acquiring the moving distance of the lens group when receiving the confirmation operation triggered by the user; the image display device is used for displaying an image when receiving the vision detection instruction; the lens group is used for adjusting the target distance between the imaging plane and the target plane in the moving process; the moving distance is used for determining a target distance between an imaging plane and a target plane, the target plane is a plane where a first lens on the near-eye side of the lens group is located, and the confirming operation is an operation triggered when eyes of a user perceive a clear image; the lens group comprises N concave lenses and N convex lenses which are arranged at intervals, wherein N is an integer larger than or equal to 3.

A third aspect of the present application provides a vision testing apparatus comprising:

the wearable device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for receiving a confirmation operation triggered by a user in the process of moving the position of the lens group, and the confirmation operation is triggered when eyes of the user perceive a clear image;

the second acquisition module is used for acquiring a target distance between an imaging plane and a target plane when the confirmation operation is received, wherein the target plane is a plane where a first lens on the near-eye side in the lens group is located;

the first determining module is used for determining a first focal length of the lens group according to the target distance;

and the second determining module is used for determining the myopia degree corresponding to the vision of the user according to the first focal length.

A fourth aspect of the present application provides a wearable device, which includes a memory, and a processor, where the memory stores a computer program operable on the processor, and the processor, when executing the computer program, implements the steps of the vision detection method according to any one of the first aspect.

A fifth aspect of the present application provides a computer-readable storage medium comprising: there is stored a computer program which, when executed by a processor, carries out the steps of the vision testing method as defined in any one of the above-mentioned first aspects.

A sixth aspect of embodiments of the present application provides a computer program product for causing a computer to perform the vision testing method of any one of the above first aspects when the computer program product runs on the computer.

Compared with the prior art, the application has the beneficial effects that: the wearable device receives a confirmation operation triggered by a user in the process of moving the position of the lens group, wherein the confirmation operation is triggered when eyes of the user perceive a clear image; the wearable device obtains a target distance between an imaging plane and a target plane when the confirmation operation is received, wherein the target plane is a plane where a first lens on the near-eye side of the lens group is located; the wearable device determines the image distance of the lens group according to the target distance, and determines the myopia degree corresponding to the vision of the user according to the image distance. The embodiment of the application can solve the problems of complicated vision detection operation and low detection efficiency, improves the vision detection efficiency and provides great convenience for detection operation at any time; has strong usability and practicability.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.

Fig. 1 is a schematic diagram of a wearable device provided by an embodiment of the present application;

FIG. 2 is a schematic diagram of a flow chart of an implementation of a vision testing method provided in an embodiment of the present application;

FIG. 3 is a schematic diagram of an internal implementation architecture of a vision testing apparatus provided in the embodiments of the present application;

FIG. 4 is a schematic diagram of an internal implementation architecture in a vision testing process according to an embodiment of the present application;

FIG. 5 is a schematic diagram of a binocular vision testing process provided by an embodiment of the present application;

FIG. 6 is a schematic structural diagram of a vision testing apparatus provided in an embodiment of the present application;

fig. 7 is a schematic diagram of a wearable device provided in an embodiment of the present application.

Detailed Description

In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.

In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.

Referring to fig. 1, fig. 1 is a schematic view of a wearable device provided in an embodiment of the present application. As shown in fig. 1, the wearable device may be a head-mounted terminal device, for example, a Virtual Reality (VR) device or an Augmented Reality (AR) device. As shown in fig. 1, the wearable device may further include an on/off key, a pause key, a microphone, a display screen, and the like. When a user starts to detect eyesight, the wearable device can receive an eyesight detection instruction input by clicking a switch key by the user, and the wearable device responds to the eyesight detection instruction and starts to drive the inner lens group to move; in the moving process, the wearable device receives a confirmation operation triggered by a user and determines the moving distance of the current lens group; the confirmation operation may be a trigger operation input by the user by clicking a pause key when the user sees a clear image, or may be triggered by other means, for example, a control terminal such as a remote controller associated with the wearable device. The trigger form for starting the detection instruction and the confirmation operation is not particularly limited.

For example, when receiving a confirmation operation triggered by the user, the wearable device determines a target distance between the imaging plane and the lens group, further determines image data of the current lens group, and determines a near vision power of the user or the vision of the user according to the image distance. And can also report the myopia degree and eyesight through the speaker or display through the display screen, as shown in fig. 1, the eyesight of the left eye is displayed through the display screen on the left: 0.3, the required glasses degree is 300 and 350 degrees; the vision of the right eye is displayed through the display screen on the right side: 0.25, the required spectacle power is 400 degrees.

Through the embodiment of the application, the detection of the vision of the user can be realized at any time, so that the operation is convenient and fast, and the detection efficiency is greatly improved.

The following further describes the specific implementation process and implementation principle of the present application by means of specific embodiments.

Referring to fig. 2, fig. 2 is a schematic flow chart illustrating an implementation of the vision testing method according to the embodiment of the present application. As shown in fig. 2, the implementation flow of the eyesight detecting method may include the following steps:

s201, in the process of moving the position of the lens group, the wearable device receives a confirmation operation triggered by the user, where the confirmation operation is triggered when the eyes of the user perceive a clear image.

In some embodiments, the wearable device may display an image through the image display apparatus, and a user needing to detect vision may wear the wearable device to view the image displayed by the image display apparatus through the movement of the lens group, so as to detect the vision.

S202, the wearable device obtains a target distance between the imaging plane and a target plane when receiving the confirmation operation, where the target plane is a plane where a first lens on the near-eye side of the lens group is located.

As shown in fig. 3, an internal implementation architecture diagram of the vision detecting device provided in the embodiment of the present application is shown. The wearable device can be used as a vision detection device and comprises a lens group, an image display device and a motor driving device.

Wherein the motor driving device further comprises a motor and a sensor, which may be a displacement sensor. The wearable device drives the lens group to move towards the direction of the image display device through the motor, and obtains the moving distance of the lens group through the sensor when receiving the confirmation operation triggered by the user. The image display device is used for displaying an image when receiving a vision detection instruction; the image may be a graphic or a number, for example, may be the "E" of the International eye chart.

For example, the movement distance is used to determine a target distance between the imaging plane and a target plane, where the target plane is a plane where a first lens on the near-eye side of the lens group is located, and the confirmation operation is an operation triggered when the eye of the user perceives a sharp image. As shown in fig. 3, the lens group may include N concave lenses and N convex lenses arranged at intervals, where N is an integer greater than or equal to 3.

Illustratively, the lens group is used for adjusting the target distance between the imaging plane and the target plane during the movement process, so that the position of the focus is adjusted by changing the image distance, and finally the focus is made to fall on the retina of the human eye, namely the eye of the user perceives a clear image.

For example, the first lens element on the side of the lens group near the user's eye may be a concave lens element, and the nth lens element on the other side near the image display device may be a convex lens element. By setting the imaging distance between the concave lens and the convex lens, the function of fully dispersing parallel light emitted by the image display device by the lens group is realized.

It is to be understood that, as shown in fig. 3, when the vision of the user's eye is normal, the imaging point (or focal point) of the parallel light rays through the lens of the user's eye may fall on the retina of the user's eye.

As shown in fig. 3, assume an image distance v between the imaging plane and the plane where the first lens is located when the lens group is at the initial position, and an object distance u between the plane where the first lens is located and the image display plane; based on the relationship between the focal length f and the image distance v and the object distance u in the following formula (1), the image distance v and the object distance u of the lens group are changed by driving the lens group to move, and the automatic focusing of the wearable device is realized. For example, when the lens group is moved from the initial position to the image display device, the image distance v is increased and the object distance u is decreased within a certain range, so that the imaging point position, i.e., the focal position (the size of the focal length) of the wearable device can be adjusted.

Based on the above principle, a specific process of detecting eyes of a user with a myopia problem is described below.

Referring to fig. 4, fig. 4 is a schematic diagram of an internal implementation architecture in a vision testing process according to an embodiment of the present application. As shown in fig. 4 (a), when there is a myopia problem with respect to the vision of the user's eye, the imaging point (or focal point) at which the parallel light rays pass through the lens of the user's eye falls in front of the retina of the user's eye, for example, at the position of the plane where the first imaging plane is located, and the image seen by the user is blurred and unclear. Thus, when performing vision detection on the user's eyes, the wearable device starts to move from the initial position to the direction of the image display apparatus by the motor-driven lens group.

As shown in fig. 4 (b), during the movement of the lens set from the initial position to the direction of the image display device, the focal length of the lens set is changed, so that the lens set and the lens of the user's eye are imaged on the same focal point, i.e. the retina of the user's eye, and the user's eye perceives a clear image. When the user perceives a clear image, the wearable device can receive a confirmation operation triggered by the user, the motor stops driving, and the sensor acquires the moving distance of the lens group.

For example, when the lens set is at the initial position, the corresponding position coordinate is 0; the sensor collects the corresponding position coordinates when the movement is stopped, so that the movement distance is determined according to the two position coordinates, such as d1 shown in the diagram (b) in fig. 4; the target distance at this time may be d2, and the object distance may be u 1.

It should be noted that the distances identified in the above drawings are only schematically illustrated, and the lengths or sizes shown in the drawings do not represent the size relationship of the actual distances in practical application, and are only used for clearly illustrating the relationship between the parameters and defining the size of the actual magnitude of each parameter.

In some embodiments, as shown in fig. 4 (b), after the wearable device drives the lens set to move from the initial position to the image display device by the motor, the focal length of the lens set is adjusted by increasing the image distance and decreasing the object distance within a certain range, so that the image of the lens set and the lens of the user's eye changes from the first image plane to the second image plane, and is imaged on the retina of the user's eye, so that the user's eye sees a clear image.

S203, the wearable device determines the image distance of the lens group according to the target distance.

And S204, determining the myopia degree corresponding to the vision of the user according to the image distance.

In one possible implementation, as shown in fig. 4 (b), the target distance may be the distance between the retina (the second imaging plane) and the first lens element on the near-eye side of the lens assembly, i.e., d 2.

In the first aspect, d2 may be determined as the image distance of the lens group when the confirmation operation triggered by the user is received, based on the distance Δ d between the retina of the general human eye and the front end of the human eye being 17 mm, and the actual value of d2 is obtained by adding Δ d to d1 after the sensor acquires the moving distance d1 of the lens group.

On the other hand, since Δ d (distance between the retina of the human eye and the front end of the human eye) shown in the diagram (b) in fig. 4 is relatively small in magnitude from d1 in practical applications, for example, the distance Δ d between the retina of the human eye and the front end of the human eye is 17 mm, and d1 may be 70 mm to 100 mm, the target distance d2 may be directly approximated to d1 acquired by the sensor, and d1 may be determined as the image distance of the lens group when the confirmation operation triggered by the user is received.

Illustratively, based on the image distance in the first aspect and the image distance in the other aspect, as shown in fig. 4 (b), the object distance u1 at this time may be a distance between the first lens piece on the near-eye side of the lens group and the image display plane of the image display device.

Based on the above embodiment, in a possible implementation manner, the wearable device may further acquire, through the sensor, the object distance u1 of the lens group when the confirmation operation triggered by the user is received, so that the focal length of the lens group and the lens at this time can be obtained according to the relationship between the image distance, the object distance, and the focal length in formula (1).

For example, if the image distance d2 is the first aspect, the focal length f between the lens set and the lens at this time is calculated by the formula (2), the formula (3) and the formula (4). Wherein, formula (2), formula (3) and formula (4) are respectively expressed as follows:

d2=d1+Δd (3);

further, according to the formula (5), calculating the focal power D of the lens group; and (4) calculating the degree A of glasses required by the eyes of the user according to the formula (6). Wherein, the formula (5) and the formula (6) are respectively expressed as follows:

for example, if the image distance d1 is the other aspect, the focal length f between the lens group and the lens at this time is calculated by the formula (7) and the formula (8). Further, the focal power D of the lens group is calculated according to the formula (9), and the number A of glasses required for the eyes of the user is calculated according to the formula (10). Wherein, formula (7), formula (8), formula (9) and formula (10) are respectively expressed as follows:

based on the above embodiment, in another realizable way, for equation (7), since the magnitude of the object distance u1 is large in practical application, it can be also ignored in equation (7)Thereby to obtainFurther, the method can be used for preparing a novel material

Finally, the number of glasses required by the eyes of the user is calculated

It should be noted that the above another realizable manner can also be applied to the case of the image distance d2 in the first aspect.

In some embodiments, after the wearable device obtains the target distance of the imaging plane from the target plane when the confirmation operation is received, the method further comprises:

and the wearable equipment takes the target vision corresponding to the target distance as the vision of the user according to the corresponding relation between the distance and the vision.

For example, a distance and vision contrast table is arranged in the wearable device, and the distance can be the moving distance of the lens group in the detection process, and can also be the image distance of the first lens on the near-eye side of the lens group from the retina of the eye of the user.

It should be noted that the detection process of the above embodiment is applicable to the detection of the eyesight of the left eye and the right eye of the user. Fig. 5 is a schematic diagram of a binocular vision testing process provided in the embodiment of the present application.

After the user wears the wearable device, the wearable device may receive a start detection instruction triggered by the user, so that the wearable device turns on an image display device to display an image, such as an international eye chart; the motor driving device on the left side in the wearable device drives the lens group, so that when the eyes of a user perceive a clear image on the left side, confirmation operation triggered by the user is received, and a target distance corresponding to the lens group on the left side is acquired through the sensor and output.

Similarly, the lens group is driven by the motor driving device on the right side in the wearable device, so that when the eyes of a user perceive a clear image on the right side, the confirmation operation triggered by the user is received, and the target distance corresponding to the lens group on the right side is acquired through the sensor and output.

Based on the target distance corresponding to the left lens group and the target distance corresponding to the right lens group output by the above embodiments, the detection result is calculated and output, where the detection result may include the reading of the required glasses corresponding to the left eye and the right eye, respectively, or the vision corresponding to the left eye and the right eye, respectively.

In addition, it should be noted that, in the above embodiments, the number of the lenses (greater than or equal to 3) in the lens group is not specifically limited, and the greater the number of the lens groups, the more the light emitted by the image display device is diffused, so that the measurement is more accurate; and the number of lenses in the lens group can be comprehensively considered and determined based on the size and the cost of the wearable device.

According to the embodiment of the application, the wearable device receives the confirmation operation triggered by the user in the process of moving the position of the lens group, wherein the confirmation operation is the operation triggered when the eyes of the user perceive a clear image; the wearable device obtains a target distance between an imaging plane and a target plane when the confirmation operation is received, wherein the target plane is a plane where a first lens on the near-eye side of the lens group is located; the wearable device determines the image distance of the lens group according to the target distance, and determines the myopia degree corresponding to the vision of the user according to the image distance. The embodiment of the application can solve the problems of complicated vision detection operation and low detection efficiency, improves the vision detection efficiency and provides great convenience for detection operation at any time; the test is simple and convenient to realize, and meanwhile, the use is convenient; when the device is used, monitoring can be carried out, comparison with previous historical data is carried out, and fluctuation of vision degrees and change of eye parameters of a user are judged, so that the user is reminded; the detection can be carried out at any time without complex instruments and professional operators.

It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.

Fig. 6 shows a block diagram of a vision testing apparatus provided in the embodiment of the present application, corresponding to the method in the above embodiment, and only shows the relevant parts in the embodiment of the present application for convenience of description. The vision testing apparatus illustrated in fig. 6 may be an executing subject of the vision testing method provided in the foregoing embodiment.

Referring to fig. 6, the vision inspection apparatus may include:

a first obtaining module 61, configured to receive a confirmation operation triggered by a user when the wearable device moves the position of the lens group, where the confirmation operation is triggered when the eyes of the user perceive a clear image;

a second obtaining module 62, configured to obtain a target distance between an imaging plane and a target plane when the confirmation operation is received, where the target plane is a plane where a first lens on a near-to-eye side of the lens group is located;

a first determining module 63, configured to determine a first focal length of the lens group according to the target distance;

and a second determining module 64, configured to determine a myopia degree corresponding to the eyesight of the user according to the first focal length.

The process of implementing each function by each module in the vision detecting device provided in the embodiment of the present application may specifically refer to the description of the embodiment shown in fig. 2, and is not described herein again.

It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.

As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".

Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance. It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements in some embodiments of the application, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first table may be named a second table, and similarly, a second table may be named a first table, without departing from the scope of various described embodiments. The first table and the second table are both tables, but they are not the same table.

Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.

The eyesight detection method provided by the embodiment of the application can be applied to terminal equipment such as Augmented Reality (AR) or Virtual Reality (VR) equipment, and the specific type of the terminal equipment is not limited at all.

By way of example and not limitation, when the terminal device is a wearable device, the wearable device may also be a generic term for intelligently designing daily wearing by applying wearable technology, developing wearable devices, such as glasses, and the like. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable intelligent device has the advantages of complete functions, large size and capability of realizing complete or partial functions without depending on a smart phone, such as smart glasses and the like.

Fig. 7 is a schematic structural diagram of a wearable device provided in an embodiment of the present application. As shown in fig. 7, the wearable device 7 of this embodiment includes: at least one processor 70 (only one shown in fig. 7), a memory 71, said memory 71 having stored therein a computer program 72 executable on said processor 70. The processor 70, when executing the computer program 72, implements the steps in the above-described embodiments of the eyesight detecting methods, such as the steps S201 to S204 shown in fig. 2. Alternatively, the processor 70, when executing the computer program 72, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the modules 61 to 64 shown in fig. 6.

The wearable device 7 may be an Augmented Reality (AR) or Virtual Reality (VR) device or the like. The wearable device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of a wearable device 7 and does not constitute a limitation of the wearable device 7 and may include more or fewer components than shown, or some components in combination, or different components, e.g. the terminal device may also include an input transmitting device, a network access device, a bus, etc.

The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.

The memory 71 may in some embodiments be an internal storage unit of the wearable device 7, such as a hard disk or a memory of the wearable device 7. The memory 71 may also be an external storage device of the wearable device 7, such as a plug-in hard disk provided on the wearable device 7, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 71 may also include both an internal storage unit and an external storage device of the wearable device 7. The memory 71 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 71 may also be used to temporarily store data that has been transmitted or is to be transmitted.

In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.

The embodiment of the present application further provides a wearable device, where the wearable device includes at least one memory, at least one processor, and a computer program stored in the at least one memory and executable on the at least one processor, and when the processor executes the computer program, the terminal device is enabled to implement the steps in any of the above method embodiments.

The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.

The embodiments of the present application provide a computer program product, which when executed on a computer, enables the computer to implement the steps in the above method embodiments.

The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like.

In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.

Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.

The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application, and are intended to be included within the scope of the present application.

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种全自动智能综合验光分析系统和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!