View display method of AR (augmented reality) glasses and AR glasses

文档序号:377570 发布日期:2021-12-10 浏览:4次 中文

阅读说明:本技术 Ar眼镜的视图显示方法及ar眼镜 (View display method of AR (augmented reality) glasses and AR glasses ) 是由 苗顺平 于 2021-09-14 设计创作,主要内容包括:本申请公开了一种AR眼镜的视图显示方法及AR眼镜,该AR眼镜的视图显示方法包括:标定AR眼镜的基准位置;获取AR眼镜基于基准位置的转动角度;基于转动角度,控制画布朝向与转动角度相反的方向移动;根据显示屏幕上的锚点在画布上的位置确定选中的功能模块;设置在AR眼镜上的交互模块实时检测对功能模块和/或画布的交互操作是否被触发;若交互操作被触发,基于交互操作指示的操作类型,对功能模块和/或画布进行交互控制;将进行交互控制后的AR眼镜的位置重新标定为基准位置。本申请解决了相关技术中的AR眼镜受限于视场角,导致显示的面积被极大的限制,并且在交互方式上较为单一的问题。(The application discloses a view display method of AR glasses and the AR glasses, wherein the view display method of the AR glasses comprises the following steps: calibrating the reference position of the AR glasses; acquiring a rotation angle of the AR glasses based on the reference position; controlling the canvas to move towards a direction opposite to the rotation angle based on the rotation angle; determining a selected functional module according to the position of an anchor point on a display screen on a canvas; the interaction module arranged on the AR glasses detects whether the interaction operation on the functional module and/or the canvas is triggered or not in real time; if the interactive operation is triggered, performing interactive control on the functional module and/or the canvas based on the operation type indicated by the interactive operation; and re-calibrating the position of the AR glasses after interactive control as a reference position. The application solves the problems that AR glasses in the related art are limited by the angle of field, the display area is greatly limited, and the interaction mode is single.)

1. A view display method for AR glasses, comprising:

calibrating the reference position of the AR glasses;

acquiring a rotation angle of the AR glasses based on the reference position;

controlling the canvas to move towards a direction opposite to the rotation angle based on the rotation angle;

determining a selected functional module according to the position of an anchor point on a display screen on the canvas;

the interaction module arranged on the AR glasses detects whether the interaction operation on the functional module and/or the canvas is triggered in real time;

if the interactive operation is triggered, performing interactive control on the functional module and/or the canvas based on the operation type indicated by the interactive operation;

and re-calibrating the position of the AR glasses after the interactive control is carried out as a reference position.

2. The method for displaying the view of the AR glasses according to claim 1, wherein the real-time detection of the interaction operation by the interaction module disposed on the AR glasses specifically includes:

the touch panel arranged on the AR glasses detects whether the interactive operation is triggered in real time; and/or the presence of a gas in the gas,

the controller arranged on the AR glasses detects whether the interactive operation is triggered in real time; and/or the presence of a gas in the gas,

the voice interaction module arranged on the AR glasses detects whether the interaction operation is triggered in real time; and/or the presence of a gas in the gas,

and an external control module in remote communication connection with the AR glasses detects whether the interactive operation is triggered in real time.

3. The view display method of the AR glasses according to claim 1, wherein an interaction module provided on the AR glasses detects whether an interaction operation is triggered in real time;

if the interactive operation is triggered, based on the operation type indicated by the interactive operation, performing interactive control on the AR glasses specifically comprises:

the interaction module arranged on the AR glasses acquires the displacement acceleration of the AR glasses in real time and judges whether the displacement acceleration is within a preset range;

and when the displacement acceleration is within a preset range, executing the functional unit selected by the anchor point to complete the interactive control.

4. The view display method of the AR glasses according to any one of claims 1 to 3, wherein the canvas is provided with a view boundary in a rotation direction of the AR glasses;

when the AR glasses rotate to the position where the current interface is the view boundary, reminding is carried out, and the current position of the AR glasses is calibrated as the reference position again.

5. The method for displaying the view of the AR glasses according to claim 4, wherein the rotation angle is specifically: horizontal rotation angle, vertical rotation angle and deflection angle.

6. The method for displaying the view of the AR glasses according to claim 5, wherein based on the rotation angle, controlling the canvas to move in a direction opposite to the rotation angle is specifically:

based on the horizontal rotation angle, controlling the canvas to move towards the direction opposite to the horizontal rotation angle, and realizing the movement control of the canvas in the horizontal direction;

based on the vertical rotation angle, controlling the canvas to move towards the direction opposite to the vertical rotation angle, and realizing the movement control of the canvas in the vertical direction;

and controlling the canvas to rotate towards the direction opposite to the deflection angle based on the deflection angle, so as to realize the movement control of the canvas in the deflection direction.

7. The view display method of AR glasses according to claim 6, further comprising:

and acquiring a display area of the canvas in the vertical direction, and activating the movement control of the canvas in the vertical direction when the display area exceeds the display range of the AR glasses.

8. The view display method of AR glasses according to claim 1, wherein a filtering algorithm is used to filter out a rotation angle generated by fine jitter and signal interference of the AR glasses when a control canvas is moved in a direction opposite to the rotation angle based on the rotation angle.

9. The method for displaying the view of the AR glasses according to claim 8, wherein the filtering algorithm is specifically:

setting a minimum rotation angle threshold, wherein when the rotation angle of the AR glasses is smaller than the minimum rotation angle threshold, the canvas is fixed;

and setting a minimum rotation time threshold, wherein when the rotation time of the AR glasses is less than the minimum rotation time threshold, the canvas is fixed.

10. AR eyewear, comprising:

the gyroscope sensor is used for calibrating the reference position of the AR glasses and acquiring the rotation angle of the AR glasses based on the reference position in real time;

the displacement calculation module is used for determining the displacement distance and the moving direction of the canvas according to the rotation angle;

the canvas moving module moves the canvas according to the displacement distance and the moving direction;

and the interaction module is used for carrying out interaction control on the AR glasses according to the interaction operation.

Technical Field

The application relates to the technical field of AR glasses, in particular to a view display method of the AR glasses and the AR glasses.

Background

With the development of human-computer interaction technology, the AR glasses are increasingly applied to daily life of people, such as watching movies, playing games, navigating road conditions, and the like. The AR glasses on the market are limited by the field of view (FOV) of the AR display of the product, and most of the AR glasses choose to fully utilize the limited space based on the preferred FOV, so that the display area is greatly limited, and the AR glasses are single in interaction mode.

Aiming at the problems that AR glasses in the related art are limited by the angle of view, the display area is greatly limited, and the interaction mode is single, an effective solution is not provided at present.

Disclosure of Invention

The present application mainly aims to provide a view display method of AR glasses and AR glasses, so as to solve the problem that the AR glasses in the related art are limited by the angle of view, resulting in a very limited display area, and are relatively single in interactive mode.

In order to achieve the above object, the present application provides a view display method of AR glasses, the view display method of the AR glasses including:

calibrating the reference position of the AR glasses;

acquiring a rotation angle of the AR glasses based on the reference position;

controlling the canvas to move towards a direction opposite to the rotation angle based on the rotation angle;

determining a selected functional module according to the position of an anchor point on a display screen on the canvas;

the interaction module arranged on the AR glasses detects whether the interaction operation on the functional module and/or the canvas is triggered in real time;

if the interactive operation is triggered, performing interactive control on the functional module and/or the canvas based on the operation type indicated by the interactive operation;

and re-calibrating the position of the AR glasses after the interactive control is carried out as a reference position.

Further, the real-time detection of whether the interactive operation is triggered by the interactive module arranged on the AR glasses specifically includes:

the touch panel arranged on the AR glasses detects whether the interactive operation is triggered in real time; and/or the presence of a gas in the gas,

the controller arranged on the AR glasses detects whether the interactive operation is triggered in real time; and/or a voice interaction module arranged on the AR glasses detects whether the interaction operation is triggered in real time; and/or the presence of a gas in the gas,

and an external control module in remote communication connection with the AR glasses detects whether the interactive operation is triggered in real time.

Further, an interaction module arranged on the AR glasses detects whether interaction operation is triggered in real time;

if the interactive operation is triggered, based on the operation type indicated by the interactive operation, performing interactive control on the AR glasses specifically comprises:

the interaction module arranged on the AR glasses acquires the displacement acceleration of the AR glasses in real time and judges whether the displacement acceleration is within a preset range;

and when the displacement acceleration is within a preset range, executing the functional unit selected by the anchor point to complete the interactive control.

Further, the canvas is provided with a view boundary in the rotation direction of the AR glasses;

when the AR glasses rotate to the position where the current interface is the view boundary, reminding is carried out, and the current position of the AR glasses is calibrated as the reference position again.

Further, the rotation angle is specifically as follows: horizontal rotation angle, vertical rotation angle and deflection angle.

Further, based on the rotation angle, the direction of the control canvas moving in the direction opposite to the rotation angle is specifically:

based on the horizontal rotation angle, controlling the canvas to move towards the direction opposite to the horizontal rotation angle, and realizing the movement control of the canvas in the horizontal direction;

based on the vertical rotation angle, controlling the canvas to move towards the direction opposite to the vertical rotation angle, and realizing the movement control of the canvas in the vertical direction;

and controlling the canvas to rotate towards the direction opposite to the deflection angle based on the deflection angle, so as to realize the movement control of the canvas in the deflection direction.

Further, the method also comprises the following steps: and acquiring a display area of the canvas in the vertical direction, and activating the movement control of the canvas in the vertical direction when the display area exceeds the display range of the AR glasses.

Further, when the canvas is controlled to move towards the direction opposite to the rotation angle based on the rotation angle, a filtering algorithm is adopted to filter out the rotation angle generated by the fine jitter and the signal interference of the AR glasses.

Further, the filtering algorithm specifically includes:

setting a minimum rotation angle threshold, wherein when the rotation angle of the AR glasses is smaller than the minimum rotation angle threshold, the canvas is fixed;

and setting a minimum rotation time threshold, wherein when the rotation time of the AR glasses is less than the minimum rotation time threshold, the canvas is fixed.

According to another aspect of the present application, there is provided AR glasses including:

the gyroscope sensor is used for calibrating the reference position of the AR glasses and acquiring the rotation angle of the AR glasses based on the reference position in real time;

the displacement calculation module is used for determining the displacement distance and the moving direction of the canvas according to the rotation angle;

the canvas moving module moves the canvas according to the displacement distance and the moving direction;

and the interaction module is used for carrying out interaction control on the AR glasses according to the interaction operation.

In the embodiment of the application, the reference position of the AR glasses is calibrated; acquiring a rotation angle of the AR glasses based on the reference position; controlling the canvas to move towards a direction opposite to the rotation angle based on the rotation angle; the selected functional module is determined according to the position of the anchor point on the canvas on the display screen, so that the purpose of enabling the canvas in the AR glasses to move along with the rotation of the head of a wearer is achieved, and an interaction mode which is more in line with the habit of the user is realized through a more natural mode (rotating the head);

detecting whether the interactive operation on the functional module and/or the canvas is triggered or not in real time through an interactive module arranged on the AR glasses; if the interactive operation is triggered, performing interactive control on the functional module and/or the canvas based on the operation type indicated by the interactive operation; the position of the AR glasses after interactive control is re-calibrated to be the reference position, the purpose of adopting an additional interaction module to perform more interactive operations while adopting the head rotation control canvas to move for interaction is achieved, so that the technical effect of an interaction mode of the AR glasses is enriched on the basis of not increasing the hardware cost, the AR display area is increased, the problem that the AR glasses in the related art are limited by the field angle, the display area is greatly limited and the problem that the interaction mode is single is solved.

Drawings

The accompanying drawings, which are incorporated in and constitute a part of this application, serve to provide a further understanding of the application and to enable other features, objects, and advantages of the application to be more apparent. The drawings and their description illustrate the embodiments of the invention and do not limit it. In the drawings:

fig. 1 is a schematic flowchart of a view display method of AR glasses according to an embodiment of the present application;

FIG. 2 is a schematic view of AR glasses according to an embodiment of the present application;

FIG. 3 is a diagram illustrating display of AR glasses views after canvas movement according to an embodiment of the present application;

FIG. 4 is a schematic diagram of the structure of AR glasses according to an embodiment of the present application;

the device comprises a control unit 1, a canvas moving module 2, a displacement calculating module 3, a gyroscope sensor 4, a motion filtering module 5, a touch pad 6, a voice interaction module 7, a controller 8, a remote controller 9, an acceleration sensor 10, an acceleration comparing module 11, a comparison result output module 12, a size comparing module 13, a canvas size acquiring module 14 and a view angle size acquiring module 15.

Detailed Description

In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.

It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used.

In this application, the terms "upper", "lower", "inside", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings. These terms are used primarily to better describe the present application and its embodiments, and are not used to limit the indicated devices, elements or components to a particular orientation or to be constructed and operated in a particular orientation.

Moreover, some of the above terms may be used to indicate other meanings besides the orientation or positional relationship, for example, the term "on" may also be used to indicate some kind of attachment or connection relationship in some cases. The specific meaning of these terms in this application will be understood by those of ordinary skill in the art as appropriate.

Furthermore, the terms "disposed," "provided," "connected," "secured," and the like are to be construed broadly. For example, "connected" may be a fixed connection, a detachable connection, or a unitary construction; can be a mechanical connection, or an electrical connection; may be directly connected, or indirectly connected through intervening media, or may be in internal communication between two devices, elements or components. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.

In addition, the term "plurality" shall mean two as well as more than two.

It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.

As shown in fig. 1 to 3, an embodiment of the present application provides a view display method for AR glasses, where the view display method for AR glasses includes the following steps:

step 101: calibrating the reference position of the AR glasses;

in this embodiment, an angle sensor such as a gyroscope, for example, a three-axis gyroscope sensor may be disposed in the AR glasses, and the reference position of the AR glasses may be calibrated by the three-axis gyroscope sensor. For example, when the AR glasses are in the initial state, the three-axis gyroscope sensor establishes a three-dimensional coordinate system with the current position of the AR glasses as the origin of coordinates, and the position of the AR glasses is calibrated as the reference position.

Step 102: acquiring a rotation angle of the AR glasses based on the reference position;

when the AR glasses are worn on the head of the wearer, an initialization setting may be performed to calibrate the current position of the AR glasses as a reference position. And when the head of the wearer moves, the three-axis gyroscope sensor acquires the motion data of the head of the wearer, wherein the motion data comprises a motion direction, a motion track and motion acceleration. In the present embodiment, the rotation angle and the rotation direction of the head of the wearer with respect to the reference position are mainly acquired by the three-axis gyro sensor. In this embodiment, the rotation angle and the rotation direction are mainly a horizontal rotation angle, a vertical rotation angle, and a deflection angle.

Step 103: controlling the canvas to move towards a direction opposite to the rotation angle based on the rotation angle;

in this embodiment, the canvas is a virtual display area formed on the AR glasses, and the content on the canvas can be viewed through the display screen projected on the display module by the AR glasses after the wearer wears the AR glasses. On the canvas, a plurality of functional modules can be displayed, such as a camera, an address book, a telephone, music, settings, navigation, time and the like, and of course, the current electric quantity, mode and the like of the AR glasses can also be displayed.

It is understood that the functional modules may be arranged according to preset partitions, and may be arranged transversely and longitudinally.

The display module comprises one or two lens modules for AR display, and also comprises a display screen and the like which are not limited to an indicator light or other display modules except the AR display module. The display module realizes display of the AR video signal and display of the working state of the AR glasses.

In this embodiment, after the triaxial gyroscope sensor acquires the head turned angle of the wearer, the displacement distance and the moving direction of the canvas can be determined according to the turned angle, and then the displacement distance and the moving direction of the canvas are converted according to the displacement distance and the moving direction, so that the canvas is moved. For example, when the head of the wearer rotates to the right by a certain angle, the three-axis gyroscope sensor controls the canvas to move to the left by a corresponding distance according to the rotation angle of the head of the wearer, so that the view on the right side of the viewing angle is displayed right in front of the eyes of the wearer after moving to the left, and the wearer can experience the view very conveniently. Similarly, the canvas may move in the reverse direction when the wearer's head is facing right, up, down, or deflected, thereby causing the corresponding view to move directly in front of the wearer's eyes.

Step 104: determining a selected functional module according to the position of an anchor point on a display screen on the canvas;

in this embodiment, the anchor point is the fixed point on the AR glasses display screen, and for user experience, the anchor point is located the central authorities of display screen, is located the dead ahead of the person's of wearing eyes all the time. When the head of the wearer rotates in the horizontal direction, the canvas moves in the horizontal direction and drives the view on the canvas to move, for example, drives a plurality of function modules to move, when a certain function module is moved to a position corresponding to the anchor point, the wearer selects the function module, and at this time, step 105 may be executed.

Step 105: the interaction module arranged on the AR glasses detects whether the interaction operation on the functional module and/or the canvas is triggered in real time;

if the interactive operation is triggered, performing interactive control on the functional module and/or the canvas based on the operation type indicated by the interactive operation;

in this embodiment, the interaction mode to be implemented by the interaction module is an interaction mode other than the mode in which the canvas is controlled to move by using the head. The interaction by using the head movement of the wearer is only executed to the anchor point selected function module in the embodiment, and the function module is executed after the selected function module is executed by other interaction modules.

Specifically, the interaction module arranged on the AR glasses detects whether the interaction operation on the functional module is triggered in real time, and when the functional module to be used is selected by the anchor point after the head of the wearer rotates, a corresponding command is input to the interaction module according to the set interaction operation, so that the functional module selected by the anchor point can execute the next action, such as starting, closing, deleting, moving, and the like. If the functional module also involves other interactive functions after an action, the same logic can be used for implementation.

As the interaction module in the AR glasses, the interaction mode is not limited to opening or closing the corresponding function module, and the movement of the canvas can be controlled. When the wearer does not employ head movement to control the interactive mode of canvas movement, an interactive operation of inputting canvas movement to the interactive module may be employed, where the interactive operation includes controlling canvas horizontal movement, vertical movement, and rotational movement. So that more views can still be presented to the wearer in front of the wearer's eyes as the canvas moves.

The interaction module and the head control are combined in the implementation, interaction modes of the AR glasses are enriched, interaction selection of a user is more flexible, and AR experience of the user is improved.

Step 106: and re-calibrating the position of the AR glasses after the interactive control is carried out as a reference position.

Specifically, it should be noted that, after the wearer uses the interaction module to realize the interaction control on the AR glasses, the reference position of the AR glasses needs to be calibrated again. For example, when the wearer needs to control the canvas by head movement after sliding the canvas by the interaction module or changing the positions of the functional modules in the canvas, the three-axis gyroscope sensor can be used to calibrate the current position of the AR glasses as a new reference position, and then the rotation angle of the AR glasses based on the new reference position is acquired along with the rotation of the wearer's head, so that the new canvas moves again in response to the rotation angle.

In re-calibrating the reference position of the AR glasses, the wearer's head may be in a state after rotation as well as in a state in front of the front view. After the view that needs is selected in the rotatory person's head of wearing, mutual module intervenes and carries out interactive control to AR glasses, and the canvas can not move along with person's head rotation this moment, and person's head can rotate the position of looking ahead of looking back promptly, the current view of the more comfortable experience of person of wearing of being convenient for. In this state, the movement of the head control canvas can be continued by recalibrating the reference positions of the AR glasses. It can be understood that after the head of the wearer rotates to select the required view, the interaction module intervenes to carry out interaction control on the AR glasses, and after a new menu is opened through the interaction control, the reference position of the AR glasses can be calibrated again immediately, so that the movement of the AR glasses in the new menu through the head control canvas is realized.

Further, the real-time detection of whether the interactive operation is triggered by the interactive module arranged on the AR glasses specifically includes:

the touch panel arranged on the AR glasses detects whether the interactive operation is triggered in real time; and/or the presence of a gas in the gas,

the controller arranged on the AR glasses detects whether the interactive operation is triggered in real time; and/or the presence of a gas in the gas,

the voice interaction module arranged on the AR glasses detects whether the interaction operation is triggered in real time; and/or the presence of a gas in the gas,

and an external control module in remote communication connection with the AR glasses detects whether the interactive operation is triggered in real time.

Specifically, the wearer may trigger the interactive operation through the touch panel, and the touch panel may detect a click (single click, double click, a preset number of clicks, and the like) operation, a sliding operation, a long-time pressing operation, and the like of the wearer. Interaction operation can also be triggered through the controller, such as control keys and the like arranged on AR glasses, of course, the wearer can also trigger interaction operation and the like through inputting voice to the voice interaction module, or trigger interaction operation through a remote controller. It will be appreciated that the above-described trigger patterns for the interaction may exist simultaneously, and that the wearer may select an appropriate trigger pattern to further enhance the flexibility of the interaction.

Further, an interaction module arranged on the AR glasses detects whether interaction operation is triggered in real time;

if the interactive operation is triggered, based on the operation type indicated by the interactive operation, performing interactive control on the AR glasses specifically comprises:

the interaction module arranged on the AR glasses acquires the displacement acceleration of the AR glasses in real time and judges whether the displacement acceleration is within a preset range;

and when the displacement acceleration is within a preset range, executing the functional unit selected by the anchor point to complete the interactive control.

Specifically, in the present embodiment, an interaction of the functional module is performed by using the head motion of the wearer. When a functional module is selected by a wearer, the rapid head nodding action can be carried out, at the moment, the interaction module obtains the displacement acceleration of the AR glasses positioned on the head of the wearer, and when the displacement acceleration is within a preset range, the selected functional module is considered to be executed. That is, the embodiment implements execution of the selected functional module through head movement, and the movement of the canvas and execution of the functional module are controlled by the head of the wearer through the interactive mode, so that both hands of the wearer can be liberated. In order to improve the use experience of a wearer and reduce action conflicts, the interaction mode can be limited in an environment that the canvas can only move in the horizontal direction, and the problem that the canvas is inconvenient to use due to head pitching and point head action conflicts in a vertical motion environment is avoided.

Further, the canvas is provided with a view boundary in the rotation direction of the AR glasses;

when the AR glasses rotate to the position where the current interface is the view boundary, reminding is carried out, and the current position of the AR glasses is calibrated as the reference position again.

Specifically, it should be noted that when the angle of the AR glasses is rotated too much by the wearer and reaches the view boundary of the canvas, the view content on the canvas is completely displayed, and at this time, an explicit prompt is given to notify the wearer that no content is displayed at the current angle. If the wearer rotates towards the direction with the content, the current angle, namely the current position of the AR glasses is taken as a reference point, the reference position is calibrated again, and the view content is displayed again along with the rotation of the head of the wearer.

For the horizontal movement and the deflection movement of the canvas, the view boundaries are arranged at the left end and the right end of the canvas, and for the vertical movement of the canvas, the view boundaries are arranged at the upper end and the lower end of the canvas.

Further, the rotation angle is specifically as follows: horizontal rotation angle, vertical rotation angle and deflection angle. It is understood that in the cartesian coordinate system, the horizontal rotation angle is an angle of rotation about the Z axis, the vertical rotation angle is an angle of rotation about the Y axis, and the yaw angle is an angle of rotation about the X axis in the present embodiment.

Based on turned angle, control canvas orientation with turned angle opposite direction removes specifically is:

based on the horizontal rotation angle, controlling the canvas to move towards the direction opposite to the horizontal rotation angle, and realizing the movement control of the canvas in the horizontal direction;

based on the vertical rotation angle, controlling the canvas to move towards the direction opposite to the vertical rotation angle, and realizing the movement control of the canvas in the vertical direction;

and controlling the canvas to rotate towards the direction opposite to the deflection angle based on the deflection angle, so as to realize the movement control of the canvas in the deflection direction.

Specifically, it should be noted that the head movement includes three directions of movement, namely rotation around the Z axis (i.e. horizontal rotation angle), rotation around the Y axis (i.e. vertical rotation angle), and rotation around the X axis (i.e. deflection angle), so that in order to fully utilize the head movement to implement multiple interaction modes, the horizontal rotation angle, the vertical rotation angle, and the deflection angle of the head are utilized in this embodiment.

Further, the method also comprises the following steps: and acquiring a display area of the canvas in the vertical direction, and activating the movement control of the canvas in the vertical direction when the display area exceeds the display range of the AR glasses.

Specifically, the resident control mode of controlling the movement of the canvas in the vertical direction to the non-AR glasses is described, and when the display area of the canvas in the vertical direction does not exceed the display range of the AR glasses (that is, when the width of the canvas is equal to or less than the width of the field angle), the wearer can browse the view on the canvas comprehensively only by the movement in the horizontal direction without moving the canvas in the vertical direction. In addition, under the condition, the data needing to be acquired and calculated by the three-axis gyroscope sensor is reduced, so that the energy consumption can be reduced, and the processing efficiency can be improved.

The AR glasses activate movement control of the canvas in the vertical direction only when the width of the canvas is greater than the width of the field angle, at which time the canvas may be controlled to move in the vertical direction by the head pitch motion.

It will be appreciated that for a length of the canvas that is less than or equal to the length of the field angle, the AR glasses may also not enable movement control of the canvas in the horizontal direction, but only activate movement control of the canvas in the vertical direction.

This embodiment compares through the size and the angle of view of canvas earlier stage to activating different canvas control modes, can effectively improving the speed of later stage data acquisition and processing, improving reaction rate, and then improving user experience and feeling.

Further, when the canvas is controlled to move towards the direction opposite to the rotation angle based on the rotation angle, a filtering algorithm is adopted to filter out the rotation angle generated by the fine jitter and the signal interference of the AR glasses.

Specifically, it should be noted that when the wearer uses the AR glasses, the head may shake slightly, and the three-axis gyroscope sensor may also be interfered by signals to cause misoperation, so to reduce the misoperation and the possibility of misoperation, in this embodiment, a filtering algorithm is adopted to filter the rotation angle generated by the slight shake of the AR glasses and the signal interference.

Further, the filtering algorithm specifically includes:

setting a minimum rotation angle threshold, wherein when the rotation angle of the AR glasses is smaller than the minimum rotation angle threshold, the canvas cannot be controlled to move by the rotation angle, namely the canvas is fixed;

and setting a minimum rotation time threshold, wherein when the rotation time of the AR glasses is less than the minimum rotation time threshold, the canvas can not be controlled to move by the rotation angle, namely the canvas is fixed.

By the method, misoperation can be reduced, possibility of misoperation is reduced, and experience of a wearer is improved.

As shown in fig. 4, according to another aspect of the present application, there is provided AR glasses including: the gyroscope sensor 4 is used for calibrating the reference position of the AR glasses and acquiring the rotation angle of the AR glasses based on the reference position in real time;

the displacement calculation module 3 determines the displacement distance and the moving direction of the canvas according to the rotation angle;

the canvas moving module 2 moves the canvas reversely according to the displacement distance and the moving direction;

and the interaction module is used for carrying out interaction control on the AR glasses according to the interaction operation.

In this embodiment, the gyro sensor 4 is disposed inside the AR glasses, and the reference position of the AR glasses can be calibrated by the gyro sensor 4. For example, when the AR glasses are in the initial state, the gyro sensor 4 establishes a three-dimensional coordinate system with the current position of the AR glasses as the origin of coordinates, and the position of the AR glasses is calibrated as the reference position.

When the AR glasses are worn on the head of the wearer, an initialization setting may be performed to calibrate the current position of the AR glasses as a reference position. While the head of the wearer moves, the gyroscope sensor 4 acquires the motion data of the head of the wearer, wherein the motion data comprises a motion direction, a motion track and motion acceleration. In the present embodiment, the rotation angle and the rotation direction of the head of the wearer with respect to the reference position are mainly obtained by the three-axis gyroscope. In this embodiment, the rotation angle and the rotation direction are mainly a horizontal rotation angle, a vertical rotation angle, and a deflection angle. Here, the gyro sensor 4 may employ a three-axis gyro sensor.

In this embodiment, the canvas is a virtual display area formed on the AR glasses, and the content on the canvas can be viewed through the display screen projected on the display module by the AR glasses after the wearer wears the AR glasses. On the canvas, a plurality of functional modules can be displayed, such as a camera, an address book, a telephone, music, settings, navigation, time and the like, and of course, the current electric quantity, mode and the like of the AR glasses can also be displayed.

It is understood that the functional modules may be arranged according to preset partitions, and may be arranged transversely and longitudinally.

In this embodiment, after gyroscope sensor 4 acquires the head turned angle of the wearer, displacement calculation module 3 can determine the displacement distance and the moving direction of the canvas according to the turned angle, and then canvas moving module 2 converts the displacement distance and the displacement direction of the canvas according to the displacement distance and the displacement direction, thereby moving the canvas. For example, when the head of the wearer rotates to the right by a certain angle, the gyroscope sensor 4 controls the canvas to move to the left by a corresponding distance according to the rotation angle of the head of the wearer, so that the view on the right side of the viewing angle is displayed right in front of the eyes of the wearer after moving to the left, and the wearer can experience the view very conveniently. Similarly, the canvas may move in the reverse direction when the wearer's head is facing right, up, down, or deflected, thereby causing the corresponding view to move directly in front of the wearer's eyes.

Determining a selected functional module according to the position of an anchor point on a display screen on a canvas; in this embodiment, the anchor point is the fixed point on the AR glasses display screen, and for user experience, the anchor point is located the central authorities of display screen, is located the dead ahead of the person's of wearing eyes all the time. When the head of the wearer rotates in the horizontal direction, the canvas moves in the horizontal direction and drives the view on the canvas to move, for example, a plurality of functional modules are driven to move, and when a certain functional module is moved to a position corresponding to the anchor point, the functional module is selected by the wearer.

The interaction module arranged on the AR glasses detects whether the interaction operation on the functional module and/or the canvas is triggered or not in real time;

if the interactive operation is triggered, performing interactive control on the functional module and/or the canvas based on the operation type indicated by the interactive operation;

in this embodiment, the interaction mode to be implemented by the interaction module is an interaction mode other than the mode in which the canvas is controlled to move by using the head. The interaction by using the head movement of the wearer is only executed to the anchor point selected function module in the embodiment, and the function module is executed after the selected function module is executed by other interaction modules.

Specifically, the interaction module arranged on the AR glasses detects whether the interaction operation on the functional module is triggered in real time, and when the head of the wearer rotates and the anchor point selects the functional module to be used, a corresponding command is input to the interaction module according to the set interaction operation, so that the functional module selected by the anchor point can execute the next action, such as starting, closing, deleting, moving, and the like. If the functional module also involves other interactive functions after an action, the same logic can be used for implementation.

As the interaction module in the AR glasses, the interaction mode is not limited to opening or closing the corresponding function module, and the movement of the canvas can be controlled. When the wearer does not employ head movement to control the interactive mode of canvas movement, an interactive operation of inputting canvas movement to the interactive module may be employed, where the interactive operation includes controlling canvas horizontal movement, vertical movement, and rotational movement. So that more views can still be presented to the wearer in front of the wearer's eyes as the canvas moves.

The interaction module and the head control are combined in the implementation, interaction modes of the AR glasses are enriched, interaction selection of a user is more flexible, and AR experience of the user is improved.

Specifically, it should be noted that, after the wearer uses the interaction module to realize the interaction control on the AR glasses, the reference position of the AR glasses needs to be calibrated again. For example, when the wearer needs to control the canvas by head movement after sliding the canvas with the interaction module or changing the positions of the functional modules in the canvas, the gyroscope sensor 4 may be used to calibrate the current position of the AR glasses as a new reference position, and then as the wearer's head rotates, the rotation angle of the AR glasses based on the new reference position is acquired, so that the new canvas moves again in response to the rotation angle.

In re-calibrating the reference position of the AR glasses, the wearer's head may be in a state after rotation as well as in a state in front of the front view. After the view that needs is selected in the rotatory person's head of wearing, mutual module intervenes and carries out interactive control to AR glasses, and the canvas can not move along with person's head rotation this moment, and person's head can rotate the position of looking ahead of looking back promptly, the current view of the more comfortable experience of person of wearing of being convenient for. In this state, the movement of the head control canvas can be continued by recalibrating the reference positions of the AR glasses. It can be understood that after the head of the wearer rotates to select the required view, the interaction module intervenes to carry out interaction control on the AR glasses, and after a new menu is opened through the interaction control, the reference position of the AR glasses can be calibrated again immediately, so that the movement of the AR glasses in the new menu through the head control canvas is realized.

As shown in fig. 4, the interaction module includes:

the touch panel 6 is arranged on the AR glasses and used for acquiring the interactive operation of the user so as to enable the control unit 1 of the AR glasses to carry out interactive control on the display interface of the AR glasses based on the type of the interactive operation;

the voice interaction module 7 is arranged on the AR glasses, and when the interaction operation acquired by the touch panel 6 accords with the preset type, the control unit 1 starts the voice interaction module 7 to realize voice interaction control on the display interface of the AR glasses through the voice interaction module 7.

As shown in fig. 4, the interaction module further includes:

the controller 8 is arranged on the AR glasses, and the controller 8 is connected with the control unit 1;

and the remote controller 9 establishes remote communication connection with the control unit 1 of the AR glasses.

Specifically, the wearer may trigger the interactive operation through the touch panel 6, and the touch panel 6 may detect a click (single click, double click, a preset number of clicks, and the like) operation, a sliding operation, a long-time pressing operation, and the like of the wearer. Interaction may also be triggered by the controller 8, such as control buttons mounted on the AR glasses, etc., although the wearer may also trigger interaction by inputting speech to the speech interaction module 7, etc., or by using a remote control 9. It will be appreciated that the above-described trigger patterns for the interaction may exist simultaneously, and that the wearer may select an appropriate trigger pattern to further enhance the flexibility of the interaction.

As shown in fig. 4, the interaction module further includes:

an acceleration sensor 10 for acquiring a displacement acceleration of the AR glasses;

the acceleration comparison module 11 is connected with the control unit 1 and used for comparing the acquired displacement acceleration with a preset displacement acceleration range; and if the displacement acceleration is within the displacement acceleration range, the control unit 1 is used for carrying out interactive control on the display interface of the AR glasses.

Specifically, in the present embodiment, an interaction of the functional module is performed by using the head motion of the wearer. When a functional module is selected by a wearer, the rapid head nodding action can be carried out, at the moment, the interaction module obtains the displacement acceleration of the AR glasses positioned on the head of the wearer, and when the displacement acceleration is within a preset range, the selected functional module is considered to be executed. That is, the embodiment implements execution of the selected functional module through head movement, and the movement of the canvas and execution of the functional module are controlled by the head of the wearer through the interactive mode, so that both hands of the wearer can be liberated. In order to improve the use experience of a wearer and reduce action conflicts, the interaction mode can be limited in an environment that the canvas can only move in the horizontal direction, and the problem that the canvas is inconvenient to use due to head pitching and point head action conflicts in a vertical motion environment is avoided.

The rotation angle acquired by the gyro sensor 4 is a horizontal rotation angle, a vertical rotation angle, and a deflection angle. It is understood that in the cartesian coordinate system, the horizontal rotation angle is an angle of rotation about the Z axis, the vertical rotation angle is an angle of rotation about the Y axis, and the yaw angle is an angle of rotation about the X axis in the present embodiment.

As shown in fig. 4, the displacement calculation module 3 is configured to:

determining the displacement distance and the moving direction of the canvas in the horizontal direction according to the horizontal rotating angle;

determining the displacement distance and the moving direction of the canvas in the vertical direction according to the vertical rotating angle;

the rotation angle and the rotation direction of the canvas in the deflection direction are determined according to the deflection angle.

Specifically, it should be noted that the head movement includes three directions of movement, namely rotation around the Z axis (i.e. horizontal rotation angle), rotation around the Y axis (i.e. vertical rotation angle), and rotation around the X axis (i.e. left-right deflection), so that in order to fully utilize the head movement to implement multiple interaction modes, the horizontal rotation angle, the vertical rotation angle, and the deflection angle of the head are utilized in this embodiment.

As shown in FIG. 4, the canvas movement module 2 is used to:

according to the horizontal rotation angle, the canvas is controlled to move towards the direction opposite to the horizontal rotation angle, and the movement control of the canvas in the horizontal direction is realized;

controlling the canvas to move towards the direction opposite to the vertical rotation angle according to the vertical rotation angle, so as to realize the movement control of the canvas in the vertical direction;

and controlling the canvas to rotate towards the direction opposite to the deflection angle according to the deflection angle, so as to realize the movement control of the canvas in the deflection direction.

As shown in fig. 4, the method further includes:

a canvas size obtaining module 14, configured to obtain a length and a height of a currently displayed canvas;

a field angle size obtaining module 15, configured to obtain the length and height of the field angle of the AR glasses;

the size comparison module 13 is used for comparing the length and the height of the canvas with the length and the height of the field angle of the AR glasses respectively;

a comparison result output module 12, configured to output a comparison result between the length of the canvas and the length of the AR glasses field angle, and output a comparison result between the width of the canvas and the width of the AR glasses field angle;

the control unit 1 selects whether to activate the movement control of the canvas in the horizontal direction according to the length comparison result and selects whether to activate the movement control of the canvas in the vertical direction according to the width comparison result.

Specifically, the resident control mode of controlling the movement of the canvas in the vertical direction to the non-AR glasses is described, and when the display area of the canvas in the vertical direction does not exceed the display range of the AR glasses (that is, when the width of the canvas is equal to or less than the width of the field angle), the wearer can browse the view on the canvas comprehensively only by the movement in the horizontal direction without moving the canvas in the vertical direction. And because in this case, the data that the gyro sensor 4 needs to acquire and calculate is reduced, energy consumption can be reduced, and processing efficiency can be improved.

The AR glasses activate movement control of the canvas in the vertical direction only when the width of the canvas is greater than the width of the field angle, at which time the canvas may be controlled to move in the vertical direction by the head pitch motion.

It will be appreciated that for a length of the canvas that is less than or equal to the length of the field angle, the AR glasses may also not enable movement control of the canvas in the horizontal direction, but only activate movement control of the canvas in the vertical direction.

As shown in fig. 4, the method further includes:

the motion filtering module 5 is used for acquiring the rotation angle and the rotation time of the AR glasses and comparing the rotation angle and the rotation time with a preset minimum rotation angle threshold and a preset minimum rotation time threshold respectively;

when the rotation angle is smaller than the minimum rotation angle threshold value, the canvas moving module 2 stops moving the canvas; and/or the presence of a gas in the gas,

the canvas movement module 2 stops moving the canvas when the turn time is less than the minimum turn time threshold.

Specifically, it should be noted that when the wearer uses the AR glasses, the head may shake slightly, and the gyro sensor 4 may also be interfered by signals to cause misoperation, so to reduce the misoperation and the possibility of misoperation, in this embodiment, a filtering algorithm is adopted to filter the rotation angle generated by the slight shake of the AR glasses and the signal interference. By the method, misoperation can be reduced, possibility of misoperation is reduced, and experience of a wearer is improved.

The AR glasses further comprise a display module, a transmission module, a sensor, a battery module and a control module. Wherein, the display module includes that 1 or two lens module groups that AR shows constitute, also includes and is not limited to pilot lamp, except that display screen of AR display module group etc.. The display module realizes display of the AR video signal and display of the working state of the AR glasses, such as battery power, current working mode and the like.

The transmission module enables wireless network transmission including, without limitation, one or more of Wi-Fi, bluetooth, RF, mobile network, and the like.

The sensors include, in addition to the gyro sensor 4, one or more of, but not limited to, an RGB camera, a TOF camera, a laser radar, a microphone, a gravity accelerometer, a geomagnetic sensor, a distance sensor, a speaker, and the like.

The battery module comprises a battery and a power management part, and realizes power supply and charging and discharging management of the AR glasses.

The control module includes the computing functions (including but not limited to CPU, memory, storage, etc.) and user interactive controls of the AR glasses. The user interaction control comprises keys, a touch panel 6, vibration induction, a remote controller 9 and the like, and realizes the control of the working state of the AR glasses, such as switching, setting modification and other operations.

The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

17页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种基于红外热成像进行管道线路查询的AR眼镜装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!