Human-computer interaction method and device for augmented reality

文档序号:1860754 发布日期:2021-11-19 浏览:6次 中文

阅读说明:本技术 一种增强现实的人机交互方法及设备 (Human-computer interaction method and device for augmented reality ) 是由 陈罡 徐欣 钱广璞 于 2021-08-20 设计创作,主要内容包括:本发明公开了一种增强现实的人机交互方法及设备,用于将AR技术中的手势操作融入术中导航,能够更便捷、直观的进行手术中的人机交互,并且能够保证手术中的人机交互的无菌性。该方法包括:确定使用增强现实AR设备的用户进行操作的目标对象;响应于所述用户的手势指令,在所述AR设备构建的虚拟空间中显示所述目标对象的虚拟对象,以及与所述虚拟对象对应的环状菜单,其中所述环状菜单上用于显示与所述虚拟对象相关的命令项;响应于所述用户在所述虚拟空间利用手势和所述命令项进行交互触发的打开指令,对所述虚拟对象进行与所述打开指令对应的所述命令项的操作。(The invention discloses a human-computer interaction method and device for augmented reality, which are used for integrating gesture operation in an AR technology into intraoperative navigation, can more conveniently and intuitively perform human-computer interaction in an operation, and can ensure the sterility of the human-computer interaction in the operation. The method comprises the following steps: determining a target object operated by a user using the Augmented Reality (AR) device; in response to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, wherein the ring menu is used for displaying a command item related to the virtual object; responding to an opening instruction which is interactively triggered by the user in the virtual space by using the gesture and the command item, and operating the command item corresponding to the opening instruction on the virtual object.)

1. A human-computer interaction method for augmented reality is characterized by comprising the following steps:

determining a target object operated by a user using the Augmented Reality (AR) device;

in response to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, wherein the ring menu is used for displaying a command item related to the virtual object;

responding to an opening instruction which is interactively triggered by the user in the virtual space by using the gesture and the command item, and operating the command item corresponding to the opening instruction on the virtual object.

2. The method of claim 1, wherein an in-loop area of the annular menu is used to display a portion of the virtual object or all of the virtual object of the target object.

3. The method of claim 1, wherein displaying a ring-shaped menu corresponding to the virtual object in the virtual space constructed by the AR device comprises:

determining a first position of the user's eyes and a second position of the virtual object in the virtual space;

displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position.

4. The method of claim 3, wherein displaying the ring menu corresponding to the virtual object between the first location and the second location comprises:

displaying the annular menu at a third position which is away from the first position by a preset value on a line segment determined by the first position and the second position;

wherein a distance between the third position and the first position is less than a distance between the third position and the second position.

5. The method of claim 1, wherein the open command comprises:

the user utilizes the gesture to touch the command item in the virtual space and then triggers an opening instruction; and/or the presence of a gas in the gas,

and the user carries out interactive triggering opening instructions by utilizing the moving gestures and the command items in the virtual space, wherein the moving direction of the moving gestures is pointed to the command items on the annular menu from the center of the annular menu.

6. The method according to any one of claims 1 to 5, wherein after the step of displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device in response to the gesture instruction of the user, the method further comprises:

determining a gesture position of the gesture instruction in the virtual space;

determining an alternative position corresponding to the gesture instruction according to the gesture position;

displaying a thumbnail menu corresponding to the ring menu at the alternative position, wherein the thumbnail menu hides all the command items displayed on the ring menu.

7. The method of claim 6, wherein after displaying the abbreviated menu corresponding to the ring menu at the alternative location, further comprising:

in response to an open instruction interactively triggered by the user in the virtual space with the gesture of movement and the command item, displaying the movement direction of the gesture of movement and the command item on the ring menu pointed to on the thumbnail menu; and/or the presence of a gas in the gas,

displaying the movement direction of the gesture of the movement on the ring menu in response to an open instruction interactively triggered by the user in the virtual space with the gesture of the movement and the command item.

8. An augmented reality human-computer interaction device comprising a processor and a memory, the memory storing a program executable by the processor, the processor being configured to read the program in the memory and perform the steps of:

determining a target object operated by a user using the Augmented Reality (AR) device;

in response to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, wherein the ring menu is used for displaying a command item related to the virtual object;

responding to an opening instruction which is interactively triggered by the user in the virtual space by using the gesture and the command item, and operating the command item corresponding to the opening instruction on the virtual object.

9. Human-computer interaction device according to claim 8, wherein the in-loop area of the ring-shaped menu is used to display part or all of the virtual objects of the target object.

10. A computer storage medium having a computer program stored thereon, the program, when executed by a processor, implementing the steps of the method according to any one of claims 1 to 7.

Technical Field

The invention relates to the technical field of augmented reality, in particular to a man-machine interaction method and equipment for augmented reality.

Background

At present, in the operation process, the human-computer interaction is mainly realized by using interface interaction logic based on a terminal, and because the requirement of the operation process on the environment is relatively strict, the human-computer interaction mode used in the operation process is too complicated, so that the operation of an operator is inconvenient.

Gesture operation is one of main human-computer interface interaction modes in the Augmented Reality (AR) technology, and the method has the characteristics of convenience, intuition, sterility and the like, but because the traditional interface interaction logic is relatively complicated, the used tree structure is not suitable for the application of the AR, on one hand, the large-area menu of a terminal screen can possibly shield the visual field of an AR user, on the other hand, because the traditional interface interaction logic needs to search the operation required to be executed on an object from a plurality of levels of menus layer by layer, and the process of searching the operation and the object are irrelevant, and the method is not beneficial to the direct interaction between the AR user and the real object.

Disclosure of Invention

The invention provides a human-computer interaction method and device for augmented reality, which are used for integrating gesture operation in an AR technology into intraoperative navigation, can more conveniently and intuitively perform human-computer interaction in an operation, and can ensure the sterility of the human-computer interaction in the operation.

In a first aspect, a method for human-computer interaction for augmented reality provided in an embodiment of the present invention includes:

determining a target object operated by a user using the Augmented Reality (AR) device;

in response to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, wherein the ring menu is used for displaying a command item related to the virtual object;

responding to an opening instruction which is interactively triggered by the user in the virtual space by using the gesture and the command item, and operating the command item corresponding to the opening instruction on the virtual object.

The core idea of this embodiment is to display a virtual object of a target object to be operated in an AR scene and display an annular menu related to the virtual object, where command items related to the target object are distributed on the annular menu, so that a user can directly interact with the command items on the annular menu in the AR scene, and open a corresponding command item by triggering after a gesture and the command item interact, so as to perform a corresponding operation on the virtual object. The mode that the user opened the command item provided in this embodiment is more suitable for the AR scene, replaces traditional tree menu, and is simpler and more convenient, has greatly improved user's operation experience.

The embodiment utilizes the mode of directly displaying the command item related to the virtual object on the annular menu to apply the menu in the AR scene, so that the direct interaction between the user and the command item is realized, the virtual object is intuitively controlled to carry out the operation of the object, the interaction with the virtual object is realized, the interaction mode is simpler and more convenient, and the use experience of the user is effectively improved.

As an alternative embodiment, the inner area of the ring-shaped menu is used to display part of the virtual object or all of the virtual object of the target object.

As an optional implementation, the displaying, in the virtual space constructed by the AR device, a ring-shaped menu corresponding to the virtual object includes:

determining a first position of the user's eyes and a second position of the virtual object in the virtual space;

displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position.

As an optional implementation manner, the displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position includes:

displaying the annular menu at a third position which is away from the first position by a preset value on a line segment determined by the first position and the second position;

wherein a distance between the third position and the first position is less than a distance between the third position and the second position.

As an optional implementation, the opening instruction includes:

the user utilizes the gesture to touch the command item in the virtual space and then triggers an opening instruction; and/or the presence of a gas in the gas,

and the user carries out interactive triggering opening instructions by utilizing the moving gestures and the command items in the virtual space, wherein the moving direction of the moving gestures is pointed to the command items on the annular menu from the center of the annular menu.

As an optional implementation, after the displaying, in response to the gesture instruction of the user, the virtual object of the target object and the ring menu corresponding to the virtual object in the virtual space constructed by the AR device, the method further includes:

determining a gesture position of the gesture instruction in the virtual space;

determining an alternative position corresponding to the gesture instruction according to the gesture position;

displaying a thumbnail menu corresponding to the ring menu at the alternative position, wherein the thumbnail menu hides all the command items displayed on the ring menu.

As an optional implementation manner, after displaying the abbreviated menu corresponding to the ring menu at the alternative position, the method further includes:

in response to an open instruction interactively triggered by the user in the virtual space with the gesture of movement and the command item, displaying the movement direction of the gesture of movement and the command item on the ring menu pointed to on the thumbnail menu; and/or the presence of a gas in the gas,

displaying the movement direction of the gesture of the movement on the ring menu in response to an open instruction interactively triggered by the user in the virtual space with the gesture of the movement and the command item.

In a second aspect, an augmented reality human-computer interaction device provided in an embodiment of the present invention includes a processor and a memory, where the memory is used to store a program executable by the processor, and the processor is used to read the program in the memory and execute the following steps:

determining a target object operated by a user using the Augmented Reality (AR) device;

in response to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, wherein the ring menu is used for displaying a command item related to the virtual object;

responding to an opening instruction which is interactively triggered by the user in the virtual space by using the gesture and the command item, and operating the command item corresponding to the opening instruction on the virtual object.

As an alternative embodiment, the inner area of the ring-shaped menu is used to display part of the virtual object or all of the virtual object of the target object.

As an alternative embodiment, the processor is configured to perform:

determining a first position of the user's eyes and a second position of the virtual object in the virtual space;

displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position.

As an alternative embodiment, the processor is configured to perform:

displaying the annular menu at a third position which is away from the first position by a preset value on a line segment determined by the first position and the second position;

wherein a distance between the third position and the first position is less than a distance between the third position and the second position.

As an optional implementation, the opening instruction includes:

the user utilizes the gesture to touch the command item in the virtual space and then triggers an opening instruction; and/or the presence of a gas in the gas,

and the user carries out interactive triggering opening instructions by utilizing the moving gestures and the command items in the virtual space, wherein the moving direction of the moving gestures is pointed to the command items on the annular menu from the center of the annular menu.

As an optional implementation, after the displaying, in response to the gesture instruction of the user, the virtual object of the target object and the ring menu corresponding to the virtual object in the virtual space constructed by the AR device, the processor is specifically further configured to perform:

determining a gesture position of the gesture instruction in the virtual space;

determining an alternative position corresponding to the gesture instruction according to the gesture position;

displaying a thumbnail menu corresponding to the ring menu at the alternative position, wherein the thumbnail menu hides all the command items displayed on the ring menu.

As an optional implementation manner, after the displaying the abbreviated menu corresponding to the ring-shaped menu at the alternative position, the processor is further specifically configured to perform:

in response to an open instruction interactively triggered by the user in the virtual space with the gesture of movement and the command item, displaying the movement direction of the gesture of movement and the command item on the ring menu pointed to on the thumbnail menu; and/or the presence of a gas in the gas,

displaying the movement direction of the gesture of the movement on the ring menu in response to an open instruction interactively triggered by the user in the virtual space with the gesture of the movement and the command item.

In a third aspect, an embodiment of the present invention further provides a human-computer interaction device for augmented reality, where the device includes:

a determination unit configured to determine a target object operated by a user using an Augmented Reality (AR) device;

a display unit, configured to display, in response to a gesture instruction of the user, a virtual object of the target object and a ring-shaped menu corresponding to the virtual object in a virtual space constructed by the AR device, where the ring-shaped menu is used to display a command item related to the virtual object;

and the operation unit is used for responding to an opening instruction which is interactively triggered by the user in the virtual space by utilizing the gesture and the command item, and performing operation of the command item corresponding to the opening instruction on the virtual object.

As an alternative embodiment, the inner area of the ring-shaped menu is used to display part of the virtual object or all of the virtual object of the target object.

As an optional implementation manner, the display unit is specifically configured to:

determining a first position of the user's eyes and a second position of the virtual object in the virtual space;

displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position.

As an optional implementation manner, the display unit is specifically configured to:

displaying the annular menu at a third position which is away from the first position by a preset value on a line segment determined by the first position and the second position;

wherein a distance between the third position and the first position is less than a distance between the third position and the second position.

As an optional implementation, the opening instruction includes:

the user utilizes the gesture to touch the command item in the virtual space and then triggers an opening instruction; and/or the presence of a gas in the gas,

and the user carries out interactive triggering opening instructions by utilizing the moving gestures and the command items in the virtual space, wherein the moving direction of the moving gestures is pointed to the command items on the annular menu from the center of the annular menu.

As an optional implementation manner, after the displaying, in response to the gesture instruction of the user, the virtual object of the target object and the ring-shaped menu corresponding to the virtual object in the virtual space constructed by the AR device, the method further includes a display abbreviating unit configured to:

determining a gesture position of the gesture instruction in the virtual space;

determining an alternative position corresponding to the gesture instruction according to the gesture position;

displaying a thumbnail menu corresponding to the ring menu at the alternative position, wherein the thumbnail menu hides all the command items displayed on the ring menu.

As an optional implementation manner, after the displaying the abbreviated menu corresponding to the ring menu at the alternative position, the displaying and abbreviating unit is further configured to:

in response to an open instruction interactively triggered by the user in the virtual space with the gesture of movement and the command item, displaying the movement direction of the gesture of movement and the command item on the ring menu pointed to on the thumbnail menu; and/or the presence of a gas in the gas,

displaying the movement direction of the gesture of the movement on the ring menu in response to an open instruction interactively triggered by the user in the virtual space with the gesture of the movement and the command item.

In a fourth aspect, an embodiment of the present invention further provides a computer storage medium, on which a computer program is stored, where the computer program is used to implement the steps of the method in the first aspect when the computer program is executed by a processor.

These and other aspects of the present application will be more readily apparent from the following description of the embodiments.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.

Fig. 1 is a flowchart of an implementation of a human-computer interaction method for augmented reality according to an embodiment of the present invention;

fig. 2 is a schematic view of an AR scene for issuing a gesture command according to an embodiment of the present invention;

FIG. 3 is a diagram illustrating a ring menu display command item according to an embodiment of the present invention;

FIG. 4A is a diagram illustrating a ring menu displaying a portion of a virtual object according to an embodiment of the present invention;

FIG. 4B is a diagram illustrating a ring menu for displaying all virtual objects according to an embodiment of the present invention;

FIG. 5 is a diagram illustrating a first open command item according to an embodiment of the present invention;

FIG. 6 is a diagram illustrating a second opening command item according to an embodiment of the present invention;

FIG. 7 is a diagram illustrating a thumbnail menu according to an embodiment of the present invention;

FIG. 8 is a schematic diagram illustrating a moving direction according to an embodiment of the present invention;

FIG. 9 is a flowchart illustrating an implementation of a detailed AR human-computer interaction method according to an embodiment of the present invention;

FIG. 10 is a schematic diagram of an augmented reality human-computer interaction device according to an embodiment of the present invention;

fig. 11 is a schematic diagram of a human-computer interaction device for augmented reality according to an embodiment of the present invention.

Detailed Description

In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

The term "and/or" in the embodiments of the present invention describes an association relationship of associated objects, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.

The application scenario described in the embodiment of the present invention is for more clearly illustrating the technical solution of the embodiment of the present invention, and does not form a limitation on the technical solution provided in the embodiment of the present invention, and it can be known by a person skilled in the art that with the occurrence of a new application scenario, the technical solution provided in the embodiment of the present invention is also applicable to similar technical problems. In the description of the present invention, the term "plurality" means two or more unless otherwise specified.

Embodiment 1, because the conventional interface interaction logic cannot realize direct interaction with an object, the process of finding operation is irrelevant to the object itself, the required operation needs to be found from a multilevel menu for any object, and the AR is not beneficial for the user to directly interact with the real object, for example, when the user sees two interactive virtual objects which are accessible by two touching hands on a desktop in an AR scene, the user inherently wants to directly interact with the virtual object instead of selecting the operation to be executed from a menu completely split from the virtual object, so the conventional menu is not suitable for an AR application and is not suitable for a human-computer interaction interface of the AR.

At present, in the operation process, the human-computer interaction is mainly realized by using interface interaction logic based on a terminal, and because the requirement of the operation process on the environment is relatively strict, the human-computer interaction mode used in the operation process is too complicated, so that the operation of an operator is inconvenient. In order to overcome the defects that a man-machine interaction mode is too complicated and inconvenient to operate in the operation process, the method and the device establish the annular menu for the operated target object based on the gesture operation and the man-machine interaction of the AR scene, so that the interaction mode of the AR scene between the target object and the command items on the annular menu is utilized to realize the related command operation of the virtual object of the target object.

As shown in fig. 1, the embodiment provides a human-computer interaction method for augmented reality, and the implementation flow of the method is as follows:

step 100, determining a target object operated by a user using an Augmented Reality (AR) device;

the target object in this embodiment includes, but is not limited to, a human body part, a medical instrument, an electronic device, and the like, and after the target object is determined in this embodiment, it is necessary to acquire a virtual object of the target object, where the virtual object is determined according to a structure, a volume, a weight, and the like of the target object, and the virtual object is used to present an object that is consistent with an internal and external structure of the target object and has the same volume and weight in a virtual space. And the virtual object can implement operations of scaling, rotating, moving, etc.

The AR device in this embodiment includes, but is not limited to, at least one of AR glasses, an AR processor, a sensor, and an identification body with AR markers.

In some examples, the manner in which the present embodiment determines the target object is as follows:

detecting a sight direction of a user wearing AR glasses of the AR device through a sensor for detecting eyeballs in the AR device;

and along the sight line direction, determining an object which is viewed by the user from the environment where the user is located, and determining the object as the target object.

In an implementation, after the user wears the AR glasses, the rotation of the eyeball can be detected by the sensor arranged on the AR glasses, so that the visual line direction of the eyes is sensed, the object which the user is viewing in the environment where the user is located is determined along the visual line direction, and the object is used as the target object for the user to operate.

It should be noted that, the AR device in this embodiment may scan an environment where a user is located, and may obtain information such as a location of each object in the environment and a size and a structure of each object, so as to construct a virtual space of the environment where the user is located and a virtual model of each object (object) in the environment. The virtual space constructed therein may be a three-dimensional space, and the virtual model may be a three-dimensional model. The user may interact directly with each virtual model in the virtual space constructed by the AR device.

Step 101, responding to a gesture instruction of the user, displaying a virtual object of the target object in a virtual space constructed by the AR device, and displaying a ring-shaped menu corresponding to the virtual object, wherein the ring-shaped menu is used for displaying a command item related to the virtual object;

in implementation, after wearing the AR glasses, the user can visually see the virtual space corresponding to the real environment, and after the user issues a gesture instruction, the virtual object of the target object currently viewed by the user and the ring-shaped menu corresponding to the virtual object can be displayed.

It should be noted that the gesture instruction in this embodiment is used to indicate different instructions triggered by different gestures of the user, for example, the user triggers a virtual object displaying a target object and a corresponding ring menu through a three-finger gesture. The specific design of the gesture can be defined according to the user requirement, and the specific gesture is not limited too much in the embodiment.

As shown in fig. 2, in the AR scene for issuing the gesture instruction provided in this embodiment, after the user issues the gesture instruction, a virtual object of a target object is displayed in a virtual space constructed by the AR device, where a position where the virtual object is displayed is the same as a position where the target object is located, and a ring-shaped menu is displayed at the position where the virtual object is located. The mode of displaying the virtual object and the annular menu in the embodiment can better adapt to the direct interaction behavior of the user and the object in the AR scene, the user only needs to send a gesture instruction in the interaction process, and the virtual object and the annular menu of the target object can be displayed at the target object where the sight line is located.

As shown in fig. 3, command items related to the virtual object, such as zooming, moving, panning, opening, closing, etc., are directly displayed on the ring menu in the present embodiment. The command items in this embodiment are evenly distributed on the ring of the ring menu in a clockwise direction or a counterclockwise direction. In implementation, the command items on the ring menu can also be controlled by gestures to rotate clockwise or counterclockwise along the ring direction.

In some examples, in response to a rotation gesture of a user, the command items on the ring menu are controlled to move clockwise or counterclockwise, the command items displayed before the movement are hidden, and other command items are sequentially displayed in a preset order.

It should be noted that command items corresponding to different virtual objects may be different, and in order to facilitate user operation, in this embodiment, the relevant command items of the virtual objects are displayed on the ring menu in a targeted manner, so that the user does not need to find the corresponding command items step by step from the general menu, and the user can interact with the virtual objects directly.

As shown in fig. 4A and 4B, the in-loop area of the ring menu in this embodiment is used to display part of the virtual object or all of the virtual object of the target object. The annular menu in the embodiment does not shield the virtual object, so that a user can visually see the change of the virtual object in the interaction process of the user and the virtual object through the command item, and the operation experience of the user is improved.

Step 102, responding to an opening instruction triggered by the interaction of the user in the virtual space by utilizing the gesture and the command item, and performing the operation of the command item corresponding to the opening instruction on the virtual object.

After the ring-shaped menu of the virtual object is displayed in this embodiment, a user may directly interact with the command item by using a gesture in the virtual space, so as to trigger opening of the interactive command item, and perform an operation of the command item corresponding to the opening instruction on the virtual object. In the whole user operation process, the interaction between the user and the virtual object can be realized only by interacting with the command item through simple gestures, and in the interaction process, the virtual object, the annular menu and the command item can be simultaneously realized, so that the user can intuitively interact with the virtual object, and the interaction experience of the user is effectively improved.

In some examples, the display position of the ring menu in this embodiment is located near the user in the line of sight of the user, so that the user can directly touch the ring menu by hand to trigger the command item, which is implemented as follows:

process 1, determining a first position of the user's eyes and a second position of the virtual object in the virtual space;

and a step 2 of displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position.

In implementation, the annular menu is displayed at a third position which is away from the first position by a preset value on a line segment determined by the first position and the second position;

wherein a distance between the third position and the first position is less than a distance between the third position and the second position.

In some examples, the coordinates of the ring menu of the present embodiment in the virtual space may be represented as C ((x2-x1) Δ t + (y2-y1) Δ t + (z2-z1) Δ t), wherein,

where E is a preset value, the coordinates of the eye (first position) are represented as a (x1, y1, z1), the coordinates of the virtual object (second position) are represented as B (x2, y2, z2), and the coordinates of the third position are represented as C (x3, y3, z 3).

In some examples, the present embodiment may trigger the opening instruction for opening the command item by interacting with the command item through a gesture, and specifically includes any one or any multiple of the following interaction manners:

mode 1, receiving an opening instruction triggered after a user touches the command item by the gesture in the virtual space;

in implementation, as shown in fig. 5, in this embodiment, a user may directly touch a command item on a ring menu in a virtual space, so as to trigger opening of the command item. For example, the user directly touches a moving command on the ring menu, and the virtual object is controlled to move to the designated position.

Mode 2, receiving an opening instruction interactively triggered by the user in the virtual space by using the moving gesture and the command item, wherein the moving direction of the moving gesture points to the command item on the annular menu from the center of the annular menu.

In implementation, as shown in fig. 6, in this embodiment, a user may trigger to open a certain command item on the ring menu according to the moving direction by moving a certain gesture. In implementation, after a certain gesture finger is sent, the user moves to the position where the command item to be opened is located, and the command item is triggered to be opened.

In some examples, in order to improve the interaction experience of the user, in the position where the user sends a gesture, a thumbnail menu is also displayed, and the specific implementation steps are as follows:

step 1, responding to a gesture instruction of the user, after a virtual object of the target object and a ring-shaped menu corresponding to the virtual object are displayed in a virtual space constructed by the AR equipment, determining a gesture position of the gesture instruction in the virtual space;

step 2, determining an alternative position corresponding to the gesture instruction according to the gesture position;

in implementation, the alternative position may be a gesture position, or may be a position close to the gesture position, which is determined specifically according to the operation requirement of the user, and this embodiment does not limit this too much.

And 3, displaying a thumbnail menu corresponding to the annular menu at the alternative position, wherein the thumbnail menu hides all the command items displayed on the annular menu.

In implementation, as shown in fig. 7, at the candidate position corresponding to the gesture position, a thumbnail menu corresponding to the ring menu is displayed, wherein the thumbnail menu is spherical in shape.

In some examples, after displaying the abbreviated menu corresponding to the ring menu at the alternative position, responding to an opening instruction triggered by the user interacting with the gesture and the command item in the virtual space, the method further includes any one of the following display modes:

mode 1, displaying the moving direction of the moving gesture and the command item on the ring menu pointed to on the thumbnail menu.

Mode 2, the movement direction of the moving gesture is displayed on the ring menu.

Mode 3, displaying the movement direction of the moving gesture and the command item on the ring menu pointed to on the thumbnail menu, and displaying the movement direction of the moving gesture on the ring menu.

As shown in fig. 8, when the user moves in a direction from the center of the ring menu to the command item on the ring menu after the user makes a gesture, the moving direction and the pointed command item can be displayed on the thumbnail menu, and the moving direction is displayed on the ring menu.

In some examples, as shown in fig. 9, the present embodiment further provides a detailed AR human-computer interaction method, where an implementation flow of the method is as follows:

step 900, determining a target object operated by a user using the AR equipment;

step 901, responding to a gesture instruction of a user, and displaying a virtual object of a target object in a virtual space constructed by the AR equipment;

step 902, determining a first position of an eye of a user and a second position of a virtual object in a virtual space; displaying a ring-shaped menu of the virtual object at a third position which is away from the first position by a preset value on the line segment determined by the first position and the second position;

wherein the distance between the third position and the first position is smaller than the distance between the third position and the second position.

Step 903, determining a gesture position of the gesture instruction in a virtual space; determining an alternative position corresponding to the gesture instruction according to the gesture position; displaying a thumbnail menu corresponding to the annular menu at the alternative position;

wherein the thumbnail menu hides all of the command items displayed on the ring menu.

Step 904, responding to an opening instruction triggered after the user touches the command item in the virtual space by using a gesture, and performing an operation of the command item corresponding to the opening instruction on the virtual object;

step 905, responding to an opening instruction which is interactively triggered by a user in a virtual space by using a moving gesture and a command item;

wherein a moving direction of the moving gesture points from a center of the ring menu to the command item on the ring menu.

Step 906, displaying the moving direction of the moving gesture and the command item on the ring menu pointed to on the thumbnail menu, and displaying the moving direction of the moving gesture on the ring menu.

Examples 2,

Based on the same inventive concept, the embodiment of the present invention further provides an augmented reality human-computer interaction device, and as the device is the device in the method in the embodiment of the present invention, and the principle of the device for solving the problem is similar to that of the method, the implementation of the device may refer to the implementation of the method, and repeated details are omitted.

As shown in fig. 10, the apparatus includes a processor 1000 and a memory 1001, the memory 1001 is used for storing programs executable by the processor 1000, and the processor 1000 is used for reading the programs in the memory 1001 and executing the following steps:

determining a target object operated by a user using the Augmented Reality (AR) device;

in response to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, wherein the ring menu is used for displaying a command item related to the virtual object;

responding to an opening instruction which is interactively triggered by the user in the virtual space by using the gesture and the command item, and operating the command item corresponding to the opening instruction on the virtual object.

As an alternative embodiment, the inner area of the ring-shaped menu is used to display part of the virtual object or all of the virtual object of the target object.

As an alternative embodiment, the processor 1000 is specifically configured to perform:

determining a first position of the user's eyes and a second position of the virtual object in the virtual space;

displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position.

As an alternative embodiment, the processor 1000 is specifically configured to perform:

displaying the annular menu at a third position which is away from the first position by a preset value on a line segment determined by the first position and the second position;

wherein a distance between the third position and the first position is less than a distance between the third position and the second position.

As an optional implementation, the opening instruction includes:

the user utilizes the gesture to touch the command item in the virtual space and then triggers an opening instruction; and/or the presence of a gas in the gas,

and the user carries out interactive triggering opening instructions by utilizing the moving gestures and the command items in the virtual space, wherein the moving direction of the moving gestures is pointed to the command items on the annular menu from the center of the annular menu.

As an optional implementation manner, after the responding to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, the processor 1000 is specifically further configured to perform:

determining a gesture position of the gesture instruction in the virtual space;

determining an alternative position corresponding to the gesture instruction according to the gesture position;

displaying a thumbnail menu corresponding to the ring menu at the alternative position, wherein the thumbnail menu hides all the command items displayed on the ring menu.

As an optional implementation manner, after displaying the abbreviated menu corresponding to the ring-shaped menu at the alternative position, the processor 1000 is further specifically configured to perform:

in response to an open instruction interactively triggered by the user in the virtual space with the gesture of movement and the command item, displaying the movement direction of the gesture of movement and the command item on the ring menu pointed to on the thumbnail menu; and/or the presence of a gas in the gas,

displaying the movement direction of the gesture of the movement on the ring menu in response to an open instruction interactively triggered by the user in the virtual space with the gesture of the movement and the command item.

Examples 3,

Based on the same inventive concept, the embodiment of the present invention further provides an augmented reality human-computer interaction device, and as the device is the device in the method in the embodiment of the present invention, and the principle of the device for solving the problem is similar to that of the method, the implementation of the device may refer to the implementation of the method, and repeated details are omitted.

As shown in fig. 11, the apparatus includes:

a determining unit 1100, configured to determine a target object operated by a user using an Augmented Reality (AR) device;

a display unit 1101, configured to display, in response to a gesture instruction of the user, a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, where the ring menu is used to display a command item related to the virtual object;

an operation unit 1102, configured to perform, in response to an opening instruction triggered by the user interacting with the command item in the virtual space by using a gesture, an operation on the command item corresponding to the opening instruction on the virtual object.

As an alternative embodiment, the inner area of the ring-shaped menu is used to display part of the virtual object or all of the virtual object of the target object.

As an optional implementation manner, the display unit 1101 is specifically configured to:

determining a first position of the user's eyes and a second position of the virtual object in the virtual space;

displaying the ring-shaped menu corresponding to the virtual object between the first position and the second position.

As an optional implementation manner, the display unit 1101 is specifically configured to:

displaying the annular menu at a third position which is away from the first position by a preset value on a line segment determined by the first position and the second position;

wherein a distance between the third position and the first position is less than a distance between the third position and the second position.

As an optional implementation, the opening instruction includes:

the user utilizes the gesture to touch the command item in the virtual space and then triggers an opening instruction; and/or the presence of a gas in the gas,

and the user carries out interactive triggering opening instructions by utilizing the moving gestures and the command items in the virtual space, wherein the moving direction of the moving gestures is pointed to the command items on the annular menu from the center of the annular menu.

As an optional implementation manner, after the displaying, in response to the gesture instruction of the user, the virtual object of the target object and the ring-shaped menu corresponding to the virtual object in the virtual space constructed by the AR device, the method further includes a display abbreviating unit configured to:

determining a gesture position of the gesture instruction in the virtual space;

determining an alternative position corresponding to the gesture instruction according to the gesture position;

displaying a thumbnail menu corresponding to the ring menu at the alternative position, wherein the thumbnail menu hides all the command items displayed on the ring menu.

As an optional implementation manner, after the displaying the abbreviated menu corresponding to the ring menu at the alternative position, the displaying and abbreviating unit is further configured to:

in response to an open instruction interactively triggered by the user in the virtual space with the gesture of movement and the command item, displaying the movement direction of the gesture of movement and the command item on the ring menu pointed to on the thumbnail menu; and/or the presence of a gas in the gas,

displaying the movement direction of the gesture of the movement on the ring menu in response to an open instruction interactively triggered by the user in the virtual space with the gesture of the movement and the command item.

Based on the same inventive concept, an embodiment of the present invention further provides a computer storage medium, on which a computer program is stored, which when executed by a processor implements the following steps:

determining a target object operated by a user using the Augmented Reality (AR) device;

in response to the gesture instruction of the user, displaying a virtual object of the target object and a ring menu corresponding to the virtual object in a virtual space constructed by the AR device, wherein the ring menu is used for displaying a command item related to the virtual object;

responding to an opening instruction which is interactively triggered by the user in the virtual space by using the gesture and the command item, and operating the command item corresponding to the opening instruction on the virtual object.

As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.

The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

21页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:风控方法、装置、计算设备及计算机存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类