User terminal for operating virtual key based on motion sensor

文档序号:1566975 发布日期:2020-01-24 浏览:34次 中文

阅读说明:本技术 基于运动传感器操作虚拟钥匙的用户终端 (User terminal for operating virtual key based on motion sensor ) 是由 雷飏 于 2016-11-02 设计创作,主要内容包括:一种用于带有运动传感器的便携式用户终端的用户终端,包括:用于通过运动传感器监测用户操作用户终端在垂直、水平、前后三个维度方向的运动方向和速度,当匹配到预定义动作时,触发执行预定义的虚拟钥匙相关操作的器件;虚拟钥匙相关操作包括:使用虚拟钥匙开锁、使用虚拟钥匙上锁、显示附近访问控制装置关联物体的联络界面、显示当前虚拟钥匙包关联物体的控制中心界面。相较于现有技术方案中的摇一摇开锁方案,本发明方案中的动作对用户来讲更易学易记,而且功能更丰富。(A user terminal for a portable user terminal with a motion sensor, comprising: the device is used for monitoring the movement direction and speed of the user operating the user terminal in the vertical, horizontal and front-back dimension directions through the movement sensor, and triggering and executing the predefined virtual key related operation when the predefined action is matched; the virtual key related operation includes: unlocking by using a virtual key, locking by using the virtual key, displaying a contact interface of a nearby access control device related object, and displaying a control center interface of a current virtual key packet related object. Compared with the scheme of unlocking by shaking in the prior art, the method has the advantages that the actions in the scheme are easier for users to learn and remember, and the functions are richer.)

1. A user terminal with a motion sensor, comprising:

the device is used for monitoring the movement direction and speed of the user operating the user terminal in the vertical, horizontal and front-back dimension directions through the movement sensor, and triggering and executing the predefined virtual key related operation when the predefined action is matched; the virtual key-related operation comprises: unlocking by using a virtual key, locking by using the virtual key, displaying a contact interface of a nearby access control device associated object, and displaying a control center interface of a current virtual key packet associated object;

further comprising:

for recognizing a predefined action: the front face of the body of the user terminal equipment faces to a user visual angle, then the body vertically reciprocates more than once along the vertical direction, and then a device of a virtual key for unlocking a nearby access control device is displayed in a two-dimensional code form;

for recognizing a predefined action: a device, wherein the front face of the body of the user terminal equipment faces the visual angle of a user, then the body rotates clockwise by more than 60 degrees by taking the front-back direction as a rotating shaft, then rotates back to the initial position and reciprocates more than once, and then a virtual key used for locking a nearby access control device is displayed in a two-dimensional code form; and

for recognizing a predefined action: the front of the body of the user terminal equipment faces to the visual angle of a user, then the body rotates anticlockwise for more than 60 degrees by taking the front-back direction as a rotating shaft, then rotates back to the initial position and reciprocates more than once, and then a device of a virtual key for the back locking of a nearby access control device is displayed in a two-dimensional code mode.

2. The user terminal of claim 1, further comprising means for educating and setting by a user a range threshold for the predefined action match in different dimensions.

3. A user terminal according to claim 1 or 2, further comprising means for identifying a predefined action comprising:

the front of the user terminal equipment faces to the user visual angle, the body of the user terminal equipment faces upwards, and then the body of the user terminal equipment declines towards the front by more than 60 degrees; and

and means for operating the predefined action using the virtual key to unlock the lock after recognizing the predefined action.

4. A user terminal according to claim 1 or 2, further comprising means for identifying a predefined action comprising:

the body of the user terminal equipment is approximately horizontally arranged, the front of the body faces upwards, and then the body is turned upwards by more than 60 degrees towards the visual angle direction of a user; and

means for operating the predefined action using the virtual key for locking back upon recognition of the predefined action.

5. A user terminal according to claim 1 or 2, further comprising means for identifying a predefined action comprising:

the front of the user terminal equipment body faces to a user visual angle, and then the user terminal body rotates forwards by more than 120 degrees by taking the vertical direction as a rotating shaft and then rotates back to the direction of the initial user terminal;

the front face of the user terminal equipment faces the user visual angle, the body of the user terminal equipment is close to the vertical direction, then the body of the user terminal equipment is declined towards the front by more than 60 degrees, then the body of the user terminal equipment returns to the direction that the body of the user terminal equipment is close to the vertical direction, and the action is repeated for two times or more; and

and means for displaying the predefined action of the contact interface of the object associated with the nearby access control device after recognizing the predefined action.

6. The user terminal of claim 5, further comprising means for displaying a contact interface of a most recently used virtual key package associated object if no access control device available nearby is searched.

7. The user terminal of claim 5, further comprising means for obtaining a temporary virtual key package from the backend system if the access control device is found nearby and there is no matching virtual key package in the user terminal, and then displaying a contact interface of the virtual key package associated object.

8. A user terminal according to claim 1 or 2, further comprising means for identifying a predefined action comprising: the front of the user terminal equipment faces to the visual angle of the user, then the body moves towards the outer side of the body of the user to form similar throwing-out action, and then the action is stopped; and

and means for displaying a control center interface of the object associated with the current virtual key fob upon recognition of the predefined action.

9. A user terminal according to claim 1 or 2, further comprising means for identifying a predefined action comprising: the front of the user terminal equipment faces to the visual angle of the user, then the body moves towards the inner side of the body of the user to form a similar withdrawing action, and then the action is stopped; and

after the predefined action is identified, the device is used for displaying the virtual key package icon array of the classification where the current virtual key package is located in an icon form;

and if the virtual key package group is displayed in the icon form in the virtual key application program, judging whether an icon array of the next virtual key package group is available for display, and if so, displaying the next group of devices.

10. The user equipment according to claims 1-3, further comprising:

means for identifying a predefined action comprising: the front face of the user terminal device faces to the user visual angle, then the body rotates by about 90 degrees clockwise by taking the vertical direction as a rotating shaft, rapidly rotates anticlockwise to the original position, and rapidly rotates by about 90 degrees clockwise again and rotates anticlockwise to the original position;

means for locking back using the virtual key after recognizing the predefined action.

Technical Field

The present invention relates generally to portable user terminals, and more particularly to a portable user terminal that can perform an interactive operation using a motion sensor.

Background

The motion sensor comprises a multi-dimensional accelerometer and a gyroscope, and many intelligent devices are provided with the sensors for recording and monitoring the motion condition of the device or realizing gesture action interactive operation.

In the previous patent application 201610914471.2, an object access right management method based on virtual key and virtual key package technology, and a corresponding background system, access control device and user terminal are disclosed. In the subsequent prior patent application 201610937762.3, a method and corresponding user terminal for interactive operation of viewing, selecting, managing, etc. of virtual key packages and virtual key data on a compact-sized user terminal with a touch display screen using finger gesture operations are also disclosed. However, there are no solutions for motion sensors to be used in conjunction with virtual key application code.

Note: some prior art solutions provide methods such as using a cell phone to shake a swing for bluetooth unlock/open operation, but have limited functionality and the shake-shake operation is easily confused with other applications.

Disclosure of Invention

It is a first object of the present invention to provide a method and a corresponding user terminal for triggering an operation using a virtual key according to a motion vector generated by a user manipulating a motion of the user terminal. The specific method comprises the following steps:

monitoring the movement direction and speed of a user operating a user terminal in the vertical, horizontal and front-back dimension directions by using a movement sensor, and triggering and executing the related operation of a predefined virtual key when a predefined action is matched;

the virtual key-related operation comprises: unlocking with a virtual key, locking with a virtual key, unlocking with a virtual key, displaying a contact interface of a nearby access control device-associated object, displaying a control center interface of a current virtual key package-associated object.

Preferably, the predefined action corresponding to the unlocking operation using the virtual key includes:

the body of the user terminal equipment is close to be straightened forward and then rotates clockwise by more than 60 degrees (the action is the action of simulating the insertion of an entity key into a lock hole and then rotating the key clockwise to unlock);

the front of the user terminal equipment faces the user visual angle, the body of the user terminal equipment faces upwards, and then the body of the user terminal equipment declines towards the front by more than 60 degrees (the action is the action of simulating the suspension bridge laying down on a river).

Preferably, the predefined action corresponding to the locking operation using the virtual key comprises:

the body of the user terminal equipment is close to be straightened forward and then rotates anticlockwise by more than 60 degrees (the action is the action of simulating the insertion of a physical key into a lock hole and then rotating the key anticlockwise to lock);

the body of the user terminal equipment is approximately horizontally placed, the front of the body faces upwards, and then the body turns upwards by more than 60 degrees towards the visual angle direction of a user (the action is the action of simulating and retracting a suspension bridge on a river).

Preferably, the predefined action corresponding to the operation of locking using the virtual key comprises:

the user terminal device faces the user's view angle, and then the body rotates clockwise by about 90 degrees with the vertical direction as the rotation axis, and rapidly rotates counterclockwise back to the original position, and again rapidly rotates clockwise by about 90 degrees and counterclockwise back to the original position (this action is the action of multiple key rotations when simulating the unlocking operation of a general physical key).

Therefore, the actions of unlocking, locking and back locking are defined in this way, which is convenient for users to memorize in an imaging way and reduces learning troubles. Unlike other unlocking technical schemes in which a shaking motion is adopted, the unlocking, locking and counter-locking motions of the invention are more vivid and better remembered and better learned, and the unlocking, locking and counter-locking are taken into consideration.

Preferably, the predefined action corresponding to displaying the contact interface of the nearby access control device associated object comprises:

the front of the user terminal device body faces to the user visual angle, and then the user terminal body rotates forwards by more than 120 degrees by taking the vertical direction as a rotating shaft and then rotates back to the original user terminal direction (in actual operation, only the wrist needs to rotate outwards and then retracts);

the front of the user terminal device faces the user visual angle, the body approaches the vertical direction, then the body declines more than 60 degrees towards the front, then the body returns to approach the vertical direction, and the action is repeated for two or more times (in actual operation, the palm only needs to face the user visual angle, then the wrist rotates downwards, and finally the wrist is retracted to the original position again, and the action is repeated).

Such an action is chosen because the user only needs to rotate the wrist to complete the operation, and does not need to swing the arm.

Preferably, in some embodiments, if no access control device available nearby is searched, a contact interface for the most recently used virtual key fob associated object is displayed. In some embodiments, if the access control device nearby is searched and the user terminal does not have a matching virtual key package, the user terminal obtains the temporarily used virtual key package from the background system and then displays a contact interface of the object associated with the virtual key package.

Preferably, the predefined action corresponding to displaying the control center interface of the current virtual key fob associated object includes: the front of the user terminal equipment faces to the visual angle of the user, then the body moves towards the outer side of the body of the user to form similar throwing-out action, and then the action is stopped. This action, which is somewhat like throwing a string of keys out of a physical key-pack, can also be accomplished by simply turning the wrist.

Preferably, in some embodiments, the matching of the predefined actions is controlled by range thresholds of different dimensions and can be taught and set by the user. The virtual key application program on the user terminal provides a user interface for a user to learn and teach the action so as to obtain more accurate range threshold parameters and improve the action recognition accuracy.

The second purpose of the present invention is to provide a method for identifying the motion vector of the user terminal by using the motion sensor, and displaying the virtual key data on the display screen in the form of a two-dimensional code for the access control device to perform the lock command operation, and a corresponding user terminal. The specific method comprises the following steps:

preferably, the predefined action method comprises, in response to displaying a virtual key for a nearby access control device in the form of a two-dimensional code:

the front face of the body of the user terminal equipment faces the visual angle of a user, then the body vertically reciprocates more than once along the vertical direction, and the displayed two-dimensional code is used for unlocking;

the front face of the body of the user terminal equipment faces to a user visual angle, then the body rotates clockwise by more than 60 degrees by taking the front-back direction as a rotating shaft, then rotates back to the initial position and reciprocates more than once, and the displayed two-dimensional code is used for locking;

the front face of the body of the user terminal equipment faces to a user visual angle, then the body rotates anticlockwise by more than 60 degrees by taking the front-back direction as a rotating shaft, then rotates back to an initial position and reciprocates more than once, and the displayed two-dimensional code is used for unlocking.

The third purpose of the present invention is to provide a method and a corresponding user terminal for identifying a user terminal motion vector by using a motion sensor to implement one-handed operation and rapidly perform a virtual key package browsing interactive operation. The specific method comprises the following steps:

preferably, the predefined action method corresponds to the virtual key package icon array displaying the classification of the current virtual key package in the form of an icon, and comprises the following steps: the front of the user terminal equipment faces to the visual angle of the user, then the body moves towards the inner side of the body of the user to form a similar withdrawing action, and then the action is stopped; if the virtual key package group is currently displayed in the form of icons in the virtual key application, it is first determined whether an icon array of the next virtual key package group is available for display, and if so, the next group is displayed.

For most users, only one or two virtual key packages are generally displayed in a card form in the virtual key application program in the user terminal, but it is troublesome for some users to look over the virtual key package cards one by one. The method is particularly suitable for management service personnel who simultaneously have access rights of a plurality of access control devices to quickly switch the selected virtual key packages. In some embodiments, the management service personnel simultaneously service different buildings in the same cell, and access to the access control equipment of different building units can be realized.

It is noted that in embodiments of the user terminal, the user terminal should be activated/unlocked by the user before using these predefined actions, in particular in some embodiments the user terminal has a fingerprint recognition unit, and the unlocking of the user terminal can be done by means of the user fingerprint.

Therefore, the user can conveniently and quickly carry out operations such as unlocking, locking, contact, checking a control center and the like in a mode of carrying out interactive operation by the motion action of the user terminal, the user does not need to watch the display screen of the user terminal and then carry out the action, and the operation can be finished by one hand. After the user skilled in using the program is used to the interactive operation of the actions, the operation efficiency can be greatly improved.

In summary, the interactive operation method designed by the present invention fully utilizes different wrist rotations and simple arm movements, and realizes the operation related to the common virtual key such as unlocking and locking by operating the portable user terminal with one hand in an easily learned and remembered operation manner.

Drawings

The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the example serve to explain the principles of the invention and not to limit the invention.

FIG. 1 is a system block diagram of a user terminal in one embodiment;

FIG. 2 is a diagram illustrating a user operating a user terminal to move for an unlocking operation in one embodiment;

FIG. 3 is a diagram illustrating a user operating a user terminal to move for locking operation in one embodiment;

FIG. 4 is a diagram illustrating a user operating a user terminal to move for an anti-lock operation in one embodiment;

FIGS. 5 and 6 are diagrams illustrating user operation of a user terminal to move a contact interface displaying objects associated with nearby access control devices in some embodiments;

FIG. 7 is a diagram illustrating a user operating a user terminal to move a control center interface displaying a current virtual key fob associated object in accordance with an embodiment;

FIG. 8 is a schematic diagram of a user operating a user terminal to move to display a two-dimensional code for virtual key unlocking of a nearby access control device in one embodiment;

FIG. 9 is a diagram illustrating a user operating a user terminal to move and display a virtual key lock for a nearby access control device in the form of a two-dimensional code according to an embodiment;

FIG. 10 is a diagram illustrating a user operating a user terminal to move to display a virtual key lock for a nearby access control device in the form of a two-dimensional code in one embodiment;

fig. 11 is a schematic diagram of a virtual key package icon array in which a user operates a user terminal to move to display a category of a current virtual key package in an icon form in one embodiment;

FIG. 12 is a flow diagram that illustrates detection and identification of motion sensor data by the motion activity processing unit, according to one embodiment;

FIG. 13 is a flowchart of an initialization process for a predefined action in the athletic action processing unit in one embodiment;

FIG. 14 is a flow diagram of a process for matching identification of motion sensor data to a predefined action in a motion action processing unit in one embodiment;

FIG. 15 is a flow diagram that illustrates the processing of received motion sensor data in the motion activity processing unit, according to one embodiment.

Detailed Description

The following description is presented to enable any person skilled in the art to make and use the embodiments, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.

The data structures and code described in the detailed description are typically stored on a computer-readable storage medium, which can be any device or medium that can store code and/or data for use by a computer system. Computer-readable storage media include, but are not limited to, volatile memory, non-volatile memory, magnetic storage devices, and optical storage devices (e.g., disk drives, magnetic tape, CDs (compact discs), DVDs (digital versatile discs or digital video discs), or other media capable of storing code and/or data now known or later developed.

The methods and processes described in the detailed description section can be implemented as code and/or data, which can be stored in a computer-readable storage medium as described above. When a computer system reads and executes the code and/or data stored on the computer-readable storage medium, the computer system performs the methods and processes embodied as data structures and code and stored within the computer-readable storage medium.

Also, the methods and processes described herein can be embodied within hardware modules or devices. These modules or devices may include, but are not limited to, an Application Specific Integrated Circuit (ASIC) chip, a Field Programmable Gate Array (FPGA), a dedicated or shared processor that executes a particular software module or piece of code at a particular time, and/or other programmable logic devices now known or later developed. When activated, the hardware modules or devices perform the methods and processes contained within them.

Fig. 1 illustrates a user terminal 300 according to an embodiment. The user terminal 300 may be various mobile terminals, smart phones, tablet computers, notebook computers, smart watches, smart glasses, vehicle-mounted computers, and the like. Referring to fig. 1, the central processing unit 313 is responsible for controlling and managing the operation of all the processing units of the processor 301. The network module 303 is used for the user terminal 300 to connect to the backend system 100, and after completing the login to the backend system 100 through the login registration processing unit 316, the user terminal can access the service of the backend system 100 and receive the virtual key update message from the backend system 100. If the virtual key update message from the background system 100 is received, the message processing unit 314 delivers the message to the virtual key processing unit 311 for processing, and the virtual key processing unit 311 verifies the message first, and updates the message to the virtual key library encrypted and stored in the device local storage 302 after the verification is successful. The input module 305 receives operation input from a user, including a touch sensitive processing device in a touch screen display. The output module 304 outputs the feedback to the user. The user interaction processing unit 315 completes interaction with the user through the input module 305 and the output module 304, such as interaction operations of selection and viewing of a virtual key package, member management, management of a virtual key, authorization for addition, and the like, and then sends a virtual key request to the background system 100 through the virtual key processing unit 311, the virtual key request unit 312, the network connection processing unit 317, and the network module 303. After logging in to the background system 100, the saved virtual key package data is decrypted from the local storage 302, and if not found, a request for obtaining the virtual key package is sent to the background system 100. The short range communication module 306 can transmit the virtual key data to the access control device 200 to perform lock command operations such as unlocking, locking, and back-locking.

In this embodiment, the input module 305 further comprises a motion sensor unit 330 for detecting three-dimensional accelerometer and gyroscope data. The detected motion sensor data is subjected to motion recognition matching processing by the motion recognition unit 310. The central processing unit 313 performs a corresponding predefined operation according to the recognized predefined action result of the motion action processing unit 310.

In some embodiments, the short-range communication module 306 includes an NFC near-field communication unit, a bluetooth low energy communication unit. The short-range communication processing unit 319 is responsible for processing the connection and communication of these short-range communications. In the previous patent application 201610932849.1, a method for operating virtual key lock command by NFC near field communication and bluetooth communication is disclosed.

In some embodiments, the output module 304 includes a display screen on which the virtual key in the form of a two-dimensional code is output and then recognized and processed by the access control device 200. In the prior patent application 201610936846.5, a method of operating a virtual key lock command in a two-dimensional code manner is disclosed.

Fig. 2 presents a schematic diagram illustrating predefined actions for a virtual key unlock operation according to one embodiment. Referring to fig. 2, wherein fig. 2A is a motion simulating a clockwise rotation motion of a physical key; fig. 2B is a diagram simulating the action of lowering a drawbridge over a river.

Fig. 3 presents a schematic diagram illustrating predefined actions for a virtual key locking operation according to an embodiment. Referring to fig. 3, wherein fig. 3A is a diagram of an operation simulating a counterclockwise rotation operation of a physical key; fig. 3B is a diagram simulating the action of stowing the drawbridge over a river.

FIG. 4 presents a schematic diagram illustrating predefined actions for a virtual key back lock operation in accordance with one embodiment. Referring to fig. 4, first, the user terminal is in the state shown in fig. 4A, and then rotated clockwise by about 90 degrees with the vertical direction as an axis, the user terminal assumes the state shown in fig. 4B; then, the rotation is performed counterclockwise back to the initial state shown in fig. 4A with the vertical direction as the axis; then, the user terminal rotates clockwise by about 90 degrees by taking the vertical direction as an axis, and enters the state shown in fig. 4B again; then, the device is rotated counterclockwise about the vertical axis back to the initial state shown in fig. 4A, and the operation is terminated. The action is to imitate the action of multiple key rotations when a general physical key is used for carrying out the back locking operation.

Fig. 5 presents a schematic diagram illustrating predefined actions for a contact interface operation for displaying nearby access control device associated objects according to an embodiment. Referring to fig. 5, first, the user terminal is in the state shown in fig. 5A, and then rotates clockwise by about 180 degrees with the vertical direction as an axis, the user terminal assumes the state shown in fig. 5B with the back of the body facing the user's view; then, the operation is finished by rotating counterclockwise about the vertical direction as an axis back to the initial state shown in fig. 5A.

Fig. 6 presents a schematic diagram illustrating predefined actions for a contact interface operation for displaying nearby access control device associated objects according to another embodiment. Referring to fig. 6, first, the user terminal is in the state shown in fig. 6A, and then the body is tilted downward by 60 degrees or more toward the front, and the user terminal assumes the state shown in fig. 6B; the body is then lifted upward, returning to the initial state shown in fig. 5A, and the action is ended.

Fig. 7 presents a schematic diagram illustrating predefined actions for displaying a virtual key operation for unlocking a nearby access control device in the form of a two-dimensional code according to one embodiment. Referring to fig. 7, first, the user terminal is in the state shown in fig. 7A and then moved upward in the vertical direction, the user terminal assumes the position and state shown in fig. 7B; then downward again in the vertical direction, returning to the initial state as shown in fig. 7A; then moves upward in the vertical direction again, and the user terminal enters the position and state shown in fig. 7B again; then downward again in the vertical direction, back to the initial state as shown in fig. 7A, and the action is finished.

Fig. 8 presents a schematic diagram illustrating predefined actions for displaying a virtual key operation for locking of a nearby access control device in the form of a two-dimensional code according to an embodiment. Referring to fig. 8, first, the user terminal is in the state shown in fig. 8A, and then rotates clockwise more than 60 degrees with the front-rear direction as a rotation axis, the user terminal assumes the position and state shown in fig. 8B; then, the rotation axis is rotated counterclockwise back to the initial state as shown in fig. 8A; then clockwise rotates more than 60 degrees with the front-back direction as the rotation axis, and the user terminal enters the position and state shown in fig. 8B again; then, the operation is returned to the initial state shown in fig. 8A by counterclockwise rotation about the front-rear direction as the rotation axis, and the operation is ended.

Fig. 9 presents a schematic diagram for illustrating predefined actions for displaying a virtual key operation for a nearby access control device to unlock in two-dimensional code form according to an embodiment. Referring to fig. 9, first, the user terminal is in the state shown in fig. 9A, and then rotated counterclockwise more than 60 degrees with the front-rear direction as a rotation axis, the user terminal assumes the position and state shown in fig. 9B; then, the front and back direction is taken as a rotating shaft to rotate clockwise to return to the initial state shown in FIG. 9A; then, the user terminal rotates counterclockwise more than 60 degrees with the front-rear direction as a rotation axis, and enters the position and state shown in fig. 9B again; then, the operation is returned to the initial state shown in fig. 9A by rotating clockwise about the front-rear direction as the rotation axis, and the operation is ended.

FIG. 10 presents a schematic diagram illustrating predefined actions for a control center interface operation for displaying a current virtual key fob associated object in accordance with another embodiment. Referring to fig. 10, first, the user terminal is in the position and state shown on the left side of fig. 10, and then the body is moved toward the outside of the user's body, forming a similar throwing-out action, and the user terminal reaches the position and state on the right side, and the action is finished.

Fig. 11 gives a schematic diagram for illustrating predefined actions of the virtual key package icon array operation for displaying in icon form the category in which the current virtual key package is located according to another embodiment. Referring to fig. 11, first, the user terminal is in the position and state shown on the right side of fig. 11, and then the body is moved toward the inside of the user's body, forming a similar retracting action, and the user terminal reaches the position and state on the left side, and the action is finished.

Fig. 12 presents a flowchart for illustrating a detection recognition process of motion sensor data in the motion action processing unit according to an embodiment. The identification processing is based on Bayesian classification algorithm to match and calculate the received motion sensor data with predefined actions one by one, then find out the action with the highest matching value and return, if no matched action exists, the failure will be returned. Referring to fig. 12, first, a matching calculation is performed for all predefined actions, and a bayesian denominator value is counted. A match calculation is performed for each predefined action in the loop body between steps 1200, 1202, 1204, 1206, and the resulting match likelihood value is multiplied by the predefined action default likelihood value (step 1204) and then accumulated into a bayesian denominator value (step 1206). Then, all predefined actions are traversed once again starting at step 1208, the match probability value calculated by the predefined action is multiplied by its default probability value and divided by the bayesian denominator value (step 1210), and the predefined action with the highest match probability is found according to the result. If no matched predefined action is found at last, returning failure; otherwise, the object of the predefined action will be returned.

After the motion action processing unit 330 recognizes the predefined action, the central processing unit 313 of the user terminal 300 may perform a predefined operation corresponding to the predefined action.

Fig. 13 presents a flow chart illustrating the process of initializing a predefined action in the athletic action processing unit according to one embodiment. Referring to FIG. 13, first, for each predefined action, data for the predefined action is loaded from the predefined action library file prior to matching and the action is trained using the data. The loop body between steps 1300, 1302, 1304, 1306 is to load predefined action data from memory one by one and then train. The training method comprises the following steps: carrying out quantitative processing on the motion sensor data (adopting a Kmeans clustering algorithm) to obtain discrete sequence data (step 1304); the discrete sequence data is then passed to a one-way hidden markov model algorithm for machine learning to obtain motion matching model data and a motion default likelihood value (step 1306).

Fig. 14 presents a flow chart illustrating the process of matching recognition of motion sensor data to a predefined motion in a motion processing unit according to one embodiment. Before executing the process, the predefined actions should be initialized and trained according to the steps described in fig. 13. Referring to fig. 14, firstly, the motion sensor data is quantized (using a Kmeans clustering algorithm) to obtain discrete sequence data (step 1400); then, the discrete sequence data is transmitted to a one-way hidden markov model algorithm for forward sequence matching calculation to obtain a matching possibility value (step 1402).

For the Kmeans clustering algorithm, the hidden markov model algorithm and the bayesian classification algorithm, reference may be made to relevant information, and these algorithms are only algorithms used in one embodiment and will not be explained in detail here.

FIG. 15 sets forth a flow chart illustrating a motion action processing unit processing received motion sensor data according to one embodiment. When the user terminal is activated and receives data from the motion sensor, the process is invoked to pre-process the data and determine whether to enter an identification state or to identify or record newly received data. Referring to fig. 15, first, the loop body composed of steps 1500, 1502, 1504 performs filtering processing on the received sensor data. In this embodiment, an extensible filter class is used to process the sensor data. Depending on the purpose of filtering, there may be a filter for filtering data in an idle state, there may be a filter for filtering fine difference data in the same moving direction, and so on. After the filtering process is completed, step 1506 is reached to check if there is more data. If no data remains, the process is completed by proceeding directly to step 1530. Otherwise, step 1508 is entered to determine if the training (teaching) state is currently being performed, and if so, step 1528 is entered to add data to the action training data, and then step 1530 is entered to complete the process. If not, then step 1510 is entered to check if it is in motion based on the filtered sensor data. Then, the process proceeds to step 1512 to determine whether the device is currently in the identification state. If not already in the identification state, then step 1520 is entered, otherwise step 1514 is entered. In step 1520, first, it is determined whether the device is currently in motion, if not, the process proceeds to step 1530, and the process is completed; otherwise, go to step 1522, start recognition and enter recognition state, and then add new data to the action recognition data (i.e., step 1524). In step 1514, it is determined whether the motion state is currently in motion, and if so, new data is added to the motion recognition data (step 1516), otherwise, the process proceeds to step 1518, where the recognition state is exited and the motion recognition data is recognized (i.e., motion recognition and processing are performed according to the process flow shown in fig. 12). After steps 1516 and 1518 are completed, the process proceeds to step 1530 where the process is completed.

In some embodiments, the user selects a predefined action to teach through the user interface and may then enter the training state. After the training and testing are passed, the action data taught by the user can be stored.

It will be appreciated by those skilled in the art that the components of the apparatus and steps of the method provided in the embodiments of the invention described above may be centralized on a single computing device or distributed across a network of multiple computing devices. Alternatively, they may be implemented in program code executable by a computing device. Thus, they may be stored in a memory device for execution by a computing device, or they may be separately fabricated as individual integrated circuit modules, or multiple modules or steps thereof may be fabricated as a single integrated circuit module for implementation. Thus, the present invention is not limited to any specific combination of hardware and software.

The above description is only a preferred embodiment of the present invention, but should not be taken as limiting the scope of the invention, which is defined by the appended claims.

24页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:主动笔、触控输入系统及驱动方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类