Emotion analysis method based on intelligent wearable device

文档序号:865454 发布日期:2021-03-19 浏览:8次 中文

阅读说明:本技术 一种基于智能可穿戴设备的情绪分析方法 (Emotion analysis method based on intelligent wearable device ) 是由 葛畅 薛艳敏 余隋怀 徐诗怡 于 2020-12-14 设计创作,主要内容包括:本发明公开了一种基于智能可穿戴设备的情绪分析方法,具体为:首先选定运动感知类信号数据、原始情绪标签、日常活动标签,用户在智能可穿戴设备输入1~2个月内的情绪数据和日常活动数据;采用智能可穿戴设备对运动感知类信号数据、情绪数据和日常活动数据的对应关系进行预设法运算,确定用户情绪状态与日常活动、人体姿态动作的关联关系,建立用户情绪预测模型;最后通过用户情绪预测模型对用户实际情绪进行分析后,输出用户情绪信息。本发明一种基于智能可穿戴设备的情绪分析方法,解决了现有技术中存在的对中等强度的情绪识别性差、难以从用户动作或活动准确推测情绪的问题。(The invention discloses an emotion analysis method based on intelligent wearable equipment, which specifically comprises the following steps: firstly, selecting motion perception signal data, an original emotion label and a daily activity label, and inputting emotion data and daily activity data of a user in an intelligent wearable device within 1-2 months; performing preset method operation on the corresponding relation of the motion perception signal data, the emotion data and the daily activity data by adopting intelligent wearable equipment, determining the incidence relation of the emotion state of the user with the daily activity and the posture action of the human body, and establishing a user emotion prediction model; and finally, analyzing the actual emotion of the user through a user emotion prediction model, and outputting user emotion information. The emotion analysis method based on the intelligent wearable device solves the problems that in the prior art, emotion recognition of moderate intensity is poor, and emotion is difficult to accurately infer from user actions or activities.)

1. An emotion analysis method based on intelligent wearable equipment is characterized by comprising the following steps:

step 1, selecting a motion perception signal data type, an original emotion label type and a daily activity label type, wearing intelligent wearable equipment by a user for 1-2 months, meanwhile, implicitly collecting motion perception signal data by adopting the intelligent wearable equipment, and periodically inputting emotion data and daily activity data on the intelligent wearable equipment by the user;

step 2, performing preset method operation on the corresponding relation of the motion perception signal data, the emotion data and the daily activity data by adopting intelligent wearable equipment, determining the incidence relation of the emotion state of the user, the daily activity and the posture and the action of the human body, and establishing a user emotion prediction model;

and 3, analyzing the actual emotion of the user through the user emotion prediction model, and outputting user emotion information.

2. The emotion analysis method based on intelligent wearable equipment according to claim 1, wherein in step 2, the motion perception class signal data categories include triaxial acceleration, velocity, angular velocity, magnetic field data and time; the original emotion label categories include tension-tired, excitement-boring, happy-low, and stress-relaxed; the daily activity tag categories include communication, dining, learning, work, home, personal care, receiving service, shopping, social contact, and sports.

3. The emotion analysis method based on a smart wearable device, according to claim 1, wherein the human gesture actions include walking, jogging, sitting, standing, going upstairs and going downstairs actions.

4. The emotion analysis method based on intelligent wearable equipment according to claim 1, wherein the preset method operation process in step 2 specifically comprises:

step 2.1, extracting motion perception signal data, fusing emotion data and daily activity data, and performing clustering analysis to obtain human posture actions;

and 2.2, adopting a classifier to respectively perform emotion recognition and daily activity recognition on the human body gesture actions, and establishing a user emotion prediction model.

5. The emotion analysis method based on intelligent wearable equipment according to claim 4, wherein the step 3 specifically comprises:

step 3.1, judging whether the motion of the user is in a first preset motion range or not according to the motion perception signal data through a user emotion prediction model, if so, reminding the user to record an emotion label, and if not, giving no prompt;

step 3.2, obtaining an emotion angle according to the arousing degree and the positive and negative of the emotion of the user, and estimating the emotion intensity and the joyful degree of the user to obtain an emotion angle value;

step 3.3, after the intelligent wearable device outputs the emotion angle value, inviting the user to judge the accuracy of the emotion angle value, if the emotion angle value is accurate, continuing prediction, and if the emotion angle value is inaccurate, inviting the user to give an actual emotion state, and substituting the data into a user emotion prediction model for correction;

and 3.4, repeating the steps 3.1-3.3 until the emotion information of the user is output.

6. The emotion analysis method based on intelligent wearable equipment as claimed in claim 4, wherein in the step 2, the preset algorithm operation process takes the time from the start of daily activities to the time when the user records emotion as a time window to predict the emotional state of the user.

7. The emotion analysis method based on the intelligent wearable device, as recited in claim 5, wherein in step 3.1, the first preset action is a sequence of actions in daily activities and a sequence of actions in gesture actions.

Technical Field

The invention belongs to the technical field of intelligent health management, and particularly relates to an emotion analysis method based on intelligent wearable equipment.

Background

The emotional state of a human is closely related to the immune response of the body. At present, emotions are mainly recognized by acquiring signals such as heart rate, brain electricity, expressions, voice and the like of a user, however, equipment needs to be additionally arranged on the head of the user to acquire the signals, the interference to the user is large, and the device is not suitable for daily wearing; although the emotion of a user can be estimated by adopting wearable equipment to acquire signals such as skin electricity, brain electricity, heart rate and the like of the user, the emotion can be estimated only when the emotion fluctuates greatly; the emotion of the user in the daily life situation is mainly moderate intensity such as calmness, joy, tension and the like, and the happy and sad emotions are not common, so how to recognize the moderate intensity emotion is a difficult problem in the current emotion recognition field.

In addition, emotion data recognized by signals of the heart rate, voice, skin electricity, brain electricity and the like of the user are often fixed numerical values, namely, the emotion data and the heart rate have a corresponding relation with voice intonation and the like, at the moment, the user cannot know the relation between self action or activity and emotion, and therefore the obtained emotion data has very limited guiding significance for health management.

Disclosure of Invention

The invention aims to provide an emotion analysis method based on intelligent wearable equipment, and solves the problems that in the prior art, emotion recognition of medium intensity is poor, and emotion is difficult to accurately infer from user actions or activities.

The invention adopts the technical scheme that an emotion analysis method based on intelligent wearable equipment is implemented according to the following steps:

step 1, selecting a motion perception signal data type, an original emotion label type and a daily activity label type, wearing intelligent wearable equipment by a user for 1-2 months, meanwhile, implicitly collecting motion perception signal data by adopting the intelligent wearable equipment, and periodically inputting emotion data and daily activity data on the intelligent wearable equipment by the user;

step 2, performing preset method operation on the corresponding relation of the motion perception signal data, the emotion data and the daily activity data by adopting intelligent wearable equipment, determining the incidence relation of the emotion state of the user, the daily activity and the posture and the action of the human body, and establishing a user emotion prediction model;

and 3, analyzing the actual emotion of the user through the user emotion prediction model, and outputting user emotion information.

The invention is also characterized in that:

in step 2, the motion perception signal data types comprise three-axis acceleration, speed, angular velocity, magnetic field data and time; original mood label categories include tension-tired, excited-boring, happy-low and stress-relaxed; the daily activity tag categories include communication, dining, learning, work, home, personal care, receiving services, shopping, social and sports.

The human body posture actions include walking, jogging, sitting, standing, going upstairs and going downstairs.

The preset method operation process in the step 2 specifically comprises the following steps:

step 2.1, extracting motion perception signal data, fusing emotion data and daily activity data, and performing clustering analysis to obtain human posture actions;

and 2.2, adopting a classifier to respectively perform emotion recognition and daily activity recognition on the human body gesture actions, and establishing a user emotion prediction model.

The step 3 specifically comprises the following steps:

step 3.1, judging whether the motion of the user is in a first preset motion range or not according to the motion perception signal data through a user emotion prediction model, if so, reminding the user to record an emotion label, and if not, giving no prompt;

step 3.2, obtaining an emotion angle according to the arousing degree and the positive and negative of the emotion of the user, and estimating the emotion intensity and the joyful degree of the user to obtain an emotion angle value;

step 3.3, after the intelligent wearable device outputs the emotion angle value, inviting the user to judge the accuracy of the emotion angle value, if the emotion angle value is accurate, continuing prediction, and if the emotion angle value is inaccurate, inviting the user to give an actual emotion state, and substituting the data into a user emotion prediction model for correction;

and 3.4, repeating the steps 3.1-3.3 until the emotion information of the user is output.

And in the step 2, the preset method operation process takes the time from the start of daily activities to the time of recording emotions of the user as a time window, and the emotional state of the user is predicted.

In step 3.1, the first preset action is taken as an action sequence in daily activities and an action sequence in posture actions.

The invention has the beneficial effects that:

according to the emotion analysis method based on the intelligent wearable device, the motion of the user is analyzed by recording the motion perception data of the user, so that the emotion angle of the user is conjectured, the user can know the current emotion state and emotion intensity, and the action cause before emotion occurs can be judged to guide the user to improve the emotion; the emotion analysis method based on the intelligent wearable device is quick and effective, data can be recorded and results can be analyzed only by wearing the wearable device by a user, and a new thought is provided for intelligent health management and improvement.

Drawings

Fig. 1 is a flow chart of an emotion analysis method based on a smart wearable device according to the present invention;

fig. 2 is a schematic diagram of a smart wearable device adopted in the emotion analysis method based on the smart wearable device.

Detailed Description

The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.

The invention relates to an emotion analysis method based on intelligent wearable equipment, which is implemented according to the following steps as shown in figure 1:

step 1, selecting a motion perception signal data type, an original emotion label type and a daily activity label type, wearing intelligent wearable equipment by a user for 1-2 months, meanwhile, implicitly collecting motion perception signal data by adopting the intelligent wearable equipment, and periodically inputting emotion data and daily activity data on the intelligent wearable equipment by the user;

step 2, performing preset method operation on the corresponding relation of the motion perception signal data, the emotion data and the daily activity data by adopting intelligent wearable equipment, determining the incidence relation of the emotion state of the user, the daily activity and the posture and the action of the human body, and establishing a user emotion prediction model;

the preset method operation process in the step 2 specifically comprises the following steps:

step 2.1, extracting motion perception signal data, fusing emotion data and daily activity data, and performing clustering analysis to obtain human posture actions;

step 2.2, adopting a classifier to respectively perform emotion recognition and daily activity recognition on the human body gesture actions, and establishing a user emotion prediction model;

in step 2, the motion perception signal data types comprise three-axis acceleration, speed, angular velocity, magnetic field data and time; original mood label categories include tension-tired, excited-boring, happy-low and stress-relaxed; the daily activity tag categories include communication, dining, learning, work, home, personal care, receiving services, shopping, social contact, and sports;

the human body posture actions comprise walking, jogging, sitting, standing, going upstairs and going downstairs;

in the step 2, the preset method operation process takes the time from the start of daily activities to the time of recording emotions of the user as a time window, and the emotional state of the user is predicted;

step 3, analyzing the actual emotion of the user through a user emotion prediction model, and outputting user emotion information; the method specifically comprises the following steps:

step 3.1, judging whether the motion of the user is in a first preset motion range or not according to the motion perception signal data through a user emotion prediction model, if so, reminding the user to record an emotion label, and if not, giving no prompt;

step 3.2, obtaining an emotion angle according to the arousing degree and the positive and negative of the emotion of the user, and estimating the emotion intensity and the joyful degree of the user to obtain an emotion angle value;

step 3.3, after the intelligent wearable device outputs the emotion angle value, inviting the user to judge the accuracy of the emotion angle value, if the emotion angle value is accurate, continuing prediction, and if the emotion angle value is inaccurate, inviting the user to give an actual emotion state, and substituting the data into a user emotion prediction model for correction;

step 3.4, repeating the step 3.1-3.3 until the emotion information of the user is output;

in step 3.1, the first preset action is taken as an action sequence in daily activities and an action sequence in posture actions.

The intelligent wearable device adopted by the emotion analysis method based on the intelligent wearable device comprises a central processing unit, wherein the central processing unit is respectively connected with a motion perception module, an emotion angle output module, an emotion prediction module and a memory; the memory is respectively connected with a power supply part, an emotion label recording module, a daily activity label recording module and a time setting module. As shown in fig. 2.

The motion perception module is used for recording motion perception signal data of a user; the emotion label recording module is used for recording emotion labels and corresponding time of the users; the daily activity label recording module is used for recording the daily activity label of the user and the corresponding time; the central processing unit carries out preset method operation according to the recorded user motion perception signal data, the emotion label and the daily activity label information to determine the emotion state of the user; the time setting module is used for setting a time period for collecting data and a time window for operation by a preset method; the power supply part is used for continuously supplying power to the intelligent wearable device; the emotion prediction module is used for analyzing daily activities, human posture actions and emotion classification and counting and analyzing emotion angles according to the classification.

7页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:心理体检方法和主客观相结合的受检者伪装性测试方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!