Personalized man-machine interaction method and device for robot and terminal equipment

文档序号:905892 发布日期:2021-02-26 浏览:6次 中文

阅读说明:本技术 一种机器人的个性化人机交互方法、装置及终端设备 (Personalized man-machine interaction method and device for robot and terminal equipment ) 是由 夏严辉 熊友军 于 2019-08-22 设计创作,主要内容包括:本发明适用于智能机器人技术领域,提供了一种机器人的个性化人机交互方法、装置及终端设备,方法包括:获取机器人发送的环境信息以及用户的特征信息和行为信息,根据所述环境信息、所述特征信息及所述行为信息形成与所述用户对应的个性化系统,在接收到开机指令时,控制机器人输出与所述用户对应的个性化系统。本发明通过获取机器人发送的环境信息以及用户的特征信息和行为信息,根据环境信息、特征信息及行为信息形成与用户对应的个性化系统,在接收到开机指令时,控制机器人输出与用户对应的个性化系统,可定制、升级个性化的机器人人机交互系统,给用户提供了更好的交互体验。(The invention is suitable for the technical field of intelligent robots, and provides a personalized man-machine interaction method, a device and terminal equipment for a robot, wherein the method comprises the following steps: the method comprises the steps of obtaining environment information sent by a robot and feature information and behavior information of a user, forming an individualized system corresponding to the user according to the environment information, the feature information and the behavior information, and controlling the robot to output the individualized system corresponding to the user when a starting-up instruction is received. According to the invention, the environment information sent by the robot, the characteristic information and the behavior information of the user are obtained, the personalized system corresponding to the user is formed according to the environment information, the characteristic information and the behavior information, when the starting instruction is received, the robot is controlled to output the personalized system corresponding to the user, the personalized robot man-machine interaction system can be customized and upgraded, and better interaction experience is provided for the user.)

1. A personalized human-computer interaction method of a robot is characterized by comprising the following steps:

acquiring environmental information sent by a robot and characteristic information and behavior information of a user;

forming a personalized system corresponding to the user according to the environment information, the feature information and the behavior information;

and when a starting-up instruction is received, controlling the robot to output a personalized system corresponding to the user.

2. The method for personalized human-computer interaction of a robot according to claim 1, wherein forming a personalized system based on the environment information, the feature information, and the behavior information comprises:

acquiring the use frequency of a default function;

identifying a default function with the use frequency higher than or equal to a first preset use frequency as a function with high user preference degree;

identifying the default function with the use frequency lower than or equal to the second preset use frequency as a function with low user preference degree, and deleting the function;

identifying a default function with the use frequency lower than a first preset use frequency and higher than a second preset use frequency as a function with a general user preference degree;

and improving and strengthening the functions with high user preference degree, and pushing the functions related to the functions with high user preference degree.

3. The method for personalized human-computer interaction of a robot according to claim 1, wherein forming a personalized system based on the environment information, the feature information, and the behavior information comprises:

and when a function adding request is received, adding a new function according to the function adding request.

4. The method for personalized human-computer interaction of a robot according to claim 1, wherein forming a personalized system based on the environment information, the feature information, and the behavior information comprises:

and acquiring the processing opinions of the user on the unidentified functions, and processing the unidentified functions according to the processing opinions.

5. The method of claim 1, wherein the characteristic information includes at least one of gender, age, family member information, education level, occupation, height, income level, consumption level, and consumption preference, the environmental information includes environmental sound, and the behavior information includes at least one of acquisition information type, playing music type, listening story type, and counseling information type.

6. The method for personalized human-computer interaction of the robot according to claim 1, wherein after acquiring the environment information sent by the robot and the characteristic information and the behavior information of the user, the method comprises the following steps:

and identifying the gender and age of the user according to the characteristic information and the behavior information.

7. The method for personalized human-computer interaction of the robot according to claim 1, wherein after acquiring the environment information sent by the robot and the characteristic information and the behavior information of the user, the method comprises the following steps:

identifying the type and the size of the noise according to the environmental sound;

acquiring the volume of a system;

identifying the environment type according to the noise type, the noise size, the volume size and the voice interaction information; the type of environment includes a bedroom, a dining room, a kitchen, an office, a conference room, or outdoors.

8. The method for personalized human-computer interaction with a robot of claim 1, wherein the characteristic information further comprises a consumption record;

after acquiring environmental information sent by a robot and characteristic information and behavior information of a user, the method comprises the following steps:

from the consumption record, an income level, a consumption level, and a consumption preference of the user are identified.

9. The method for personalized human-computer interaction of the robot according to claim 1, wherein before controlling the robot to output the personalized system corresponding to the user when the power-on instruction is received, the method comprises:

acquiring biological information of a user, which is sent by the robot;

and binding the biological information with a personalized system corresponding to the user.

10. A personalized human-computer interaction device of a robot, comprising:

the first acquisition module is used for acquiring environmental information sent by the robot and characteristic information and behavior information of a user;

the forming module is used for forming a personalized system corresponding to the user according to the environment information, the feature information and the behavior information;

and the receiving module is used for controlling the robot to output the personalized system corresponding to the user when a starting-up instruction is received.

11. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 9 when executing the computer program.

12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 9.

Technical Field

The invention belongs to the technical field of intelligent robots, and particularly relates to a personalized human-computer interaction method and device for a robot and terminal equipment.

Background

In recent years, due to the advantages of comprehensive functions, high intelligence and the like, intelligent robots are widely applied to daily life.

However, the existing intelligent robots can only be upgraded uniformly. Or, only the local function can be customized according to the user's needs, which cannot meet the user's needs.

Disclosure of Invention

In view of this, embodiments of the present invention provide a method and an apparatus for personalized human-computer interaction of a robot, and a terminal device, so as to solve the problem that an intelligent robot in the prior art can only be upgraded uniformly. Or, only the local function can be customized according to the user requirement, and the requirement of the user cannot be met.

The first aspect of the embodiment of the invention provides a personalized man-machine interaction method for a robot, which comprises the following steps:

acquiring environmental information sent by a robot and characteristic information and behavior information of a user;

forming a personalized system corresponding to the user according to the environment information, the feature information and the behavior information;

and when a starting-up instruction is received, controlling the robot to output a personalized system corresponding to the user.

Optionally, forming a personalized system according to the environment information, the feature information, and the behavior information includes:

acquiring the use frequency of a default function;

identifying a default function with the use frequency higher than or equal to a first preset use frequency as a function with high user preference degree;

identifying the default function with the use frequency lower than or equal to the second preset use frequency as a function with low user preference degree, and deleting the function;

identifying a default function with the use frequency lower than a first preset use frequency and higher than a second preset use frequency as a function with a general user preference degree;

and improving and strengthening the functions with high user preference degree, and pushing the functions related to the functions with high user preference degree.

Optionally, forming a personalized system according to the environment information, the feature information, and the behavior information includes:

and when a function adding request is received, adding a new function according to the function adding request.

Optionally, forming a personalized system according to the environment information, the feature information, and the behavior information includes:

and acquiring the processing opinions of the user on the unidentified functions, and processing the unidentified functions according to the processing opinions.

Optionally, the characteristic information includes at least one of gender, age, family member information, education level, occupation, height, income level, consumption level, and consumption preference, the environment information includes environment sound, and the behavior information includes at least one of acquisition information type, playing music type, listening story type, and counseling information type.

Optionally, after acquiring the environmental information sent by the robot and the feature information and the behavior information of the user, the method includes:

and identifying the gender and age of the user according to the characteristic information and the behavior information.

Optionally, after acquiring the environmental information sent by the robot and the feature information and the behavior information of the user, the method includes:

identifying the type and the size of the noise according to the environmental sound;

acquiring the volume of a system;

identifying the environment type according to the noise type, the noise size, the volume size and the voice interaction information; the type of environment includes a bedroom, a dining room, a kitchen, an office, a conference room, or outdoors.

Optionally, the characteristic information further includes a consumption record;

after acquiring environmental information sent by a robot and characteristic information and behavior information of a user, the method comprises the following steps:

from the consumption record, an income level, a consumption level, and a consumption preference of the user are identified.

Optionally, when the power-on instruction is received, before controlling the robot to output the personalized system corresponding to the user, the method includes:

acquiring biological information of a user, which is sent by the robot;

and binding the biological information with a personalized system corresponding to the user.

A second aspect of an embodiment of the present invention provides a personalized human-computer interaction device for a robot, including:

the first acquisition module is used for acquiring environmental information sent by the robot and characteristic information and behavior information of a user;

the forming module is used for forming a personalized system corresponding to the user according to the environment information, the feature information and the behavior information;

and the receiving module is used for controlling the robot to output the personalized system corresponding to the user when a starting-up instruction is received.

Optionally, the forming module includes:

a first acquisition unit configured to acquire a frequency of use of a default function;

the first identification unit is used for identifying the function with the use frequency higher than or equal to the default function of the first preset use frequency as the function with high user preference degree;

a second identification unit for identifying a default function having a frequency of use lower than or equal to a second preset frequency of use as a function having a low user's preference degree and deleting the function;

the third identification unit is used for identifying the default function with the use frequency lower than the first preset use frequency and higher than the second preset use frequency as the function with the general user favorite degree;

a pushing unit for perfecting and enhancing the function with high user liking degree, pushing the function related to the function with high user liking degree,

optionally, the forming module includes:

and the receiving unit is used for adding a new function according to the function adding request when the function adding request is received.

Optionally, the forming module includes:

and the second acquisition unit is used for acquiring the processing opinions of the user on the unidentified functions and processing the unidentified functions according to the processing opinions.

Optionally, the characteristic information includes at least one of gender, age, family member information, education level, occupation, height, income level, consumption level, and consumption preference, the environment information includes environment sound, and the behavior information includes at least one of acquisition information type, playing music type, listening story type, and counseling information type.

Optionally, the apparatus further includes:

and the first identification module is used for identifying the gender and the age of the user according to the characteristic information and the behavior information.

Optionally, the apparatus further includes:

the second identification module is used for identifying the type and the size of the noise according to the environmental sound;

the second acquisition module is used for acquiring the volume of the system;

the third identification module is used for identifying the environment type according to the noise type, the noise size, the volume size and the voice interaction information; the type of environment includes a bedroom, a dining room, a kitchen, an office, a conference room, or outdoors.

Optionally, the characteristic information further includes a consumption record;

the device, still include:

and the fourth identification module is used for identifying the income level, the consumption level and the consumption preference of the user according to the consumption record.

Optionally, the apparatus further includes:

the third acquisition module is used for acquiring the biological information of the user sent by the robot;

and the binding module is used for binding the biological information with the personalized system corresponding to the user.

A third aspect of an embodiment of the present invention provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method as described above when executing the computer program.

A fourth aspect of embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method as described above.

According to the embodiment of the invention, the environment information sent by the robot, the characteristic information and the behavior information of the user are obtained, the personalized system corresponding to the user is formed according to the environment information, the characteristic information and the behavior information, when the starting instruction is received, the robot is controlled to output the personalized system corresponding to the user, the personalized robot man-machine interaction system can be customized and upgraded, and better interaction experience is provided for the user.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.

Fig. 1 is a schematic flowchart of a personalized human-computer interaction method for a robot according to an embodiment of the present invention;

fig. 2 is a flowchart of a personalized human-computer interaction method for a robot according to a second embodiment of the present invention;

fig. 3 is a schematic flowchart of a personalized human-computer interaction method for a robot according to a third embodiment of the present invention;

fig. 4 is a schematic structural diagram of a personalized human-computer interaction device of a robot according to a fourth embodiment of the present invention;

fig. 5 is a schematic structural diagram of a personalized human-computer interaction device of a robot according to a fifth embodiment of the present invention;

fig. 6 is a schematic structural diagram of a personalized human-computer interaction device of a robot according to a sixth embodiment of the present invention;

fig. 7 is a schematic diagram of a terminal device according to a seventh embodiment of the present invention.

Detailed Description

In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

The terms "comprises" and "comprising," and any variations thereof, in the description and claims of this invention and the above-described drawings are intended to cover non-exclusive inclusions. For example, a process, method, or system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. Furthermore, the terms "first," "second," and "third," etc. are used to distinguish between different objects and are not used to describe a particular order.

In order to explain the technical means of the present invention, the following description will be given by way of specific examples.

Example one

As shown in fig. 1, the present embodiment provides a personalized human-computer interaction method for a robot, which may be applied to terminal devices such as a server of an intelligent robot, an intelligent robot control device, and the like. The personalized man-machine interaction method for the robot provided by the embodiment comprises the following steps:

s101, acquiring environment information sent by the robot and characteristic information and behavior information of a user.

In specific application, the environment information sent by the robot is obtained, so that the environment type of the location of the robot is judged according to the environment information. And acquiring characteristic information and behavior information of the user. The users include, but are not limited to, personal users, and home users (including two or more users). The characteristic information includes, but is not limited to, at least one of gender, age, family member information, education level, occupation, height, income level, consumption level, and consumption preference; environmental information includes, but is not limited to, environmental sounds; types of environments include, but are not limited to, bedrooms, restaurants, kitchens, offices, conference rooms, or outdoors; the behavior information includes, but is not limited to, at least one of a type of acquisition information, a type of music played, a type of story listened to, and a type of advisory information. In this embodiment, the obtaining manner includes: (1) active acquisition: acquiring direct information or acquiring required basic information in a voice interactive inquiry mode; (2) passive acquisition: comprehensive judgment and processing of information are carried out through the acquired basic information (such as characteristic information and behavior information of a user); (3) and (3) combining software and hardware to obtain: shooting images through a camera, and carrying out face recognition by combining a face recognition algorithm; or recording is carried out through mic, and voice recognition is carried out in combination with voiceprint recognition.

And S102, forming a personalized system corresponding to the user according to the environment information, the feature information and the behavior information.

In specific application, the system of the robot is personalized and customized according to the environmental information, the characteristic information and the behavior information to form a personalized system corresponding to the user. For example, the gender, age and preference of the user are identified according to the characteristic information and the behavior information, the position of the user is identified according to the environment information, the consumption level and the consumption preference of the user are identified according to the consumption record of the user, the system is personalized, and a personalized system corresponding to the user is formed.

And S103, controlling the robot to output a personalized system corresponding to the user when a starting-up instruction is received.

In specific application, when a starting-up instruction is received, the robot is controlled to output a personalized system corresponding to a user.

In one embodiment, when a starting-up instruction is received, the biological information of the user sent by the robot is obtained, the identity of the user is identified according to the biological information of the user, and the personalized system corresponding to the user is output, so that one user can correspond to one personalized system.

In one embodiment, when a starting-up instruction is received, whether an update data packet exists is detected, if yes, the update data packet is downloaded, and the system version of the robot is updated according to the update data packet.

In specific application, when a starting-up instruction is received, whether an update data packet exists in a local database (or other terminal equipment and service cloud in communication connection with a current terminal) is detected, if yes, the update data packet is downloaded, and a system version of the robot is updated according to the update data packet.

In one embodiment, the characteristic information includes, but is not limited to, at least one of gender, age, family member information, education level, occupation, height, income level, consumption level, and consumption preference, the environmental information includes, but is not limited to, environmental sounds, the environmental type includes, but is not limited to, bedroom, dining room, kitchen, office, meeting room, or outdoors, the behavioral information includes, but is not limited to, at least one of type of information obtained, type of music played, type of story listened to, and type of advisory information.

In the embodiment, by combining various algorithms such as face recognition, voiceprint recognition and the like, environment information, personal information, other characteristic information of a user or user behavior information can be comprehensively judged, and information such as the number of family members, the sex of the family members, the education level of the user, the occupation and the height of the user can be recognized; alternatively, personal information input by the user is acquired.

In one embodiment, after step S101, the method includes:

and identifying the gender and age of the user according to the characteristic information and the behavior information.

In a specific application, the gender and the age of the user are identified according to the characteristic information behavior information of the user. Alternatively, the gender and age of the user input are obtained. For example, the preference level of the information type, the music playing type, the story listening type or the consultation information type is acquired, the response result of the voice interaction is acquired, the voiceprint recognition result of the user is acquired, the acquired information is comprehensively judged, and the gender and the age of the user are recognized.

In one embodiment, after step S101, the method includes:

identifying the type and the size of the noise according to the environmental sound;

acquiring the volume of a system;

identifying the environment type according to the noise type, the noise size, the volume size and the voice interaction information; the type of environment includes a bedroom, a dining room, a kitchen, an office, a conference room, or outdoors.

In specific application, the noise type and the noise magnitude contained in the surrounding environment are identified according to the environment sound, and the volume magnitude set by the system is obtained. Identifying the environment type according to the noise type and the noise size contained in the surrounding environment, the volume set by the system and the acquired voice interaction information; the type of environment includes, but is not limited to, a bedroom, a dining room, a kitchen, an office, a conference room, or outdoors. In one embodiment, the type of environment input by the user may be obtained directly.

In one embodiment, the characteristic information further comprises a consumption record;

after step S101, the method includes:

identifying a user's income level, consumption level and consumption preferences from the consumption records

In specific application, the consumption record of the user sent by the robot is obtained, the income level, the consumption level and the consumption preference of the user are identified according to the consumption record of the user, and detailed personalized customization is carried out on the user. In one embodiment, the user's revenue level, consumption level, and consumption preferences may be identified in conjunction with the consumption records described above, as well as other characteristic information and behavioral information of the user.

According to the embodiment, the environment information sent by the robot, the characteristic information and the behavior information of the user are obtained, the personalized system corresponding to the user is formed according to the environment information, the characteristic information and the behavior information, when the starting instruction is received, the robot is controlled to output the personalized system corresponding to the user, the personalized robot man-machine interaction system can be customized and upgraded, and better interaction experience is provided for the user.

Example two

As shown in fig. 2, this embodiment is a further description of the method steps in the first embodiment. In this embodiment, step S102 includes:

and S1021, acquiring the use frequency of the default function.

In a specific application, the frequency of use of a default function in the system is obtained.

And S1022, identifying the function with the frequency higher than or equal to the default function with the first preset frequency as the function with high user preference degree.

In a specific application, a default function with the use frequency higher than or equal to a first preset use frequency is identified as a function with high user preference degree, and the function is optimized. The first preset frequency may be specifically set according to actual conditions, for example, the first preset frequency is preset to be used three times a week. Alternatively, a first preset frequency of use set by a user may be acquired.

And S1023, identifying the default function with the use frequency lower than or equal to the second preset use frequency as the function with low user preference degree, and deleting the function.

In a specific application, a default function with the use frequency lower than or equal to a second preset use frequency is identified as a function with low user preference degree, and the function is deleted. The second preset frequency may be specifically set according to an actual situation, for example, the second preset frequency is preset to be used once a week, or the second preset frequency set by the user may be obtained.

S1024, identifying the default function with the use frequency lower than the first preset use frequency and higher than the second preset use frequency as a function with general user preference degree.

In specific application, the default function with the use frequency lower than the first preset use frequency and higher than the second preset use frequency is marked as the function with the general user liking degree, and when the customized personalized system is upgraded, the functions with the general user liking degree and the functions with the high user liking degree are normally upgraded and used, so that the phenomenon that the functions are abandoned due to the fact that the functions are overprocessed and the user feels dislike is avoided. The third preset frequency may be specifically set according to an actual situation, for example, the third preset frequency is preset to be used once a month, or the third preset frequency set by the user may be obtained.

S1025, perfecting and strengthening the functions with high user preference degrees, and pushing the functions related to the functions with high user preference degrees.

In specific application, the functions marked as high user preference degrees are perfected and enhanced, and the functions associated with the functions with high user preference degrees are pushed. For example, if the news broadcast is identified as a function with high user liking degree, the news flashes related to the news broadcast can be pushed, and the news flashes are identified and further processed according to the using frequency of the news flashes.

In one embodiment, step S102 includes:

and when a function adding request is received, adding a new function according to the function adding request.

In specific application, if a function adding request sent by a user is received, function information (including but not limited to names and definitions of functions) required to be added by the user is obtained, and a corresponding new function is added according to the function adding request and the function information, so that personalized requirements of the user are met.

In one embodiment, step S102 includes:

and acquiring the processing opinions of the user on the unidentified functions, and processing the unidentified functions according to the processing opinions.

In a specific application, processing opinions of a user on unidentified functions are acquired, and the unidentified functions are processed according to the processing opinions. In the present embodiment, the unidentified function includes, but is not limited to, a function of failing to acquire accurate user usage preference information, or a function of acquiring only a small amount of user usage preference information. Alternatively, the unidentified function may be identified according to the above step S1022 or S1023, and further processed according to the identification.

According to the embodiment, the system functions of the robot are optimized, upgraded or processed according to the use frequency of each function and the preference degree of the user to the functions, so that the requirements of the user in various aspects are met, and the user experience is improved.

EXAMPLE III

As shown in fig. 3, this embodiment is a further description of the method steps in the first embodiment. In this embodiment, before step S103, the method includes:

and S104, acquiring the biological information of the user sent by the robot.

In a specific application, biological information of a user sent by a robot is acquired, wherein the biological information includes but is not limited to at least one of a face image, a fingerprint, an iris, a palm print and a voiceprint of the user.

And S105, binding the biological information with a personalized system corresponding to the user.

In specific application, the biological information is bound with a personalized system corresponding to a user, so that a one-to-one personalized system is realized.

According to the embodiment, the biological information of the user sent by the robot is acquired, and the biological information of the user is bound with the customized personalized system, so that the personalized system customized according to the user preference is realized, and different requirements of a plurality of customers are met.

It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.

Example four

As shown in fig. 4, the present embodiment provides a robot personalized human-computer interaction device 100 for performing the method steps of the first embodiment. The personalized human-computer interaction device 100 for the robot provided by the embodiment comprises:

the first acquisition module 101 is used for acquiring environmental information sent by the robot and characteristic information and behavior information of a user;

a forming module 102, configured to form a personalized system corresponding to the user according to the environment information, the feature information, and the behavior information;

and the receiving module 103 is configured to control the robot to output a personalized system corresponding to the user when the starting instruction is received.

In one embodiment, the characteristic information includes at least one of gender, age, family member information, education level, occupation, height, income level, consumption level, and consumption preference, the environment information includes environment sound, and the behavior information includes at least one of acquisition information type, playing music type, listening story type, and counseling information type.

In one embodiment, the personalized human-computer interaction device 100 for the robot further includes:

and the first identification module is used for identifying the gender and the age of the user according to the characteristic information and the behavior information.

In one embodiment, the personalized human-computer interaction device 100 for the robot further includes:

the second identification module is used for identifying the type and the size of the noise according to the environmental sound;

the second acquisition module is used for acquiring the volume of the system;

the third identification module is used for identifying the environment type according to the noise type, the noise size, the volume size and the voice interaction information; the type of environment includes a bedroom, a dining room, a kitchen, an office, a conference room, or outdoors.

In one embodiment, the characteristic information further comprises a consumption record;

the personalized human-computer interaction device 100 of the robot further comprises:

and the fourth identification module is used for identifying the income level, the consumption level and the consumption preference of the user according to the consumption record.

According to the embodiment, the environment information sent by the robot, the characteristic information and the behavior information of the user are obtained, the personalized system corresponding to the user is formed according to the environment information, the characteristic information and the behavior information, when the starting instruction is received, the robot is controlled to output the personalized system corresponding to the user, the personalized robot man-machine interaction system can be customized and upgraded, and better interaction experience is provided for the user.

EXAMPLE five

As shown in fig. 5, in the present embodiment, the forming module 102 in the fourth embodiment further includes the following structure for executing the method steps in the second embodiment:

a first obtaining unit 1021 for obtaining a frequency of use of a default function;

a first identification unit 1022, configured to identify a function with a frequency higher than or equal to a first preset frequency as a function with a high user preference;

a second identification unit 1023, configured to identify a default function whose usage frequency is lower than or equal to a second preset usage frequency as a function whose user's preference degree is low, and delete the function;

the third identifying unit 1024 is configured to identify a default function with a usage frequency lower than the first preset usage frequency and higher than the second preset usage frequency as a function with a general user preference.

The pushing unit 1025 is used for perfecting and strengthening the function with high user preference degree and pushing the function related to the function with high user preference degree;

in one embodiment, forming module 102 includes:

and the receiving unit is used for adding a new function according to the function adding request when the function adding request is received.

In one embodiment, forming module 102 includes:

and the second acquisition unit is used for acquiring the processing opinions of the user on the unidentified functions and processing the unidentified functions according to the processing opinions.

According to the embodiment, the system functions of the robot are optimized, upgraded or processed according to the use frequency of each function and the preference degree of the user to the functions, so that the requirements of the user in various aspects are met, and the user experience is improved.

EXAMPLE six

As shown in fig. 6, in the present embodiment, the personalized human-computer interaction device 100 for a robot in the fourth embodiment further includes the following structure for executing the steps of the method in the third embodiment:

a third obtaining module 104, configured to obtain biological information of the user sent by the robot;

a binding module 105, configured to bind the biological information with a personalized system corresponding to the user.

According to the embodiment, the biological information of the user sent by the robot is acquired, and the biological information of the user is bound with the customized personalized system, so that the personalized system customized according to the user preference is realized, and different requirements of a plurality of customers are met.

EXAMPLE seven

Fig. 7 is a schematic diagram of the terminal device provided in this embodiment. As shown in fig. 7, the terminal device 7 of this embodiment includes: a processor 70, a memory 71 and a computer program 72, such as a personalized human-machine interaction program for a robot, stored in said memory 71 and executable on said processor 70. The processor 70, when executing the computer program 72, implements the steps in the embodiments of the personalized human-machine interaction method for each robot described above, such as the steps S101 to S103 shown in fig. 1. Alternatively, the processor 70, when executing the computer program 72, implements the functions of each module/unit in each device embodiment described above, for example, the functions of the modules 101 to 103 shown in fig. 4.

Illustratively, the computer program 72 may be partitioned into one or more modules/units that are stored in the memory 71 and executed by the processor 70 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 72 in the terminal device 7. For example, the computer program 72 may be divided into a first acquiring module, a forming module and a receiving module, and each module has the following specific functions:

the first acquisition module is used for acquiring environmental information sent by the robot and characteristic information and behavior information of a user;

the forming module is used for forming a personalized system corresponding to the user according to the environment information, the feature information and the behavior information;

and the receiving module is used for controlling the robot to output the personalized system corresponding to the user when a starting-up instruction is received.

The terminal device 7 may be a desktop computer, a notebook, a robot, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of a terminal device 7 and does not constitute a limitation of the terminal device 7 and may comprise more or less components than shown, or some components may be combined, or different components, for example the terminal device may further comprise input output devices, network access devices, buses, etc.

The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.

The memory 71 may be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may also be an external storage device of the terminal device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital Card (SD), a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal device 7. The memory 71 is used for storing the computer program and other programs and data required by the terminal device. The memory 71 may also be used to temporarily store data that has been output or is to be output.

It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.

In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.

Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.

In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.

The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.

The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.

The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

17页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:电子装置及其画面视角识别方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类