Information processing apparatus, information processing method, and information processing program

文档序号:174689 发布日期:2021-10-29 浏览:39次 中文

阅读说明:本技术 信息处理设备、信息处理方法和信息处理程序 (Information processing apparatus, information processing method, and information processing program ) 是由 小川浩明 镰田智惠 角尾衣未留 前田幸徳 高桥晃 户塚典子 立石和也 小山裕一郎 武 于 2020-03-11 设计创作,主要内容包括:一种信息处理设备设置有控制单元,该控制单元执行检测作为感测信息的指示装置的操作状态的信息的处理,以及在检测到感测信息时,通过参考其中存储有与感测信息相关联的响应内容的存储单元,判断是否将装置的操作状态通知给用户的处理。(An information processing apparatus is provided with a control unit that performs processing of detecting information indicating an operation state of a device as sensed information, and processing of determining whether to notify the user of the operation state of the device by referring to a storage unit in which response content associated with the sensed information is stored when the sensed information is detected.)

1. An information processing apparatus comprising:

a control unit that performs:

a process of detecting information indicating an operation state of the apparatus as the sensed information; and

a process of determining whether to notify a user of the operation state of the apparatus by referring to a storage unit in which response content associated with the sensed information is stored when the sensed information is detected.

2. The information processing apparatus according to claim 1, wherein

The control unit:

detecting, as the sensing information, a notification sound emitted by the apparatus that notifies the user of the operation state; and

upon detection of the notification sound, it is determined whether to notify the user of the operation state of the apparatus by referring to the storage unit in which response content associated with the notification sound is stored.

3. The information processing apparatus according to claim 1, wherein the control unit updates the response content associated with the sensing information stored in the storage unit based on a reaction received from a user after notifying the user of the operating state of the device.

4. The information processing apparatus according to claim 3, wherein the control unit updates a setting indicating whether to notify the user of the operating state of the device associated with the detected sensing information, based on a reaction received from the user.

5. The information processing apparatus according to claim 3, wherein

The control unit:

identifying speech received from the user; and

updating a setting indicating whether to notify the user of the operation state of the apparatus associated with the detected sensing information based on a reaction of the user according to a result of the voice recognition.

6. The information processing apparatus according to claim 1, wherein when the control unit notifies the user of the operating state of the device, the control unit notifies the user of information relating to a mounting position of the device together with the operating state.

7. The information processing apparatus according to claim 1, wherein

The control unit:

identifying a device associated with the sensed information based on image recognition; and

notifying the user of the operational state of the identified device together with information about the identified device.

8. The information processing apparatus according to claim 7, wherein the control unit notifies the user of at least one of a type of the device, a name of the device, and an installation position of the device together with the operation state of the device.

9. The information processing apparatus according to claim 1, wherein

The control unit:

detecting the position of the user; and

determining whether to notify the user of the operation state of the device based on a positional relationship between the location where the user is located and the device associated with the sensing information.

10. The information processing apparatus according to claim 9, wherein

The control unit:

detecting the position of the user; and

determining whether to notify the user of the operating state of the device based on a distance between a location where the user is located and a mounting location of the device associated with the sensing information.

11. The apparatus according to claim 9, wherein the control unit determines whether to notify the user of the operation state of the device in accordance with an orientation of a face or a body of the user when the device issues information indicating the operation state or when information indicating the operation state of the device is detected as the sensing information.

12. The information processing apparatus according to claim 9, wherein

The control unit:

detecting an attribute of the user in the vicinity of the information processing apparatus; and

and judging whether to inform the user of the operation state of the device according to the detected attribute of the user.

13. The information processing apparatus according to claim 1, wherein when the control unit notifies the user of the operating state of the device, the control unit notifies the user of information that labeling has been previously performed on the sensed information together with the operating state of the device.

14. The information processing apparatus according to claim 1, wherein when the control unit detects an abnormal sound indicating that the operating state of the device is abnormal as the sensing information, the control unit notifies the user of information indicating that an abnormal sound is detected together with the operating state of the device.

15. The information processing apparatus according to claim 1, wherein the control unit detects at least one of information regarding light, temperature, humidity, smell, vibration, and carbon dioxide concentration observed around the device as the sensing information.

16. The information processing apparatus according to claim 1, wherein the control unit controls display of a notification on a display unit.

17. The information processing apparatus according to claim 1, wherein

The control unit:

acquiring the use state of the information processing equipment; and

controlling a notification output based on the usage status.

18. The information processing apparatus according to claim 17, wherein the use status includes information relating to content output by the information processing apparatus.

19. An information processing method performed by an information processing apparatus, the information processing method comprising:

detecting information indicating an operation state of the apparatus as the sensing information; and

upon detection of the sensing information, it is determined whether to notify a user of the operation state of the apparatus by referring to a storage unit in which response content associated with the sensing information is stored.

20. An information processing program that causes an information processing apparatus to execute a process comprising the steps of:

detecting information indicating an operation state of the apparatus as the sensing information; and

upon detection of the sensing information, it is determined whether to notify a user of the operation state of the apparatus by referring to a storage unit in which response content associated with the sensing information is stored.

Technical Field

The present disclosure relates to an information processing apparatus, an information processing method, and an information processing program. In particular, the present disclosure relates to a process of notifying a user of device behavior.

Background

With the development of the electrified life, the chances of a plurality of devices (such as household appliances) working simultaneously are increasing. In view of such circumstances, a technique of smoothly and actively using a plurality of devices has been proposed.

For example, there is known a technique in which a device connected to a network notifies a user of an error via the network, and the user can know the result thereof via an email message (for example, patent document 1). Further, in one known technique, a failure is handled by converting and outputting product information for diagnosing the state of the home appliance device, by capturing an image of an output signal in order to diagnose the state of the home appliance device and to diagnose whether or not a failure has occurred in the home appliance (for example, patent document 2).

Reference list

Patent document

Patent document 1: japanese laid-open patent publication No. 5-274317

Patent document 2: japanese laid-open patent publication No. 2013-

Disclosure of Invention

Technical problem

According to the above-described conventional technique, a user can smoothly operate a plurality of devices (e.g., home appliances).

However, the conventional techniques still have room for improvement. For example, in the conventional art, if the home appliance is not compatible with network communication or if the home appliance cannot display information to be recognized by a device for diagnosis, it is difficult to notify the user of the status of the device. That is, in some cases, the conventional technique cannot be implemented unless a combination of a device that transmits a notification and a device that can communicate with the device that transmits the notification in some way is used.

Accordingly, the present disclosure proposes an information processing apparatus, an information processing method, and an information processing program capable of smoothly operating a variety of devices without depending on the performance of each device.

Solution to the problem

According to the present disclosure, an information processing apparatus includes a control unit that performs processing of detecting information indicating an operation state of a device as sensing information, and processing of determining whether to notify a user of the operation state of the device by referring to a storage unit in which response content associated with the sensing information is stored when the sensing information is detected.

Drawings

Fig. 1 is a schematic diagram of an example of information processing according to the first embodiment.

Fig. 2 is a schematic diagram of a configuration example of an information processing apparatus according to the first embodiment.

Fig. 3 is a schematic diagram of an example of a response content table according to the first embodiment.

Fig. 4 is a schematic diagram of a process flow according to the first embodiment.

Fig. 5 is a schematic diagram of an example of information processing according to the second embodiment.

Fig. 6 is a schematic diagram of a configuration example of an information processing apparatus according to the second embodiment.

Fig. 7 is a schematic diagram of an example of a device information table according to the second embodiment.

Fig. 8 is a schematic diagram of a process flow according to the second embodiment.

Fig. 9 is a schematic diagram of an example of a response content table according to a modification of the second embodiment.

Fig. 10 is a schematic diagram of an example of information processing according to the third embodiment.

FIG. 11 is a diagram of an example of a table of response contents according to another embodiment.

Fig. 12 is a schematic diagram of a user information table according to another embodiment.

Fig. 13 is a block diagram of a first example of a system configuration according to the present disclosure.

Fig. 14 is a block diagram of a second example of a system configuration according to the present disclosure.

Fig. 15 is a block diagram of a third example of a system configuration according to the present disclosure.

Fig. 16 is a block diagram of a fourth example of a system configuration according to the present disclosure.

Fig. 17 is a block diagram of a fifth example of a system configuration according to the present disclosure.

Fig. 18 is a schematic diagram of a client-server (client-server) system according to one of the specific examples of system configurations of the present disclosure.

Fig. 19 is a schematic diagram of a distributed system according to another specific example of the system configuration of the present disclosure.

Fig. 20 is a block diagram of a sixth example of a system configuration according to the present disclosure.

Fig. 21 is a block diagram of a seventh example of a system configuration according to the present disclosure.

Fig. 22 is a block diagram of an eighth example of a system configuration according to the present disclosure.

Fig. 23 is a block diagram of a ninth example of a system configuration according to the present disclosure.

Fig. 24 is a schematic diagram of an example of a system including an intermediate server in accordance with one of more specific examples of system configurations of the present disclosure.

Fig. 25 is a schematic diagram of an example of a system including a terminal device as a host according to one of more specific examples of system configurations of the present disclosure.

FIG. 26 is a schematic diagram of an example of a system including an edge server (edge server) in accordance with one of more specific examples of system configurations of the present disclosure.

Fig. 27 is a schematic diagram of an example of a system including fog computing (fog computing) according to one of more specific examples of system configurations of the present disclosure.

Fig. 28 is a block diagram of a tenth example of a system configuration according to the present disclosure.

Fig. 29 is a block diagram of an eleventh example of a system configuration according to the present disclosure.

Fig. 30 is a diagram showing a hardware configuration of an example of a computer that realizes the functions of the apparatus.

Detailed Description

Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the embodiments, the same reference numerals are used for members having the same functions, and the description of overlapping portions is omitted.

The present disclosure is explained in the following order.

1. First embodiment

1-1. example of information processing according to the first embodiment

1-2. configuration of information processing apparatus according to first embodiment

1-3. flow of information processing according to the first embodiment

1-4. variants according to the first embodiment

2. Second embodiment

2-1. example of information processing according to the second embodiment

2-2. configuration of information processing apparatus according to second embodiment

2-3. flow of information processing according to the second embodiment

2-4. variants according to the second embodiment

3. Third embodiment

4. Other embodiments

4-1. abnormal sound detection

4-2, notifying according to user attribute

4-3, notifying according to the using state of the device

4-4. arrangement of the devices

4-5. information handling System mode

4-6. others

5. Effect of information processing apparatus according to the present disclosure

6. Hardware configuration

(1. first embodiment)

[1-1. example of information processing according to the first embodiment ]

An example of information processing according to a first embodiment of the present disclosure is described with reference to fig. 1. Fig. 1 is a schematic diagram of an example of information processing according to the first embodiment. Fig. 1 shows an example of information processing according to the first embodiment performed by an information processing system 1, the information processing system 1 including an information processing apparatus 100 according to the present disclosure and a home appliance 10 as an example of a device according to the present disclosure.

The information processing apparatus 100 is an example of an information processing apparatus according to the present disclosure. For example, the information processing apparatus 100 has a function (also referred to as a proxy function or the like) of making a conversation with a user by voice or text, and performs various information processing such as voice recognition or response generation for the user. Further, the information processing apparatus 100 can also function to perform various controls on a so-called internet of things (IoT) device or the like according to a request of a user using a proxy function. The information processing apparatus 100 is, for example, a smart speaker, a smart phone, a television, a tablet terminal, or the like. Further, the information processing apparatus 100 may be a wearable device such as a watch type terminal or a glasses type terminal, or a home appliance product having an agent function such as a refrigerator or a washing machine, in addition to a smart speaker or the like.

The household appliance 10 is an example of a device according to the present disclosure. For example, the home appliance 10 is a home appliance product that is installed and used in a user's home or other places. In the example shown in fig. 1, it is assumed that the home appliance 10 is a washing machine. Further, in the first embodiment, it is assumed that the home appliance 10 does not have a function of communicating with the information processing apparatus 100 via a network. Further, in the example shown in fig. 1, only one home appliance 10 is shown; however, the number of the home appliances 10 is not limited to the number shown in fig. 1.

In the example shown in fig. 1, the information processing apparatus 100 detects (senses) the operation state of the home appliance 10 using various sensors such as a microphone or a camera. In addition, the information processing apparatus 100 acquires detected information (hereinafter referred to as "sensed information"). For example, the information processing apparatus 100 can acquire a sound (e.g., an electronic sound for notifying a washing start or a washing end or the like) output by the home appliance 10, and can further reproduce the acquired sound. According to this processing, even if the user is not near the home appliance 10, the information processing apparatus 100 can notify the user of the electronic sound emitted by the home appliance 10.

Incidentally, in an environment where a user uses a plurality of home appliances, if the user is notified of all the sounds acquired by the information processing apparatus 100, the user may be inconvenienced. For example, as an example, in the case of washing end, it makes sense to notify a user in a room far from the washing machine that washing end is present; however, it is redundant and troublesome for a user located in front of the washing machine to receive a notification that the washing machine is stopped. That is, regarding the notification of the operation state of the home appliance product to be operated by the information processing apparatus 100, if information that the user can know with certainty is notified one by one, the convenience of the user may be rather lowered. In this way, in the notification technology relating to the states of the home appliances, a problem to be solved is how to smoothly operate a plurality of home appliances without lowering the convenience of the user.

Therefore, the information processing apparatus 100 according to the present disclosure solves the above-described problems by information processing to be described below.

Specifically, the information processing apparatus 100 according to the present disclosure detects information indicating the operation state of the home appliance as the sensing information, and determines whether to notify the user of the operation state of the home appliance by referring to the storage unit in which the response content associated with the sensing information is stored when the sensing information is detected. Therefore, the information processing apparatus 100 can perform notification according to the request of the user without transmitting the notification of the state of the home appliance to the user one by one; thus, a plurality of home appliances can be appropriately operated without causing a user to feel trouble.

An example of information processing according to the first embodiment of the present disclosure will be described below with reference to fig. 1 in conjunction with a flow of information processing. Further, in the example shown in fig. 1, the processing units included in the information processing apparatus 100 are conceptually described as a detection unit 141, a notification unit 142, and a UI unit 143; however, these units are merely for convenience of description, and the information processing apparatus 100 does not necessarily have the functional configuration shown in fig. 1.

First, an electronic sound emitted by the home appliance 10 as an example of sensed information (hereinafter, a sound emitted for this type of notification is sometimes referred to as a "notification sound") is detected according to the detection unit 141 of the information processing apparatus 100 (step S1). For example, the information processing apparatus 100 detects the notification sound emitted by the home appliance 10 based on an algorithm for pattern matching with the notification sound stored in advance in the storage unit 130. Further, the information processing apparatus 100 may detect the notification sound using various known methods in addition to the above-described example. For example, the information processing apparatus 100 may also detect a notification sound emitted by the home appliance 10 using the learned sound recognition model, thereby distinguishing the notification sound from an operation sound (vibration sound emitted during washing, etc.) output by the home appliance 10.

Further, the information processing apparatus 100 can also detect not only the notification sound but also the electronic display emitted by the home appliance 10 as the sensed information. For example, the information processing apparatus 100 may also detect a blinking display at the end of washing by using a camera or the like.

The information processing apparatus 100 transmits the detected notification sound to the notification unit 142 (step S2). The notification unit 142 according to the information processing apparatus 100 determines whether the notification sound detected by the detection unit 141 is a notification sound that needs to be notified to the user.

Specifically, the information processing apparatus 100 refers to the storage unit 130 (step S3). Although details will be described later, the storage unit 130 stores therein information relating to notification availability indicating whether notification sounds need to be notified to the user and information for distinguishing detected notification sounds (e.g., templates of notification sounds) as data tables. That is, the information processing apparatus 100 refers to the storage unit 130, and then determines whether the notification sound detected in step S1 is "notification sound that needs to be notified to the user".

If the information processing apparatus 100 determines that the notification sound detected in step S1 is a notification sound that needs to be notified to the user, the information processing apparatus 100 transmits data about the notification sound (waveform data, signal data, etc. for reproducing the notification sound) to the User Interface (UI) unit 143 (step S4).

The UI unit 143 according to the information processing apparatus 100 is a processing unit that transmits and receives information to and from a user. For example, the UI unit 143 controls processing of displaying information on a display included in the information processing apparatus 100 or processing of outputting voice from a voice output device (speaker or the like) included in the information processing apparatus 100.

The information processing apparatus 100 notifies the user of the notification sound sent by the notification unit 142 (step S5). For example, in step S1, the information processing apparatus 100 outputs the same sound as the notification sound detected on the home appliance 10.

At this time, the information processing apparatus 100 can also make a predetermined inquiry to the user by outputting a preset response content together with the notification sound. For example, the information processing apparatus 100 makes an inquiry such as "detect such a sound, then notify you of the sound from now? ".

After that, the information processing apparatus 100 receives the reaction from the user (step S6). For example, the information processing apparatus 100 receives a reaction indicating that the user does not reject the notification of the notification sound after the user recognizes the notification sound (for example, a voice of "know" or "thank you" not including a negative expression). Alternatively, the information processing apparatus 100 receives a reaction (for example, a voice including a negative expression of "no notification required" or "quiet") indicating that the user rejects the notification of the notification sound after the user recognizes the notification sound. Alternatively, the information processing apparatus 100 receives a reaction of the user to the inquiry made to the user (for example, a voice indicating a user decision for judging a response when the same sound is detected later, for example, "please let i know that sound from now on").

The information processing apparatus 100 transmits the received reaction to the notification unit 142 (step S7). Subsequently, the information processing apparatus 100 reflects the received reaction to the database in the storage unit 130 (step S8). In other words, the information processing apparatus 100 learns whether or not to give the user a notification related to the notification sound based on the reaction of the user.

As described above, the information processing apparatus 100 detects information indicating the operation state of the home appliance 10 as the sensing information. Then, if the sensed information is detected, the information processing apparatus 100 refers to the storage unit 130 in which the response content associated with the sensed information is stored, and then determines whether to notify the user of the operation state of the home appliance 10.

In this way, when the home appliance 10 emits a certain notification sound, the information processing apparatus 100 determines whether or not to notify the user of the subject information, and then gives the notification to the user. Therefore, the information processing apparatus 100 can notify the user of the notification sound, so that the information processing apparatus 100 does not notify the user of the notification sound that is not desired by the user but notifies the user of the notification sound that is desired by the user, so that the information processing apparatus 100 can perform notification that satisfies the user request. Further, if the user is not in the vicinity of the home appliance 10, the information processing apparatus 100 can deliver a notification to the user on behalf of the home appliance 10, so that the information processing apparatus 100 can improve the convenience of the user. Further, even in a state where the home appliance 10 is not connected to the network (for example, in a case where the home appliance 10 is not an internet of things device), the information processing apparatus 100 detects the sound emitted from the home appliance 10 using a microphone or the like, so that the information processing apparatus 100 can reliably detect the notification sound regardless of the function of the home appliance 10. Therefore, the information processing apparatus 100 can smoothly operate various home appliances 10 regardless of the performance of each home appliance 10.

Further, fig. 1 shows an example in which a single information processing apparatus 100 performs information processing according to the present disclosure; however, a plurality of information processing apparatuses 100 may be installed. For example, the information processing according to the present disclosure may also be performed in cooperation with a first smart speaker installed near the user and a second smart speaker installed near the home appliance 10. In this case, the second smart speaker transmits information related to the detected notification sound to the first smart speaker via the network. The first smart speaker outputs a notification sound emitted from the home appliance 10 to a user together with information about the installation location (e.g., kitchen, etc.) of the second smart speaker. Specifically, when the first smart speaker outputs a notification sound emitted from the home appliance 10, the first smart speaker transmits a notification such as "such sound is emitted from the kitchen". Therefore, the information processing apparatus 100 can positively deliver information to the user that the home appliance 10 is related to and that the user cannot know.

The configuration of the information processing apparatus 100 that performs the above-described information processing and the information processing system 1 including the information processing apparatus 100 will be described in detail below with reference to fig. 2 and subsequent drawings.

[1-2. configuration of information processing apparatus according to first embodiment ]

The configuration of the information processing apparatus 100 according to the first embodiment is described with reference to fig. 2. Fig. 2 is a schematic diagram of a configuration example of the information processing apparatus 100 according to the first embodiment.

As shown in fig. 2, the information processing apparatus 100 includes a sensor 120, an input unit 121, a communication unit 122, a storage unit 130, and a control unit 140.

The sensor 120 is a device for detecting various information. The sensors 120 include a voice input sensor 120A, and the voice input sensor 120A collects, for example, a notification sound emitted from the home appliance 10 and a voice of a user speaking. The voice input sensor 120A is, for example, a microphone. Further, the sensor 120 includes, for example, an image input sensor 120B. The image input sensor 120B is, for example, a camera for capturing an image of the home appliance 10, the user, or the situation of the user at home. For example, the image input sensor 120B is, for example, a stereo camera or the like capable of acquiring a distance or a direction (depth data or the like) of an observation target.

Further, the sensor 120 may also include an acceleration sensor, a gyro sensor, and the like. Further, the sensor 120 may also include a sensor that detects the current position of the information processing apparatus 100. For example, the sensor 120 may also receive radio waves emitted by Global Positioning System (GPS) satellites, and detect position information (e.g., latitude and longitude) indicating the current position of the information processing apparatus 100 based on the received radio waves.

Further, the sensor 120 may also include a radio wave sensor that detects radio waves emitted from an external device or an electromagnetic wave sensor that detects electromagnetic waves. Further, the sensor 120 can also detect the environment in which the information processing apparatus 100 is located. Specifically, the sensor 120 may further include an illuminance sensor (infirmity sensor) that detects illuminance around the information processing apparatus 100, a temperature sensor that detects temperature around the information processing apparatus 100, a humidity sensor that detects humidity around the information processing apparatus 100, a geomagnetic sensor that detects a magnetic field at a position where the information processing apparatus 100 is located.

Further, the sensor 120 does not necessarily have to be provided inside the information processing apparatus 100. For example, the sensor 120 may also be installed outside the information processing apparatus 100 as long as information sensed using communication or the like can be transmitted to the information processing apparatus 100.

The input unit 121 is a device for receiving various operations of a user. The input unit 121 is realized by, for example, a keyboard, a mouse, a touch panel, or the like. The input unit 121 receives a voice input of a user if the information processing apparatus 100 is a smart speaker; therefore, the voice input sensor 120A may also serve as the input unit 121.

The communication unit 122 is realized by, for example, a Network Interface Card (NIC) or the like. The communication unit 122 is connected to the network N in a wired or wireless manner, and transmits and receives information to and from the other information processing apparatus 100, an external server that performs voice recognition processing, and the like via the network N.

The storage unit 130 is implemented, for example, by a semiconductor memory device such as a Random Access Memory (RAM) or a flash memory, or a memory device such as a hard disk or an optical disk. In the first embodiment, the storage unit 130 includes a response content table 131.

The response content table 131 stores therein response content used when the notification sound is detected to output a response to the user. Fig. 3 shows an example of the response content table 131 according to the first embodiment. Fig. 3 shows an example of the response content table 131 according to the first embodiment. In the example shown in fig. 3, the response content table 131 has entries of "notification sound ID", "response content", and the like. In addition, the "response content" includes sub-entries of "notification availability" and "notification message".

The "notification sound ID" indicates identification information for identifying a notification sound. Further, although not shown in fig. 3, the notification sound ID may also include information on waveform data, signal data, and the like that identify the detected notification sound.

The "response content" indicates the content of a response output to the user when the notification sound is detected. The "notification availability" indicates whether or not the user is notified of the notification sound. The "notification message" indicates the content of a message output together with the notification sound. In the example shown in FIG. 3, the entry for the notification message is conceptually represented as "B01"; however, in practice, the contents of a specific voice output to the user are stored in an entry of the notification message.

That is, in fig. 3, as an example of the information registered in the response content table 131, the notification sound identified by the notification sound ID of "a 01" indicates a notification sound to be notified to the user when the notification sound is detected (notification availability is "yes"), and indicates that the notification message is "B01".

The description is continued with reference again to fig. 2. The control unit 140 is a processing unit that performs information processing performed by the information processing apparatus 100. As shown in fig. 2, the control unit 140 includes a detection unit 141, a notification unit 142, and a UI unit 143. The control unit 140 is realized by, for example, a Central Processing Unit (CPU), a Micro Processing Unit (MPU), a Graphics Processing Unit (GPU), or the like that executes a program (e.g., an information processing program according to the present disclosure) stored in the information processing apparatus 100 in a Random Access Memory (RAM) or the like as a work area. Further, the control unit 140 is a controller, and may also be implemented by, for example, an integrated circuit such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).

The detection unit 141 detects information indicating an operation state of the apparatus (the home appliance 10) as sensing information. For example, the detection unit 141 uses various information detected by the sensor 120 as sensing information.

For example, the detection unit 141 detects a notification sound issued by the home appliance 10 to notify the user of the operation state as the sensing information. Specifically, the detection unit 141 detects an electronic sound when the home appliance 10 starts to operate or an electronic sound when the operation ends.

For example, the detection unit 141 refers to a template of the notification sound stored in advance in the storage unit 130, and then detects the notification sound by collating (pattern matching) the template with the notification sound emitted from the home appliance 10. Alternatively, the detection unit 141 detects an electronic sound emitted by the home appliance 10, or detects a notification sound using a learning model or the like for identifying or classifying the type of the electronic sound.

Further, the detection unit 141 may also detect a voice spoken by the user from the voice detected by the sensor 120. For example, the detection unit 141 analyzes the user's semantic meaning included in the detected speech through an Automatic Speech Recognition (ASR) process or a Natural Language Understanding (NLU) process, and then detects the analyzed information.

Further, as a result of the voice analysis, if the user's intention cannot be found, the detection unit 141 may also pass the state to the UI unit 143. For example, as a result of the analysis, if an intention that cannot be estimated from the user's voice is included, the detection unit 141 transfers its content to the UI unit 143. In this case, the UI unit 143 outputs a response (speech, e.g., "please say again" or the like) requesting the user to accurately give the speech again for unclear information.

Further, the detection unit 141 may also detect various information related to face information about the user or movement of the user, such as orientation, inclination, movement speed, and the like of the user's body, via the image input sensor 120B, an acceleration sensor, an infrared sensor, and the like. That is, the detection unit 141 may also detect various physical quantities such as position information, acceleration, temperature, gravity, rotation (angular velocity), illuminance, geomagnetism, pressure, proximity, humidity, or a rotation vector as a context via the sensor 120.

Further, the detection unit 141 may also detect information related to communication. For example, if there are a plurality of information processing apparatuses 100, the detection unit 141 may also periodically detect the connection state between the information processing apparatuses 100. The connection state mentioned here refers to information such as whether or not bidirectional communication is established, for example.

The notification unit 142, when detecting the sensed information, refers to the storage unit 130 in which the response content associated with the sensed information is stored, and then determines whether to notify the user of the operation state of the apparatus. Further, the operation state of the device may be the notification sound itself detected by the detection unit 141, or may be a message or the like indicating the operation state of the device (a message or the like indicating the end of the operation of the home appliance 10).

For example, if the detection unit 141 detects a notification sound, the notification unit 142 refers to the storage unit 130 in which the response content associated with the notification sound is stored, and then determines whether to notify the user of the operation state of the apparatus. Specifically, the notification unit 142 refers to the response content of the detected notification sound, and performs control to notify the user of the subject notification sound if the subject notification sound is a notification sound set to notify the user. In contrast, if the detected notification sound is not set as the notification sound to be notified to the user, the notification unit 142 performs control so that the user is not notified of the subject notification sound. Further, if the detected notification sound is not stored in the storage unit 130 and whether to notify the user of the status is not set, the notification unit 142 may also notify the user of the subject notification sound and a message indicating that the notification sound is detected for the first time. In this case, the notification unit 142 may also send a query to the user, for example, "from now on, do you need to be notified of this notification sound".

Further, after notifying the user of the operation state of the apparatus, the notification unit 142 updates the response content associated with the sensing information stored in the storage unit 130 based on the received reaction of the user.

Specifically, the notification unit 142 updates the setting indicating whether to notify the user of the operation state of the device associated with the detected sensed information (for example, information stored in the "notification availability" entry shown in fig. 3) based on the received reaction of the user.

More specifically, the notification unit 142 recognizes the received voice of the user, and updates the setting indicating whether to notify the user of the operation state of the device associated with the detected sensing information, based on the reaction of the user according to the voice recognition result. For example, if the notification unit 142 receives an affirmative reaction, such as "thank you", from the user who is notified of the operation state of the apparatus, the notification unit 142 updates (or holds) the setting so that the user is notified of the operation state associated with the subject notification sound as in the past. Alternatively, if the notification unit 142 receives a negative reaction, such as "notification unnecessary", from the user who is notified of the operation state of the apparatus, the notification unit 142 updates the setting so that the user is not notified of the operation state associated with the subject notification sound from now on.

Further, when the notification unit 142 notifies the user of the operation state of the apparatus, the notification unit 142 may also notify the user of information related to the installation position of the apparatus together with the operation state. For example, if a plurality of information processing apparatuses 100 are installed in the user's home, each information processing apparatus 100 can store the installation location of each device (information indicating a category such as the user's home, kitchen, or toilet). Then, when the notification unit 142 notifies the user of the notification sound, the notification unit 142 also notifies the installation position of the information processing apparatus 100 that the operation state of the device has been detected. Specifically, the notification unit 142 notifies the user of the notification sound together with a message indicating, for example, "output such a sound from the kitchen". Therefore, the user can roughly predict which home appliance 10 has emitted the notification sound. Further, as described above, the information processing apparatus 100 according to the present disclosure sometimes performs information processing according to the present disclosure in cooperation with a plurality of devices. In this case, the device that has judged whether or not to notify the user of the operation state of the device and the device that notifies the user of the operation state may also be different devices. That is, the notification processing executed by the notification unit 142 includes not only processing in which the own device transmits a notification to the user, but also processing in which the own device controls another device and causes the other device to transmit the notification to the user.

The UI unit 143 is a processing unit that transmits and receives information to and from the user. For example, the UI unit 143 serves as an interface that outputs information (sound information about a notification sound or the like) notified by the notification unit 142 and receives a voice input of the user.

Further, the UI unit 143 includes a mechanism for outputting various information. For example, the UI unit 143 may further include a speaker for outputting sound or a display for outputting video images. For example, the UI unit 143 outputs the notification generated by the notification unit 142 to the user by voice. Further, the UI unit 143 may also convert the notification to the user generated by the notification unit 142 into screen display (image data), and output the converted image to the display. For example, the UI unit 143 may also display video image data together with voice, in which a message generated by the notification unit 142 is displayed in a text mode. In addition, the UI unit 143 may also give a notification to the user by voice, for example, and output the image acquired by the detection unit 141 to a display.

[1-3. flow of information processing according to the first embodiment ]

The flow of information processing according to the first embodiment will be described below with reference to fig. 4. Fig. 4 is a schematic diagram of a process flow according to the first embodiment.

As shown in fig. 4, the information processing apparatus 100 determines whether or not a notification sound emitted from the home appliance 10 is detected (step S101). In a case where the notification sound is not detected (no in step S101), the information processing apparatus 100 waits until the notification sound is detected.

In contrast, in the case where a notification sound is detected (yes in step S101), the information processing apparatus 100 checks the detected notification sound with the notification sound stored in the storage unit 130 (step S102).

Then, the information processing apparatus 100 determines whether the detected notification sound matches the notification sound stored in the storage unit 130 (step S103). If the two notification sounds match (yes in step S103), the information processing apparatus 100 determines whether the subject notification sound is set to be able to be notified to the user (step S104).

If the notification sound is set to be able to be notified to the user (for example, in the case of "yes" in the item of "notification availability" shown in fig. 3) (yes in step S104), the information processing apparatus 100 notifies the user of the operation state of the home appliance 10 based on the response content stored in the storage unit 130 (step S105). In contrast, if the notification sound is not set so as to be able to be notified to the user (no in step S104), the information processing apparatus 100 ends the processing without notifying the user.

Further, if the detected notification sound does not match the notification sound stored in the storage unit 130 (no in step S103), the information processing apparatus 100 inquires of the user of a response desired by the user when the subject notification sound is detected later (step S106).

Then, the information processing apparatus 100 associates the response of the user with the detected sound (notification sound detected in step S101) and stores the associated information in the storage unit 130 again (step S107).

[1-4. variants according to the first embodiment ]

Various modifications may be made to the information processing according to the described first embodiment. A modification of the first embodiment will be described below.

For example, the information processing apparatus 100 need not have all of the components shown in fig. 2. For example, the information processing apparatus 100 need not have the response content table 131 shown in fig. 3. In this case, the information processing apparatus 100 can also access an external server or the like that holds information associated with the response content table 131 via a network, and can also acquire information associated with the response content table 131.

Further, the information processing apparatus 100 can also access an external server or the like and appropriately update the contents held by the response content table 131. For example, if the information processing apparatus 100 receives registration of the home appliance 10 used by the user, the information processing apparatus 100 may also acquire data on the notification sound associated with the home appliance 10 from an external server or the like.

(2. second embodiment)

[2-1. example of information processing according to the second embodiment ]

The second embodiment will be described below. Fig. 5 is a schematic diagram of an example of information processing according to the second embodiment. The information processing according to the second embodiment is performed by the information processing apparatus 100A shown in fig. 5. In the second embodiment, the information processing apparatus 100A detects a notification sound emitted from each of the home appliance 10A and the home appliance 10B. In addition, in the example shown in fig. 5, the home appliance 10A is a washing machine, and the home appliance 10B is an electric rice cooker. Further, in the following description, the information processing apparatus 100 according to the first embodiment and the information processing apparatus 100A according to the second embodiment are simply referred to as the information processing apparatus 100 when they are not required to be distinguished from each other. Further, in the following description, the home appliance 10 according to the first embodiment and the home appliance 10A or 10B according to the second embodiment are simply referred to as the home appliance 10 when they are not required to be distinguished from each other.

In the example shown in fig. 5, the electronic sound emitted from the home appliance 10A or the home appliance 10B is detected according to the detection unit 141 of the information processing apparatus 100A (step S11 and step S12). At this time, the information processing apparatus 100A detects the installation direction or position of the home appliance 10A or the home appliance 10B using, for example, an array microphone or the like. Further, if the detected direction is within the field of view of the camera, the information processing apparatus 100A performs object recognition on the camera image. Thus, the information processing apparatus 100A identifies the home appliance 10A or the home appliance 10B that emits the detected notification sound.

After that, the information processing apparatus 100A refers to the information stored in the storage unit 130 (step S13). Specifically, the information processing apparatus 100A refers to the object tag information (e.g., information indicating which home appliance is associated with the result of image recognition) stored in the storage unit 130. Then, the information processing apparatus 100A transmits, to the notification unit 142, information that the notification sounds detected in step S11 and step S12 are associated with the home appliance 10A and the home appliance 10B that emitted the subject notification sound, respectively (step S14). The processing of step S15 and subsequent processing are the same as those described in the first embodiment; therefore, a description thereof will be omitted.

That is, in the second embodiment, the information processing apparatus 100A identifies the home appliance 10A or the home appliance 10B associated with the sensed information by image recognition, and then notifies the user of the operation state of the home appliance 10A or the home appliance 10B together with information about the identified home appliance 10A or the home appliance 10B.

Therefore, the information processing apparatus 100A can notify the user of the operation state of the home appliance 10A or the home appliance 10B in more detail. Specifically, the information processing apparatus 100A can notify the user of information indicating a target to emit a notification sound such as "output such a sound from an electric rice cooker" together with the notification sound. That is, the information processing apparatus 100A can further improve the convenience of the user using the plurality of home appliances 10.

[2-2. configuration of information processing apparatus according to second embodiment ]

Fig. 6 is a schematic diagram of a configuration example of an information processing apparatus 100A according to the second embodiment. Compared to the first embodiment, the information processing apparatus 100A further includes a device information table 132.

The device information table 132 stores therein information relating to devices (home appliances). Fig. 7 is a schematic diagram of an example of a device information table according to the second embodiment. In the example shown in fig. 7, the device information table 132 has entries of "device ID", "device type", "image identification data", and the like.

The "device ID" indicates identification information for identifying a device. In addition, in the present specification, it is assumed that the same reference numerals are used for the device ID and the home appliance 10. For example, the device identified by the device ID "10A" represents "home appliance 10A".

The "device type" indicates the type of the device. The type of the device indicates information classified by, for example, the attribute or characteristic of the home appliance 10. Specifically, the types of devices are categories of the home appliances 10, such as "washing machine", "rice cooker", and "refrigerator".

The "image recognition data" indicates data obtained as a result of image recognition. For example, in image recognition, an object is attached with information indicating that the object included in the image is recognized as a "washing machine" or "electric rice cooker". The image recognition data is data indicating such an image recognition result. In the example shown in fig. 7, the image recognition data item is conceptually represented as "C01"; however, actually, specific data or the like indicating the object extracted as the image recognition result or the type of the recognized object is stored in the entry of the image recognition data. For example, if data indicated by "C01" is obtained by image recognition performed by the information processing apparatus 100A, the information processing apparatus 100A can specify that the object associated with the data is a device identified by the device ID "10A" (the home appliance 10A in this example) by referring to the device information table 132.

That is, fig. 7 shows the home appliance 10A having the device ID of "10A" indicating that the device type is "washing machine" and the image recognition data is "C01" as an example of the information registered in the device information table 132.

The description is continued with reference again to fig. 6. As described above, the information processing apparatus 100A according to the second embodiment performs the direction recognition or the image recognition on the home appliance 10A or the home appliance 10B, and also performs the image recognition on the user.

For example, the notification unit 142 according to the second embodiment identifies a device associated with the sensed information by performing image recognition, and notifies the user of the operation state of the device together with information about the identified device.

Specifically, the notification unit 142 notifies the user of at least one of the type of the device, the name of the device, and the installation position of the device, along with the operation state of the device. For example, the control unit 140 notifies the user of the type or name of the home appliance 10 (e.g., "refrigerator", "rice cooker", etc.) that has emitted the notification sound, or notifies the user of the location (e.g., "kitchen", "toilet", etc.) where the home appliance 10 is placed.

Further, the detection unit 141 according to the second embodiment may also detect not only information about the device but also information about the user using the sensor 120. Specifically, the detection unit 141 detects the location of the user in the user's home. Then, the detection unit 141 verifies whether the user is in the vicinity of the home appliance 10 that emits the notification sound.

Then, based on the detected positional relationship between the position where the user is located and the device associated with the sensed information, the notification unit 142 according to the second embodiment can also determine whether to notify the user of the operation state of the device.

Specifically, based on the distance between the detected user position and the position at which the device associated with the sensed information is installed, the notification unit 142 determines whether to notify the user of the operation state of the device. For example, the detection unit 141 detects a distance between the user and the home appliance 10 using a sensor 120 (e.g., a depth sensor) capable of measuring the distance. Alternatively, the detection unit 141 estimates the distance between the user included in the same image and the home appliance 10 by performing the image recognition process.

Then, if the distance between the user and the home appliance 10 exceeds a predetermined threshold (e.g., 10 meters, etc.), the notification unit 142 notifies the user of a notification sound emitted by the home appliance 10 (i.e., the operation state of the home appliance 10). In contrast, if the distance between the user and the home appliance 10 does not exceed the predetermined threshold, the notification unit 142 does not need to notify the user of the notification sound emitted by the home appliance 10.

That is, the notification unit 142 detects the positional relationship between the home appliance 10 and the user, and then determines whether or not to give a notification to the user. Therefore, the user can avoid a troublesome situation such as a situation in which a notification of the operation state of the home appliance 10 located very close to the user is received from the information processing apparatus 100A. In contrast, for the home appliance 10 whose operation state is difficult for the user to visually recognize, the user can know the operation state via the information processing apparatus 100A. In this way, the information processing apparatus 100A can realize the notification processing with a high degree of satisfaction of the user.

Further, the detection unit 141 may detect not only the distance between the user and the home appliance 10 but also further detailed information. For example, the detection unit 141 may also detect the orientation of the face or the orientation of the body of the user through known image recognition processing. Then, the notification unit 142 may also determine whether to notify the user of the operation state of the home appliance 10 according to the orientation of the face or body of the user when the home appliance 10 issues information indicating the operation state or when the home appliance 10 detects information indicating the operation state as sensed information.

Specifically, when the home appliance 10 emits the notification sound, if the face or body of the user faces the direction of the home appliance 10, the notification unit 142 determines that the user recognizes the notification sound emitted by the home appliance 10. In this case, the notification unit 142 determines that the user does not need to be notified of the operation state of the home appliance 10 again, and does not give the user notification. In contrast, when the home appliance 10 emits the notification sound, if the face or body of the user does not face the direction of the home appliance 10, the notification unit 142 determines that the user does not recognize the notification sound emitted by the home appliance 10. In this case, the notification unit 142 determines that it is necessary to notify the user of the operation state of the home appliance 10 and gives the user a notification. In this way, the information processing apparatus 100A can execute notification processing according to the situation of the user at that time.

Further, the notification unit 142 may also determine whether to give a notification based not only on the orientation of the face or body of the user but also on the position area of the user. For example, the notification unit 142 may also determine that notification to the user is not necessary during a period of time in which both the home appliance 10 and the user are within the angle of view of the camera (i.e., a case in which the home appliance 10 and the user are included in the same image). At this time, the notification unit 142 may also set a predetermined buffer time for which notification to the user is not necessary, for example, within a predetermined period of time (e.g., several seconds) after the user leaves the shooting range. Further, the notification unit 142 may also determine that notification to the user is not necessary if a predetermined time has elapsed after the user leaves the shooting range (outside the shooting frame).

Further, even if the home appliance 10 and the user are in the same photographing frame, if the distance between the user and the home appliance 10 exceeds a predetermined distance because the camera is a wide-angle camera, the notification unit 142 may also determine that the user is given a notification even if the user and the home appliance 10 are within the field of view of the camera.

Further, if the notification unit 142 determines that the state in which the user closes his eyes is longer than a predetermined period of time (i.e., determines that the user is in a sleep state) based on the face recognition processing performed on the user, the notification unit 142 may also determine that notification is not necessary. Further, even if the information processing apparatus 100A does not have a camera, the notification unit 142 can simply implement processing as described above by performing speaker recognition by the voice of the speaker, state determination processing, or the like.

[2-3. flow of information processing according to the second embodiment ]

The flow of information processing according to the second embodiment will be described below with reference to fig. 8. Fig. 8 is a schematic diagram of a process flow according to the second embodiment.

As shown in fig. 8, the information processing apparatus 100A determines whether or not the notification sound emitted from the home appliance 10 is detected (step S201). In a case where the notification sound is not detected (no in step S201), the information processing apparatus 100A waits until the notification sound is detected.

In contrast, in the case where the notification sound is detected (yes in step S201), the information processing apparatus 100A checks the notification sound with the notification sound stored in the storage unit 130, identifies the home appliance 10 that has emitted the notification sound, and acquires information relating to the home appliance 10 (step S202).

Then, the information processing apparatus 100A determines whether the detected notification sound matches the notification sound stored in the storage unit 130 (step S203). If the two notification sounds match (yes in step S203), the information processing apparatus 100A determines whether the subject notification sound is set to be able to be notified to the user (step S204).

If the notification sound is set so as to be able to be notified to the user (yes in step S204), the information processing apparatus 100A further determines whether the user is in a position suitable for notification (step S205). For example, the information processing apparatus 100A determines whether the user is away from the home appliance 10 by a distance greater than or equal to a predetermined distance.

If the user is in a position suitable for notification (yes in step S205), the information processing apparatus 100A notifies the user of the operation state of the home appliance 10 based on the response content stored in the storage unit 130 (step S206). In contrast, if the notification sound is not set so as to be able to be notified to the user (no in step S204), or if the user is not in a position suitable for notification (no in step S205), the information processing apparatus 100A ends the processing without notifying the user.

Further, if the detected notification sound does not match the notification sound stored in the storage unit 130 (no in step S203), the information processing apparatus 100A inquires of the user what reaction is required when the subject notification sound is detected later (step S207).

Then, the information processing apparatus 100A associates the reply of the user with the detected sound (notification sound detected in step S201) and stores the associated information in the storage unit 130 again (step S208).

[2-4. variation according to the second embodiment ]

Various modifications may be made to the information processing according to the described second embodiment. A modification of the second embodiment will be explained below.

For example, the information processing apparatus 100A may also mark the detected home appliance 10 or the detected notification sound. This will be described with reference to fig. 9. Fig. 9 is a schematic diagram of an example of a response content table 131B according to a modification of the second embodiment. The response content table 131B has a "tag" entry in addition to the information indicated in the response content table 131 and the device information table 132.

The "tag" stores information or the like instructed by the user after the notification sound is notified to the user or after an inquiry about processing of the notification sound is made to the user. That is, in fig. 9, as an example of the information registered in the response content table 131B, the notification sound indicating that the notification sound ID is "a 11" is the notification sound issued by the home appliance 10A indicated by the device ID "10A", and indicates that the device type of the home appliance 10A is "washing machine". Further, for the notification sound whose notification sound ID is "a 11", it is also indicated that the notification availability is "yes", the notification message is "B11", and the notification sound tag is "washing end".

For example, if the information processing apparatus 100A detects that the home appliance 10A emits the notification sound, the information processing apparatus 100A makes an inquiry about the notification sound to the user together with the recognition result of the home appliance 10A. Specifically, when the information processing apparatus 100A performs such as "the following sound is output by the home appliance 10A, then do you be notified of this sound from now on? "if the user gives a reply like" let i know 'washing end' ", the information processing apparatus 100A associates the notification sound with the tag conforming to the reply.

Thereafter, if the information processing apparatus 100A detects the same notification sound, the information processing apparatus 100A refers to the response content table 131B, and recognizes that the notification sound is attached with a tag indicating "washing end". Then, if the information processing apparatus 100A detects the notification sound, the information processing apparatus 100A outputs a notification message such as "washing has been completed" conforming to the label to the user. At this time, the information processing apparatus 100A may also output the notification sound together with the message, or may omit outputting the notification sound itself.

In this way, when the information processing apparatus 100A notifies the user of the operation state of the device, the information processing apparatus 100A notifies the user of information that previously marks the sensed information together with the operation state of the device. That is, the information processing apparatus 100A can not only recognize the home appliance 10A or the home appliance 10B that emits the notification sound, but also label the notification sound emitted by the home appliance 10A or the home appliance 10B. Therefore, the user can receive the notification converted into the information easily recognized by the mark, as compared with the case of only notifying the sound.

(3. third embodiment)

The third embodiment will be described below. Fig. 10 is a schematic diagram of an example of information processing according to the third embodiment. The information processing according to the third embodiment is performed by the information processing apparatus 100C shown in fig. 10. As shown in fig. 10, in the third embodiment, the information processing apparatus 100C includes a temporary storage area 133 in the storage unit 130. The flow of information processing according to the third embodiment will be described below with reference to fig. 10. Further, description of the processing described in the first embodiment or the second embodiment will be omitted.

The information processing apparatus 100C detects the notification sound emitted by the home appliance 10 (step S21). The information processing apparatus 100C transmits the detected notification sound to the notification unit 142 (step S22). The information processing apparatus 100C refers to the storage unit 130 (step S23), and then transmits the content to be notified according to the content stored in the storage unit 130 to the UI unit 143 (step S24). At this time, it is assumed that the information processing apparatus 100C stores the notification sound detected in step S21 in the temporary storage area 133 of the storage unit 130.

In the example shown in fig. 10, it is assumed that the notification sound detected in step S21 is a notification sound that is not notified to the user (no to "notification availability"). In this case, the information processing apparatus 100C does not perform display without notifying the user of the content of the notification sound (step S25).

Here, it is assumed that the user hears the notification sound emitted from the home appliance 10 and desires to request the information processing apparatus 100C to receive the notification. In this case, the user expresses a request to the information processing apparatus 100C, for example, "let i know the sound just made from now on" (step S26).

The information processing apparatus 100C transmits a request to the notification unit 142 (step S27). The information processing apparatus 100C accesses the storage unit 130, refers to the notification sound stored in the temporary storage area 133, and updates the response content associated with the subject notification sound. Specifically, the information processing apparatus 100C updates the setting of notification availability of "no" to "yes".

In this way, the information processing apparatus 100C according to the third embodiment stores the notification sound in the temporary storage area 133 and then waits for an instruction of the user for a certain period of time (for example, for one minute). Then, if an instruction is received from the user, the information processing apparatus 100C updates the response content of the notification sound stored in the temporary storage area 133 in accordance with the instruction received from the user. Therefore, the information processing apparatus 100C can perform flexible learning on various requests received from the user.

(4. other embodiments)

The processing according to the above embodiments may also be performed in various embodiments different from the above embodiments.

[4-1. abnormal Sound detection ]

For example, the information processing apparatus 100 can detect not only the notification sound emitted by the home appliance 10 but also information related to various notifications. For example, if the information processing apparatus 100 detects an abnormal sound indicating that the operation state of the home appliance 10 is abnormal as the sensed information, the information processing apparatus 100 may also notify the user of information indicating that the information processing apparatus 100 detected the abnormal sound together with the operation state of the home appliance 10. The abnormal sound mentioned here is, for example, a sound whose sound pressure level exceeds a predetermined threshold value with respect to a normal operation sound, or the like. The information processing apparatus 100 may also notify the user of an alarm, such as "the washing machine outputs an infrequently heard sound", if an abnormal sound is detected.

Further, the information processing apparatus 100 can also detect information other than sound as the sensed information. For example, the information processing apparatus 100 may also detect at least one of information regarding light, temperature, humidity, odor, vibration, and carbon dioxide concentration observed around the home appliance 10 as the sensing information. For example, the information processing apparatus 100 detects light, temperature, and the like emitted from the home appliance 10 according to various sensors 120, and notifies a user of the detection based on the detected information. For example, the information processing apparatus 100 gives a notification to the user based on information detected by an odor sensor, an image sensor, an optical sensor, a tactile sensor, a vibration sensor, a temperature sensor, a humidity sensor, a carbon dioxide concentration sensor, or the like.

Further, the information processing apparatus 100 may also refer to a data table obtained by defining whether the operation state of the home appliance 10 indicates an abnormal state, and then notify the user that the abnormal state has been detected. This will be described with reference to fig. 11. Fig. 11 is a diagram of an example of a response content table 131C according to another embodiment.

The response content table 131C has entries such as "detection condition" compared with the first to third embodiments. The "detection condition" indicates a condition in which information detected by the sensor 120 is detected as sensed information.

For example, the example shown in fig. 11 indicates that the subject information is detected as the sensing information under the detection conditions of "in the case where the temperature (of a certain household appliance 10) exceeds 40 °," in the case where the odor index (of a certain household appliance 10) exceeds 300 ", and the like.

The information processing apparatus 100 refers to the response content table 131C, and when sensing information is detected, notifies the user of the content of the sensing information together with the tag. For example, the information processing apparatus 100 notifies the user of a notification message such as "please check because the temperature of the home appliance 10 is abnormally high" together with the temperature detected around the home appliance 10. Therefore, the information processing apparatus 100 can appropriately notify the user of the abnormal operation state of the home appliance 10.

Further, the detection condition for determining the abnormal state may also be installed in the information processing apparatus 100 at the time of initial shipment, and may be updated by receiving an input of a user, or may also be updated by an external server or the like provided by the manufacturer of the home appliance 10.

[4-2. Notification according to user Attribute ]

Further, the information processing apparatus 100 can also recognize the user and give a notification according to the user. That is, the information processing apparatus 100 may also detect attributes of a user located near the information processing apparatus 100, and determine whether to notify the user of the operation state of the device according to the detected attributes of the user.

In this case, the information processing apparatus 100 includes, for example, the user information table 134 shown in fig. 12. The user information table 134 stores information related to a user who uses the information processing apparatus 100. Fig. 12 is a diagram illustrating an example of the device information table 134 according to another embodiment.

In the example shown in fig. 12, the user information table 134 has entries of "user ID", "attribute", "notification setting", and the like.

The "user ID" indicates identification information for identifying a user. The "attribute" indicates various information about the user registered by the user when using the information processing apparatus 100. For example, the attribute includes attribute information (user profile) including the age, sex, place of residence, home structure, and the like of the user. Further, the attribute is not limited to information registered by the user, and may include information automatically recognized by the information processing apparatus 100. For example, the attribute may also include child information or male and female information estimated by image recognition performed by the information processing apparatus 100.

The "notification setting" indicates setting information indicating whether reception of a notification from the information processing apparatus 100 is desired. In the example shown in fig. 12, the entry of the notification setting is conceptually represented as "F01"; however, actually, setting information indicating whether each user desires to receive the notification is stored in the notification setting entry for each notification sound or for each type of the home appliance 10.

That is, in the example shown in fig. 12, for the user identified by the user ID "U01", the attribute is indicated as "male, adult", and the notification is set to "F01".

When the information processing apparatus 100 detects the notification sound, the information processing apparatus 100 refers to the user information table 134 and checks the notification setting of the user who is in the vicinity of the information processing apparatus 100. Then, the information processing apparatus 100 determines whether or not to give a notification to the subject user according to the notification setting generated for each user. Therefore, the information processing apparatus 100 can give a notification according to each user.

Further, the information processing apparatus 100 may also use various known techniques as a method for detecting a user in the vicinity of the information processing apparatus 100. For example, the information processing apparatus 100 detects a user in the vicinity of the information processing apparatus 100 using a biosensor that detects the position of a living body based on information emitted by the living body. Specifically, the biosensor is an infrared sensor (thermal imaging method) that detects the temperature (body temperature) of a living body, an image sensor (camera) for performing image recognition on a living body, or the like. Further, the information processing apparatus 100 may also use a distance measurement sensor or the like that measures a distance to the user. The distance measuring sensor is a distance sensor, an ultrasonic sensor, or the like that measures a distance to a living body by irradiating light. Further, for the distance measuring sensor, it is also possible to use a technique such as light detection and ranging, or laser imaging, detection, and ranging (laser radar (LiDAR)). Further, in order to measure the distance between the information processing apparatus 100 and the user, for example, a technique such as synchronous positioning and mapping (SLAM) provided in the information processing apparatus 100 may also be used.

[4-3. Notification according to the use status of the device ]

Further, the information processing apparatus 100 can also acquire the use state of the information processing apparatus 100 that outputs the notification, and then can also output the notification according to the acquired use state. For example, the information processing apparatus 100 may also control display of a notification on a display unit such as a display. Specifically, the information processing apparatus 100 can also control the notification according to the voice reproduced by the information processing apparatus 100 that gives the notification or according to the displayed image.

For example, it is conceivable to place smart speakers near the home appliance 10 and place a television set in which a user watches a broadcast program as the information processing apparatus 100. If the smart speaker placed near the home appliance 10 detects the notification sound output from the home appliance 10, the information processing apparatus 100 does not display the notification for the period of time in which the broadcast program is displayed on the display of the television set, and then outputs the notification when the broadcast program is switched to the commercial program. Further, it is also possible to perform control so that the notification is displayed at a position where the view of the displayed content is not blocked. Further, for example, if the information processing apparatus 100 that outputs the notification is a smartphone, and if it is determined that displaying a large notification image on the screen would cause a hindrance, processing of displaying the notification using an icon may also be performed. The process of acquiring these usage states may also be performed based on information on an application running on the information processing apparatus 100, or may also be performed based on image analysis performed on content displayed on the screen.

The above examples are merely examples and do not exclude different embodiments also conceivable based on the present invention, including examples in which the information processing apparatus 100 outputting the voice notification is a different type of information processing apparatus 100 (such as a smart speaker reproducing the voice content), or examples in which the information processing apparatus 100 outputting the notification is a different combination of smart phones reproducing the broadcast program. Further, the embodiment may also be performed in the case where the information processing apparatus 100 that detects the notification sound of the home appliance 10 and the information processing apparatus 100 that outputs the notification are the same.

[4-4. arrangement of the respective apparatuses ]

In the above-described embodiments, the description has been made of an example in which the information processing apparatus 100 is a so-called smart speaker, a smart phone, a television set, or a tablet terminal, and performs processing in an independent manner. However, the information processing apparatus 100 may also perform information processing according to the present disclosure in cooperation with a server device (so-called cloud server or the like) connected through a network. Further, for example, the information processing apparatus 100 may also be implemented to cooperate with a smart speaker and a smart phone. In this case, for example, information processing may be performed such that a smartphone held by a user performs notification based on a notification sound detected by a smart speaker. In addition, information processing may be performed such that a television set viewed by a user gives a notification by voice or by display on a screen based on a notification sound emitted from a microwave oven detected by a refrigerator having an agent function.

Further, the information processing apparatus 100 according to the present disclosure can also be realized by means of, for example, an IC chip mounted on a smartphone or the like.

[4-5. mode of information processing System ]

Further, the information processing system 1 according to the present disclosure may include various modifications. For example, if the information processing apparatus 100 is an internet of things (IoT) device or the like, the information processing according to the present disclosure may also be implemented by a client (IoT device) and an external server (cloud server) or the like that cooperate with each other. Conceivable examples of modes as the information processing system 1 will be listed below. Further, in the examples described below, examples will be described in which each device includes an input unit, a processing unit, and an output unit. The input unit and the output unit correspond to, for example, the communication unit 122 shown in fig. 2. Further, the processing unit corresponds to, for example, the control unit 140 shown in fig. 2. Further, in the modification described below, a modification of the information processing system is referred to as "system 2". Further, a modification of the information processing apparatus 100 is referred to as "information processing apparatus 11", "information processing apparatus 12", or "information processing apparatus 13". In addition, variations of the information processing apparatus 11 and the like are referred to as "information processing apparatus 11 a", "information processing apparatus 11 b", "information processing apparatus 11 c", and the like.

(first example)

Fig. 13 is a block diagram of a first example of a system configuration according to an embodiment of the present disclosure. Referring to fig. 13, the system 2 includes an information processing apparatus 11. The input unit 200, the processing unit 300, and the output unit 400 are all implemented in the information processing apparatus 11. The information processing apparatus 11 may be a terminal device or a server as described below. In the first example, in order to realize the functions according to the embodiments of the present disclosure, the information processing apparatus 11 may also be a stand-alone device that communicates with an external device not via a network. Further, the information processing apparatus 11 can also communicate with an external device for another function, and thus does not necessarily have to be a separate device. Each of the interface 250a between the input unit 200 and the processing unit 300 and the interface 450a between the processing unit 300 and the output unit 400 may be an interface included in the apparatus.

In the first example, the information processing apparatus 11 may be, for example, a terminal device. In this case, the input unit 200 may include an input device, a sensor, and software acquiring information from an external service. The software that acquires information from the external service acquires data from, for example, application software of a service executed by the terminal apparatus. The processing unit 300 is realized by a processor or a processing circuit provided in a terminal apparatus that operates according to a program stored in a memory or a storage device. The output unit 400 may include an output device, a control device, and software providing information to an external service. The software that provides information to the external service may provide information to application software of a service executed in, for example, a terminal device.

Alternatively, in the first example, the information processing apparatus 11 may be a server. In this case, the input unit 200 may include software that acquires information from an external service. The software that acquires information from the external service acquires data from, for example, a server of the external service (which may also be the information processing apparatus 11 itself). The processing unit 300 is implemented by a processor included in a terminal apparatus operating according to a program stored in a memory or a storage device. The output unit 400 may include software that provides information to an external service. The software that supplies information to the external service supplies the information to, for example, a server of the external service (which may also be the information processing apparatus 11 itself).

(second example)

Fig. 14 is a block diagram of a second example of a system configuration according to an embodiment of the present disclosure. Referring to fig. 14, the system 2 includes information processing apparatuses 11 and 13. The input unit 200 and the output unit 400 are implemented in the information processing apparatus 11. In contrast, the processing unit 300 is implemented in the information processing apparatus 13. The information processing apparatus 11 and the information processing apparatus 13 communicate via a network to realize functions according to the embodiments of the present disclosure. The interface 250b between the input unit 200 and the processing unit 300 and the interface 450b between the processing unit 300 and the output unit 400 may be communication interfaces between devices.

In the second example, the information processing apparatus 11 may be, for example, a terminal device. In this case, the input unit 200 may include software that obtains information from the input device, the sensor, and the external service, similar to the first example described above. Similar to the first example described above, the output unit 400 may further include software that provides information to the output device, the control device, and the external service. Alternatively, the information processing apparatus 11 may also be a server that transmits and receives information to and from an external service. In this case, the input unit 200 may include software that acquires information from an external service. Further, the output unit 400 may include software providing information to an external service.

Further, in the second example, the information processing apparatus 13 may be a server or a terminal device. The processing unit 300 is realized by a processor or a processing circuit included in the information processing apparatus 13 that operates according to a program stored in a memory of the storage device. The information processing apparatus 13 may also be a dedicated device as, for example, a server. In this case, the information processing apparatus 13 may also be installed in a data center or in a home. Alternatively, the information processing apparatus 13 can function as a terminal device regarding another function; however, regarding the functions according to the embodiments of the present disclosure, the information processing apparatus 13 may also be a device that does not implement the input unit 200 and the output unit 400. In the example described below, the information processing apparatus 13 may also be a server in the above sense or may also be a terminal device.

For example, consider a case where the information processing apparatus 11 is a wearable device, and the information processing apparatus 13 is a mobile device connected to the wearable device by Bluetooth (registered trademark) or the like. In a case where the wearable device receives an input of a user operation (input unit 200), the mobile device performs processing based on a request transmitted according to the operation input (processing unit 300), and outputs the result of the processing from the wearable device (output unit 400), which can be said to be the information processing apparatus 11 in the above-described second example, and the mobile device is said to be the information processing apparatus 13.

(third example)

Fig. 15 is a block diagram of a third example of a system configuration according to an embodiment of the present disclosure. Referring to fig. 15, the system 2 includes information processing apparatuses 11a, 11b, and 13. The input unit 200 is implemented in the information processing apparatus 11 a. The output unit 400 is implemented in the information processing apparatus 11 b. Further, the processing unit 300 is implemented in the information processing apparatus 13. The information processing apparatuses 11a and 11b communicate with the information processing apparatus 13 via a network to realize the functions according to the embodiments of the present disclosure. The interface 250b between the input unit 200 and the processing unit 300 and the interface 450b between the processing unit 300 and the output unit 400 may be communication interfaces between devices. However, in the third example, since the information processing apparatus 11a and the information processing apparatus 11b are separate devices, each of the interfaces 250b and 450b may include a different type of interface.

In the third example, the information processing apparatuses 11a and 11b may be, for example, terminal devices. In this case, the input unit 200 may include software that obtains information from the input device, the sensor, and the external service, similar to the first example described above. Similar to the first example described above, the output unit 400 may further include software or the like that provides information to the output device, the control device, and the external service. Alternatively, one or both of the information processing apparatuses 11a and 11b may also be servers for acquiring information from and providing information to external services. In this case, the input unit 200 may include software that acquires information from an external service. Further, the output unit 400 may include software providing information to an external service.

Further, in the third example, similarly to the above-described second example, the information processing apparatus 13 may be a server or a terminal device. The processing unit 300 is realized by a processor or a processing circuit included in the information processing apparatus 13 that operates according to a program stored in a memory or a storage device.

In the third example described above, the information processing apparatus 11a implementing the input unit 200 and the information processing apparatus 11b implementing the output unit 400 are separate devices. Thus, for example, a function of outputting the result of processing based on an input obtained by the information processing apparatus 11a (a terminal device held or used by a first user) from the information processing apparatus 11b (a terminal device held or used by a second user different from the first user) can be realized. Further, it is also possible to realize, for example, a function of outputting a result of processing based on an input obtained by the information processing apparatus 11a (a terminal device held or used by the first user) from the information processing apparatus 11b (a terminal device not held by the first user at that time, for example, installed in the home of the user, but not in the home of the user). Alternatively, each of the information processing apparatus 11a and the information processing apparatus 11b may be a terminal device held or used by the same user. For example, if the information processing apparatuses 11a and 11b are wearable devices worn by users at different parts of the body or a combination of a wearable device and a mobile device, functions in which these devices cooperate with each other may be provided for the users.

(fourth example)

Fig. 16 is a block diagram of a fourth example of a system configuration according to an embodiment of the present disclosure. Referring to fig. 16, the system 2 includes information processing apparatuses 11 and 13. In the fourth example, the input unit 200 and the output unit 400 are implemented in the information processing apparatus 11. In contrast, the processing unit 300 is implemented in a separate manner by the information processing apparatus 11 and the information processing apparatus 13. The information processing apparatus 11 communicates with the information processing apparatus 13 via a network to realize the functions according to the embodiments of the present disclosure.

As described above, in the fourth example, the processing unit 300 is implemented in a separate manner between the information processing apparatus 11 and the information processing apparatus 13. More specifically, the processing unit 300 includes processing units 300a and 300c implemented in the information processing apparatus 11, and includes a processing unit 300b implemented in the information processing apparatus 13. The processing unit 300a performs processing based on information supplied from the input unit 200 via the interface 250a, and then supplies the result of the processing to the processing unit 300 b. In this sense, it can be said that the processing unit 300a performs preprocessing. In contrast, the processing unit 300c performs processing based on the information supplied from the processing unit 300b, and then supplies the result of the processing to the output unit 400 via the interface 450 a. In this sense, it can be said that the processing unit 300c performs post-processing.

Further, in the example shown in the drawings, both the processing unit 300a that performs preprocessing and the processing unit 300c that performs post-processing are shown; in practice, however, only one of the processing units may be present. In other words, the information processing apparatus 11 may also implement the processing unit 300a that performs preprocessing without implementing the processing unit 300c that performs post-processing, and the information provided by the processing unit 300b may also be provided to the output unit 400 without any processing. Similarly, the information processing apparatus 11 may also implement the processing unit 300c that performs post-processing, but need not implement the processing unit 300a that performs pre-processing.

There are interfaces 350b between processing unit 300a and processing unit 300b and between processing unit 300b and processing unit 300 c. The interface 350b is a communication interface between devices. In contrast, if the information processing apparatus 11 implements the processing unit 300a, the interface 250a is an interface included in the device. Similarly, if the information processing apparatus 11 implements the processing unit 300c, the interface 450a is an interface included in the device.

Further, the above-described fourth example is the same as the above-described second example except that one or both of the processing unit 300a and the processing unit 300c are realized by a processor or a processing circuit included in the information processing apparatus 11. In other words, the information processing apparatus 11 may be a server that transmits information to or receives information from a terminal device or an external service. Further, the information processing apparatus 13 may be a server or a terminal device.

(fifth example)

Fig. 17 is a block diagram of a fifth example of a system configuration according to an embodiment of the present disclosure. Referring to fig. 17, the system 2 includes information processing apparatuses 11a, 11b, and 13. The input unit 200 is implemented in the information processing apparatus 11 a. The output unit 400 is implemented in the information processing apparatus 11 b. Further, the processing unit 300 is implemented in the information processing apparatuses 11a and 11b and the information processing apparatus 13 in a separated manner. The information processing apparatuses 11a and 11b and the information processing apparatus 13 communicate via a network to realize the functions according to the embodiments of the present disclosure.

As shown in the drawing, in the fifth example, the processing unit 300 is implemented in a separate manner between the information processing apparatuses 11a and 11b and the information processing apparatus 13. More specifically, the processing unit 300 includes a processing unit 300a implemented in the information processing apparatus 11a, a processing unit 300b implemented in the information processing apparatus 13, and a processing unit 300c implemented in the information processing apparatus 11 b. The processing unit 300 configured in a separated manner is the same as that of the fourth example described above. However, in the fifth example, since the information processing apparatus 11a and the information processing apparatus 11b are separate devices, each of the interfaces 350b1 and 350b2 may include different types of interfaces.

Further, the fifth example is the same as the above-described third example except that one or both of the processing unit 300a and the processing unit 300c are realized by a processor or a processing circuit included in the information processing apparatus 11a or the information processing apparatus 11 b. In other words, the information processing apparatuses 11a and 11b may be servers for transmitting and receiving information to and from a terminal device or an external service. Further, the information processing apparatus 13 may be a server or a terminal device. Further, in the following description, description of processing units in a terminal or a server having an input unit and an output unit, respectively, will be omitted; however, in any instance, one or all of the devices may include a processing unit.

(example of client-Server System)

Fig. 18 is a schematic diagram of a client-server system of a more specific example of a system configuration according to an embodiment of the present disclosure. In the example shown in the drawing, the information processing apparatus 11 (or the information processing apparatus 11a or 11b) is a terminal device, and the information processing apparatus 13 is a server.

As shown, the terminal device includes, for example, a mobile device 11-1, such as a smart phone, a tablet computer, or a notebook computer (PC); a wearable device 11-2, such as an eye-worn or contact lens type terminal, a watch type terminal, a bracelet type terminal, a loop type terminal, an earphone, a garment-mounted or garment-integrated type terminal, a shoe-mounted or shoe-integrated type terminal, or a necklace type terminal; an in-vehicle device 11-3, such as a car navigation system or a rear seat entertainment system; a television set 11-4; a digital camera 11-5; consumer Electronics (CE) devices 11-6, such as recorders, gaming devices, air conditioners, refrigerators, washing machines, or desktop computers (PCs); a robot device; a device comprising a sensor mounted with the facility; and a digital signboard (digital sign) 11-7 installed on the street. These information processing apparatuses 11 (terminal devices) communicate with an information processing apparatus 13 (server) via a network. In the above example, the network between the terminal device and the server corresponds to the interface 150b, the interface 250b, or the interface 350 b. Furthermore, these devices may also operate individually in communication with each other, or a system may also be constructed in which all devices are able to operate in communication.

Further, the example shown in fig. 18 is shown to facilitate understanding of an example of implementing the system 2 in a client-server system; thus, system 2 is not limited to such a client-server system, as explained in each of the examples above. In other words, for example, both the information processing apparatuses 11 and 13 may be terminal devices, but both the information processing apparatuses 11 and 13 may be servers. If the information processing apparatus 11 includes the information processing apparatuses 11a and 11b, one of the information processing apparatuses 11a and 11b may be a terminal device, and the other may be a server. Further, if the information processing apparatus 11 is a terminal device, examples of the terminal device are not limited to the terminal devices 11-1 to 11-7, and different types of terminal devices may also be included.

(examples of distributed systems)

Another configuration example of the system 2 will be described with reference to fig. 19. Fig. 19 is a schematic diagram of a distributed system of another specific example of a system configuration according to an embodiment of the present disclosure. In the illustrated example, the information processing apparatuses 11 (or the information processing apparatuses 11a or 11b) are nodes, and the information processing apparatuses 11 are connected to each other via a network.

In the distributed system shown in fig. 19, it is possible to operate in cooperation with each other between the devices, perform distribution management on data, and allocate processing. Therefore, it is possible to reduce the processing load, improve the real-time (increase the response time and processing speed), and ensure the safety.

In addition, distributed systems can also perform machine learning in a distributed coordinated manner and can process large amounts of data.

Further, in the distributed system shown in fig. 19, a server used in the centralized system is not required, and data can be monitored mutually to ensure the credibility of the data. Specifically, for example, it is possible to share transaction information (ledger) with all participants (all information processing apparatuses 11) and strictly maintain validity (referred to as a block chain). In blockchains, it is difficult to actually operate on all ledgers for all participants, thereby more reliably ensuring trustworthiness. Further, in the blockchain, if data included in the previous block is to be operated, all hash values included in the subject block and the subsequent blocks need to be recalculated, thereby increasing the processing load and the processing is practically impossible; therefore, reliability can be ensured more reliably.

Further, in the blockchain, all participants share transaction information (distributed database), and writing to the distributed database is performed based on a specific protocol, so that fraud by a specific participant can be prevented, and thus fairness can be maintained.

(sixth example)

Fig. 20 is a block diagram showing a sixth example of the system configuration according to the embodiment of the present disclosure. Referring to fig. 20, the system 2 includes information processing apparatuses 11, 12, and 13. The input unit 200 and the output unit 400 are implemented in the information processing apparatus 11. In contrast, the processing unit 300 is implemented in a distributed manner in the information processing apparatus 12 and the information processing apparatus 13. The information processing apparatus 11 and the information processing apparatus 12, and the information processing apparatus 12 and the information processing apparatus 13 communicate with each other via a network to realize functions according to the embodiments of the present disclosure.

As described above, in the sixth example, the processing unit 300 is implemented in a distributed manner between the information processing apparatus 12 and the information processing apparatus 13. More specifically, the processing unit 300 includes processing units 300a and 300c implemented in the information processing apparatus 12 and a processing unit 300b implemented in the information processing apparatus 13. The processing unit 300a performs processing based on information provided by the input unit 200 via the interface 250b, and then provides the result of the processing to the processing unit 300b via the interface 350 b. In contrast, the processing unit 300c performs processing based on the information provided by the processing unit 300b via the interface 350b, and then provides the result of the processing to the output unit 400 via the interface 450 b. Further, in the example shown in the drawings, both the processing unit 300a that performs preprocessing and the processing unit 300c that performs post-processing are shown; however, in practice, there may be one of the processing units.

In the sixth example, the information processing apparatus 12 is between the information processing apparatus 11 and the information processing apparatus 13. More specifically, for example, the information processing apparatus 12 may be a terminal device or a server between the information processing apparatus 11 as a terminal device and the information processing apparatus 13 as a server. Examples of the case where the information processing apparatus 12 is a terminal device include a case where the information processing apparatus 11 is a wearable device, the information processing apparatus 12 is a mobile device connected to the wearable device via Bluetooth (registered trademark) or the like, and the information processing apparatus 13 is a server connected to the mobile device via the internet. Further, examples of the case where the information processing apparatus 12 is a server include a case where the information processing apparatus 11 is various terminal devices, the information processing apparatus 12 is an intermediate server connected to the terminal devices via a network, and the information processing apparatus 13 is a server connected to the intermediate server via a network.

(seventh example)

Fig. 21 is a block diagram showing a seventh example of the system configuration according to the embodiment of the present disclosure. Referring to fig. 21, the system 2 includes information processing apparatuses 11a, 11b, 12, and 13. In the example shown in the figure, the input unit 200 is implemented in the information processing apparatus 11 a. The output unit 400 is implemented in the information processing apparatus 11 b. In contrast, the processing unit 300 is implemented in a distributed manner in the information processing apparatus 12 and the information processing apparatus 13. The information processing apparatuses 11a and 11b and the information processing apparatus 12, and the information processing apparatus 12 and the information processing apparatus 13 communicate with each other via a network to realize functions according to the embodiments of the present disclosure.

The seventh example is an example of a combination of the third example and the sixth example described above. In other words, in the seventh example, the information processing apparatus 11a implementing the input unit 200 and the information processing apparatus 11b implementing the output unit 400 are separate devices. More specifically, the seventh example includes a case where the information processing apparatuses 11a and 11b are wearable devices worn by users at different parts of the body, the information processing apparatus 12 is a mobile device connected to these wearable devices via Bluetooth (registered trademark) or the like, and the information processing apparatus 13 is a server connected to the mobile device via the internet. Further, the seventh example also includes a case where the information processing apparatuses 11a and 11b are a plurality of terminal devices (which may be held or used by the same user or may also be held or used by different users), the information processing apparatus 12 is an intermediate server connected to each terminal device via a network, and the information processing apparatus 13 is a server connected to the intermediate server via a network.

(eighth example)

Fig. 22 is a block diagram showing an eighth example of the system configuration according to the embodiment of the present disclosure. Referring to fig. 22, the system 2 includes information processing apparatuses 11, 12a, 12b, and 13. The input unit 200 and the output unit 400 are implemented in the information processing apparatus 11. In contrast, the processing unit 300 is implemented in the information processing apparatuses 12a and 12b and the information processing apparatus 13 in a distributed manner. The information processing apparatus 11 and the information processing apparatuses 12a and 12b, and the information processing apparatuses 12a and 12b and the information processing apparatus 13 communicate with each other via a network to realize functions according to the embodiments of the present disclosure.

The eighth example is designed as an example having a configuration in which, in the above-described sixth example, the processing unit 300a that performs preprocessing and the processing unit 300c that performs post-processing are realized by separate information processing apparatuses 12a and 12b, respectively. Therefore, the information processing apparatus 11 and the information processing apparatus 13 are the same as those in the above-described sixth example. Further, the information processing apparatuses 12a and 12b may also be servers or terminal devices, respectively. For example, in the system 2, if the information processing apparatuses 12a and 12b are both servers, it can be said that the processing unit 300 is implemented in 3 servers (the information processing apparatuses 12a, 12b, and 13) in a distributed manner. Further, the number of servers that implement the processing unit 300 in a distributed manner is not limited to three, and may be two or may also be four or more. These examples can be understood from, for example, the eighth example described herein or the ninth example to be described below; therefore, the description thereof is omitted.

(ninth example)

Fig. 23 is a block diagram showing a ninth example of the system configuration according to the embodiment of the present disclosure. Referring to fig. 23, the system 2 includes information processing apparatuses 11a, 11b, 12a, 12b, and 13. In the ninth example, the input unit 200 is implemented in the information processing apparatus 11 a. The output unit 400 is implemented in the information processing apparatus 11 b. In contrast, the processing unit 300 is implemented in the information processing apparatuses 12a and 12b and the information processing apparatus 13 in a distributed manner. The information processing apparatus 11a and the information processing apparatus 12a, the information processing apparatus 11b and the information processing apparatus 12b, and the information processing apparatuses 12a and 12b and the information processing apparatus 13 communicate via a network to realize the functions according to the embodiments of the present disclosure.

The ninth example is an example of a combination of the seventh example and the eighth example described above. In other words, in the ninth example, the information processing apparatus 11a implementing the input unit 200 and the information processing apparatus 11b implementing the output unit 400 are separate devices. The information processing apparatuses 11a and 11b communicate with different intermediate nodes (information processing apparatuses 12a and 12b), respectively. Therefore, in the ninth example, similar to the above-described eighth example, the processing unit 300 is implemented in three servers (information processing apparatuses 12a, 12b, and 13) in a distributed manner, and the functions according to the embodiments of the present disclosure can be implemented using the information processing apparatuses 11a and 11b (which may be terminal devices held or used by the same user or held or used by different users).

(example of System including intermediate Server)

Fig. 24 is a schematic diagram of an example of a system including an intermediate server showing one of more specific examples of system configurations according to an embodiment of the present disclosure. In the example shown in the drawing, the information processing apparatus 11 (or the information processing apparatus 11a or 11b) is a terminal device, the information processing apparatus 12 is an intermediate server, and the information processing apparatus 13 is a server.

Similar to the example described above with reference to FIG. 18, examples of terminal devices may include mobile device 11-1, wearable device 11-2, in-vehicle device 11-3, television set 11-4, digital camera 11-5, CE device 11-6, robotic device, and signboard 11-7. These information processing apparatuses 11 (terminal devices) communicate with an information processing apparatus 12 (intermediate server) via a network. In the above example, the networks between the terminal device and the intermediate server correspond to the interface 250b and the interface 450b, respectively. Further, the information processing apparatus 12 (intermediate server) communicates with the information processing apparatus 13 (server) via a network. The network between the intermediate server and the server corresponds to the interface 350b in the above example.

Further, the example shown in fig. 24 is shown as an example such that the system 2 is implemented in a system including an intermediate server can be easily understood, and the system 2 is not limited to this type of system for the reasons explained in each of the examples described above.

(example of System including terminal device as host)

Fig. 25 is a schematic diagram showing an example of a system including a terminal device as a host according to one of more specific examples of system configurations according to an embodiment of the present disclosure. In the example shown in the drawing, the information processing apparatus 11 (or the information processing apparatus 11a or 11b) is a terminal device, the information processing apparatus 12 is a terminal device as a host, and the information processing apparatus 13 is a server.

In the example shown in the drawings, the terminal device may include, for example, a wearable device 11-2, an in-vehicle device 11-3, a digital camera 11-5, a robot device, a device including a sensor mounted with a facility, and a CE device 11-6. These information processing apparatuses 11 (terminal devices) communicate with the information processing apparatus 12 via a network such as bluetooth (registered trademark) or Wi-Fi. In the drawings, mobile device 12-1 is shown as an example of a terminal device as a host. The network between the terminal device and the mobile device corresponds to the interface 250b or 450b in the above example. The information processing apparatus 12 (mobile device) communicates with the information processing apparatus 13 (server) via a network such as the internet. The network between the mobile device and the server corresponds to the interface 350b in the above example.

Further, the example shown in fig. 25 is shown as an example such that the system 2 is implemented in a system including a terminal device as a host can be easily understood, and the system 2 is not limited to this type of system for the reasons explained in each of the examples described above. Further, in the example shown in the drawings, the terminal device as the host is not limited to the mobile device 12-1, and various terminal devices having appropriate communication functions and processing functions may be the host. Further, the wearable device 11-2, the in-vehicle device 11-3, the digital camera 11-5, and the CE device 11-6 shown in the drawings as examples of the terminal devices do not exclude terminal devices other than these from the examples, but are merely examples of typical terminal devices that may be the information processing apparatus 11 in the case where the information processing apparatus 12 is the mobile device 12-1.

(example of System including edge Server)

Fig. 26 is a schematic diagram of an example of a system including an edge server showing one of more specific examples of system configurations according to an embodiment of the present disclosure. In the example shown in the drawing, the information processing apparatus 11 (or the information processing apparatus 11a or 11b) is a terminal device, the information processing apparatus 12 is an edge server, and the information processing apparatus 13 is a server.

Similar to the example described above with reference to FIG. 18, examples of terminal devices may include mobile device 11-1, wearable device 11-2, in-vehicle device 11-3, television set 11-4, digital camera 11-5, CE device 11-6, robotic device, and signboard 11-7. These information processing apparatuses 11 (terminal devices) communicate with the information processing apparatus 12 (edge server 12-2) via a network. The network between the terminal device and the edge server corresponds to the interface 250b or 450b in the above example. The information processing apparatus 12 (edge server) communicates with the information processing apparatus 13 (server) via a network such as the internet. The edge server and the network between the servers correspond to interface 350b in the example described above.

In the example shown in fig. 26, the edge servers 12-2 (e.g., the edge servers 12-2a to 12-2d) are distributed at positions closer to the terminal device (the information processing apparatus 11) than the server 13, so that it is possible to reduce communication delay, improve processing speed, and improve real-time performance.

Further, the example shown in fig. 26 is shown as an example such that the system 2 is implemented in a system including an edge server can be easily understood, and the system 2 is not limited to this type of system for the reasons explained in each of the examples described above.

(example of System including fog calculation)

Fig. 27 is a schematic diagram illustrating an example of a system including fog calculation, one of more specific examples of system configurations according to an embodiment of the present disclosure. In the example shown in the drawing, the information processing apparatus 11 (or the information processing apparatus 11a or 11b) is a terminal device, the information processing apparatus 12 is a fog calculation, and the information processing apparatus 13 is a server.

Similar to the example described above with reference to FIG. 18, examples of terminal devices may include mobile device 11-1, wearable device 11-2, in-vehicle device 11-3, television set 11-4, digital camera 11-5, CE device 11-6, robotic device, and signboard 11-7. These information processing apparatuses 11 (terminal devices) communicate with the information processing apparatus 12 (fog calculation 12-3) via a network. The network between the terminal device and the fog calculation corresponds to the interface 250b or 450b described above. The information processing apparatus 12 (fog calculation) communicates with the information processing apparatus 13 (server) via a network such as the internet. The network between the fog calculation and the server corresponds to the interface 350b in the above example.

In the distributed processing environment between the cloud and the device, the fog computation 12-3 is distributed in an area closer to the device (information processing apparatus 11) than the cloud (server 13). Specifically, the fog calculation 12-3 is configured as a system including an edge calculation that is constructed using a mechanism for optimally arranging computing resources classified by a domain or a region in a distributed manner.

In the example shown in fig. 27, for example, it is conceivable to use the following fog as the fog calculation 12-3: a mobile mist 12-3a performing data management and processing on the mobile device 11-1; a wearable mist 12-3b for performing data management and processing on the wearable device 11-2; an on-vehicle fog 12-3c that performs data management and processing on the on-vehicle device 11-3; a television terminal fog 12-3d for performing data management and processing on the television 11-4; a camera terminal fog 12-3e for performing data management and processing on the digital camera 11-5; CE fog 12-3f, which performs data management and processing on CE device 11-6; and signboard mist 12-3g for performing data management and processing on the signboard 11-7. The data circulation may be performed between the mists with each other.

In the fog calculation, it is possible to allocate a calculation resource at a position close to the apparatus and perform various processes such as management, accumulation, or conversion of data, so that it is possible to reduce communication delay, improve processing speed, and improve real-time.

Further, the example shown in fig. 27 is shown as an example such that the system 2 is implemented in a system including the fog calculation can be easily understood, and the system 2 is not limited to this type of system for the reasons explained in each of the examples described above.

(tenth example)

Fig. 28 is a block diagram showing a tenth example of the system configuration according to the embodiment of the present disclosure. Referring to fig. 28, the system 2 includes information processing apparatuses 11a, 12a, and 13. In the tenth example, the input unit 200 is implemented in the information processing apparatus 11 a. Further, the processing unit 300 is implemented in the information processing device 12a and the information processing device 13 in a distributed manner. The output unit 400 is implemented in the information processing apparatus 13. The information processing apparatus 11a and the information processing apparatus 12a, and the information processing apparatus 12a and the information processing apparatus 13 communicate with each other via a network to realize functions according to the embodiments of the present disclosure.

The tenth example is an example in which the information processing apparatuses 11b and 12b are incorporated in the information processing apparatus 13 in the ninth example described above. In other words, in the tenth example, the information processing apparatus 11a implementing the input unit 200 and the information processing apparatus 12a implementing the processing unit 300a are separate devices; however, the processing unit 300b and the output unit 400 are both realized by the same information processing apparatus 13.

The tenth example realizes a configuration in which, for example, information acquired by the input unit 200 included in the information processing apparatus 11a as a terminal device is supplied to the information processing apparatus 13 as a server or a terminal through processing performed by the processing unit 300a in the information processing apparatus 12a as an intermediate terminal device or a server, and then output by the output unit 400 through processing performed by the processing unit 300 b. Further, intermediate processing performed by the information processing apparatus 12a may also be omitted. Such a configuration may be used, for example, in a service in which predetermined processing is executed in the server or the information processing apparatus 13 based on information supplied from the information processing apparatus 11a, and then the result of the processing is accumulated or the result of the processing is output in the server or the information processing apparatus 13. The accumulated result of the processing may be used by another service, for example.

(eleventh example)

Fig. 29 is a block diagram showing an eleventh example of the system configuration according to the embodiment of the present disclosure. Referring to fig. 29, the system 2 includes information processing apparatuses 11b, 12b, and 13. In the eleventh example, the input unit 200 is implemented in the information processing apparatus 13. Further, the processing unit 300 is implemented in the information processing device 13 and the information processing device 12b in a distributed manner. The output unit 400 is implemented in the information processing apparatus 11 b. The information processing apparatus 13 and the information processing apparatus 12b, and the information processing apparatus 12b and the information processing apparatus 11b communicate with each other via a network to realize functions according to the embodiments of the present disclosure.

An eleventh example is an example in which, in the ninth example described above, the information processing apparatuses 11a and 12a are incorporated in the information processing apparatus 13. In other words, in the eleventh example, the information processing apparatus 11b implementing the output unit 400 and the information processing apparatus 12b implementing the processing unit 300c are separate devices; however, the input unit 200 and the processing unit 300b are realized by the same information processing apparatus 13.

The eleventh example realizes a configuration in which, for example, information acquired by the input unit 200 included in the information processing apparatus 13a as a server or a terminal device is supplied to the information processing apparatus 12b as an intermediate terminal device through processing performed by the processing unit 300b, and then output by the output unit 400 in the information processing apparatus 11b as a terminal device through processing performed by the processing unit 300 c. Further, intermediate processing performed by the information processing apparatus 12b may also be omitted. Such a configuration may be used, for example, in a service in which predetermined processing is performed in the server or the information processing apparatus 13 based on information acquired in the server or the information processing apparatus 13, and then the result of the in-service processing is provided to the information processing apparatus 11 b. The information to be acquired may be provided by different services, for example.

[4-6. other ]

Further, the components of each unit shown in the drawings are only for conceptually illustrating the functions thereof, and are not necessarily physically configured as shown in the drawings. In other words, the specific shape of the individual or integrated devices is not limited to the drawings. Specifically, all or part of the apparatus may be configured by functionally or physically separating or integrating any unit according to various loads or use conditions. For example, the detection unit 141 and the notification unit 142 may also be integrated.

Further, each of the above-described embodiments and modifications may be used in any appropriate combination as long as the processes do not conflict with each other.

Further, the effects described in the present specification are merely exemplary and not restrictive, and other effects are also possible.

(5. Effect of information processing apparatus according to the present disclosure)

As described above, the information processing apparatus (the information processing apparatus 100 in the embodiment, etc.) according to the present disclosure includes the control unit (the control unit 140 in the embodiment). The control unit detects information indicating the operation state of the apparatus (in the present embodiment, the home appliance 10 or the like) as the sensed information. Further, when the sensed information is detected, the control unit refers to the storage unit (the storage unit 130 in the embodiment, or the like) in which the response content associated with the sensed information is stored, and then determines whether or not to notify the user of the operation state of the apparatus.

In this way, the information processing apparatus determines whether to notify the user of the operation state of the device associated with the detected sensed information, so that the information processing apparatus can notify the user of only necessary information without causing trouble when the user uses a plurality of devices. Therefore, the information processing apparatus can avoid lowering the convenience of the user, and can smoothly operate a plurality of home appliances.

Further, the control unit detects, as the sensing information, a notification sound emitted by the apparatus to notify the user of the operation state, refers to the storage unit in which the response content associated with the notification sound is stored when the notification sound is detected, and then determines whether or not to notify the user of the operation state of the apparatus. Therefore, the information processing apparatus can determine whether or not to notify the user of the operation state related to the notification sound emitted by the device.

Further, the control unit updates the response content associated with the sensing information stored in the storage unit based on a reaction of the user received after notifying the user of the operation state of the apparatus. Therefore, the information processing apparatus can update the content to be notified to the user according to the request of the user.

Further, the control unit updates a setting indicating whether to notify the user of the operation state of the device associated with the detected sensing information based on the received reaction of the user. Therefore, the information processing apparatus can appropriately update the judgment criterion relating to the notification in accordance with the request of the user.

Further, the control unit recognizes a voice received from the user, and updates a setting indicating whether to notify the user of an operation state of the device associated with the detected sensing information, based on a reaction of the user according to a result of the voice recognition. Therefore, the information processing apparatus can update the response content based on the voice of the user without receiving an input operation such as a key operation by the user.

Further, when the control unit notifies the user of the operation state of the apparatus, the control unit notifies the user of information related to the installation position of the apparatus together with the operation state. Therefore, the information processing apparatus can notify the user of detailed information such as information indicating which device at the installation position emits the notification sound.

Further, the control unit identifies a device associated with the sensing information based on the image recognition, and notifies the user of the operation state of the device together with information about the identified device. Therefore, the information processing apparatus can notify the user of detailed information such as information indicating which device has emitted the notification sound.

Further, the control unit notifies the user of at least one of the type of the device, the name of the device, and the installation position of the device, along with the operation state of the device. Therefore, the information processing apparatus can notify the user of detailed information such as information indicating what kind of device has emitted the notification sound.

Further, the control unit detects a position where the user is located, and determines whether to notify the user of the operation state of the device based on a positional relationship between the position where the user is located and the device associated with the sensing information. Therefore, the information processing apparatus can notify only the user who is in the vicinity, so that the information processing apparatus can perform notification that does not cause trouble to the user.

Further, the control unit detects a position where the user is located, and determines whether to notify the user of the operation state of the device based on a distance between the position of the user and a position where the device associated with the sensing information is mounted. Therefore, the information processing apparatus can perform appropriate notification according to the state, as in the case where no notification is given to the user in the vicinity of the home appliance.

Further, the control unit determines whether to notify the user of the operation state of the apparatus in accordance with the orientation of the face or body of the user when the apparatus issues information indicating the operation state or when information indicating the operation state of the apparatus as sensed information is detected. Therefore, the information processing apparatus can determine whether or not to execute the notification according to an actual state when the sound is emitted, such as a state whether or not the user recognizes the sound emitted from the home appliance.

Further, the control unit detects an attribute of a user located in the vicinity of the information processing apparatus, and determines whether to notify the user of the operation state of the device according to the detected attribute of the user. Therefore, the information processing apparatus can perform notification according to a detailed request of the user, such as a request to give a notification only to an adult user and not to give a child user.

Further, when the control unit notifies the user of the operation state of the apparatus, the control unit notifies the user of information that the marking was previously performed on the sensing information together with the operation state of the apparatus. Therefore, the information processing apparatus can notify the user of the operation state of the device in more detail.

Further, when the control unit detects an abnormal sound indicating that the operation state of the apparatus is abnormal as the sensed information, the control unit notifies the user of information indicating that the abnormal sound has been detected together with the operation state of the apparatus. Therefore, the information processing apparatus can positively notify that an abnormal situation has occurred in the user device.

Further, the control unit detects at least one of information about light, temperature, humidity, smell, vibration, and carbon dioxide concentration observed around the home appliance as the sensing information. Therefore, even in the case where the information is not the notification sound, the information processing apparatus can positively notify the user of the information relating to the operation of the device.

(6. hardware configuration)

The information processing apparatus such as the information processing apparatus 100 according to each of the above-described embodiments is realized by, for example, a computer 1000 having a configuration shown in fig. 30. The following description will be made taking the information processing apparatus 100 according to the first embodiment as an example. Fig. 30 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the information processing apparatus 100. The computer 1000 includes a CPU1100, a RAM1200, a Read Only Memory (ROM)1300, a Hard Disk Drive (HDD)1400, a communication interface 1500, and an input/output interface 1600. Each unit in the computer 1000 is connected by a bus 1050.

The CPU1100 operates based on a program stored in the ROM1300 or the HDD1400, and controls each unit. For example, the CPU1100 loads programs stored in the ROM1300 or the HDD1400 into the RAM1200, and then executes processing associated with various programs.

The ROM1300 stores a boot program such as a basic input/output system (BIOS) executed by the CPU1100 when the computer 1000 is started, a program dependent on hardware of the computer 1000, and the like.

The HDD1400 is a computer-readable recording medium in which programs executed by the CPU1100, data used by these programs, and the like are recorded in a non-transitory manner. Specifically, the HDD1400 is a medium in which an information processing program according to the present disclosure is recorded as an example of the program data 1450.

The communication interface 1500 is an interface for the computer 1000 to connect to an external network 1550 (e.g., the internet). For example, the CPU1100 receives data of another apparatus via the communication interface 1500 and transmits data generated by the CPU1100 to the other apparatus.

The input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000. For example, the CPU1100 receives data from an input device such as a keyboard, a mouse, or a remote controller via the input/output interface 1600. Further, the CPU1100 transmits data to output devices such as a display, a speaker, a printer via the input/output interface 1600. Further, the input/output interface 1600 may also function as a media interface that reads a program or the like recorded in a predetermined recording medium (media). Examples of one of the media mentioned here include optical recording media such as Digital Versatile Disks (DVDs) and phase-change rewritable disks (PDs), magneto-optical recording media such as magneto-optical disks (MOs), magnetic tape media, magnetic recording media, semiconductor memories, and the like.

For example, when the computer 1000 functions as the information processing apparatus 100 according to the first embodiment, the CPU1100 in the computer 1000 realizes the functions of the control unit 140 and the like by executing the information processing program loaded onto the RAM 1200. Further, the HDD1400 stores the information processing program according to the present disclosure and data included in the storage unit 130. Further, the CPU1100 reads the program data 1450 from the HDD1400 and executes the program; however, as another example, the CPU1100 may also acquire these programs from other devices via the external network 1550.

Further, the present technology can also be configured as follows.

(1) An information processing apparatus comprising:

a control unit that performs:

a process of detecting information indicating an operation state of the device as the sensed information, and

a process of determining whether to notify the user of the operation state of the apparatus by referring to the storage unit in which the response content associated with the sensed information is stored when the sensed information is detected.

(2) The information processing apparatus according to (1), wherein,

a control unit:

a notification sound emitted by the detection means to notify the user of the operation state is used as the sensing information; and

when the notification sound is detected, it is determined whether or not to notify the user of the operation state of the apparatus by referring to the storage unit in which the response content associated with the notification sound is stored.

(3) The information processing apparatus according to (1) or (2), wherein the control unit updates the response content associated with the sensing information stored in the storage unit based on a reaction received from the user after notifying the user of the operation state of the device.

(4) The information processing apparatus according to (3), wherein the control unit updates a setting indicating whether to notify the user of an operation state of the device associated with the detected sensing information, based on a reaction received from the user.

(5) The information processing apparatus according to (3) or (4), wherein,

the control unit:

recognizing speech received from a user; and

based on the reaction of the user according to the result of the voice recognition, a setting indicating whether to notify the user of the operation state of the apparatus associated with the detected sensing information is updated.

(6) The information processing apparatus according to any one of (1) to (5), wherein when the control unit notifies the user of the operation state of the device, the control unit notifies the user of information relating to the installation position of the device together with the operation state.

(7) The information processing apparatus according to (1) or (6), wherein,

a control unit:

identifying a device associated with the sensed information based on the image recognition; and

the user is notified of the operational state of the identified device along with information about the identified device.

(8) The information processing apparatus according to (7), wherein the control unit notifies the user of at least one of a type of the device, a name of the device, and an installation position of the device together with an operation state of the device.

(9) The information processing apparatus according to any one of (1) to (8), wherein,

a control unit:

detecting the position of a user; and

whether to notify the user of the operation state of the device is determined based on the positional relationship between the position where the user is located and the device associated with the sensed information.

(10) The information processing apparatus according to (9), wherein,

the control unit:

detecting the position of a user; and

whether to notify the user of the operation state of the device is determined based on the distance between the position where the user is located and the installation position of the device associated with the sensed information.

(11) The information processing apparatus according to (9) or (10), wherein the control unit determines whether to notify the user of the operation state of the device in accordance with an orientation of the face or body of the user when the device issues information indicating the operation state or when information indicating the operation state of the device is detected as the sensed information.

(12) The information processing apparatus according to any one of (9) to (11), wherein,

a control unit:

detecting an attribute of a user in the vicinity of the information processing apparatus; and

and judging whether to inform the user of the operation state of the device according to the detected attributes of the user.

(13) The information processing apparatus according to any one of (1) to (12), wherein when the control unit notifies the user of the operation state of the device, the control unit notifies the user of information that the labeling has been previously performed on the sensed information together with the operation state of the device.

(14) The information processing apparatus according to any one of (1) to (13), wherein when the control unit detects an abnormal sound indicating that the operation state of the device is abnormal as the sensed information, the control unit notifies the user of information indicating that the abnormal sound is detected together with the operation state of the device.

(15) The information processing apparatus according to any one of (1) to (14), wherein the control unit detects at least one of information regarding light, temperature, humidity, smell, vibration, and carbon dioxide concentration observed around a device as the sensing information.

(16) The information processing apparatus according to any one of (1) to (15), wherein the control unit controls display of the notification on the display unit.

(17) The information processing apparatus according to any one of (1) to (16), wherein,

a control unit:

acquiring a use state of the information processing device; and

the notification output is controlled based on the usage status.

(18) The information processing apparatus according to (17), wherein the use state includes information relating to content output by the information processing apparatus.

(19) An information processing method performed by an information processing apparatus, the information processing method comprising:

detecting information indicating an operation state of the apparatus as the sensing information;

when the sensing information is detected, it is determined whether to notify the user of the operation state of the apparatus by referring to the storage unit in which the response content associated with the sensing information is stored.

(20) An information processing program that causes an information processing apparatus to execute a process comprising the steps of:

detecting information indicating an operation state of the apparatus as the sensing information;

when the sensing information is detected, it is determined whether to notify the user of the operation state of the apparatus by referring to the storage unit in which the response content associated with the sensing information is stored.

Reference numerals:

1 information processing system

10 household appliance

100 information processing apparatus

120 sensor

120A voice input sensor

120B image input sensor

121 input unit

122 communication unit

130 memory cell

131 response table of contents

132 device information table

133 temporary storage area

134 user information table

140 control unit

141 detection unit

142 notification unit

143 User Interface (UI) element.

60页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于可佩戴音频设备中振动减轻的高顺应性微型扬声器

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!