Mobile assistance device and mobile assistance method

文档序号:1328049 发布日期:2020-07-14 浏览:7次 中文

阅读说明:本技术 移动式辅助装置及移动式辅助方法 (Mobile assistance device and mobile assistance method ) 是由 野本弘之 平尾骏 井泽秀人 嘉和知玲子 本泽邦朗 于 2018-04-11 设计创作,主要内容包括:实施方式提供一种综合性地判断来自各种传感器的数据,判断接下来应控制的内容,能够提高监视效果的移动式辅助装置及方法。根据一实施方式,具备:移动装置;传感器,其搭载于移动装置;及控制部,其根据所述传感器的输出来判断周围空间的明度,并基于判断结果,输出用于控制其他器具的动作的控制信号。(Embodiments provide a mobile assistance device and method capable of comprehensively determining data from various sensors, determining the contents to be controlled next, and improving the monitoring effect. According to one embodiment, the apparatus includes: a mobile device; a sensor mounted on a mobile device; and a control unit that determines brightness of the surrounding space based on the output of the sensor and outputs a control signal for controlling the operation of another appliance based on the determination result.)

1. A mobile assistance device is provided with:

a mobile device;

a sensor mounted on a mobile device; and

and a control unit that determines brightness of the surrounding space from the output of the sensor and outputs a control signal for controlling the operation of another appliance based on the determination result.

2. The mobile assistance device of claim 1, wherein,

the mobile auxiliary device further comprises a mapping part for storing a plurality of areas,

the control unit determines a destination area of the plurality of areas, and measures illuminance in the determined destination area.

3. The mobile assistance device of claim 1, wherein,

the mobile auxiliary device further comprises a mapping part for storing a plurality of areas,

the control unit determines a destination area of the plurality of areas, measures illuminance in the determined destination area, and adjusts the illuminance of the lighting fixture in the determined destination area when the measured illuminance does not satisfy a predetermined value.

4. The mobile assistance device of claim 3, wherein,

the mobile assistance device further includes a camera that captures an image when the illuminance of the illumination fixture in the determined destination area is adjusted to be high.

5. The mobile assistance device of claim 3, wherein,

the control unit includes a human detection sensor, and stops adjustment of the illumination appliance when the human detection sensor detects a human.

6. The mobile assistance device of claim 3, wherein,

the control unit includes a human detection sensor, and when the human detection sensor detects a human, the control unit stops the adjustment of the illumination of the lighting device and activates the microphone and/or the speaker.

7. The mobile assistance device of claim 3, wherein,

the mobile assistance device further includes a camera and the mapping unit that stores the plurality of areas,

the control unit temporarily stores the measured value of the illuminance when the measured value of the illuminance does not satisfy a predetermined value,

performing imaging by the camera when the illuminance of the illumination tool in the determined destination area is adjusted to be large,

after the end of the shooting by the camera, the illumination of the illumination tool is adjusted so that the illumination becomes the temporarily stored measurement value.

8. The mobile assistance device of claim 7, wherein,

the control unit further stores a color tone of the illumination before the illuminance of the illumination tool is adjusted, and returns the color tone of the illumination tool to a state before the adjustment after the shooting by the camera is completed.

9. A mobile auxiliary device, wherein,

the mobile auxiliary device is provided with a mobile device which is provided with a camera, a microphone, a communication device and a control device,

the control device is provided with:

a means for acquiring sound data of a sound detected by the microphone;

a mechanism that determines a direction of the sound and an occurrence region of the sound; and

and a mechanism for moving the camera to the sound generation area by controlling the moving device, and causing the camera to capture the direction of the sound.

10. The mobile assistance device of claim 9, wherein,

the mobile assistance device includes a mechanism for limiting the determination, the control of the mobile device, and the imaging according to the type of the sound.

11. The mobile assistance device of claim 9, wherein,

the mobile assistance device includes learning means relating to the voice, and means for limiting the determination, the control of the mobile device, and the shooting when the voice is stored in the learning means.

12. The mobile assistance device of claim 9, wherein,

the mobile auxiliary device is provided with the following mechanisms: the moving device is controlled to turn on the illumination of the moving path when the brightness required for shooting is insufficient in the middle of moving.

13. The mobile assistance device of claim 9, wherein,

the mobile auxiliary device is provided with the following mechanisms: and a mechanism for controlling the moving device, shooting a moving path in the moving process, judging whether an obstacle exists according to the difference between the past shooting data and the shooting data in the moving process of the current time, and controlling the moving device in a mode of avoiding the obstacle.

14. The mobile assistance device of claim 13, wherein,

the portable auxiliary device measures the illuminance of the illumination, and can give a warning when the brightness of the illumination is insufficient.

15. The mobile assistance device of claim 9, wherein,

the mobile auxiliary device is provided with: a means for controlling the first controlled device; a means for acquiring a detection output of a first sensor that reacts to a phenomenon based on the first controlled device; and a means for checking the logical matching between the control state of the first controlled device and the detection output of the first sensor.

16. The mobile assistance device of claim 15, wherein,

the first controlled device is an illumination device, and the first sensor is an illuminance sensor.

17. The mobile assistance device of claim 15, wherein,

the number of the first controlled devices is one, and the number of the first sensors is multiple.

18. The mobile assistance device of claim 15, wherein,

the mobile assist apparatus may further include a first sensor that is provided in the first controlled apparatus, a plurality of first sensors that are turned on at different timings, and a mechanism that determines which of the plurality of first sensors is malfunctioning.

19. A mobile assistance method for a mobile assistance device comprising a mobile device, a sensor mounted on the mobile device, and a control unit,

at the destination of movement by the mobile device, the brightness of the surrounding space is determined from the output of the sensor, and a control signal for controlling the operation of another appliance is output based on the determination result.

Technical Field

The present embodiment relates to a mobile support device and a mobile support method.

Background

Recently, various monitoring systems have been developed. The monitoring system acquires and analyzes image data and/or audio data using a camera, a microphone, and the like. Based on the analysis result, various kinds of determination such as intrusion of a suspicious person are performed.

Prior art documents

Patent document

Patent document 1: japanese patent No. 4819221

Patent document 2: japanese patent No. 4245367

Patent document 3: japanese patent laid-open publication No. 2004-295408

Patent document 4: japanese laid-open patent publication No. 2007-133625

Disclosure of Invention

Drawings

Fig. 1 is a diagram showing an outline of a portable assist apparatus.

Fig. 2 is a diagram showing a relationship between the internal configuration of the HGW600 and the network.

Fig. 3A is a diagram showing a case where the mobile HGW600 enters a room 800a in a bright state.

Fig. 3B is a diagram showing a case where the mobile HGW600 enters a dark room 800B.

Fig. 4 is a flowchart showing an example of the control operation of the HGW 600.

Fig. 5 is a diagram showing an example of the relationship between the HGW600 and the smartphone.

Fig. 6 is an explanatory diagram showing a schematic relationship between a home sensor group and a controlled device, and the HGW 600.

Fig. 7 is a flowchart showing an example of system operation when the all-standby mode is set in the system to which the present embodiment is applied.

Fig. 8 is a flowchart showing another example of the system operation when the all-standby mode is set in the system to which the present embodiment is applied.

Fig. 9 is a flowchart showing an example of system operation when monitoring the operation of the air conditioner in the system to which the present embodiment is applied.

Fig. 10 is a reference diagram showing operations performed when the inspection of the sensor and the controlled device is performed in the system to which the present embodiment is applied.

Fig. 11 is a diagram showing an example of the overall configuration of another network system to which the present embodiment is applied.

Fig. 12 is an explanatory diagram showing an example of a flow of recording event-related data and monitoring data as a timeline in the embodiment shown in fig. 11.

Fig. 13 is a block diagram showing a configuration with main parts of the embodiment of fig. 11 taken out.

Fig. 14 is a diagram showing an example of a menu of a smartphone as a user interface that can access event-related data and/or monitoring data.

Fig. 15A is an explanatory diagram showing a procedure for accessing monitoring data by the smartphone.

Fig. 15B is an explanatory diagram showing another procedure for accessing monitoring data by the smartphone.

Fig. 16A is a diagram showing an example of an operation screen displayed on the smartphone.

Fig. 16B is a diagram showing another example of an operation screen displayed on the smartphone.

Fig. 17 is a diagram showing an example of an image when monitoring data (video data) relating to a certain event is played back.

Fig. 18A is a diagram illustrating an example of a relationship between a smartphone and event-related data displayed on the smartphone and an operation method.

Fig. 18B is a diagram illustrating another relationship between a smartphone and event-related data displayed on the smartphone and another example of an operation method.

Fig. 19 is a diagram illustrating still another relationship between a smartphone and event-related data displayed on the smartphone and yet another operation method.

Fig. 20 is a hierarchical diagram illustrating an example of the relationship between event-related data and the recording position of monitoring data.

Detailed Description

Hereinafter, embodiments will be described with reference to the drawings. Fig. 1 illustrates an outline of the portable assist apparatus. Reference numeral 600 denotes a Home Gateway (HGW) which can be connected to a network as described later. The HGW600 is a mobile apparatus integrated with a mobile device (may also be referred to as a transfer device) 618. The HGW600 may be stationary (may be referred to as a stationary type), and may be collectively referred to as an auxiliary device 650 together with a mobile apparatus.

The HGW600 (auxiliary device 650) includes at least a camera 611, a microphone 613, and a speaker 615. The camera 611, microphone 613, etc. may also be referred to as a sensor in a broad sense. The camera may be provided in plurality. The HGW600 can control the moving device 618 based on pickup data from the sensor and an internal control application. Based on this self-control, it is possible to change the location, track the target, and the like.

Fig. 2 shows the relationship between the internal structure of the HGW600 and the network.

The server 1000 can be connected to the HGW600 via the internet 300. The HGW600 includes a memory 601, a control unit (may also be referred to as a system controller) 602, a device manager 603, a network interface (hereinafter, network I/F)605, a sensor control table 609, a camera 611, a microphone 613, a speaker 615, and the like.

The HGW600 can accommodate various communication methods via a network I/F605 as a communication function (may also be referred to as a communication device). The communication method of the sensor may vary depending on the manufacturer. For example, there are sensors using IEEE802.15.4 as a communication method, sensors using IEEE802.151, and sensors using IEEE802.15.3 a. There are also sensors using ieee802.11b, ieee802.11a, and ieee802.11g.

Therefore, the HGW600 of the present embodiment can be provided with an interface adaptable to each mode as a network interface.

The HGW600 includes a drive unit 618a for controlling the drive of the previous moving device 618. The HGW600 further includes a mapping unit 619 which can store the movement position and create a map.

The HGW600 includes an illumination sensor 622 that detects brightness of the surroundings, and a human detection sensor 623 that can detect whether or not a human is present in the surroundings. When the ambient lighting is sufficient, the camera 611 may also serve as the human detection sensor 623.

The memory 601 (which may also be referred to as a control data manager) includes an application manager (hereinafter referred to as APP-Mg), an event manager (hereinafter referred to as EVT-Mg), and a configuration manager (hereinafter referred to as CONFIG-Mg). The APP-Mg manages a plurality of applications for controlling various operations of the HGW 600. EVT-Mg manages applications for events and the like that control various actions that are caused by the occurrence of various events. The CONFIG-Mg manages a configuration application that recognizes functions in the HGW600 and various functions related to the HGW600 and performs, for example, an operation sequence and operation restriction.

The system controller 602 can collectively perform sequential control of the respective modules in the HGW 600. The operations (determination, data processing, analysis, communication) of the HGW described below by referring to the flowchart are executed based on the application stored in the system controller 602 and the memory 601.

The EVT-Mg can control the camera 611, the microphone 613, the speaker 615, a recording manager (not shown), and the like. The EVT-Mg determines detection data from external sensors and/or data from the camera 611 and the microphone 613, which are taken in from the network I/F605, and can control the subsequent actions and behaviors. The CONFIG-Mg can perform initial setting of each function block inside the HGW600, restriction of functions, expansion of functions, priority, operation time, and the like.

The device manager 603 can authenticate another device operating in association with the HGW600 and register the device in the memory 601. Therefore, the device manager 603 can manage … a plurality of other sensors, lighting fixtures 120, and the like connected via the network I/F605. The device manager 603 also registers identification data of the server 1000 connected via the internet 300, and identifies the server 1000. The device manager 603 also registers identification data of a smartphone or the like connected via the internet 300, and can identify the smartphone.

The sensor/appliance control table 609 stores names of various sensors and appliances, position information of various sensors and appliances, and data for controlling and/or restricting control of the sensors and appliances. Further, the name and position information of each sensor can be displayed on a smartphone or the like, and thus, the user can confirm the type and installation position of the sensor.

The network I/F605 is connected to other sensors and control objects (lighting fixtures 120 and …) in a home, for example, via short-range wireless. In the figure, the lighting fixture 120 is representatively shown. The lighting fixture 120 includes an I/F unit 122 connected to a network, a light control unit 123, and a light emitting unit 124. The light adjusting section 123 can control the current of the light emitting section 124 based on an instruction given from the outside via the I/F section 122, and by this control, the lighting can be controlled to be bright or dark. Various sensors exist as sensors for acquiring information, sensors for controlling, and sensors to be controlled.

The Mapping unit 619 is described further, and the Mapping unit 619 stores a moving position and can create a map, for example, the Mapping unit 619 can apply an image from the mounted camera 611. as a Mapping function of the Mapping unit 619, there is an S L AM (Simultaneous execution of self position determination and Mapping) function, the S L AM function creates a map of a target area and operates while constructing an operation (moving) path according to the map, as compared with a case where a moving body has conventionally traveled randomly on the floor, and the S L AM function operates by referring to image pickup data from the camera, internally generates a surrounding environment map, and can output current position information.

< illumination control and photographing >

Fig. 3A and 3B show a case where the mobile HGW600 enters a room 800a in a bright state and a case where the mobile HGW enters a room 800B in a dark state. For example, when the HGW600 enters the room 800a for a regular inspection to photograph the room, the measurement value of the illuminance sensor 622 is checked. Room 800a is bright and therefore shooting is performed. Examples of the brightness of the room 800a include a case where the lighting fixture 120 is lit at a sufficient brightness, and a case where the room is sufficiently bright due to external light from the window 802.

In contrast, when the user enters the room 800b for the purpose of imaging the room by the periodic inspection, the measurement value of the illuminance sensor 622 indicates that the illuminance is low. In this case, the HGW600 controls the lighting fixture 120 to illuminate the room by lighting or dimming control, and performs imaging.

The above description is an example, and is a simple example of illumination control by the HGW 600. The HGW600 can implement more complicated control depending on the circumstances.

Referring to fig. 4, still another example of the control operation of the HGW600 will be described. This control operation is performed centrally by the system controller 602 in the HGW 600.

The HGW600 starts its operation based on, for example, the operation setting of a timer set by a user, the direct operation of the user, and some detection information from another sensor. The direct operation by the user may take an instruction from, for example, a smartphone as an operation instruction via the internet. Or may initiate an action based on a specific voice command from the user. The movable body of the HGW600 can start moving (cooperative operation of the auxiliary device) based on a command from a stationary auxiliary device described later.

In the operation flow shown here, mapping of the room in the house is completed in advance by the S L AM function executed in the past, and the HGW600 grasps the moving position of itself based on a combination of the house mapping data and other sensors (such as a camera and a GPS).

When the operation is started, a list of areas to be moved is acquired (step SB 1). Next, it is determined whether or not there is an HGW600 unreached region. This determination is performed as follows: the system controller 602 refers to the list of the areas to be operated, and determines whether or not an area that has not been reached remains (step SB 2). If no unreached area remains, the operation is terminated, and if no unreached area remains, for example, an unreached area with a high priority is determined. The following determination is made: the area determined as the unreachable area is an area reachable by the HGW600 (step SB 3).

In step SB3, if it is determined that the area is a reachable area, the process proceeds to step SB4, and if it is determined that the area is an unreachable area, the process proceeds to step SB 21. Factors of the area determined to be inaccessible include, for example, that a lock with a sensor of a room is locked, that an obstacle that cannot be avoided exists in the middle of the inspection by a camera, and the like.

After reaching the reachable area (step SB4), the HGW600 measures the ambient illuminance (step SB 5). after the illuminance is measured, it is determined whether or not the light color in the surrounding space is the brightness at which the S L AM function can operate normally (step SB 6).

When the brightness is a brightness at which the S L AM function can normally operate, the predetermined operation is executed (step SB 7). the predetermined operation is an operation according to the purpose of moving the HGW600 to the position.

For example, when the HGW600 moves for measuring the temperature of a room, the HGW acquires temperature information from a temperature sensor of the room. Or moved for shooting a flower for room decoration, the HGW600 shoots the flower using the camera 611. When the window is moved for monitoring the open/close state of the window of the room, the window sensor acquires open/close information from the window of the room, or the camera captures an image of the window. If such an operation is not completed, the process returns to step SB 5. When the HGW600 determines that the flower state is insufficient, the automatic water feeder is turned on to supply the predetermined water to the flower. The HGW600 can also control the state of illumination when shooting flowers.

When the above operation is completed (step SB8), the HGW600 determines whether or not the illumination is controlled before the operation (step SB 9). When the HGW600 controls the illumination before operation, the control is performed to return the illumination to the state before the control (step SB10), and the process returns to step SB 2.

If it is determined in step SB3 that the area that has not been reached cannot be reached, the area is excluded from the area to be operated (monitored) (step SB21), and the process returns to step SB 2.

In the previous step SB6, it is assumed that the HGW600 has performed the determination that the light color in the surrounding space is the brightness at which the S L AM function cannot operate normally, in this case, the process proceeds to step SB22, and a determination is made as to whether or not the lighting fixture is the network control target, next, in the case where the lighting fixture is the network control target, a determination is made as to whether or not the control of the lighting fixture is not prohibited (step SB23), and data of the device manager 603 and/or the sensor control table 609 (see fig. 2) is used as reference data at this time.

In the case where the lighting fixture is not the network control target or the control is prohibited, the area is excluded from the moving object area in step SB 21.

When the control of the lighting fixture is not prohibited, it is determined whether or not the human detection sensor 623 has not detected a human. In the case where the human detection sensor 623 detects a human, basically, in this area, except for a specific action purpose, the action-target area is excluded (step SB 21). The specific operation purpose means that a case set as a monitoring target even when a person is present in the room is the specific operation purpose. For example, even when a person is present on a bed of a hospital, the open/close state of a window or a curtain may be monitored. Therefore, such a room can be registered in advance as an exceptional room (area), that is, as an essential inspection area, for example, in the mapping unit 619.

In step SB24, when the human detection sensor does not detect a human, it is determined whether or not the upper limit of the number of attempts for dimming has not been reached, in order to check whether or not the number of attempts for dimming has not exceeded a predetermined number of times by referring to the number of times of lighting control in the current area so far, and a limit is set to the number of times of lighting control.

If the upper limit is not reached, the initial state of the illumination is recorded (stored) (step SB26), and the control of the illumination is started (step SB 27). However, if the upper limit is exceeded, the process proceeds to step SB 10.

The control of the illumination can be performed by adjusting the overall brightness and color tone. When the control of the lighting is started, the process proceeds to step SB 5. In the control loop, when the area that has not been reached except the excluded target area is zero, the operation is terminated. Then, for example, the charging station is automatically returned to start charging.

The above example describes an example in which the HGW600 performs the light control of the illumination appliance according to the brightness of the surrounding environment. However, it is needless to say that the shooting by the camera 611 may be linked with the dimming of the lighting fixture. Alternatively, although it is assumed in step SB23 that the light is not adjusted when the human detection sensor detects a human, the operation of the microphone 613 and/or the speaker 615 may be turned on at this time.

Fig. 5 shows an example of setting the lighting control operation of the HGW600 from the external smartphone GUI-1. The HGW application of the smartphone GUI-1 can be started, the HGW600 accessed, and the list of lighting fixtures displayed. Further, the control prohibition time period can be set for each lighting fixture.

In the example of fig. 5, the lighting G (for example, lighting equipment for the entrance) is changed from 19: 00-21: 00 example of prohibition control. The numeric value (in this case, time) may be input by various methods, and there are a number selection method and a method of selecting and inputting a numeric key displayed on a screen.

In lighting F1-1 (e.g., the living room of the first floor), the ratio of light from 17: 00-23: 00 example of prohibition control.

The above shows an example in which prohibition of illumination control by the HGW600 is set by the smartphone GUI-1. However, other commands may be sent from the smartphone GUI-1 to the HGW 600. The HGW600 may transmit, for example, a video image of the destination area to the smartphone GUI-1 in accordance with a command from the smartphone GUI-1. For example, a list indicating what kind of control was performed during 1 day may be sent to the smartphone GUI-1.

In addition, the auxiliary system may be set to a full standby mode. For example, the householder goes out for a period of time (e.g., 1 to 2 weeks), and there is no schedule for the outsider to come home. In this case, the full standby mode may be selected from a menu screen of the smartphone GUI-1, and the full standby may be set to on by an operation of the button AraS 101.

As described above, the assist apparatus includes:

(A1) … a moving body; a sensor mounted on a mobile body; and a control unit that determines brightness of the surrounding space from the output of the sensor and outputs a control signal for controlling the operation of another appliance based on the determination result.

(A2) … (a1) further includes a mapping unit that stores a plurality of areas, and the control unit determines a destination area of the plurality of areas and measures illuminance in the determined destination area.

(A3) … further includes a mapping unit for storing a plurality of areas in (A1),

the control unit determines a destination area among the plurality of areas, measures illuminance in the determined destination area, and adjusts the illuminance of the lighting fixture in the determined destination area when the measured illuminance does not satisfy a predetermined value.

(A4) …, in (A3), the lighting fixture further includes a camera, and the camera performs imaging when the illuminance of the lighting fixture in the determined destination area is adjusted to be large.

(A5) … (A3), the control unit includes a human detection sensor, and when the human detection sensor detects a human, the control unit stops the adjustment of the illuminance of the lighting fixture.

(A6) …, in (A3), the control unit includes a human detection sensor, and when the human detection sensor detects a human, the control unit stops the adjustment of the illumination of the lighting device and activates the microphone and/or the speaker.

(A7) … (A3) further includes a camera and the mapping unit storing a plurality of areas, wherein the control unit temporarily stores the measured value of the illuminance when the measured value of the illuminance does not satisfy a predetermined value,

performing imaging by the camera when the illuminance of the illumination tool in the determined destination area is adjusted to be large,

after the end of the shooting by the camera, the illumination of the illumination is adjusted so that the illumination becomes the temporarily stored measurement value.

(A8) … in (a7), the control unit also stores the color tone of the illumination before the adjustment of the illuminance of the illumination appliance, and returns the color tone of the illumination appliance to the state before the adjustment after the end of the shooting by the camera.

(A9) … in (A3), the control unit does not perform the control of the lighting fixture for a time period in which the control of the illuminance of the lighting fixture is prohibited.

(A10) … (A3), the controller may control the lighting fixture to perform a flashing operation.

(A11) … at (A3), the control device outputs a control signal for controlling the lighting when the mobile body moves to a destination of movement based on an instruction from a stationary auxiliary device.

< collaboration of sensor information >

The mobile auxiliary device 650 can start operation based on some detected information from the first sensor. In this case, the second and third sensors may cooperate with each other.

Fig. 6 is an explanatory diagram showing a schematic relationship between a sensor group and a controlled device (including a lighting fixture, an air conditioner, a refrigerator, a television device, an iron, an automatic door, a fire extinguishing apparatus, and the like) in a home and an HGW 600.

For example, when the air conditioner is controlled to lower the temperature of a room, the temperature may not be lowered. In such a case, there is a case where the window is opened and the cold air leaks to the outside, not the air conditioner. In addition, although the temperature of the room is lowered by controlling the air conditioner, the temperature may not be adjusted to an accurate temperature due to the damage of the indoor temperature sensor.

The embodiments described now can provide a system that can achieve cooperation with other sensors and adjust to a favorable indoor environment when the above-described problem occurs.

In fig. 6, the HGW600 can communicate with a sensor group and a controlled device in a home by a wired or wireless method. As a communication method, Bluetooth (registered trademark), ZigBee (registered trademark), Z-Wave (registered trademark), Wi-Fi (registered trademark), or the like can be used.

2000 denotes various sensor groups and controlled device groups. They may also be referred to as groups of home network terminals or internet of things (so-called IoT group of elements). Hereinafter, each sensor and each controlled device included in the IoT element group 2000 will be described.

The sensor 2100 is an example of a sensor that detects an event. For example, a switch 2102 is provided on the substrate 2101. The flap 2103 has one end attached to one end of the base 2101 via a hinge. When the flap 2103 opens the door or window, for example, the rotating portion thereof is separated from the base 2101, and the switch 2102 is turned on. Thus, the power supply circuit formed on the substrate 2101 is supplied with power from the power supply, and the radio wave transmitter of the substrate 2101 is activated to output a radio wave including a predetermined sensor ID. At this time, the switch 2102 is turned on (i.e., the door or window is opened), and the electric wave is captured by the HGW600, so that the HGW600 can recognize that the door or window is opened. When the electric wave including the predetermined sensor ID is not received, the HGW600 can recognize that the door or the window is closed.

The sensor 2110 is an example of a sensor that detects another event. For example, a photoelectric converter (photoelectric conversion panel) 2112 is mounted on the substrate 2111. The output of the photoelectric converter 2112 drives a radio wave transmitter 2113. The photoelectric converter 2112 is immediately discharged when not being irradiated with light and loses electric power. Therefore, for example, when the curtain is opened or the lighting fixture is illuminated, the radio wave including the sensor ID is output from the radio wave transmitter 2113. On the other hand, when the curtain is closed or the illumination is off, the radio wave transmitter 2113 is stopped, and the radio wave output is stopped. Therefore, the sensor 2110 can be used as a sensor for detecting opening/closing of a curtain, opening/closing of illumination, or the like.

Further, a color filter may be provided on the light receiving surface of the photoelectric converter 2112 to avoid a reaction to unnecessary light.

A second sensor similar to the sensor 2110 may be further added for detecting opening and closing of the curtain. The second sensor may be configured to turn off the switch by blocking light with the curtain when the curtain is open, and to turn on the switch by irradiating light when the curtain is closed, and to output a radio wave including the ID of the second sensor for a certain time. In this way, when one sensor fails, the HGW600 can easily determine that it is abnormal. Therefore, the HGW600 system can improve the detection capability of the curtain open/close detection function.

The sensor 101 is a high-class sensor composed of an integrated circuit. Including memory 112, network I/F115. Further, functions 116 and 117 as detection elements are included. However, the type of sensor is not limited to this type, and various types can be used.

The memory 112 includes an application manager (APP-Mg), an event manager (EVT-Mg), and a configuration manager (CONFIG-Mg). CONFIG-Mg manages various applications for controlling the overall operation of the sensor system. EVT-Mg manages the application of events for performing the next actions of sensor 101 based on sensed data from functions 116, 117. The functions 116 and 117 include various elements according to the purpose of detection. As various elements, for example, a camera and a microphone may be used. Further, various elements include a heat sensor, a temperature sensor, a humidity sensor, an illumination sensor, a pressure sensor, a switch, and the like. The sensor 101 may include 1 or more detection elements for its purpose.

The sensors 101, 102, and 103 … can be used as sensors for detecting opening and closing of doors, sound emission of a certain sound, movement of a person, opening and closing of windows, and photographing, and are disposed at various positions in a home, for example.

Although the above description has been made on the mobile HGW600, a stationary HGW600a may be added. In this case, the stationary HGW600a is set as, for example, the slave HGW600 a. Since this HGW600a has the same structure and function as the HGW600 described with reference to fig. 2, in addition to the moving body, detailed description thereof is omitted.

2121 is a fixed camera provided in a parking lot, a front of a hallway, or a door, for example, and functions as a sensor. The lights 2131, 2132, 2133, the fire extinguishing device 2126, and the like are controlled devices in each room in the home. A temperature sensor 2122 provided at a kitchen or indoor temperature measurement site, a pressure sensor 2133 attached to a window glass edge, a door, or the like, a fire alarm sensor 2125, and a microphone (not shown) also belong to the sensor group.

The mobile HGW600 and the stationary HGW600a can exhibit various capabilities of 1 sensor or 1 controlled device or more by effectively utilizing the characteristics of the sensor group and the controlled device group in combination.

Fig. 7 is a flowchart showing system operation in the all-standby mode. The householder may sometimes go out for 1 to 2 weeks, for example. The system of the embodiment can set the full standby mode at this time.

Currently, it is set to start in the full standby mode (step SC 1). Currently, it is assumed that a suspicious moving body is detected by a camera of the HGW600a of the stationary type (step SC 2). The HGW600a activates the camera periodically or based on a detection signal from a human sensor or the like, for example, to start shooting. The HGW600a can detect a suspicious moving object by processing the captured image data by the movement detection circuit. In the case where a suspicious moving body is detected, the HGW600a photographs an object as an event. The shot data is recorded in a recording medium (for example, a USB connection recording/playing device can be used) connected to a network in a home. Or may be recorded in a recording medium within a server via a network.

When the HGW600a detects a suspicious moving body, it notifies the mobile HGW600 of the detection (step SC 3). The HGW600a continues shooting of the suspicious moving object, but the suspicious moving object sometimes moves out of the field of view (step SC 4). For example, a suspicious moving body may move to another room or entrance. In this case, the HGW600a notifies the portable HGW600 that the suspicious moving body has moved to another room or entrance (step SC 5).

In this case, it is preferable that HGW600a notify the mobile HGW600 of which room to move based on mapping information in which rooms to which suspicious moving bodies move are registered in advance.

The mobile HGW600 can autonomously move to a room or entrance where a suspicious moving body moves, photograph the suspicious moving body, and transmit video data for recording on a recording medium.

Fig. 8 is a flow chart showing another system action in the full standby mode. The householder may sometimes go out for 1 to 2 weeks, for example. At this time, the system of the embodiment may set the full standby mode.

For example, the HGW600 set to the portable type detects a suspicious sound by a microphone (steps SD1, SD 2). Here, even if the surroundings are photographed by a camera, the portable HGW600 may not photograph a suspicious object causing a suspicious sound.

Then, the portable HGW600 acquires data from sensors (window sensors, door sensors, pressure sensors, temperature sensors, and the like) installed in homes in each room or the like, and performs data analysis (step SD 3).

The portable HGW600 determines whether or not the reliability of the suspected sound is present, that is, whether or not the sound is a sound (abnormal sound) which has become customary so far, by data analysis. The determination also uses the learning result of the sound detected in the past. For example, in an area where a train or a car often passes nearby and generates a vibration sound, the mobile HGW600 is not determined to be a suspicious sound even if the same sound is detected, using the learning result. Examples of the abnormal sound include a sound of hitting a window glass or breaking, a sound of collision, and a creaky sound.

When pets (dogs, cats, birds, etc.) are present in a house, their sounds can be analyzed based on the sound analysis function and can be excluded from the abnormal sound determination.

When the reliability of the suspected sound is high, the generation direction and the generation location of the suspected sound are determined (step SD 5). Then, the mobile HGW600 moves to the area where the suspected sound is generated based on the mapping information, and the camera is directed to the sound direction to capture an event (step SD 6).

The mobile HGW600 can take an image of a moving path by a camera even during movement, and can move so as to avoid an obstacle when the obstacle is present. The mobile HGW600 has a captured image of an obstacle or the like registered in advance in the mapping unit 619. Therefore, when an obstacle exists on the moving path, the existence of the obstacle can be immediately determined by image comparison.

When shooting is performed using a camera as described above, the HGW600 can turn on the lighting facilities in the vicinity when the surrounding space is dark as described above. In this case, the illuminance data of the illumination is also acquired, and if the illuminance is insufficient, the administrator can be notified of the insufficient illuminance data as a warning.

Fig. 9 shows an embodiment in which the portable HGW600 determines whether or not the air conditioner is functioning normally after the start control of the controlled device (for example, the air conditioner) is performed. The mobile HGW600 turns on the cooling air conditioner of the room a1, for example, in response to a remote operation or a user's voice command (steps SE1, SE 2). When a predetermined time elapses (SE3, SE4), the air conditioner starts a cooling operation, and checks whether the temperature in room a1 has dropped to the vicinity of a set value (step SE 5). The portable HGW600 stores the temperature of the room a1 at the time of starting the cooling air conditioner, and can compare the temperature with the temperature of the room a1 after a certain period of time has elapsed.

Here, if the temperature of the room a1 is in the desired temperature range, the process ends (step SE 6). However, when the temperature of the room a1 does not fall within the predetermined range, data is collected from various sensors provided in the room a 1.

As the sensor, for example, data is collected from a window sensor, a door sensor, a sensor of a heat source (a gas furnace, a heater, or the like). Here, when the window is opened, the door is opened, or the heat source is turned on, there is a high probability that the temperature of room a1 is not decreased. Therefore, the portable HGW600 moves to the field, performs imaging, and notifies the administrator (step SE 15).

After analyzing the data from the sensor, if the cause is unknown, the system moves to the scene, performs imaging, and notifies the administrator of the imaging (step SE 14).

In the above-described embodiment, even if the cooling air conditioner is replaced with a heating air conditioner, the normal operation or the abnormal state of the heating air conditioner can be determined by the same processing.

Fig. 10 is an explanatory view for explaining still another embodiment of the above-described mobile HGW 600. The portable HGW600 has a function of checking the normal and abnormal states of various sensors and various controlled devices installed in a home periodically or according to a user's instruction.

For example, the illumination tool 120 is turned on/off, and the output of the camera 611 and/or the illuminance sensor is checked. This makes it possible to determine whether or not the lighting fixture 120 is operating normally. Further, the on/off control may be performed for each of the plurality of lighting devices to check whether or not the camera 611 and/or the illuminance sensor are operating normally. When the on/off control is performed on each of the plurality of lighting apparatuses, if none of the camera 611 and the illuminance sensor is responsive, the camera 611 and/or the illuminance sensor may malfunction.

Further, the illumination device may be controlled, and the illumination intensity of the illumination device may be measured and compared with the past measurement data to determine the replacement timing of the illumination device. Since there is a type of illumination appliance having an illuminance sensor built therein, the HGW600 can use output data from the illuminance sensor.

The HGW600 may control the air conditioner 2127 to be turned on/off, and determine whether the air conditioner 2127 is operating normally based on detection data from the temperature sensor 2122. When the air conditioner is normal, if detection outputs from the plurality of temperature sensors are acquired and analyzed, it is possible to determine which temperature sensor has failed.

Further, the camera 611 captures the open/close state of the window 2128, and it can be determined whether the window 2128 is open or not from the image data. At this time, by determining whether window sensor 2100 is open or closed, it can also be determined whether window sensor 2100 is normal.

In this way, the logical matching between the control state of the controlled device and the detection output of the sensor can be checked.

As described above, the HGW600 according to the present embodiment can perform maintenance of various sensors and various controlled devices. The main functions thereof are summarized below.

(1B) Comprises a mobile device equipped with a camera, a microphone, a communication device, and a control device,

the control device is provided with: a means for acquiring sound data of a sound detected by the microphone; a mechanism that determines a direction of the sound and a generation region of the sound; and a mechanism for moving the camera to the sound generation area by controlling the moving device, and causing the camera to capture the direction of the sound.

(2B) …, in (1B), the device includes a mechanism for limiting the specification, the control of the mobile device, and the imaging according to the type of the sound.

(3B) …, in the case of (1B), the device may further include learning means for learning the sound, and the device may further include means for limiting the specifying, the controlling of the mobile device, and the capturing when the sound is stored by the learning means.

(4B) … in (1B), the apparatus includes: the moving device is controlled to turn on the illumination of the moving path when the brightness required for shooting is insufficient in the middle of moving.

(5B) … in (1B), the apparatus includes: the moving device is controlled to photograph a moving path in the middle of moving, and the presence or absence of an obstacle is determined based on the difference between past photographed data and photographed data in the middle of moving this time, and the moving device is controlled so as to avoid the obstacle.

(6B) … in (4B), the illuminance of the illumination may be measured, and a warning may be given when the brightness of the illumination is insufficient.

(7B) … in (1B), the printer includes: a means for controlling the first controlled device; a means for acquiring a detection output of a first sensor that reacts to a phenomenon based on the first controlled device; and a means for checking the logical matching between the control state of the first controlled device and the detection output of the first sensor.

(8B) … in (7B), the first device under control is a lighting fixture, and the first sensor is an illuminance sensor.

(9B) … in (7B), the number of the first controlled devices is 1, and the number of the first sensors is plural.

(10B) … in (7B), the number of the first controlled devices is 1, the number of the first sensors is plural, the plural first sensors are turned on with time shift, and a mechanism for determining which of the plural first sensors is malfunctioning is provided.

< recording of event, checking function of event >

The present embodiment is not limited to the above.

Fig. 11 is a diagram showing an example of the overall configuration of a network system using a mobile assistance device according to an embodiment.

In fig. 11, a server 1000 can be connected to a home gateway (hereinafter, HGW)600 via the internet 300. The HGW600 includes a system controller 602, a device manager 603, a network interface (hereinafter referred to as a network I/F)605, a recording manager 607, a camera 611, a microphone 613, a speaker 615, and the like. Also, the HGW600 includes a sensor control table 609.

The memory (control data management unit) 601 is as described above. The system controller 602 can collectively perform sequential control of the respective modules in the HGW 600.

The EVT-MG can also perform control of the logging manager 607.

In addition, the sensor control table 609 stores the name of each sensor in which the sensors 101, 102, 103, and 104 are registered, the position information of each sensor, and data for controlling each sensor. Further, the name and position information of each sensor can be displayed on the smartphone GUI-1, and thus the user can confirm the type and installation position of the sensor.

The network I/F605 is connected to other sensors 101, 102, 103, … in a home, for example, via short-range wireless. The structure of the other sensor 101 is representatively shown. The sensor 101 also includes a control data management unit 112 and a network I/F115. Further, functions 116 and 117 as detection elements are included. However, the type of sensor is not limited to this type, and various types can be used.

The memory (control data management unit) 112 includes an application manager (APP-MG), an event manager (EVT-MG), and a configuration manager (CONFIG-MG). CONFIG-Mg manages various applications for controlling the overall operation of the sensor system. EVT-Mg manages event applications for causing sensor 101 to perform the next actions based on detection data from functions 116, 117. The functions 116 and 117 include various elements according to the purpose of detection. As various elements, for example, a camera or a microphone may be used as in the HGW 600. Further, the various elements include a heat sensor, a temperature sensor, a humidity sensor, an illumination sensor, a pressure sensor, a switch, and the like. The sensor 101 may include 1 or more sensing elements for its purpose.

The sensors 101, 102, and 103 … are disposed at various positions in a home, for example, as a sensor for detecting opening and closing of a door, a sensor for detecting sound emission of a certain sound, a sensor for detecting movement of a person, a sensor for detecting opening and closing of a window, and a sensor for capturing an image.

In the above system, when a detection signal is output from any 1 or more of the sensors 611 (camera), 613 (microphone), and the other sensors 101, 102, and …, the control data management unit 601 recognizes the occurrence of an event. Then, the control data management section 601 controls the camera 611 via the recording manager 607. Thus, the camera 611 transmits the monitoring data buffered from the event occurrence time point (for example, 10 minutes ago) to the storage medium via the recording manager 607 and the control data management unit 601, and continues to transmit the monitoring data captured for a certain duration (for example, 3 minutes, 5 minutes, 10 minutes, 20 minutes, 30 minutes, or the like). Along with the monitoring data, event-related data (which may also be referred to as event attribute data) at the time of detection of the event is also transmitted to the storage medium 1010 in the present system.

The event-related data may include, for example, any 1 or more of the occurrence time of the event, the kind of sensor that detected the event, the position data of the sensor, the recording start time, the recording end time, and the like.

In fig. 11, the storage medium is, for example, a memory in the server 1000, but may not necessarily be a storage medium in the server. The storage location of the monitoring data may be a storage medium in the HGW600 or a storage medium connected via the network I/F605. The storage medium 1010 includes a data area 1011 and a management area 1021. The monitoring data 1012 is stored in the data area 1011, and the event-related data 1022 is stored in the management area 1021.

The monitoring data 1012 may include not only video data but also measurement data from a sensor. For example, the temperature change, humidity change, or pressure change of a specific portion. Management data for playing the monitoring data is described in the management area 1021. The management data also includes previous event correlation data. The management data includes event-related data and a record address of monitoring data corresponding to the event-related data. When a plurality of events occur, there are a plurality of event-related data and a plurality of monitoring data corresponding to the plurality of event-related data.

The event-related data includes the category of the event (which may also be referred to as sensor output). Further, although monitoring data (for example, monitoring video) is recorded based on an event, the event-related data includes the recording start time and the recording end time thereof.

Fig. 12 shows the time lapse when the monitoring data is recorded in the storage medium when the event occurs. Here, various sensors in the living room of the home are assumed. As the sensors, there are an open/close detection sensor of the door 1, an open/close detection sensor of the door 2, an open/close detection sensor of the window 1, an open/close detection sensor of the window 2, a microphone, and a movement detection sensor (using a captured image, an infrared sensor, or the like). The HGW600 is disposed at a corner of the ceiling of the living room, and the camera thereof can photograph the living room.

Currently, the 1 st child enters the room from the door 1, opens the door 1 at time t1, and closes at time t 2. At time t1, the movement of the person is detected by the camera. When the door is opened or closed, for example, about 3 minutes of recording is performed. When the movement detection is continuous, the video recording is continuously performed during the detection. During the recording, sound is also picked up from the microphone 613. As a result, in the storage medium 1010 (however, if a storage medium exists in the HGW600 or directly connected to the HGW600, the storage medium may be the storage medium), the monitoring data caused by the first event (occurrence of 2 events) is recorded as the recording data Rec1 in the storage medium. The event-related data at this time includes the sensor ID attached to the door 1, the ID of the camera 611, the start time and the end time of recording Rec 1. Further, the management data (event-related data) includes an address on the storage medium where the record Rec1 is stored.

After a while, currently, the 2 nd child enters the room from the door 2, opens the door 2 at time t3, and closes the door 2 at time t 4. Also, at time t3, the movement of the person is detected by the camera. The monitoring data resulting from the second event (occurrence of 2 events) is recorded as recording data Rec2 in the storage medium.

Then, it is assumed that: the microphone 613 picks up a loud sound at time t5, the movement of the person is detected at time t6, and the opening and closing of the door 1 are detected at times t7 and t 8. For example, a2 nd child sings aloud, detects the movement of the 2 nd child, and the 1 st child exits from door 1. Thereby, the monitoring data resulting from the third event (occurrence of 3 events) is recorded as the recording data Rec3 in the storage medium.

After a lapse of time, a large sound is picked up by the microphone 613 at time t9, the movement of the person is detected by the camera 611 at time t10, the window 1 is opened at time t11, and then a large sound is again picked up by the microphone 613 at time t 12. For example, at 2 nd child sings with a loud sound, next 2 nd child moves to window 1, opens window 1, and sings aloud. Thereby, the monitoring data caused by the fourth event (occurrence of 4 events) is recorded as the recording data Rec4 in the storage medium.

Next, at time t13, the 1 st child gets in and out of door 1, and at time t14, window 1 is closed. Thereby, the monitoring data resulting from the fifth event (occurrence of 3 events) is recorded as the recording data Rec5 in the storage medium.

In the case where the event-related data and the monitoring data are recorded in the storage medium as described above, the HGW600 can present the monitoring data to the smartphone GUI-1 in various forms in the case of checking the monitoring data.

Fig. 13 shows an example of the internal configuration of the system controller 602 shown in fig. 11.

The event determiner 625 determines the detection signals from the sensors as described in fig. 12. When an event is detected, the recording commander 621 transmits the monitoring data 1012 to the recording medium 1010 to instruct recording. At the same time, the event-related data is transferred to the recording medium 1010 and recorded.

The event determiner 625 may determine an event when a specific command signal is transmitted from the smartphone GUI-1. For example, when a first user having a smartphone GUI-1 has a phone conversation with a second user at home, the first user operates a specific key of the smartphone GUI-1 to be able to transmit an event start signal to the HGW 600. Also, the first user can operate a specific key of the smartphone GUI-1 to transmit an event start signal to the HGW600 even when he is not talking. Further, a second user in the home consciously operates the sensor, and can transmit an event start signal to the HGW 600. For example, for maintenance, the second user consciously operates a sensor sensing on/off of the illumination (e.g., shielding/opening the light receiving part), and can transmit an event start signal to the HGW 600.

In the case where it is desired to check the monitoring data, the user can request the HGW600 (system controller 602) for the playback of the monitoring data related to a desired event via the smartphone GUI-1 or the networked television receiving apparatus GUI-2.

To this end, the system controller 602 includes a playback controller 623, and the playback controller 623 is configured to play back arbitrary event-related data and monitoring data from the storage medium 1030. The play controller 623 includes a fast forward function, a rewind play function, a frame-by-frame play function, and an event handler that performs a cut-off process on an event. Since a large amount of event-related data and monitoring data are stored in the storage medium 1010, the system controller 602 can operate so that a user can efficiently check desired monitoring data. Therefore, the system controller 602 includes a filter unit 631 and a display pattern processing unit 629, which can classify and select various events and generate a display list or a display array. The generated display arrangement or the played-back monitor data is transmitted to a monitor such as a smartphone GUI-1 or a television receiver GUI-2 via the display data output unit 627. Also, the system controller 602 includes a memory 624 for temporarily storing data, lists.

The system controller 602 communicates with the smart phone GUI-1 or the television receiver GUI-2, and transmits the generated display arrangement and the played monitoring data to the monitor. The playback controller 623 is capable of performing a fast-forward function, a rewind function, and a frame-by-frame playback function for capturing a video of an event, in accordance with an instruction from the smartphone GUI-1 or the television receiver GUI-2. The playback controller 623 includes an event processor that processes event-related data, and can execute the order of events, event selection processing, and the like.

Fig. 14 shows a state in which a menu is displayed on the screen of the smartphone GUI-1, for example. Examples of the selection buttons in the menu include a monitoring data request button 501, an internet (1) connection button 502, an internet (2) connection button 503, a mobile phone activation button 504, a game (1) activation button 505, and a game (2) activation button 506. Further, a sensor list button 507 is provided, and when this button 507 is operated, a list of various sensors for detecting an event can be displayed.

Here, the monitoring data request button 501 is touched and operated. Then, the smartphone GUI-1 displays, for example, "what event image is checked? Such messages and display to the user buttons 512, 513, 514 "all", "specify", "typically".

When the user selects the "all" button 512 by a touch operation, the control data management unit 601 controls the display unit to transmit the date and time data of the occurrence of an event (event not including all of the sensors) and a part (thumbnail) of the monitoring data (image data captured by the camera) at the time of occurrence of each event to the smartphone GUI-1. Since a large amount of event-related data and monitoring data are stored in the storage medium 1010, as display data at the time of display start, event-related data related to a plurality of (3 to 5) events and representative thumbnails (thumbnail) of the corresponding monitoring data are selected and displayed around an event occurring 5 hours before and after the current time as a center. The representative thumbnail is, for example, monitoring data (image data) corresponding to the event occurrence time.

An image of "what event is checked? "in such a message," the user can select the "designation" button 513 by touch operation. At this time, a list 517 of locations (for example, door 1 open/close, door 2 open/close, window 1 open/close, window 2 open/close, …, etc., as names) to which the activated sensors are attached is displayed. The user can select 1 or more occurrences of the desired image inspection by touch operation. Fig. 15A shows an example of selecting and determining items such as opening and closing of the door 1, opening and closing of the window 1, and movement detection. In this example, a simple example of an event is shown, but in reality, more events and event names are set.

When the user selects an event to be generated for which image inspection is desired and performs a determination operation 518, a representative thumbnail of monitoring data at the time of generation of the selected event and corresponding event-related data are displayed, as will be described later. In this case, since a large amount of event-related data and monitoring data are stored in the storage medium 1010, as display data at the time of display start, the representative thumbnail images of the event-related data and the corresponding monitoring data relating to a plurality of (3 to 5) events in the past and behind the event are selected and displayed centering on the event occurring 5 hours before, for example, from the current time.

In the smartphone GUI-1 is displayed "what event to check for images? In the case of such a message, the user can select the "normal" button 514 by touch operation. This button 514 is active after the "assign" button 513 has been operated and a decision operation 518 has been made. In this case, based on the designated event, the representative thumbnail images of the event-related data and the corresponding monitoring data related to a plurality of (3 to 5) events before and after the current time are selected and displayed centering on the event occurring 5 hours before, for example, from the current time.

Fig. 15A illustrates an example in which the events are managed independently for each kind of selected event, and the event-related data is arranged in time series. This arrangement example is explained with reference to fig. 18A described later. However, the display example of the event-related data is not limited to this, and event-related data of different types of events may be displayed in combination by the setting shown in fig. 15B.

That is, as shown in fig. 15B, the combination button 518a may be displayed before the decision button 518B is operated. When the combination button 518a is operated, the event-related data of the selected items (the 2 items of the currently selected door 2 for opening and closing and the movement detection) in the event list may be combined and displayed in chronological order. That is, when the combination button 518a and the decision button 518B are continuously operated, the event-related data is arranged and displayed as described in fig. 18B, for example.

As described above, before the user requests the control data management unit 601 to play the monitoring data relating to the desired event, the user can notify the control data management unit 601 of the image play of the desired event.

Fig. 16A shows an operation screen displayed next after the monitoring data request button 501 is operated through the menu of the smartphone GUI-1 shown in fig. 14. Here, as described above, the "what event image is checked? Together, "such messages display" all, "designate," and "typically" such buttons 512, 513, 514 for the user. Here, for example, the "normal" button 514 is selected. Then, an event list such as that shown in fig. 16B is displayed. The event list is a list generated by the playback controller 623 shown in fig. 13 by reading the event-related data and the monitoring data from the storage medium 1030, performing filtering processing by the filter unit 631, and generating the event list by the display style processing unit 629. The filtration may be performed in the following order: first, the playback control unit 623 reads the event-related data from the storage medium 1010, performs filtering processing on the event-related data, and plays back the monitoring data corresponding to the extracted event-related data from the storage medium 1010.

Although the above description requests and displays the event list through the smartphone GUI-1, the same operation can be performed through the television receiver GUI-2. When the television receiving apparatus GUI-2 is operated, the operation can be performed by operating a cursor on the screen via a remote controller.

Although the thumbnail image showing the monitoring data is simplified in fig. 16B, actually, a video image in the range of the viewing angle of the camera 611 is captured.

Currently, it is assumed that the thumbnail 522 of the event 521 is selected from the list of fig. 16B by a touch operation. Then, the playback controller 623 (shown in fig. 13) starts continuous playback of the monitor data from the time point when the event 521 occurs, for example, 5 minutes before to about 10 minutes before, for example, and transmits the data to the monitor. The image at this time is shown in fig. 17.

The image of fig. 17 is taken when a person 525 enters a room by opening a door 526, walks to a bed 527, and goes to bed on the bed 527. The playback controller 623 displays a list of a plurality of pieces of monitoring data corresponding to a plurality of pieces of event-related data in accordance with an instruction from the smartphone GUI-1 or the television receiver GUI-2, and when any monitoring data is selected from the plurality of pieces of monitoring data displayed in the list, plays the designated monitoring data continuously during the recording period.

The playback controller 623 (shown in fig. 13) can execute a fast-forward function, a rewind function, and a frame-by-frame playback function for capturing a video of an event, as described below, in accordance with instructions from the smartphone GUI-1 or the television receiver GUI-2.

Further, the playback controller 623 is able to refer to event-related data, and therefore is able to sequentially and continuously fast-forward playback or normal playback of a plurality of pieces of the monitoring data associated with a plurality of the events.

Further, the play controller 623 can sequentially and continuously fast-forward play or normal play the plurality of monitoring data associated with the specified specific event.

The playback controller 623 shown in fig. 13 includes an event handler that handles a plurality of pieces of event-related data, and the event handler can accept or reject a plurality of pieces of event-related data corresponding to a specific event at a time. For example, there are cases where an event is generated in a pulse. For example, detection of loud sounds or detection of movement (for example, detection of movement when the blind swings due to wind) occurs singly. In such a case, it may be preferable to perform the deletion process by setting the detection time to a series of continuous flows, and to check the monitoring data based on the event-related information after the deletion process.

In the above-described embodiment, image data of a certain length of time (5 minutes, 10 minutes, 15 minutes, or 20 minutes) captured by the camera 611 at the time of event detection is stored as monitoring data. The time for which the monitoring data is stored for each event can be changed arbitrarily. Also, the period may be different according to the kind of event. Further, the time period for holding the monitoring data may be different depending on the time period.

The arrangement method of the events (arrangement method of thumbnails corresponding to the event-related data) can be arbitrarily arranged by the arrangement application, and the thumbnails corresponding to the event-related data can be displayed in accordance with the arrangement.

Fig. 18A shows a display example when the event-related data and the thumbnail of the monitoring data associated with the event-related data are classified by each event. The classification is shown by taking as an example an event of movement detection, an event of opening and closing of the door 1, an event of opening and closing of the window 1, and an event of turning on/off of the lighting 1.

Currently, event-related data 526a-526d relating to the opening and closing of door 1 and corresponding thumbnails 525 a-525 d are shown on smartphone GUI-1. The event correlation data 526a-526d is arranged in chronological order of the occurrence of the events. Here, if the user slidingly operates the touch operation surface of the smartphone GUI-1 in the direction of the arrow 531a, the event-related data and the corresponding thumbnail image with the later time are displayed, and if the user slidingly operates in the direction of the arrow 531b, the event-related data and the corresponding thumbnail image with the earlier time are displayed.

Further, if the user slidingly operates the touch operation surface of the smartphone GUI-1 in the direction of the arrow 532a, the event-related data and the corresponding thumbnail image related to the opening and closing of the window 1 are further displayed, and if the user slidingly operates in the direction of the arrow 532b, the event-related data and the corresponding thumbnail image related to the movement detection are displayed.

In the above-described embodiment, as described with reference to fig. 13 and 15B, when event-related data and corresponding thumbnails are displayed, a plurality of events may be displayed in combination. As shown in fig. 13, the control data management section 601 includes a filter section 631. The filter 631 can filter and sort the event-related data for each of the categories, or can combine the event-related data of different categories and display the event-related data. Fig. 18B shows an example of thumbnail images of event-related data and corresponding monitoring data when an event occurs to open or close the door 2 and when an event of sound detection exists.

In this case, if the user performs a sliding operation in the direction of the arrow 531a with respect to the touch operation surface of the smartphone GUI-1, the event-related data and the corresponding thumbnail image with the later time are displayed, and if the user performs a sliding operation in the direction of the arrow 531b, the event-related data and the corresponding thumbnail image with the earlier time are displayed.

Fig. 19 is a diagram illustrating still another relationship between a smartphone and event-related data displayed on the smartphone and yet another operation method. The previous display example (fig. 18A to 18B) displays the event-related data as a list. However, it is also possible to display the occurrence source name of each event in a tiled window style as shown in fig. 19 after the display of the monitoring data is specified. When the user presses a desired window from among the plurality of tiled windows (door 1, door 2, window 1-window 4, lights 1-5, sound 561, sound woman 562, and sound man 563 …), the display state may transition to the display state shown in fig. 16B, for example. When the sound 561 is selected, the event-related data related to all the sounds is displayed. However, when the voice-woman 562 is selected, the event-related data related to the voice of the woman is displayed, and when the voice-man 563 is selected, the event-related data related to the voice of the man is displayed.

Fig. 20 shows the structure of event-related data recorded in management area 1021 and monitoring data recorded in data area 1011. The event-related data is classified by the categories of the event. For example, opening and closing of a door, opening and closing of a window, opening/closing of lighting, opening/closing of an air conditioner, opening/closing of a television receiver, movement detection, and the like. The sensor items (sensor 1, sensor 2, …) to which the sensor identification data is attached belong to various kinds of items. Event data is described for each sensor item. The event data includes, for example, the occurrence time of the event, the recording start time of the monitoring data, the recording end time of the monitoring data, the recording start address of the monitoring data, the recording end address of the monitoring data, and the thumbnail address. The recording start address of the monitoring data, the recording end address of the monitoring data, and the address of the thumbnail address indicating the data area 1011 are referred to, and the playback controller 623 can read and play necessary data from the storage medium 1030 by referring to the addresses.

In this embodiment, for example, the television receiving apparatus GUI-2 can easily display the monitor data with high quality. Further, since the monitoring data is not transmitted to the outside via the internet 300, it is effective particularly in the case of managing individual monitoring data. Data transmitted to the server 1000 via the internet 300 and data transmitted from the server 1000 to the HGW are subjected to concealment processing.

Although the embodiments of the present invention have been described, the above embodiments are presented as examples, and do not limit the scope of the invention. The above-described new embodiment can be implemented in various other embodiments, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.

Description of the reference numerals

120 … lighting equipment, 300 … internet, 600 … HGW, 601 … memory, 603 … equipment manager, 605 … network I/F unit, 609 … sensor and appliance control table, 618 … mobile device, 618a … driving unit, 619 … mapping unit, 622 … illuminance sensor, human motion sensor … 623, 1000 … server, 800a, 800b … room, 2000 … various sensor groups and controlled device group.

42页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:等离子体处理装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!