Control device, control method, and program

文档序号:1160201 发布日期:2020-09-15 浏览:16次 中文

阅读说明:本技术 控制装置、控制方法及程序 (Control device, control method, and program ) 是由 入江淳 半田正树 赤间健人 何信莹 大桥武史 河野碧 于 2018-11-14 设计创作,主要内容包括:一种控制装置,具有用于基于识别到手的结果输出控制香气的输出的信息的控制单元。(A control device has a control unit for outputting information for controlling the output of fragrance based on the result of recognizing a hand.)

1. A control device comprises

And a control unit for outputting information for controlling the output of the fragrance based on the result of the hand recognition.

2. The control device according to claim 1, wherein,

wherein the control unit identifies a position of the hand based on a result of identifying the hand, and outputs information for controlling the fragrance to be output from the position.

3. The control device according to claim 2, wherein,

wherein the information includes information for controlling outputting of the fragrance to a device closest to the recognized position of the hand among a plurality of devices emitting the fragrance.

4. The control device according to claim 1, wherein,

wherein the control unit outputs information for controlling the output of the fragrance based on a result of recognizing the motion of the hand.

5. The control device according to claim 1, wherein,

wherein the information controlling the output of the scent includes: at least one of information to cause emission of a fragrance, information to cause a change in intensity of a fragrance, information to cause a change in direction in which a fragrance appears, information to stop emission of a fragrance, and information indicating a period of time during which emission of a fragrance continues.

6. The control device according to claim 1, wherein,

wherein the control unit outputs information for performing control for causing a user to perceive a change in a surrounding environment and information for controlling output of the fragrance.

7. The control device according to claim 6, wherein,

wherein the control for making the user perceive the change in the surrounding environment includes at least one of control for reproducing sound, control for emitting wind, and control for changing surrounding brightness.

8. The control device according to claim 1, wherein,

wherein the control unit outputs information controlling output of the fragrance based on a result of recognizing the hand and a result of the recognition with respect to a predetermined input.

9. The control device according to claim 8, wherein,

wherein the predetermined input comprises an input detectable by at least one of a sound sensor, a temperature sensor, and a biosensor.

10. The control device according to claim 1, wherein,

wherein the control unit identifies a hand based on an image corresponding to a viewpoint of a user.

11. The control device according to claim 10, wherein,

wherein the image comprises a real image or a virtual image.

12. The control device of claim 1, further comprising

And a fragrance emission unit that performs output of fragrance based on the information.

13. The control device according to claim 1, wherein,

the control apparatus is configured as a wearable device.

14. A control method comprises

Information for controlling the output of the fragrance is output by the control unit based on the result of recognizing the hand.

15. A program for causing a computer to execute a control method, the method comprising

Information for controlling the output of the fragrance is output by the control unit based on the result of recognizing the hand.

Technical Field

The present disclosure relates to a control device, a control method, and a program.

Background

There is known an apparatus that converts information based on a user's perception (e.g., smell) into information based on another animal's perception and provides the converted information to the user (for example, refer to patent document 1 below).

Reference list

Patent document

Patent document 1: japanese patent application laid-open No. 2014-165706

Disclosure of Invention

Problems to be solved by the invention

In such fields, it is desirable to provide information about perception to a user based on suitable information.

An object of the present disclosure is, for example, to provide a control device, a control method, and a program that output information that controls output of fragrance (smell) based on appropriate information.

Solution to the problem

For example, the present disclosure is a control device including a control unit that outputs information that controls output of a fragrance based on a result of recognizing a hand.

For example, the present disclosure is a control method including outputting, by a control unit, information that controls output of a fragrance based on a result of recognizing a hand.

For example, the present disclosure is a program that causes a computer to execute a control method including outputting, by a control unit, information that controls output of a fragrance based on a result of recognizing a hand.

Effects of the invention

According to at least one embodiment of the present disclosure, information for controlling the output of fragrance (smell) may be output based on appropriate information. The effect described herein is not limited, and may be any effect described in the present disclosure. Furthermore, the disclosure is not to be interpreted as being limited due to exemplary effects.

Drawings

Fig. 1 is a block diagram showing a configuration example of a control system according to an embodiment.

Fig. 2 is a diagram showing an exemplary appearance of a control device according to an embodiment.

Fig. 3 is a block diagram showing a configuration example of a control device according to an embodiment.

Fig. 4 is a block diagram showing a configuration example of a fragrance output device according to an embodiment.

Fig. 5A to 5D are diagrams showing specific examples of a fragrance output device according to an embodiment.

Fig. 6 is a diagram for schematically describing processing for recognizing a hand motion or the like.

Fig. 7 is a diagram for describing an example of processing for estimating the position of the hand.

Fig. 8 is a diagram for describing an example of processing for estimating the position of the hand.

Fig. 9 is a diagram for describing an example of processing for estimating the position of the hand.

Fig. 10A to 10D are diagrams for describing an outline of processing according to the first embodiment.

Fig. 11 is a flowchart for describing the flow of processing according to the first embodiment.

Fig. 12 is a flowchart for describing a flow of processing (modification) according to the first embodiment.

Fig. 13A to 13D are diagrams for describing an outline of processing according to the second embodiment.

Fig. 14 is a flowchart for describing the flow of processing according to the second embodiment.

Fig. 15 is a flowchart for describing a flow of processing (modification) according to the second embodiment.

Fig. 16 is a flowchart for describing a flow of processing (modification) according to the second embodiment.

Fig. 17 is a flowchart for describing a flow of processing (modification) according to the second embodiment.

Fig. 18A to 18D are diagrams for describing an outline of processing according to the third embodiment.

Fig. 19 is a flowchart for describing the flow of processing according to the third embodiment.

Detailed Description

Embodiments and the like of the present disclosure will be described below with reference to the accompanying drawings. Note that description will be made in the following order.

< techniques common to embodiments >

< first embodiment >

< second embodiment >

< third embodiment >

< modification >

< techniques common to embodiments >

[ control System ]

First, a technique common to each embodiment of the present disclosure will be described. Fig. 1 is a diagram showing a configuration example of a control system (control system 1) according to an embodiment of the present disclosure. The control system 1 has a configuration including, for example, a control device 2 and a fragrance output device 3.

The operation of the control system 1 will be schematically described. The control device 2 outputs information for controlling the output of fragrance (smell) based on the result of recognizing the hand. The control device 2 outputs the control signal S1 as information for controlling the output of the fragrance based on, for example, the recognition result of the shape of the hand of the user or the recognition result of the shape of the hand that changes continuously, in other words, the position of the hand or the motion of the hand. The control signal S1 output from the control device 2 is supplied to the fragrance output device 3. The fragrance output device 3 emits fragrance or the like in accordance with the control signal S1. Note that the fragrance output device 3 is capable of emitting various types of fragrances and the like according to the object.

It should be noted that the output of the fragrance means controlling the fragrance, and specifically the meaning includes at least one of causing the fragrance to be emitted (released), causing the intensity of the fragrance to be changed, causing the direction in which the fragrance appears to be changed, eliminating the emission of the fragrance, and the period for which the fragrance is emitted for a long time. Therefore, there may be a case where the fragrance output device 3 emits fragrance, stops emitting fragrance, changes the intensity of fragrance, or the like according to the control signal S1.

In the control system 1, one or more fragrance output devices 3 may be present. Further, although the fragrance output device 3 is described as a device physically separated from the control device 2 in the embodiment, the fragrance output device 3 may be a device physically integrated to the control device 2, in other words, the control device 2 may have the fragrance output device 3. Further, the fragrance output device 3 may be a device detachable from the control device 2.

The control system 1 may comprise means in addition to the control means 2 and the fragrance output means 3. For example, the control system 1 may have a device (hereinafter referred to as an environmental change issuing device as appropriate) that causes the user to perceive a change in the surrounding environment. Examples of such an environmental change emitting device include a device that emits sound (e.g., a portable speaker device), a device that causes a fan or the like to emit air, a lighting device, and the like. There may be one or more environmental change emitting devices. Furthermore, different types of environmental change emitting devices may be used. The operation of the environment change issuing device is controlled based on a control signal S2, which is information for allowing the user to perceive a change in the surrounding environment S2. For example, the control signal S2 is supplied from the control device 2 to the environmental change emitting device. Such an environmental change emitting device may be incorporated into the control device 2 or the fragrance output device 3.

[ control device ]

(exemplary appearance)

Next, the control device 2 will be described. Fig. 2 is a diagram showing an exemplary appearance of the control device 2. For example, the control apparatus 2 is configured as a wearable device (a device that can be worn on a human body and has a wearable size). More specifically, as shown in fig. 2, the control device 2 according to the embodiment is configured as a glasses-type terminal device.

The control device 2 has a frame 5 for holding a right image display unit 6a and a left image display unit 6b, similar to ordinary eyeglasses. The right image display unit 6a and the left image display unit 6b are arranged to be placed in front of the right eye and the left eye of the user, respectively.

The frame 5 is provided with various sensors, an imaging device, a battery, and the like (illustration of these is omitted). For example, an image (actual image) obtained via the imaging device is displayed on the right image display unit 6a and the left image display unit 6 b. By providing the imaging device at an appropriate position of the frame 5, it is possible to display images of the user's viewpoints on the right image display unit 6a and the left image display unit 6 b.

Instead of the actual image, an image generated by the control device 2 or an image supplied from an external device may be displayed on the right image display unit 6a and the left image display unit 6 b. In this case, a space, a so-called Virtual Reality (VR), may be provided to the user. Further, an actual image to which an image of a predetermined object is added may be displayed on the right image display unit 6a and the left image display unit 6 b. In this case, a space, so-called Augmented Reality (AR), may be provided to the user.

(configuration example)

Fig. 3 is a block diagram showing a configuration example of the control device 2. For example, the control device 2 has a control unit 21, an image sensor unit 22, a sensor unit 23, a communication unit 24, a speaker 25, and a display 26, and each of these units is connected via a predetermined interface 27.

The control unit 21 includes a Central Processing Unit (CPU) and the like, and controls each unit of the control apparatus 2. The control unit 21 has a Read Only Memory (ROM)21a in which programs are stored, and a Random Access Memory (RAM)21b serving as a work memory when the programs are executed.

The control unit 21 recognizes the position of the hand of the user or the motion of the hand by performing predetermined image processing on the image data obtained by the image sensor unit 22. The control unit 21 outputs a control signal S1 as information for controlling the output of the fragrance in accordance with the position of the hand, the movement of the hand, and the like. The control unit 21 has a processing circuit (not shown) for performing image processing.

The image sensor unit 22 is configured by a Charge Coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), or the like. The image sensor unit 22 photoelectrically converts object light entering via a lens unit (not shown) into a charge amount (image data), and outputs the image data. The image data is supplied to the control unit 21 via the interface 27.

The sensor unit 23 is a general term for configuration, except for the image sensor unit 22. As the sensor unit 23, a motion sensor, specifically, an acceleration sensor, a gyro sensor, an electronic compass, an atmospheric pressure sensor, or the like is exemplified. The sensor unit 23 may have a physiological sensor that measures physiological information (e.g., blood pressure, pulsation, body temperature, etc.) of the user of the control apparatus 2. Further, the sensor unit 23 may have a pressure sensor for detecting whether the user wears the control device 2, a microphone for detecting a sound, or the like. The configuration of the sensor unit 23 may be changed according to the content of control performed by the control device 2 as appropriate.

For example, the communication unit 24 is used to perform wireless communication with the fragrance output device 3. The communication unit 24 has a modem circuit according to a communication method, an antenna, and the like. Examples of wireless communication include a Local Area Network (LAN), bluetooth (registered trademark), Wi-Fi (registered trademark), or wireless usb (wusb), and the like.

The speaker 25 outputs sound. The speaker 25 emits a predetermined sound according to the control of the control unit 21. The sound may be any sound such as a human voice or a natural sound. The sound source may be data stored in the control unit 21 or may be data acquired via the communication unit 24.

The configuration of the display 26 corresponds to the configuration of the right image display unit 6a and the left image display unit 6b described above. The display 26 includes a liquid crystal display (liquid crystal LCD), an Organic Light Emitting Diode (OLED), and the like.

[ fragrance output device ]

(configuration example)

Fig. 4 is a block diagram showing a main configuration example of the fragrance output apparatus 3. For example, the fragrance output device 3 has a control unit 31, a communication unit 32, and a fragrance output mechanism 33.

The control unit 31 includes a CPU and the like, and controls each unit of the fragrance output device 3. The control unit 31 has a ROM that stores programs, and a RAM that functions as a work memory when executing the programs. Note that these explanations are omitted. The control unit 31 performs control regarding emission of fragrance and the like based on the control signal S1 transmitted from the control device 2.

For example, the communication unit 32 is used to perform wireless communication with the control device 2. The communication unit 32 has a modem circuit according to a communication method, an antenna, and the like.

The fragrance output mechanism 33 is a mechanism that actually emits fragrance or the like. The fragrance output mechanism 33 emits fragrance by, for example, volatilizing a liquid fragrance source. It should be noted that known mechanisms for emitting fragrance may be applied to the fragrance output mechanism 33.

(concrete examples)

Specific examples of the fragrance output device 3 will be described with reference to fig. 5A to 5D. For example, the fragrance output device 3 illustrated in fig. 5A has a housing 310, and the housing 310 is substantially rectangular and box-shaped in top view. In the housing 310, a fragrance source is contained. A plurality of holes are provided on the side surface 311 in the longitudinal direction of the housing 310. In the example shown in fig. 5A, three holes (holes 312a, 312b, and 312c) are provided. It should be noted that these holes are optionally referred to as holes 312 without distinguishing between the individual holes. For example, the aperture 312 is configured to be openable and closable.

The scent is released from the interior of the housing 310 through the apertures 312. For example, the fragrance is released from the inside of the housing 310 through the open hole 312. In addition, the release of fragrance is stopped by closing the aperture 312.

The direction of the scent occurrence can be controlled by optionally opening the aperture 312 among the plurality of apertures 312. For example, in the case where the user appears at a position facing substantially the center of the side surface 311, it is possible to perform control for making the user feel that the fragrance comes from a correct direction by opening only the hole 312 c.

The degree of opening and closing of the aperture 312 may be varied. By having the apertures 312 open to a greater extent, the intensity of the scent can be increased. Further, by narrowing the aperture 312, the intensity of the fragrance can be reduced. Further, a ventilation mechanism such as a fan may be provided in the housing 310. The intensity of the fragrance can also be varied according to the intensity of the ventilation.

The example shown in fig. 5B is an example of the fragrance output apparatus 3 that can be attached to a portable device (e.g., the control apparatus 2 or a smartphone according to the embodiment). For example, the fragrance output device 3 shown in fig. 5B has a small and cylindrical housing 313. The housing 313 is attached to a housing of a smartphone or the like by an attachment mechanism (e.g., clip attachment mechanism) not shown. Then, the fragrance is released from the predetermined hole of the housing 313. Of course, in this example, the intensity of the scent, etc. may also be varied. Note that in fig. 5B, the fragrance is schematically shown by dot hatching.

The example shown in fig. 5C is an example of the fragrance output device 3 having a tubular housing. In fig. 5C, four fragrance output devices 3 are shown, and each fragrance output device 3 has a tubular housing 315a, 315b, 315C or 315 d. For example, fragrance is emitted from the upper portion of the tubular housing. These fragrance output means 3 are arranged at appropriate positions, and fragrance output means 3 that emit fragrance is selected as appropriate. Therefore, it becomes possible to control the emission of fragrance from any place. Of course, the number of fragrance output devices 3 is not limited to four, but any number may be provided.

The example shown in fig. 5D is an example of a moving scent output device 3. The fragrance output device 3 shown in fig. 5D has a small and cylindrical housing 316. The scent is released from the housing 316. The housing 316 is supported by, for example, a linear motor (not shown) or the like. As schematically shown in fig. 5D, the housing 316 is optionally movable by driving a linear motor. By positioning the housing 316 in any location and allowing the scent to be released from the housing 316 at that location, the scent can be emitted from the appropriate location. That is, in the embodiment, the intensity of the fragrance, the time for stopping the emission of the fragrance, and the like may be arbitrarily set in addition to the fragrance emitting portion.

[ recognition concerning the shape of the hand ]

Next, the execution of the hand-related recognition method by the control unit 21 of the control apparatus 2 will be described. Fig. 6 summarizes the identification method. For example, an image of grabbing a coffee cup by hand is input (image IM 1). The feature value is calculated based on the image IM 1. Examples of feature values include feature values based on Convolutional Neural Network (CNN), histogram of gradient direction (HOG) feature values, and feature values based on scale-invariant feature transform (SIFT).

Then, the position of the hand or the like is determined by analyzing feature values obtained by a predetermined algorithm such as detection, recognition and segmentation. As the algorithm, the CNN, Boosting (Boosting), Support Vector Machine (SVM), graph division, and the like described above can be applied.

Then, according to the analysis result, the position of the hand may be detected as indicated by reference symbol AA in fig. 6. It should be noted that "hand" in the embodiments means a portion of the wrist further down, but may include an upper arm region. The "hand" may be a portion (e.g., only the fingers) and not the entire portion of the wrist further down. Further, the position or posture of the hand may be detected as indicated by reference symbol BB in fig. 6. Note that, in the drawing indicated by reference symbol BB, a circle provided on the hand represents the feature point CP.

Further, by performing the above-described processing on the images of the plurality of frames, the control apparatus 2 as indicated by reference symbols CC (CC1 to CC4) in fig. 6 can also recognize the motion of the hand. The motion of the hand provided with reference character CC1 indicates an example of moving the hand in a Horizontal direction (Horizontal). The motion of the hand provided with reference CC2 indicates an example of moving the hand in a Vertical direction (Vertical). The action of the hand provided with reference CC3 indicates an example of moving the hand in a Clockwise direction (Clockwise). The motion of the hand provided with reference character CC4 indicates an instance of moving the hand in a counter clockwise direction (countlockwise). Of course, the motion of the hand to be recognized is not limited to these illustrated examples, and various motions of the hand may be recognized. Further, the above processing is not based on the illustrated algorithm, and a known method may be applied.

(example of hand position estimating method)

Next, an example of a method for estimating the position of the hand will be described with reference to fig. 7 to 9. The horizontal axis of the graph shown in fig. 7 indicates an example of an input image. In the example shown in fig. 7, three input images IM5, IM6, and IM7 are shown. The vertical axis of the chart shown in fig. 7 is the coordinates (position) of the feature point of the hand corresponding to the input image. The coordinates are represented by (x)t,yt,zt) (where t is the number of frames). Further, for example, xtFrom xt=(xt0,xt1...xtn) (where n is the number of feature points). I.e. xtA set of x-axis coordinates for each feature point is indicated. With y for each feature pointtAnd ztAnd is similar.

By learning the coordinates of the input image, as shown in fig. 8, the correspondence between the coordinates of the input image and the position of the hand is obtained. Although a linear correspondence is shown in the example shown in fig. 8, another correspondence, such as a non-linear correspondence, may be used.

After the correspondence relationship is obtained, for example, in a case where an image IM8 similar to the image IM7 is input as shown in fig. 9, the coordinate position of the hand of the image IM8 can be estimated based on the correspondence relationship. The position of the hand can be estimated by using the above-described method.

< first embodiment >

The first embodiment will be described. The first embodiment is an example in which the control unit 21 of the control device 2 recognizes the position of the hand based on the result of recognizing the hand and outputs information for controlling the fragrance output from the position. It should be noted that the "position" in the present specification may be a position that exactly matches the position, or may be a position that is close to the position. A specific example of the proximity is a position closest to a target position (for example, a position of a hand or a position of an object) among a plurality of positions capable of controlling the output regarding the fragrance.

(treatment overview)

Fig. 10A to 10D are diagrams conceptually showing contents corresponding to the first embodiment. For example, the image IM11 of the viewpoint of the user is input via the image sensor unit 22 of the control apparatus 2. The image IM11 is an image of the user of the control device 2 grasping the coffee cup 41 with the hand HA. It should be noted that the coffee cup 41 may be a real object or may be a virtual object superimposed on a real space. The image IM11 is supplied from the image sensor unit 22 to the control unit 21.

Based on the image IM11, the control unit 21 separates and detects the hand HA by using the feature points CP in the image IM11 as shown in fig. 10B, pattern matching, or another method. Then, it is determined whether the detected hand HA HAs an object by using the position of the feature point CP, pattern matching, or another method. It should be noted that, based on the sensed data of another sensor device (for example, a pressure sensor provided for the coffee cup 41), the hand HA may be detected, and it may be determined whether the hand HA HAs the coffee cup 41.

Then, as schematically shown in fig. 10C, in the case where it is determined that the hand HA HAs an object, the three-dimensional position of the detected hand HA is calculated. The three-dimensional position may be a three-dimensional coordinate (x, y, z) of a world coordinate system, or may be a two-dimensional coordinate (u, v, d) of a point projected on an image plane with depth information of the included point. For example, the three-dimensional position of the hand HA is obtained by a hand posture detection technique. Further, the three-dimensional position can be obtained by using the gravity center of the region of the hand HA and the depth information (depth information) of the portion obtained by the hand segmentation technique.

Then, as schematically shown in fig. 10D, control is performed on the emission of fragrance from a place corresponding to the obtained three-dimensional position of the hand HA. In this embodiment, the aroma is that of coffee. An example of controlling the emission of fragrance will be described. For example, it is assumed that a plurality of fragrance output devices 3 are arranged in a space in which a hand HA of the user exists. The control unit 21 generates a control signal S1 for the fragrance output device 3 arranged at the three-dimensional position of the hand HA (or the position closest to the position) among the plurality of fragrance output devices 3. That is, the control signal S1 includes an identifier that identifies the fragrance output apparatus 3 and data that gives an instruction for fragrance emission. The control signal S1 is transmitted to each of the fragrance output devices 3 simultaneously via the communication unit 204.

Each fragrance output device 3 receives the control signal S1 via the communication unit 32. Then, the control unit 31 of each fragrance output apparatus 3 interprets the control signal S1, and interprets whether the control signal S1 is a signal to the control unit 31. In the case where the control signal S1 is issued to the control unit 31, the control unit 31 drives the fragrance output mechanism 33 and causes the fragrance output mechanism 33 to emit fragrance. With this arrangement, fragrance is emitted from the three-dimensional position of the hand HA, and the user can perceive that fragrance is emitted from the predetermined position that is visually recognized.

It should be noted that data for controlling the intensity of the aroma depending on the position in the depth direction of the object emitting the aroma (in this embodiment, the coffee cup 41) may be included in the control signal S1. For example, in the case where the distance from the user in the depth direction is long, the fragrance is made thin, and in the case where the distance from the user in the depth direction is short, the fragrance is made thick.

Furthermore, a plurality of fragrance output devices 3 is not required. For example, in the case where the fragrance output device 3 is movable, the fragrance output device 3 may be moved to a position corresponding to the three-dimensional position of the hand HA, and fragrance may be caused to be emitted from the fragrance output device 3 after the movement.

(treatment procedure)

Fig. 11 is a flowchart showing the flow of processing according to the first embodiment. For example, the processing described below is executed by the control unit 21. In step ST11, it is determined whether the user has an object (which may be a virtual object or a real object). The processing in step ST11 is executed as follows, for example. The output of the pressure sensor provided on the object is changed according to the user's grip operation, and the detection result of the control device 2 is performed by communication. The control unit 21 of the control apparatus 2 may determine whether the user has an object based on the information provided by the communication.

In a case where it is determined that the user has no object, the process returns to step ST11, and the determination process in step ST11 is repeated. In the case where it is determined that the user has an object, the process proceeds to step ST 12.

In step ST12, image data is supplied from the image sensor unit 22 to the control unit 21 via the interface 27. Then, the process proceeds to step ST 13.

In step ST13, the control unit 21 estimates the three-dimensional position of the hand HA holding the object from the image data. As the estimation method, a known method other than the above-described method may be applied. Then, the process proceeds to step ST 14.

In step ST14, according to the three-dimensional position of the hand HA obtained in step ST13, the control unit 21 performs control so that the fragrance corresponding to the object appears to be released from the position. That is, the control unit 21 generates a control signal S1 for performing such control, and outputs the generated control signal S1. The control signal S1 is supplied to the fragrance output device 3, and the predetermined fragrance output device 3 performs an operation of emitting fragrance.

It should be noted that the process flow shown in fig. 11 may be modified to the flow chart shown in fig. 12. That is, the process in step ST12 may be performed before step ST 11. In step ST12, image data is supplied from the image sensor unit 22 to the control unit 21 via the interface 27. The control unit 21 determines whether the hand HA HAs an object based on the image data. For example, the control unit 21 determines the shape of the hand HA gripping the object, the distance between the hand HA and the object, or the like based on the image data, and determines whether the hand HA HAs the object. Since other processes are the same as the above-described processes, a repetitive description will be omitted.

As described above, in the first embodiment, processing may be performed to identify the position of the hand HA based on the result of identifying the hand HA and control the predetermined fragrance to be output from the position.

< second embodiment >

Next, a second embodiment will be described. In the second embodiment, the control unit 21 outputs information for controlling the output of the fragrance based on the result of recognizing the motion (gesture) of the hand. It should be noted that the items described in the first embodiment may be applied to the second embodiment unless otherwise specified.

[ summary of treatment ]

Fig. 13A to 13D are diagrams conceptually showing contents corresponding to the second embodiment. Fig. 13A to 13C are similar items to those described in the first embodiment, and therefore will be described only schematically. That is, the three-dimensional position of the hand HA is detected based on the input image IM 12. In addition, in this embodiment, the motion of the hand HA is further recognized. For example, a change in the feature point, a change in the shape of the hand HA, or the like is detected, and the motion of the hand associated with the change is recognized as a predetermined motion of the hand HA. The recognition of the motion of the hand HA may be another known method (such as a method based on a technique called gesture recognition). Then, the output of the fragrance is controlled based on the three-dimensional position of the hand HA. At that time, control according to the motion of the recognized hand is performed. It should be noted that when distinguishing the types of fragrance, the types of fragrance may be distinguished according to the action of the hand HA and the object of recognition.

[ treatment procedure ]

Fig. 14 is a flowchart showing the flow of processing according to the second embodiment. For example, the processing described below is executed by the control unit 21. Since the processing in steps ST21 to ST23 is similar to the processing in steps ST11 to ST13 in the first embodiment described above, repeated description will be omitted. After step ST23, the process proceeds to step ST 24.

In step ST24, when there is an operation of the hand HA, the control unit 21 recognizes the operation. Then, the process proceeds to step ST 25.

In step ST25, in accordance with the three-dimensional position of the hand HA, the control unit 21 performs control so that the fragrance corresponding to the motion of the object and the hand appears to be emitted from the position.

It should be noted that the process flow shown in fig. 14 may be modified to the flow chart shown in fig. 15. That is, the process in step ST22 may be performed before step ST 21. In step ST22, image data is supplied from the image sensor unit 22 to the control unit 21 via the interface 27. The control unit 21 determines whether the hand HA HAs an object based on the image data. Since other processes are the same as the above-described processes, a repetitive description will be omitted.

Further, as in the flowchart shown in fig. 16, an object may not necessarily exist, and determination of whether or not the user has an object may not necessarily be performed. That is, the process in step ST22 is performed first, and image data is input from the image sensor unit 22 to the control unit 21. Then, the process proceeds to step ST 26.

In step ST26, the control unit 21 recognizes the three-dimensional position of the hand HA and the motion of the hand HA based on the image data input in step ST 22. Then, the process proceeds to step ST 27.

In step ST27, the control unit 21 performs control regarding fragrance output according to the three-dimensional position of the hand HA and the motion of the hand HA identified in step ST 26.

It should be noted that not only the control regarding the fragrance output but also the control that causes the user to perceive the change in the surrounding environment may be performed. For example, controlling sound emission, emission of wind, or the like may be performed.

[ modification of treatment ]

After the processes shown in fig. 14 to 16 are performed, the processes shown in the flowchart in fig. 17 (the processes in step ST28 and step ST 29) may be performed. Note that the letters "a", "B", and "C" in the circles shown in fig. 14 to 17 are used to indicate the continuity of processing, and do not indicate specific processing.

In step ST28, the control unit 21 determines whether the situation has changed. For example, the change of the situation means a case where a new action of the hand HA is detected or a case where the three-dimensional position of the hand HA is changed. In the case where the situation has not changed, the process returns to step ST28, and the determination process in step ST28 is repeated. In the case where the situation changes, the process proceeds to step ST 29.

In step ST29, control regarding the output of fragrance according to the change of situation is performed. For example, control is performed to change the intensity of the fragrance and the direction in which the fragrance appears. And the control concerning the fragrance output, the control content other than the control can be changed. For example, the amount of air output from the ventilation device, which is one of the environmental change emitting devices, may be changed according to a change in situation.

[ specific examples ]

A specific example corresponding to the second embodiment will be described. Table 1 below is a table showing specific examples corresponding to the second embodiment.

[ Table 1]

Table 1 shows each pattern number, a recognition action of the hand HA (there may be cases where an object is included), an example of input from a sensor device different from the image sensor unit 22, and an example of fragrance output.

In the case of pattern P1, "flower" is recognized as the object, "touch" action is recognized as the action of hand HA, and from these results, the action of hand HA "touching flower" is recognized. It should be noted that, in this case, the flower may be a real object that has been recognized as a flower by image recognition or the like, or may be a virtual object that has been recognized as a flower. In this case, control is performed to emit the fragrance of flowers from a position corresponding to the three-dimensional position of the hand HA. Then, the fragrance output device 3 is operated to emit the fragrance of flowers.

In the case of pattern P2, "fruit" is identified as the object, the "crushing" action is identified as the action of hand HA, and from these results, the action of hand HA "crushing fruit" is identified. The hand HA may be one hand or may be both hands. It should be noted that in this case, the fruit may be a real object that has been recognized as a fruit by image recognition or the like, or may be a virtual object that has been recognized as a fruit. In this case, control is performed to emit the fragrance of the fruit from a position corresponding to the three-dimensional position of the hand HA. Then, the flavor output device 3 is operated to emit the flavor of the fruit.

In the case of pattern P3, "wineglass" is recognized as the object, and the turning motion is recognized as the motion of the hand HA. From these results, the action of the hand HA "shaking the wine glass" is recognized. It should be noted that in this case, the wineglass may be a real object that has been recognized as a wineglass by image recognition or the like, or may be a virtual object that has been recognized as a wineglass. In this case, control is performed to emit the fragrance of wine from the three-dimensional position of the hand HA. Then, the flavor output device 3 is operated to emit the flavor of the wine.

The pattern P4 is a pattern in which no object exists. As the motion of the hand HA, for example, it seems as if the motion of the hand HA is recognized when magic is used. Such actions of the hand HA include a pop-up gesture (an action of opening the hand HA close to the chest grip) and an action of raising the index finger. In the case of recognizing such a motion of the hand HA, the fragrance suitable for the magic is released from the three-dimensional position of the hand HA (in the case of a pop-up gesture of opening the three-dimensional position of the hand HA). For example, in the case of a magic of flame handling, the fragrance output device 3 is operated to emit a burning fragrance from a three-dimensional position of the hand HA. Therefore, when the output of the fragrance is controlled in accordance with the movement of the hand HA, the presence of the object is not necessarily required.

It should be noted that when recognizing the motion of the hand HA, the other input may be referred to as to recognize the meaning of the motion of the hand HA. In the case of pattern P4, the other input is sound. For example, in the case where a voice singing a magic spell is detected, the fragrance output device 3 may emit a burning fragrance and the motion of the hand HA. For example, voice may be detected by the sensor unit 23. By referring to another input, more appropriate control can be performed, and improper control due to incorrect recognition can be prevented.

Further, in order to give a more realistic sensation, control may be performed regarding not only the output of the fragrance, the output of the wind, or the output of the sound of the environmental change emitting device or (for example, the sound of instantaneously emitting flames).

Pattern P5 is an example of a scent control associated with a change in situation. In the pattern P5, for example, the game controller is recognized as an object, and a motion of gripping the object is detected as a motion of the hand HA. Here, it is assumed that the grip force increases. The change in the grip force can be detected by the sensor unit 23 of the pressure sensor as a specific example. In case the grip strength is increased, the intensity of the scent changes. For example, in the case where the grip force is increased, control of increasing the intensity of fragrance is performed.

In the case of pattern P6, the motion "clapping" is recognized as the motion of the hand HA. The amount of clapping can be recognized from the degree of opening of the hands (distance between both hands) at each clapping. The control of changing the intensity of the fragrance may be performed according to the degree of applause. Note that in the case of the pattern P6, the sound of the clapping and the motion of the hand HA can be recognized by voice recognition. Not only the output of the fragrance, but also control regarding the output of wind (for example, the emission of wind from a part touched by both hands) or the output of sound of the environmental change emitting device may be performed.

The pattern P7 is an example of fragrance control based on continuous motion of the hand HA. In pattern P7, for example, "flowers" are identified as objects. It should be noted that, in this case, the flower may be a real object that has been recognized as a flower by image recognition or the like, or may be a virtual object that has been recognized as a flower. Then, after the flower is foiled, recognized as the motion of the hand HA, the motion of moving the hand HA, for example, near the nose. In this case, control of emitting the fragrance of flowers from the three-dimensional position of the hand HA moving close to the nose is performed.

The detection result of the temperature sensor may be used as another input. For example, when the "meat" is recognized as the object and the "roast meat" is recognized as the motion of the hand HA according to the temperature change, the control of outputting the sound of the roast meat and the control of emitting the flavor of the roast meat can be performed.

< third embodiment >

Next, a third embodiment will be described. The third embodiment is an example in which the control unit 21 outputs information for deleting a fragrance based on the recognition result regarding the motion of the hand HA (gesture). It should be noted that the items described in the first and second embodiments may be applied to the third embodiment unless otherwise specified. It should be noted that the information of deleting the fragrance may be output based not on the motion of the hand HA but on the shape of the hand HA at a predetermined position (for example, a shape of stretching the hand near the nose as if blocking the nose).

[ summary of treatment ]

Fig. 18A to 18D are diagrams conceptually showing contents corresponding to the third embodiment. Fig. 18A to 18C are similar items to those described in the first embodiment, and therefore will be described only schematically. That is, the three-dimensional position of the hand HA is detected based on the input image IM 13. In this embodiment, for example, at least one of the shape of the hand HA and the hand HA (hereinafter, referred to as the action of the hand HA or the like as appropriate) is detected from the position or change of the feature point, the change of the shape of the hand HA, or the like. In the case where the detected motion of the hand HA or the like is a preset motion or the like, the control unit 21 determines that the user dislikes the fragrance, and outputs information of deleting the fragrance as the control signal S1. It should be noted that, in this embodiment, deleting a fragrance may mean weakening the intensity of the fragrance, may mean causing the emission of the fragrance to stop, or may mean controlling ventilation or the like so as to directly eliminate the fragrance (hereinafter, referred to as deodorization as appropriate), and causing the emission of the fragrance to stop.

[ treatment procedure ]

Fig. 19 is a flowchart showing the flow of processing according to the third embodiment. For example, the processing described below is executed by the control unit 21. In step ST31, image data is input from the image sensor unit 22 to the control unit 21 via the interface 27. Then, the process proceeds to step ST 32.

In step ST32, the control unit 21 recognizes the motion of the hand and the like. Then, the process proceeds to step ST 33.

In step ST33, if the motion of the hand or the like is a setting motion or the like, the control unit 21 performs control to delete the fragrance. That is, the control unit 21 generates the control signal S1 for deleting the fragrance, and supplies the generated control signal S1 to the fragrance output device 3. The fragrance output device 3 is operated to delete the fragrance based on the control signal S1. It should be noted that, when performing control to remove fragrance, the control unit 21 generates the control signal S2 and supplies the generated control signal S2 to the environmental change emitting device. The environmental change emitting device performs a process for removing the fragrance by performing ventilation or the like.

[ specific examples ]

A specific example corresponding to the third embodiment will be described. Table 2 below is a table showing specific examples corresponding to the third embodiment.

[ Table 2]

Figure BDA0002610950570000181

In the pattern P8 in table 2, as the motion of the hand HA, or the like, a motion of extending the hand HA in front of the eyes (more specifically, around the front of the nose) is recognized. This operation is recognized as an operation showing the user dislike smell. Accordingly, the control unit 21 generates the control signal S1 for deleting the fragrance, and supplies the generated control signal S1 to the fragrance output device 3. The fragrance output device 3 operates according to the control signal S1, and for example, stops the output of fragrance, softens fragrance (reduces fragrance), or performs deodorization.

It should be noted that the above control may be performed in consideration of another input. For example, in the case where a specific voice (for example, "foul-smelling", "holding", "stopping", "ending", or the like) or a specific sound (for example, a sound of nose extraction or cough) of the sound input and the motion of the hand HA are detected by the sensor unit 23, the control of deleting the fragrance may be performed. Further, the control of deleting the fragrance may be performed with reference to information such as rough breathing or high body temperature based on a physiological sensor, information due to an increase in the temperature of the combustion environment based on a temperature sensor, information based on an odor sensor, information such as an unstable line of sight based on a line-of-sight detection sensor, or other information. In this way, by referring to another input, more appropriate control can be performed, and inappropriate control due to incorrect recognition can be prevented.

The pattern P9 is an example in which the motion of holding the hand HA over the object is detected as the motion of the hand HA. The pattern P10 is an example in which the motion of hiding the object by the hand HA is detected as the motion of the hand HA. It should be noted that in the patterns P9 and P10, the object may be a real object or may be a virtual object. Further, in the patterns P9 and P10, the hand HA may be one hand or may be both hands. The control of deleting the fragrance is performed similarly as in the case of the patterns P9 and P10. In addition to the illustrated mode, for example, in the case where a motion of swinging the hand HA close to the nose or the like is detected, for example, control of deleting fragrance may be performed. As described above, the control of deleting the fragrance may be performed in accordance with the action of the hand HA or the like.

It should be noted that also in the case of the patterns P9 and P10, the above-described control may be performed in consideration of another input. For example, in the case where a specific voice (for example, "foul-smelling", "holding", "stopping", "ending", or the like) or a specific sound (for example, a sound of nose extraction or cough) of the sound input and the motion of the hand HA are detected by the sensor unit 23, the control of deleting the fragrance may be performed. Further, the control of deleting the fragrance may be performed with reference to information such as rough breathing or high body temperature based on a physiological sensor, information due to an increase in the temperature of the combustion environment based on a temperature sensor, information based on an odor sensor, information such as an unstable line of sight based on a line-of-sight detection sensor, or other information. In this way, by referring to another input, more appropriate control can be performed, and inappropriate control due to incorrect recognition can be prevented.

< modification >

Although the embodiments of the present disclosure have been specifically described above, the contents of the present disclosure are not limited to the above-described embodiments, and various modifications may be made based on the technical idea of the present disclosure.

Although the hand is described as an example in the above embodiments, the present disclosure may be applied to a part of the body, such as the foot or the elbow. Further, the control apparatus is not limited to the glasses type wearable device, and may be a wearable device worn on a shoulder, a watchband type wearable device, a head-up display, or the like. Further, the control apparatus is not limited to the wearable device.

In the above-described embodiment, even in the case where the object is a real object, the real object does not necessarily have to be an object that emits fragrance. Assuming that the real object emits a fragrance, as described in the above embodiment, the process of controlling the output of the fragrance may be performed from the position of the real object.

A part of the processing by the control device in the above-described embodiment may be executed on the cloud. For example, the processing performed by the control unit 21 may be performed on the cloud by a server apparatus or the like.

In the above-described embodiment, control regarding the fragrance output in consideration of the passage of time may be performed. For example, after the control of emitting the fragrance is performed, the control may be performed such that the intensity of the fragrance is gradually reduced over time.

The present disclosure may be applied to various apparatuses. For example, the present disclosure may be applied not only to entertainment devices such as game devices but also to simulation devices for health care, cooking, disaster relief, and the like.

The present disclosure may also be embodied by apparatuses, methods, programs, systems, and the like. For example, by being able to download a program executed by an apparatus that performs the functions described in the above-described embodiment and does not have the functions described in the embodiment, it is possible to execute the download and installation of the program, the control described in the embodiment, in the apparatus. The present disclosure may also be implemented by a server that distributes such programs. Further, the items and modifications described in each embodiment may be combined as appropriate.

The present disclosure may have the following configuration.

(1) A control device comprises

And a control unit for outputting information for controlling the output of the fragrance based on the result of the hand recognition.

(2) The control device according to the above (1),

wherein the control unit recognizes a position of the hand based on a result of recognizing the hand, and outputs information for controlling the fragrance to be output from the position.

(3) The control device according to the above (2),

wherein the information includes information for controlling outputting of the fragrance to a device closest to the recognized position of the hand among the plurality of devices that emit the fragrance.

(4) The control device according to the above (1),

wherein the control unit outputs information for controlling the output of the fragrance based on the result of recognizing the motion of the hand.

(5) The control device according to any one of (1) to (4),

wherein the information controlling the output of the scent includes: at least one of information to cause emission of a fragrance, information to cause a change in intensity of a fragrance, information to cause a change in direction in which a fragrance appears, information to stop emission of a fragrance, and information indicating a period of time during which emission of a fragrance continues.

(6) The control device according to any one of (1) to (5),

wherein the control unit outputs information for performing control for causing the user to perceive a change in the surrounding environment and information for controlling output of the fragrance.

(7) The control device according to (6),

wherein the control for causing the user to perceive the change in the surrounding environment includes at least one of control for reproducing sound, control for emitting wind, and control for changing surrounding brightness.

(8) The control device according to any one of (1) to (5),

wherein the control unit outputs information controlling the output of the fragrance based on a result of recognizing the hand and a result of the recognition with respect to a predetermined input.

(9) The control device according to (8),

wherein the predetermined input comprises an input detectable by at least one of a sound sensor, a temperature sensor, and a physiological sensor.

(10) The control device according to any one of (1) to (9),

wherein the control unit identifies the hand based on the image corresponding to the viewpoint of the user.

(11) According to the control device of (10),

wherein the image comprises a real image or a virtual image.

(12) The control device according to any one of (1) to (11), further comprising a fragrance emission unit that performs output of fragrance based on the information.

(13) The control apparatus according to any one of (1) to (12), the control apparatus being configured as a wearable device.

(14) A control method comprises

Information for controlling the output of the fragrance is output by the control unit based on the result of recognizing the hand.

(15) A program for causing a computer to execute a control method, the method comprising

Information for controlling the output of the fragrance is output by the control unit based on the result of recognizing the hand.

REFERENCE SIGNS LIST

1 control system

2 control device

3 fragrance output device

21 control unit

22 image sensor unit

23 sensor unit.

31页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于融合声学和惯性位置确定的系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类