Cooking robot, cooking robot control device and control method

文档序号:73895 发布日期:2021-10-01 浏览:45次 中文

阅读说明:本技术 烹饪机器人、烹饪机器人控制装置和控制方法 (Cooking robot, cooking robot control device and control method ) 是由 藤田雅博 吉田芳 米夏埃尔·西格弗里德·施普兰格尔 于 2020-02-14 设计创作,主要内容包括:该技术涉及一种烹饪机器人、烹饪机器人控制装置和控制方法,其允许在使用烹饪机器人来再现由烹饪者制作的菜肴时提高再现性。该技术的一个方面的烹饪机器人通过使用包括数据集的食谱数据来控制由烹饪臂执行的烹饪动作,该数据集链接:描述与菜肴食材有关的信息和与使用食材的烹饪过程中的烹饪者的动作有关的信息的烹饪操作数据;以及指示与烹饪过程的进度相结合地测量的烹饪者的感觉的感官数据。该技术可以应用于通过操作臂来进行烹饪的烹饪机器人。(The technology relates to a cooking robot, a cooking robot control apparatus, and a control method that allow for improved reproducibility when using the cooking robot to reproduce a dish made by a cook. A cooking robot of one aspect of the technology controls a cooking action performed by a cooking arm by using recipe data including a data set linking: cooking operation data describing information about a dish material and information about an action of a cooker in a cooking process using the material; and sensory data indicative of a cook's sensation measured in conjunction with the progress of the cooking process. The technology can be applied to a cooking robot that performs cooking by operating an arm.)

1. A cooking robot comprising:

a cooking arm configured to perform a cooking operation for making a dish; and

a control unit configured to control a cooking operation performed by the cooking arm using recipe data including a data set in which cooking operation data describing information about a food material of the dish and information about an operation of a cook in a cooking process using the food material is linked with sensation data indicating a sensation of the cook measured in conjunction with a progress of the cooking process.

2. The cooking robot according to claim 1,

the sensory data is data indicating at least one of a flavor of the food material before cooking, a flavor of a cooked food material cooked in the cooking process, and a flavor of the dish completed through all cooking processes.

3. The cooking robot according to claim 2,

the sensory data includes taste information indicative of at least one of sweetness, sourness, saltiness, bitterness, umami, pungency, and astringency.

4. The cooking robot according to claim 3,

the sensory data includes another information about taste obtained by using the taste information as an input to a model generated by deep learning.

5. The cooking robot of claim 2, further comprising:

a flavor measurement unit configured to measure at least one of: a flavor of the food material cooked by the cooking operation performed by the cooking arm, and a flavor of the dish completed by the cooking operation performed by the cooking arm.

6. The cooking robot according to claim 5,

the flavor measurement unit measures at least one of a taste forming a flavor of the food material and a taste forming a flavor of the dish, and

The control unit causes the cooking arm to perform a cooking operation for taste adjustment such that the taste measured by the flavor measuring unit matches the taste represented by the sensory data.

7. The cooking robot according to claim 5,

the sensory data comprises texture information indicative of at least one of a texture of the food material and a texture of the dish,

the flavor measurement unit measures at least one of texture forming flavor of the food material and texture forming flavor of the dish, and

the control unit causes the cooking arm to perform a cooking operation for making a texture adjustment such that the texture measured by the flavor measuring unit matches the texture represented by the sensory data.

8. The cooking robot according to claim 7,

the texture information is information indicating at least one of stress, hardness, and water content measured by a sensor.

9. The cooking robot according to claim 1,

the recipe data includes cooking environment data indicating an environment of a cooking space measured in connection with a progress of the cooking process,

The cooking robot further includes:

an environment control unit configured to control an environment of a meal space in which a meal for a dish completed by a cooking operation performed by the cooking arm is performed such that the environment of the meal space matches an environment of a cooking space represented by the cooking environment data.

10. The cooking robot according to claim 9,

the cooking environment data is data indicating at least one of temperature, humidity, atmospheric pressure, brightness, hue, and saturation of the cooking space.

11. The cooking robot according to claim 1,

the recipe data includes food attribute information indicating at least one of an attribute of the food material before cooking, an attribute of a cooked food material cooked in the cooking process, and an attribute of the dish completed by all cooking processes.

12. The cooking robot according to claim 1,

the type and amount of seasoning used in the cooking process are described in the cooking operation data, and

the recipe data comprises seasoning attribute information indicative of an attribute of the seasoning.

13. The cooking robot according to claim 1,

the recipe data includes cook attribute information indicating an attribute of the cook.

14. The cooking robot according to claim 13,

the control unit updates the cooking process according to a difference between the attribute of the cooker indicated by the cooker attribute information and an attribute of a user eating a dish completed by a cooking operation performed by the cooking arm.

15. The cooking robot of claim 1, further comprising:

a condition identifying unit configured to identify a condition when the cooking arm performs a cooking operation, wherein,

the control unit updates the cooking process according to the identified condition.

16. The cooking robot of claim 15,

the control unit updates the cooking process according to the condition of the food material.

17. The cooking robot according to claim 1,

the control unit controls the cooking arm according to an instruction command, which is generated based on the recipe data and gives an instruction on the cooking operation.

18. The cooking robot of claim 17,

The control unit causes the plurality of cooking arms to cooperatively perform the cooking operation according to the instruction command.

19. A control method, comprising:

controlling, by a cooking robot including a cooking arm that performs a cooking operation for making a dish, the cooking operation performed by the cooking arm using recipe data including a data set in which cooking operation data is linked with sensation data, the cooking operation data describing information about a food material of the dish and information about an operation of a cook in a cooking process using the food material, and the sensation data indicating a sensation of the cook measured in conjunction with a progress of the cooking process.

20. A cooking robot control device comprising:

a control unit configured to control a cooking operation performed by a cooking arm included in a cooking robot using recipe data including a data set in which cooking operation data describing information on a food material of a dish and information on an operation of a cook in a cooking process using the food material is linked with sensation data indicating a sensation of the cook measured in conjunction with a progress of the cooking process.

21. A control method, comprising:

controlling, by a cooking robot control apparatus, a cooking operation performed by a cooking arm included in a cooking robot using recipe data including a data set in which cooking operation data describing information on a food material of a dish and information on an operation of a cook in a cooking process using the food material is linked with sensation data indicating a sensation of the cook measured in conjunction with a progress of the cooking process.

Technical Field

The present technology relates to a cooking robot, a cooking robot control apparatus, and a control method, and in particular, to a cooking robot, a cooking robot control apparatus, and a control method capable of improving reproducibility in a case where the cooking robot reproduces the same dish as a dish made by a cook.

Background

A technology for reproducing dishes made by a cook at a cooking robot side by sensing movement of the cook during cooking and saving/transmitting data of the sensing result data is being researched. For example, a cooking operation by the cooking robot is realized based on a sensing result of the same movement as that of the hand of the cook.

CITATION LIST

Patent document

Patent document 1: PCT Japanese translation patent publication No. 2017-506169

Patent document 2: PCT Japanese translation patent publication No. 2017-536247

Disclosure of Invention

Problems to be solved by the invention

By the cooking method using the conventional cooking robot, even if the cooking process is performed according to the recipe, it is practically difficult to reproduce the dish as intended by the cooker.

This is because taste sense, smell sense, and the like differ depending on a cooker and an individual who eats food, and the type, size, texture, origin, and the like of food materials differ between the cooker side and the reproduction side, the type and capability of cooking devices differ, and cooking environments such as temperature and humidity also differ.

The present technology has been made in view of the above circumstances, and improves reproducibility in the case where the same dish as that made by a cooker is reproduced by a cooking robot.

Solution to the problem

A cooking robot according to an aspect of the present technology includes: a cooking arm configured to perform a cooking operation for making a dish; and a control unit configured to control a cooking operation performed by the cooking arm using recipe data including a data set in which cooking operation data describing information about a food material of a dish and information about an operation of a cooker in a cooking process using the food material and sensation data indicating a sensation of the cooker measured in conjunction with a progress of the cooking process are linked.

In one aspect of the present technology, a cooking operation performed by a cooking arm is controlled using recipe data including a data set in which cooking operation data describing information on a food material of a dish and information on an operation of a cook in a cooking process using the food material and sensation data indicating a sensation of the cook measured in conjunction with a progress of the cooking process are linked.

Drawings

Fig. 1 is a diagram showing an example of an overall process in a cooking system according to an embodiment of the present technology.

Fig. 2 is a diagram for describing a difference in food materials used on the chef side and the reproduction side.

Fig. 3 is a diagram showing an example of the description contents of recipe data.

Fig. 4 is a diagram showing an example of information contained in a cooking process data set.

Fig. 5 is a diagram showing an example of flavor components.

Fig. 6 is a diagram showing an example of calculation of taste subjective information.

Fig. 7 is a diagram showing an example of a graph of taste subjective information.

Fig. 8 is a diagram showing an example of recipe data.

Fig. 9 is a diagram showing an example of a flow of generating recipe data.

Fig. 10 is a diagram showing an example of a flow of reproducing a dish based on recipe data.

Fig. 11 is a diagram showing the flow of the chef side and the flow of the playback side together.

Fig. 12 is a diagram showing an example of another descriptive content of recipe data.

Fig. 13 is a diagram showing a configuration example of a cooking system according to an embodiment of the present technology.

Fig. 14 is a diagram showing another configuration example of the cooking system.

Fig. 15 is a diagram showing a configuration example of the control device.

Fig. 16 is a diagram showing a configuration example around a kitchen where a cook performs cooking.

Fig. 17 is a diagram showing an example of the use state of the taste sensor.

Fig. 18 is a block diagram showing a configuration example on the chef side.

Fig. 19 is a block diagram showing a configuration example of hardware of the data processing apparatus.

Fig. 20 is a block diagram showing a functional configuration example of the data processing apparatus.

Fig. 21 is a perspective view showing an appearance of the cooking robot.

Fig. 22 is an enlarged view showing the condition of the cooking arm.

Fig. 23 is a view showing an appearance of the cooking arm.

Fig. 24 is a view showing an example of movable ranges of respective parts of the cooking arm.

Fig. 25 is a view showing an example of connection between the cooking arm and the controller.

Fig. 26 is a block diagram showing an example of the configuration of the cooking robot and its surroundings.

Fig. 27 is a block diagram showing a functional configuration example of the control device.

Fig. 28 is a block diagram showing a configuration example of the flavor information processing unit.

Fig. 29 is a flowchart for describing recipe data generation processing by the data processing apparatus.

Fig. 30 is a flowchart for describing the flavor information generation process executed in step S5 of fig. 29.

Fig. 31 is a flowchart for describing a menu reproduction process of the control apparatus.

Fig. 32 is a flowchart for describing the flavor measurement process executed in step S36 of fig. 31.

Fig. 33 is a flowchart for describing the flavor adjustment process executed in step S38 of fig. 31.

Fig. 34 is a flowchart for describing the taste adjustment process executed in step S61 of fig. 33.

Fig. 35 is a diagram showing an example of a plan.

Fig. 36 is a flowchart for describing the flavor adjustment process of the control device.

Fig. 37 is a diagram showing an example of determining a flavor.

Fig. 38 is a diagram showing an example of determining flavor using flavor subjective information.

Fig. 39 is a diagram showing an example of a model for generating sensor data.

Fig. 40 is a flowchart for describing the flavor sensor information correction processing of the control device.

Fig. 41 is a diagram showing another configuration example of the cooking system.

Detailed Description

< brief summary of the present technology >

The present technology focuses on a difference between a sensation when a cook makes a dish and a sensation when cooking is performed based on a recipe created by the cook, and links sensation data obtained by converting the sensation of the cook when making the dish into data with data describing a food material and a cooking process to manage the linked data as recipe data.

Further, the present technology enables the cooking robot side to reproduce a dish having a flavor desired by a cook by adjusting a cooking operation of the cooking robot based on a feeling of the cook represented by the feeling data.

Further, the present technology enables flexible cooking according to the characteristics (attributes, states, etc.) of a person eating a dish by adjusting the ingredients and the cooking operation using data sensed during the cooking operation at the time of reproduction, in addition to the sensory data.

Hereinafter, a manner for implementing the present technology will be described. The description will be given in the following order.

1. Generation of recipe data and reproduction of dishes in a cooking system

2. Recipe data

3. Flow example of Generation of recipe data and reproduction of dishes

4. Configuration example of cooking System

5. Operation of a cooking system

6. Modifications of the invention

< Generation of recipe data and reproduction of dishes in cooking System >

Fig. 1 is a diagram showing an example of an overall process in a cooking system according to an embodiment of the present technology.

As shown in fig. 1, the cooking system includes a cook-side configuration for cooking and a reproduction-side configuration for reproducing a dish made by the cook.

The chef-side configuration is, for example, a configuration set in a restaurant, and the reproduction-side configuration is, for example, a configuration set in an ordinary home. The cooking robot 1 is prepared as a reproduction-side arrangement.

The cooking system of fig. 1 is a system in which the same dish as a dish made by a chef is reproduced on a cooking robot 1 arranged as a reproduction side. The cooking robot 1 is a robot including a drive system device such as a cooking arm and various sensors and having a cooking function.

The recipe data indicated by the arrow is provided from the chef-side configuration to the rendering-side configuration including the cooking robot 1. As will be described in detail below, information about a dish made by a chef, which includes ingredients of the dish, is described in the recipe data.

In the reproduction-side configuration, dishes are reproduced by controlling the cooking operation of the cooking robot 1 based on recipe data. For example, a dish is reproduced by causing the cooking robot 1 to perform a cooking operation for realizing the same process as that of a chef.

Although the cook is shown as the cook performing cooking, the cooking system of fig. 1 is applicable to the case where anyone performs cooking, as long as the person performs cooking, regardless of the name of the cook or the role in the kitchen.

Further, although one chef-side configuration is shown in fig. 1, the cooking system includes a plurality of chef-side configurations provided in a plurality of restaurants, respectively. For example, recipe data of a predetermined dish made by a predetermined chef selected by a person eating a dish reproduced by the cooking robot 1 is supplied to the reproduction-side configuration.

Note that a dish refers to a work product that is completed after cooking. Cooking refers to a process of making a dish or an action (work) of making a dish.

Fig. 2 is a diagram for describing a difference in food materials used on the chef side and the reproduction side.

In the case of using carrots in cooking by, for example, a chef, information indicating that carrots are used as food materials is described in the recipe data. Furthermore, information about the cooking process using carrots is described.

Similarly, a cooking operation using carrots is performed on the reproduction side based on the recipe data.

Here, even if the food materials are classified into the same "carrot", the carrot prepared by the chef side and the carrot prepared by the reproduction side differ in taste, aroma and texture due to differences in type, place of production, harvesting time, growth conditions and environment after harvesting. There is no identical food material as a natural product.

Therefore, even if the cooking robot 1 performs exactly the same cooking operation as that of the chef, the flavor of the dish prepared using carrot will be different. Details of the flavor will be described below.

A plurality of cooking processes are experienced to complete one dish, but even if a semi-finished dish made by one cooking process using carrots is viewed, its flavor is different between the cook side and the reproduction side.

Similarly, the flavor of a finished dish or a semi-finished dish is different between the cook side and the reproduction side depending on the difference of seasonings used in a certain cooking process, the difference of cooking tools (e.g., a kitchen knife and a pot) used for cooking, and the difference of firepower such as in the equipment.

Thus, in the cooking system of fig. 1, for example, every time a cooking process is performed, the flavor obtained as a sensation by a chef when making a dish is measured. In the recipe data provided to the reproduction side, the sensory data obtained by converting the flavor obtained by the chef into data is described as being linked with, for example, information on food materials and operations related to one cooking process.

< recipe data >

Fig. 3 is a diagram showing an example of the description contents of recipe data.

As shown in fig. 3, one recipe data includes a plurality of cooking process data sets. In the example of fig. 3, a cooking process data set related to cooking process #1, a cooking process data set related to cooking process #2, and a cooking process data set related to cooking process # N are included.

Therefore, in the recipe data, information on one cooking process is described as one cooking process data set.

Fig. 4 is a diagram showing an example of information included in the cooking process data set.

As shown in the balloon of fig. 4, the cooking process data set includes cooking operation information as information on a cooking operation for realizing a cooking process and flavor information as information on a flavor of an ingredient having undergone the cooking process.

1. Cooking operation information

The cooking operation information includes food material information and operation information.

1-1. food material information

The food material information is information about food materials used by a chef in a cooking process. The information about the food material includes information representing the type, number and size of the food material.

For example, in the case where a cook cooks with carrots during a certain cooking process, information indicating that carrots are used is included in the food material information. Information indicating various foods used by the chef as food materials (such as water and seasonings) for dishes is also included in the food material information. Food is a variety of items that humans can eat.

The foodstuffs include not only completely uncooked foodstuffs but also cooked (prepared) foodstuffs obtained by applying a certain cooking. The food material information included in the cooking operation information of a certain cooking process includes information of food materials that have undergone a previous cooking process.

The food material used by the cook may be identified, for example, by analyzing an image of the cook being cooked captured by the camera. The food material information is generated based on the recognition result of the food material. The image captured by the camera may be a still image or a moving image.

The food material information may be registered by the chef or another person (e.g., a staff person) who supports the chef when generating the recipe data.

1-2. operational information

The operation information is information on the movement of the cook during the cooking process. The information on the movement of the cook includes information representing the movement of the body of the cook each time, which includes the type and hand movement of the cooking tool used by the cook, the standing position of the cook each time, and the like.

For example, in the case where a cook cuts a certain food material with a kitchen knife, the operation information includes information indicating that the kitchen knife has been used as a cooking tool, information indicating a cutting position, a number of times of cutting, a cutting intensity, an angle, a speed, and the like.

Further, in the case where a cook stirs a pot containing a liquid as a food material with a ladle, the operation information includes information indicating that the ladle has been used as a cooking tool, information indicating the intensity, angle, speed, time, and the like of a stirring method.

In the case where a cook roasts a certain food material in an oven, the operation information includes information indicating that the oven has been used as a cooking tool, a heating power of the oven, a roasting time, and the like.

In the case where the cook has put the food material on the plate, the operation information includes information of tableware for putting the food material, a method of arranging the food material, a color of the food material, and the like.

For example, the movement of the cook is recognized by analyzing an image of the cook being cooked captured by a camera device, or by analyzing sensor data measured by a sensor attached to the cook. The operation information is generated based on the recognition result of the movement of the chef.

2. Flavour information

As shown in fig. 4, the flavor information includes flavor sensor information and flavor subjective information. The flavor is obtained as a sensation. The flavor information included in the cooking process data set corresponds to sensory data obtained by converting the cook's senses into data.

Fig. 5 is a diagram showing an example of flavor components.

As shown in fig. 5, the deliciousness, i.e., "flavor", that an individual feels in the brain is mainly a combination of taste obtained by the sense of taste of the individual, aroma obtained by the sense of smell of the individual, and texture obtained by the sense of touch of the individual.

The flavor also includes a sensible temperature (sensible temperature) and a color because the individual feels a delicious taste which varies depending on the sensible temperature and the color of the food material.

The configuration elements of the flavor will be described.

(1) Taste of the product

Taste includes five tastes (salty, sour, bitter, sweet, and umami) that can be perceived by the tongue and taste receptor cells in the oral cavity. Salty, sour, bitter, sweet and umami flavors are referred to as the basic five flavors.

Further, in addition to the basic five tastes, the taste also includes not only a pungent taste in the oral cavity but also a pungent taste sensed in a vanillin receptor (vanilloid receptor) belonging to the Transient Receptor Potential (TRP) channel family, which is a general pain feeling. Astringency is also one of the tastes, although astringency and bitterness overlap depending on the concentration.

Each taste will be described.

-salty taste

Substances that give a salty taste include minerals (Na, K, Fe, Mg, Ca, Cu, Mn, Al, Zn, etc.) that form salts by ionizing bonds.

-sourness

Substances that produce a sour taste include acids such as citric acid and acetic acid. Typically, a sour taste is perceived depending on the decrease in pH (e.g., about pH 3).

-sweet taste

Substances that produce sweetness include sugars (e.g., sucrose and glucose), lipids, amino acids (e.g., glycine), and artificial sweeteners.

-umami taste

Umami taste-producing substances include amino acids (e.g., glutamic acid and aspartic acid), nucleic acid derivatives (e.g., inosinic acid, guanylic acid, and xanthylic acid), organic acids (e.g., succinic acid), and salts.

-bitter taste

Substances that produce bitter taste include alkaloids (e.g., caffeine), humulones (e.g., theobromine, nicotine, catechins, and terpenoids), limonin, cucurbitacin, flavanone glycosides naringin, bitter amino acids, bitter peptides, bile acids, and inorganic salts (e.g., calcium and magnesium salts).

-astringent taste

Substances that produce astringency include polyphenols, tannins, catechins, multivalent ions (Al, Zn and Cr), ethanol and acetone. Astringency is recognized or measured as part of the bitter taste.

-spicy taste

The substance which produces a pungent taste includes capsaicin. As a biological function, capsaicin and menthol are recognized by the warm receptors of the TRP channel family as a pain sensation, not a taste sensation. Capsaicin is a component of capsicum and various spices, people feel hot, menthol is a component of mint, people feel cold.

(2) Fragrance

The aroma is sensed by volatile low molecular weight organic compounds having a molecular weight of 300 or less, which are recognized (bound) by olfactory receptors expressed in the nasal cavity and nasopharynx.

(3) Texture of

Texture is an index called food texture, and is expressed by hardness, viscosity, cohesiveness, polymer content, water content (moisture), oil content (greasiness), and the like.

(4) Body temperature

The body-sensory temperature is the temperature felt by human skin. The body-sensory temperature is not only the temperature of the food itself but also the temperature sensation felt by the skin surface layer in response to the food components, such as the refreshing sensation of a food containing volatile substances such as mint, and the warming sensation of a food containing spicy food materials such as capsicum.

(5) Colour(s)

The color of the food reflects the pigments contained in the food as well as the bitter and astringent ingredients. For example, plant-derived foods include pigments produced by photosynthesis and components associated with the bitter and astringent tastes of polyphenols. The ingredients included in the food can be estimated from the color of the food by an optical measurement method.

2-1. flavor sensor information

The flavor sensor information configuring the flavor information is sensor data obtained by measuring the flavor of the food material with a sensor. Sensor data obtained by a sensor by measuring the flavor of uncooked food material may be included in the flavor information as flavor sensor information.

Since flavors are configured by taste, aroma, texture, body-sensory temperature, and color, the flavor sensor information includes sensor data relating to taste, sensor data relating to aroma, sensor data relating to texture, sensor data relating to body-sensory temperature, and sensor data relating to color. All of the sensor data may be included in the flavor sensor information, or no sensor data may be included in the flavor sensor information.

The respective sensor data configuring the flavor sensor information are referred to as taste sensor data, smell sensor data, texture sensor data, body-sensory temperature sensor data, and color sensor data.

Taste sensor data is sensor data measured by a taste sensor. The taste sensor data is configured by at least one parameter of a salty taste sensor value, a sour taste sensor value, a bitter taste sensor value, a sweet taste sensor value, an umami taste sensor value, a spicy taste sensor value, and an astringent taste sensor value.

Taste sensors include, for example, artificial lipid membrane type taste sensors that use artificial lipid membranes in the sensor cell. The artificial lipid membrane type taste sensor is a sensor for detecting a change in membrane potential caused by electrostatic and hydrophobic interactions of a lipid membrane with respect to a taste substance, which is a substance causing a sense of taste, and outputting the change as a sensor value.

A device capable of converting each element of salty taste, sour taste, bitter taste, sweet taste, umami taste, pungent taste, and astringent taste configuring the taste of food into data and outputting the data may use various devices, for example, a taste sensor using a polymer film as a taste sensor, instead of an artificial lipid film type taste sensor.

The olfactory sensor data is sensor data measured by an olfactory sensor. The olfactory sensor data is configured by values of each element representing fragrance, for example, spicy fragrance, fruit fragrance, grass fragrance, musty (cheese fragrance), citrus fragrance, and rose fragrance.

The olfactory sensor includes, for example, a sensor provided with innumerable sensors such as a crystal oscillator. A crystal oscillator is used instead of the human nasal receptor. An olfactory sensor using a crystal oscillator detects a change in oscillation frequency of the crystal oscillator when a fragrance component strikes the crystal oscillator, and outputs a value expressing the above-described fragrance based on a pattern of the change in oscillation frequency.

The device capable of outputting the value expressing the fragrance may use various devices such as a device using a sensor formed using various materials such as carbon instead of a human nose receptor as an olfactory sensor (instead of a sensor using a crystal oscillator).

The texture sensor data is sensor data specified by analyzing an image captured by the image pickup device and sensor data measured by various sensors. The texture sensor data is configured by at least one parameter representing information of hardness, viscosity (stress), cohesiveness, polymer content, water content, oil content, and the like.

Hardness, stickiness, viscosity and cohesion are identified, for example, by analyzing images of the food material cooked by the chef captured by the camera. For example, by analyzing an image of soup being stirred by a chef, values of hardness, viscosity and cohesiveness can be identified. These values can be identified by measuring the pressure at which the cook cuts the food material with the kitchen knife.

The polymer content, the water content, and the oil content are measured by, for example, a sensor that irradiates the food material with light having a predetermined wavelength, and analyzes the reflected light to measure the value.

A database in which each parameter of each food material and texture is associated with each other is prepared, and the texture sensor data of each food material can be identified by referring to the database.

The body-sensory temperature sensor data is sensor data obtained by measuring the temperature of the food material using a temperature sensor.

The color sensor data is data specified by analyzing the color of the food material from an image captured by the image pickup device.

2-2. subjective information of flavor

The flavor subjective information is information indicating how an individual such as a cook who is cooking feels subjective flavor. Flavor subjective information is calculated based on the flavor sensor information.

Since the flavor is configured by taste, aroma, texture, body-sensory temperature, and color, the flavor subjective information includes subjective information on taste, subjective information on aroma, subjective information on texture, subjective information on body-sensory temperature, and subjective information on color. Subjective information about taste, subjective information about aroma, subjective information about texture, subjective information about body-sensory temperature, and subjective information about color may all be included in the flavor subjective information, or some subjective information may not be included in the flavor subjective information.

Each piece of subjective information configuring the flavor subjective information is called taste subjective information, smell subjective information, texture subjective information, body-sensory temperature subjective information, and color subjective information.

Fig. 6 is a diagram showing an example of calculation of taste subjective information.

As shown in fig. 6, taste subjective information is calculated using a taste subjective information generation model, which is a model of a neural network generated by deep learning or the like. For example, a taste subjective information generation model is generated in advance by performing learning using taste sensor data of a certain food material and information (numerical value) indicating how a chef who has eaten the food material feels a taste.

For example, as shown in fig. 6, when a salty taste sensor value, a sour taste sensor value, a bitter taste sensor value, a sweet taste sensor value, a umami taste sensor value, a spicy taste sensor value, and an astringent taste sensor value are input as taste sensor data of a certain food material, the salty taste subjective value, the sour taste subjective value, the bitter taste subjective value, the sweet taste subjective value, the umami taste subjective value, the spicy taste subjective value, and the astringent taste subjective value are output from the taste subjective information generation model.

The salty taste subjective value is a value indicating how a chef feels salty taste. The sourness subjective value is a value indicating how a chef feels sourness. Similarly, the bitterness subjective value, the sweetness subjective value, the umami subjective value, the peppery subjective value, and the astringency subjective value are values indicating how the chef feels bitterness, sweetness, umami, peppery, and astringency, respectively.

As shown in fig. 7, taste subjective information of a certain food material is shown as a graph using a salty taste subjective value, a sour taste subjective value, a bitter taste subjective value, a sweet taste subjective value, a umami taste subjective value, a spicy taste subjective value, and an astringent taste subjective value. The similarly shaped food materials having the graph of the taste subjective information are food materials having a similar taste to a chef when only the taste of the flavor is focused.

Similarly, other subjective information configuring the flavor subjective information is calculated using the respective models used to generate the subjective information.

That is, the olfactory subjective information is calculated by inputting olfactory sensor data into the olfactory subjective information generation model, and the texture subjective information is calculated by inputting texture sensor data into the texture subjective information generation model. The body-sensory-temperature subjective information is calculated by inputting body-sensory-temperature subjective sensor data to a body-sensory-temperature subjective information model, and the color subjective information is calculated by inputting color sensor data to a color subjective information generation model.

Taste subjective information may be calculated based on table information in which taste sensor data of a certain food material is associated with information representing how a chef who has eaten the food material feels the taste, instead of using a neural network model. Various methods for calculating flavor subjective information using flavor sensor information can be employed.

As described above, the recipe data is configured by linking (associating) cooking operation information, which is information on a cooking operation that implements a cooking process, to flavor information, which is information on a flavor of a food material or a dish measured in conjunction with the cooking process.

As shown in fig. 8, recipe data including each of the above information is prepared for each dish. For example, which recipe data are used to reproduce a dish is selected by a person at a place where the cooking robot 1 is installed.

< example of flow of generation of recipe data and reproduction of dish >

Fig. 9 is a diagram showing an example of a flow of generating recipe data.

As shown in fig. 9, cooking by a chef is generally performed by repeating cooking using food materials for each cooking process, tasting the cooked food materials, and adjusting the flavor.

For example, the flavor is adjusted by adding work such as adding salt when salty taste is insufficient or squeezing lemon juice when sour taste is insufficient. For example, the aroma is adjusted by adding work such as chopping and adding herbs, or heating food materials. For example, the texture is adjusted by adding work such as beating the food material to soften it when it is hard or increasing the time for cooking the food material.

Cooking operation information configuring a cooking process data set is generated based on a sensing result by sensing an operation of a chef cooking with the food material and an operation of a chef adjusting a flavor.

Further, by sensing the flavor of the cooked food material, flavor information is generated based on the sensing result.

In the example of fig. 9, cooking operation information configuring the cooking process data set of the cooking process #1 is generated based on the sensing results of the cooking operation performed by the cook as the cooking process #1 and the operation of the cook for adjusting the flavor, as indicated by arrows a1 and a 2.

Further, as indicated by arrow a3, flavor information configuring the cooking process data set of cooking process #1 is generated based on the sensing result of the flavor of the food material cooked by cooking process # 1.

After the cooking process #1 is completed, a cooking process #2, which is the next cooking process, is performed.

Similarly, as indicated by arrows a11 and a12, cooking operation information configuring the cooking process data set of the cooking process #2 is generated based on the sensing results of the cooking operation performed by the cook as the cooking process #2 and the operation used by the cook to adjust the flavor.

Further, as indicated by arrow a13, based on the result of sensing the flavor of the cooked food material by the cooking process #2, flavor information configuring the cooking process data set of the cooking process #2 is generated.

One dish is completed through such a plurality of cooking processes. Further, recipe data describing a cooking process data set for each cooking process is generated when a dish is completed.

Hereinafter, a case where one cooking process is mainly configured by three cooking operations of cooking, tasting, and adjusting will be described, but units of the cooking operations included in one cooking process may be arbitrarily set. A cooking process may be configured by a cooking operation that does not involve tasting or post-tasting flavor adjustments, or may be configured by flavor adjustments. Also in this case, flavor information obtained by sensing the flavor of each cooking process based on the sensing result is included in the cooking process data set.

The timing for sensing the flavor may also be set arbitrarily, instead of sensing the flavor every time one cooking process is completed. For example, flavor sensing can be repeated during one cooking process. In this case, the cooking process data set includes time-series data of flavor information.

The flavor information may be included in the cooking process data set together with information of a cooking operation performed at a timing at which the flavor is measured at an arbitrary timing each time, instead of including the flavor information in all the cooking process data sets.

Fig. 10 is a diagram showing an example of a flow of reproducing a dish based on recipe data.

As shown in fig. 10, the reproduction of a dish by the cooking robot 1 is performed by repeating cooking for each cooking process based on the cooking operation information included in the cooking process data set described in the recipe data, measuring the flavor of the cooked food material and adjusting the flavor.

The flavor is adjusted by adding work so that, for example, the flavor measured by a sensor prepared on the cooking robot 1 side approaches the flavor indicated by the flavor information. The details of the flavor adjustment by the cooking robot 1 will be described below.

The flavour measurement and adjustment may be repeated a number of times during the cooking process, for example. That is, each time the adjustment is performed, the flavor is measured for the adjusted food material, and the flavor is adjusted based on the measurement result.

In the example of fig. 10, as indicated by an arrow a21, the cooking operation of the cooking robot 1 is controlled based on the cooking operation information configuring the cooking process data set of the cooking process #1, and the same operation as that of the cook's cooking process #1 is performed by the cooking robot 1.

After the same operation as that of the cook's cooking process #1 is performed by the cooking robot 1, the flavor of the cooked food material is measured, and the adjustment of the flavor of the cooking robot 1 is controlled based on the flavor information configuring the cooking process data set of the cooking process #1, as indicated by arrow a 22.

In the case where the flavor measured by the sensor prepared on the cooking robot 1 side matches the flavor indicated by the flavor information, the flavor adjustment is completed, and the cooking process #1 is also completed. For example, it is determined that the flavor measured by the sensor prepared at the cooking robot 1 side matches the flavor indicated by the flavor information not only in the case where the flavors are identical but also in the case where the flavors are similar by the threshold value or more.

After the cooking process #1 is completed, a cooking process #2, which is the next cooking process, is performed.

Similarly, as indicated by an arrow a31, the cooking operation of the cooking robot 1 is controlled based on the cooking operation information configuring the cooking process data set of the cooking process #2, and the same operation as that of the cook's cooking process #2 is performed by the cooking robot 1.

After the same operation as that of the cook's cooking process #2 is performed by the cooking robot 1, the flavor of the cooked food material is measured, and the adjustment of the flavor of the cooking robot 1 is controlled based on the flavor information configuring the cooking process data set of the cooking process #2, as indicated by arrow a 32.

In the case where the flavor measured by the sensor prepared on the cooking robot 1 side matches the flavor indicated by the flavor information, the flavor adjustment is completed, and the cooking process #2 is also completed.

Through such a plurality of cooking processes, the dish made by the chef is reproduced by the cooking robot 1.

Fig. 11 is a diagram showing the flow of the chef side and the flow of the playback side together.

As shown in the left side of fig. 11, one dish is completed through a plurality of cooking processes #1 to # N, and recipe data describing a cooking process data set of each cooking process is generated.

Meanwhile, on the reproduction side, one dish is reproduced through a plurality of cooking processes #1 to # N that are the same as the cooking process performed on the cook side, based on recipe data generated by the cook's cooking.

Since the cooking by the cooking robot 1 is performed by adjusting the flavor of each cooking process, the final dish will be a dish having the same or similar flavor as the dish made by the chef. In this way, dishes having the same flavor as dishes made by the chef are reproduced in a highly reproducible form based on the recipe data.

For example, a chef may provide a person who cannot visit the chef's own restaurant with a dish having the same flavor as the dish made by the chef. Furthermore, the cook can retain the dishes made by the cook as recipe data in a reproducible form.

Meanwhile, a person eating a dish reproduced by the cooking robot 1 may eat a dish having the same flavor as that prepared by a chef.

Fig. 12 is a diagram showing an example of another descriptive content of recipe data.

As shown in fig. 12, the recipe data may include flavor information about the flavor of the finished dish. In this case, the flavor information on the flavor of the completed dish is linked to the entire cooking operation information.

In this way, the relationship of association between the cooking operation information and the flavor information is not necessarily one-to-one.

< example of configuration of cooking System >

(1) Integral arrangement

Fig. 13 is a diagram showing a configuration example of a cooking system according to an embodiment of the present technology.

As shown in fig. 13, the cooking system is configured by connecting a data processing apparatus 11 set to the chef-side configuration and a control apparatus 12 set to the reproduction-side configuration via a network 13 such as the internet. As described above, the cooking system is provided with a plurality of such chef-side configurations and reproduction-side configurations.

The data processing device 11 is a device that generates the recipe data. The data processing device 11 is configured by a computer or the like. The data processing device 11 transmits recipe data of a dish selected by a person eating the reproduced dish, for example, to the control device 12 via the network 13.

The control device 12 is a device that controls the cooking robot 1. The control device 12 is also configured by a computer or the like. The control device 12 receives the recipe data provided by the data processing device 11 and outputs an instruction command to control the cooking operation of the cooking robot 1 based on the description of the recipe data.

The cooking robot 1 drives each part such as a cooking arm according to an instruction command provided from the control device 12 to perform a cooking operation of each cooking process. The instruction command includes a torque of a motor provided in the cooking arm, a driving direction, information for controlling a driving amount, and the like.

The control device 12 sequentially outputs instruction commands to the cooking robot 1 until cooking is completed. When the cooking robot 1 performs an operation according to the instruction command, the dish is finally completed.

Fig. 14 is a diagram showing another configuration example of the cooking system.

As shown in fig. 14, the recipe data may be provided from the chef side to the reproduction side via a server on the network.

The recipe data management server 21 shown in fig. 14 receives the recipe data transmitted from each data processing apparatus 11, and manages the recipe data by causing the database to store the recipe data. The recipe data management server 21 transmits predetermined recipe data to the control apparatus 12 in response to a request transmitted from the control apparatus 12 via the network 13.

The recipe data management server 21 has a function of collectively managing the recipe data of dishes made by chefs of various restaurants and distributing the recipe data in response to a request from the reproduction side.

Fig. 15 is a diagram showing an example of the arrangement of the control device 12.

As shown in a in fig. 15, the control device 12 is provided as a device external to the cooking robot 1, for example. In the example of a in fig. 15, the control device 12 and the cooking robot 1 are connected via the network 13.

The cooking robot 1 receives an instruction command transmitted from the data processing device 12 via the network 13. An image captured by the camera of the cooking robot 1, various data such as sensor data measured by a sensor provided in the cooking robot 1 are transmitted from the cooking robot 1 to the data processing device 12 via the network 13.

Instead of connecting one cooking robot 1 to one control device 12, a plurality of cooking robots 1 may be connected to one control device 12.

As shown in B in fig. 15, the control device 12 may be provided inside the housing of the cooking robot 1. In this case, the operation of each part of the cooking robot 1 is controlled in accordance with the instruction command generated by the data processing device 12.

Hereinafter, description will be given under the following assumption: the control device 12 is provided as a device external to the cooking robot 1.

(2) Arrangement at the cook side

(2-1) arrangement around kitchen

Fig. 16 is a diagram showing a configuration example around a kitchen where a cook performs cooking.

In the vicinity of the kitchen 31 where the cook performs cooking, various devices are provided to measure information used in analyzing the operation of the cook and the flavor of the food material. Some of these devices are attached to the body of the cook.

Devices disposed around the kitchen 31 are connected to the data processing device 11 via wired or wireless communication. Each device disposed around the kitchen 31 may be connected to the data processing device 11 via a network.

As shown in fig. 16, the cameras 41-1 and 41-2 are disposed above the kitchen 31. The cameras 41-1 and 41-2 capture the state of cooking by the cook and the state on the ceiling of the kitchen 31, and transmit the images obtained by the capturing to the data processing device 11.

The small camera 41-3 is attached to the head of the cook. The capture range of the camera 41-3 can be switched according to the direction of the cook's line of sight. The camera 41-3 captures the state of the hand of the cook who is cooking, the state of the food to be cooked, and the state on the ceiling of the kitchen 31, and transmits the image obtained by the capturing to the data processing device 11.

In this way, a plurality of image pickup devices are provided around the kitchen 31. In the case where there is no need to distinguish the image pickup devices 41-1 to 41-3, they are collectively referred to as the image pickup device 41 as appropriate.

The olfactory sensor 42 is attached to the body of the chef. The olfactory sensor 42 measures the aroma of the food material, and transmits the olfactory sensor data to the data processing device 11.

A taste sensor 43 is provided on the ceiling of the kitchen 31. The taste sensor 43 measures the taste of the food material, and transmits the taste sensor data to the data processing device 11.

As shown in fig. 17, the taste sensor 43 is used by bringing a sensor unit 43A provided at an end of a cable into contact with an ingredient or the like to be cooked. In the case where taste sensor 43 is the above-described artificial lipid membrane type taste sensor, a lipid membrane is provided on sensor cell 43A.

Not only taste sensor data but also texture sensor data and body-sensory temperature sensor data among sensor data configuring flavor sensor information may be measured by the taste sensor 43 and transmitted to the data processing device 11. In this case, the taste sensor 43 is provided with functions as a texture sensor and a sensory temperature sensor. For example, texture sensor data such as polymer content, water content, and oil content may be measured by taste sensor 43.

Various devices other than the device shown in fig. 16 are provided around the kitchen 31.

Fig. 18 is a block diagram showing a configuration example on the chef side.

In the configuration shown in fig. 18, the same configurations as those described above are denoted by the same reference numerals. Duplicate description will be appropriately omitted.

As shown in fig. 18, the image pickup device 41, the smell sensor 42, the taste sensor 43, the infrared sensor 51, the texture sensor 52, and the environment sensor 53 are connected to the data processing device 11. The same configurations as those described above are denoted by the same reference numerals. Duplicate description will be appropriately omitted.

The infrared sensor 51 outputs IR light and generates an IR image. The IR image generated by the infrared sensor 51 is output to the data processing device 11. Various analyses, such as an analysis of the operation of the cook and the food material, may be performed based on the IR image taken by the infrared sensor 51 instead of the image (RGB image) captured by the image pickup device 41.

The texture sensor 52 is configured by sensors that output various sensor data for texture analysis, such as a hardness sensor, a stress sensor, a water content sensor, and a temperature sensor. Hardness sensors, stress sensors, water content sensors and temperature sensors may be provided in cooking tools such as kitchen knives, frying pans or ovens. The sensor data measured by the texture sensor 52 is output to the data processing device 11.

The environment sensor 53 is a sensor that measures a cooking environment, which is an environment such as a space of a kitchen where a cook performs cooking. In the example of fig. 18, the environment sensor 53 includes an image pickup device 61, a temperature/humidity sensor 62, and an illuminance sensor 63.

The camera 61 outputs the captured image of the cooking space to the data processing device 11. By analyzing the captured image of the cooking space, for example, the color (brightness, hue, and saturation) of the cooking space is measured.

The temperature/humidity sensor 62 measures the temperature and humidity of the chef-side space, and outputs information indicating the measurement result to the data processing device 11.

The illuminance sensor 63 measures the luminance of the cook-side space, and outputs information indicating the measurement result to the data processing device 11.

The color, temperature and brightness of the space in which the food dish is consumed affect how the individual perceives the flavor. For example, in the case of a seasoning considering the same dish, the higher the temperature, the lighter the taste is preferable, and the lower the temperature, the heavier the taste is preferable.

A cooking environment that may affect how an individual perceives a flavor may be measured while cooking and included in the recipe data as environmental information.

On the reproduction side, the environment such as color, temperature, and brightness of the room in which the person eats the dish is adjusted to be the same environment as the cooking environment represented by the environment information included in the recipe data.

Thus, the feeling of the individual's flavor when eating the reproduced dish can approach the feeling of the chef when cooking.

Various types of information that may affect how to perceive the flavor, such as air pressure and noise in the cook-side space and season and time of day during cooking, may be measured by the environment sensor 53 and included in the recipe data as the environmental information.

(2-2. configuration of data processing apparatus 11)

Fig. 19 is a block diagram showing a configuration example of hardware of the data processing apparatus 11.

As shown in fig. 19, the data processing apparatus 11 is configured by a computer. A Central Processing Unit (CPU)201, a Read Only Memory (ROM)202, and a Random Access Memory (RAM)203 are connected to each other by a bus 204.

Further, an input/output interface 205 is connected to the bus 204. An input unit 206 including a keyboard and a mouse, and the like, and an output unit 207 including a display, a speaker, and the like are connected to the input/output interface 205.

Further, a storage unit 208 including a hard disk, a nonvolatile memory, and the like, a communication unit 209 including a network interface, and the like, and a drive 210 for driving a removable medium 211 are connected to the input/output interface 205.

In the computer configured as described above, the CPU 201 loads a program stored in the storage unit 208 into the RAM 203 via the input/output interface 205 and the bus 204, for example, and executes the program, thereby performing a series of processes.

Fig. 20 is a block diagram showing a functional configuration example of the data processing apparatus 11.

At least a part of the functional units shown in fig. 20 is realized by the CPU 201 in fig. 19 executing a predetermined program.

As shown in fig. 20, the data processing unit 221 is implemented in the data processing apparatus 11. The data processing unit 221 includes a cooking operation information generating unit 231, a flavor information generating unit 232, a recipe data generating unit 233, an environment information generating unit 234, an attribute information generating unit 235, and a recipe data output unit 236.

The cooking operation information generating unit 231 includes a food material recognizing unit 251, a tool recognizing unit 252, and an operation recognizing unit 253.

The food material recognition unit 251 analyzes the image captured by the image pickup device 41, and recognizes the type of food material used by the cook for cooking. The food material recognition unit 251 is provided with identification information, for example, characteristic information used in recognizing various types of food materials.

The tool recognition unit 252 analyzes the image captured by the camera 41 and recognizes the type of cooking tool used by the cook for cooking. The tool identification unit 252 is provided with identification information used in identifying various types of cooking tools.

The operation recognition unit 253 analyzes an image captured by the camera 41, sensor data representing a measurement result of a sensor attached to the body of the cook, and the like, and recognizes the operation of the cook who is cooking.

Information indicating the recognition result by each unit of the cooking operation information generating unit 231 is supplied to the recipe data generating unit 233.

The flavor information generating unit 232 includes a taste measuring unit 261, a fragrance measuring unit 262, a texture measuring unit 263, a body-sensory temperature measuring unit 264, a color measuring unit 265, and a subjective information generating unit 266.

The taste measurement unit 261 measures the taste of the food material by controlling the taste sensor 43, and acquires taste sensor data. The food materials to be measured include all food materials processed by the chef, such as pre-cooked food materials, cooked food materials and finished dishes.

The aroma measurement unit 262 measures the aroma of the food material by controlling the olfactory sensor 42, and acquires olfactory sensor data of the food material.

The texture measuring unit 263 measures the texture of the food material by analyzing the image captured by the image pickup device 41 and the measurement result of the texture sensor 52, and acquires texture sensor data of the food material.

The sensible temperature measuring unit 264 acquires sensible temperature sensor data indicating a sensible temperature of the food material measured by the temperature sensor.

The color measurement unit 265 recognizes the color of the food material by analyzing the image captured by the image pickup device 41, and acquires color sensor data representing the recognition result. In the case where the object to be color-recognized is a dish completed by laying out food materials, the color of each part in the entire dish is recognized.

The subjective information generation unit 266 generates subjective information based on the sensor data acquired by each of the taste measurement unit 261 to the color measurement unit 265. The subjective information generation unit 266 performs processing of converting objective data on the flavor represented by the sensor data into subjective data representing how the chef feels the flavor.

The subjective-information generating unit 266 is provided with information used in generating subjective information, such as a neural network described with reference to fig. 6.

For example, the subjective information generating unit 266 inputs the taste sensor data acquired by the taste measuring unit 261 into the taste subjective information generation model to generate the taste subjective information of the food material.

Similarly, the subjective information generation unit 266 inputs the olfactory sensor data acquired by the aroma measurement unit 262 into the olfactory subjective information generation model to generate olfactory subjective information of the food material. The subjective information generating unit 266 inputs the texture sensor data acquired by the texture measuring unit 263 into the texture subjective information model to generate texture subjective information of the food material.

Subjective information generation section 266 inputs the body-sensory temperature sensor data acquired by body-sensory temperature measurement section 264 to a body-sensory temperature subjective information generation model to generate body-sensory temperature subjective information of the food material. The subjective information generating unit 266 inputs the color sensor data acquired by the color measuring unit 265 into a color subjective information generation model to generate color subjective information of the food material.

The sensor data acquired by each of the taste measuring unit 261 to the color measuring unit 265 and the subjective information generated by the subjective information generating unit 266 are supplied to the recipe data generating unit 233.

The recipe data generation unit 233 generates cooking operation information based on information supplied from each unit of the cooking operation information generation unit 231. That is, the recipe data generation unit 233 generates the food material information based on the recognition result of the food material recognition unit 251, and generates the operation information based on the recognition results of the tool recognition unit 252 and the operation recognition unit 253. The recipe data generation unit 233 generates cooking operation information including the food material information and the operation information.

Further, the recipe data generation unit 233 generates the flavor information based on the information supplied from each unit of the flavor information generation unit 232. That is, the recipe data generation unit 233 generates flavor sensor information based on the sensor data acquired by the taste measurement unit 261 to the color measurement unit 265, and generates flavor subjective information based on the subjective information generated by the subjective information generation unit 266. The recipe data generation unit 233 generates flavor information including flavor sensor information and flavor subjective information.

The recipe data generation unit 233 generates a cooking process data set by, for example, associating cooking operation information and flavor information for each cooking process of a chef. The recipe data generation unit 233 collects cooking process data sets related to cooking processes from a first cooking process to a last cooking process of a certain dish, thereby generating recipe data describing a plurality of cooking process data sets.

The recipe data generation unit 233 outputs the recipe data generated in this manner to the recipe data output unit 236. The recipe data output by the recipe data generation unit 233 includes the environment information generated by the environment information generation unit 234 and the attribute information generated by the attribute information generation unit 235 as appropriate.

The environmental information generating unit 234 generates environmental information representing a cooking environment based on the measurement result of the environment sensor 53. The environmental information generated by the environmental information generating unit 234 is output to the recipe data generating unit 233.

The attribute information generating unit 235 generates attribute information indicating the attributes of the chef. Attributes of the chef include, for example, the chef's age, gender, nationality, and living area. Information representing the physical condition of the chef may be included in the attribute information.

The age, sex, nationality and living area of the cook can affect how the cook feels the flavor. That is, the flavor subjective information included in the recipe data is considered to be influenced by the age, sex, nationality, living area, and the like of the cook.

On the reproduction side, in the case of performing processing using flavor subjective information included in the recipe data, the flavor subjective information is appropriately corrected according to a difference between the attribute of the chef represented by the attribute information and the attribute of the person eating the reproduced dish, and the processing is performed using the corrected flavor subjective information.

For example, assume that the chef is a french person, and that the person eating the reproduced dish is japanese. In this case, how the chef feels the flavor represented by the flavor subjective information included in the recipe data is how the french person feels the flavor, which is different from how japanese person feels.

Based on information indicating how japanese feels corresponding to how french people feel, the flavor subjective information included in the recipe data is corrected so that the same feeling can be obtained when japanese eats the dish. The information used when correcting the flavor subjective information is information that relates how a french person feels the flavor to how a japanese person feels the flavor, and is statistically generated and prepared in advance on the reproduction side, for example.

Attributes such as categories of dishes (e.g., french, japanese, italian, or spanish) made by the chef may be included in the attribute information.

Further, attributes of the foodstuffs and seasonings for cooking may be included in the attribute information. The attributes of the food material include production area and variety. The flavor attributes also include the area of production and variety.

In this way, chef attribute information as attribute information representing attributes of a chef, food attribute information as attribute information representing attributes of dishes and food materials, and seasoning attribute information as attribute information representing attributes of seasonings among the food materials can be included in the recipe data.

The recipe data output unit 236 controls the communication unit 209 (fig. 19), and outputs the recipe data generated by the recipe data generation unit 233. The recipe data output from the recipe data output unit 236 is supplied to the control device 12 or the recipe data management server 21 via the network 13.

(3) Arrangement on reproduction side

(3-1) configuration of cooking robot 1

Appearance of the cooking robot 1

Fig. 21 is a perspective view showing an appearance of the cooking robot 1.

As shown in fig. 21, the cooking robot 1 is a kitchen-type robot having a rectangular parallelepiped case 311. Various configurations are provided inside the housing 311 as a main body of the cooking robot 1.

The cooking assistance system 312 is provided at the rear side of the housing 311 so as to stand upright from the upper surface of the housing 311. The space divided by the thin plate-like members in the cooking assistance system 312, such as a refrigerator, an oven, and a storage room, has a function for assisting the cooking of the cooking arms 321-1 to 321-4.

A rail is provided on the top plate 311A in a length direction, and cooking arms 321-1 to 321-4 are provided on the rail. The cooking arms 321-1 to 321-4 may change position along the track as a moving mechanism.

The cooking arms 321-1 to 321-4 are robot arms configured by connecting cylindrical members with joints. Various operations related to cooking are performed by the cooking arms 321-1 to 321-4.

The space above the top plate 311A is a cooking space where the cooking arms 321-1 to 321-4 perform cooking.

Although four cooking arms are shown in fig. 21, the number of cooking arms is not limited to four. Hereinafter, the cooking arms 321-1 to 321-4 are collectively referred to as the cooking arms 321 without distinguishing the cooking arms 321-1 to 321-4.

Fig. 22 is a view showing an enlarged condition of the cooking arm 321.

As shown in fig. 22, accessories having various cooking functions are attached to the distal end of the cooking arm 321. As the attachments for the cooking arm 321, various attachments are prepared, such as an attachment having a robot function (hand function) for gripping food materials, tableware, and the like, and an attachment having a cutter function for cutting food materials.

In the example of fig. 22, a knife attachment 331-1 as an attachment having a knife function is attached to the cooking arm 321-1. A piece of meat placed on the top plate 311A is cut using the knife attachment 331-1.

A main shaft attachment 331-2, which is an attachment for fixing food material or rotating food material, is attached to the cooking arm 321-2.

A peeler accessory 331-3, which is an accessory having a peeler function to peel food material, is attached to the cooking arm 321-3.

The potatoes lifted by cooking arm 321-2 using spindle attachment 331-2 are peeled using peeler attachment 331-3 through cooking arm 321-3. As described above, the plurality of cooking arms 321 may perform a work in cooperation with each other.

A robot arm attachment 331-4, which is an attachment having a robot function, is attached to the cooking arm 321-4. The fryer pot with chicken is transported to the space of the cooking assistance system 312 with oven functionality using the robot attachment 331-4.

Cooking by the cooking arm 321 is performed by appropriately replacing the accessories according to the contents of the work. The replacement of the accessory is performed automatically by the cooking robot 1, for example.

The same attachment may also be attached to multiple cooking arms 321, such as a robot attachment 331-4 to each of the four cooking arms 321.

Cooking by the cooking robot 1 is performed not only using the above prepared accessory as a tool for a cooking arm, but also using the same tool as a tool for cooking by an individual as appropriate. For example, a knife used by an individual is gripped by the manipulator attachment 331-4, and cooking is performed using the knife, for example, cutting food stuff.

Arrangement of cooking arms

Fig. 23 is a view showing an appearance of the cooking arm 321.

As shown in fig. 23, the cooking arm 321 is generally configured by connecting a thin cylindrical member with a hinge portion serving as a joint portion. Each hinge portion is provided with a motor or the like that generates a force for driving each member.

As a cylindrical member, an attaching/detaching member 351, a relay member 353, and a base member 355 are provided in order from the distal end. The attaching/detaching member 351 is a member having a length of 1/5 that is about the length of the relay member 353. The combined length of the attaching/detaching member 351 and the length of the relay member 353 is substantially the same as the length of the base member 355.

The attaching/detaching member 351 and the relay member 353 are connected with a hinge portion 352, and the relay member 353 and the base member 355 are connected with a hinge portion 354. The hinge portion 352 and the hinge portion 354 are provided at both ends of the relay member 353.

In this example, the cooking arm 321 is configured by three cylindrical members. However, the cooking arm 321 may be configured by four or more cylindrical members. In this case, a plurality of relay members 353 are provided.

An attaching/detaching portion 351A to attach or detach an accessory is provided at a distal end of the attaching/detaching member 351. The attaching/detaching member 351 includes an attaching/detaching portion 351A that attaches or detaches various accessories, and functions as a cooking function arm unit that performs cooking by operating the accessories.

At the rear end of the base member 355, an attaching/detaching portion 356 to be mounted to a rail is provided. The base member 355 serves as a movement function arm unit that realizes movement of the cooking arm 321.

Fig. 24 is a view showing an example of the movable range of each part of the cooking arm 321.

As shown by the ellipse #1, the attaching/detaching member 351 is rotatable about the central axis of the circular cross section. A small flat circle shown at the center of the ellipse #1 indicates the direction of the rotation axis of the alternate long and short dash line.

As shown by circle #2, the attaching/detaching member 351 is rotatable about an axis passing through the fitting portion 351B with the hinge portion 352. Further, the relay member 353 is rotatable about an axis passing through the fitting portion 353A with the hinge portion 352.

The two small circles shown within circle #2 indicate the directions of the respective rotational axes (in the direction perpendicular to the paper). The movable range of the attaching/detaching member 351 centered on the axis passing through the fitting portion 351B and the movable range of the relay member 353 centered on the axis passing through the fitting portion 353A are, for example, ranges of 90 degrees.

The relay member 353 is configured to be separated by a member 353-1 on the distal end side and a member 353-2 on the rear end side. As shown by ellipse #3, the relay member 353 is rotatable about the central axis of the circular cross-section at the connection 353B between the member 353-1 and the member 353-2.

The other movable portions have substantially similar movable ranges.

In other words, as indicated by circle #4, the relay member 353 is rotatable about an axis passing through the fitting portion 353C with the hinge portion 354. Further, the base member 355 is rotatable about an axis passing through the fitting portion 355A with the hinge portion 354.

The base member 355 is configured to be separated by a member 355-1 on the distal end side and a member 355-2 on the rear end side. As shown by oval #5, base member 355 is rotatable about the central axis of the circular cross-section at connection 355B between member 355-1 and member 355-2.

As shown by circle #6, the base member 355 is rotatable about an axis passing through the fitting portion 355C with the attaching/detaching portion 356.

As shown by an ellipse #7, the attaching/detaching portion 356 is mounted to the rail so as to become rotatable about the center axis of the circular cross section.

Thus, the attaching/detaching member 351 having the attaching/detaching portion 351A at the distal end, the relay part 353 connecting the attaching/detaching member 351 and the base member 355, and the base member 355 connected to the rear end of the attaching/detaching portion 356 are respectively connected to be rotatable with hinge portions. The movement of each movable part is controlled by a controller in the cooking robot 1 according to the instruction command.

Fig. 25 is a view showing an example of connection between the cooking arm and the controller.

As shown in fig. 25, the cooking arm 321 and the controller 361 are connected via a wire in a space 311B formed inside the housing 311. In the example in fig. 25, the cooking arms 321-1 to 321-4 and the controller 361 are connected via wires 362-1 to 362-4, respectively. The wires 362-1 to 362-4 having flexibility are appropriately bent depending on the positions of the cooking arms 321-1 to 321-4.

As described above, the cooking robot 1 is a robot capable of performing various operations related to cooking by driving the cooking arm 321.

Arrangement around the cooking robot 1

Fig. 26 is a block diagram showing an example of the configuration of the cooking robot 1 and its surroundings.

The cooking robot 1 is configured by connecting each part to the controller 361. In the configuration shown in fig. 26, the same configurations as those described above are denoted by the same reference numerals. Duplicate description will be appropriately omitted.

In addition to the cooking arm 321, the image pickup device 401, the smell sensor 402, the taste sensor 403, the infrared sensor 404, the texture sensor 405, the environment sensor 406, and the communication unit 407 are also connected to the controller 361.

Although not shown in fig. 21 and the like, the same sensor as the sensor provided on the cook side is provided at a predetermined position of the cooking robot 1 itself or around the cooking robot 1. The image pickup device 401, the smell sensor 402, the taste sensor 403, the infrared sensor 404, the texture sensor 405, and the environment sensor 406 have functions similar to those of the image pickup device 41, the smell sensor 42, the taste sensor 43, the infrared sensor 51, the texture sensor 52, and the environment sensor 53 on the cook side.

The controller 361 is configured by a computer having a CPU, ROM, RAM, flash memory, and the like. The controller 361 executes a predetermined program by the CPU to control the overall operation of the cooking robot 1.

In the controller 361, a predetermined program is executed to realize the instruction command acquisition unit 421 and the arm control unit 422.

The instruction command acquisition unit 421 acquires an instruction command transmitted from the control device 12 and received by the communication unit 407. The instruction command acquired by the instruction command acquisition unit 421 is supplied to the arm control unit 422.

The arm control unit 422 controls the operation of the cooking arm 321 according to the instruction command acquired by the instruction command acquisition unit 421.

The camera 401 captures a state of the cooking arm 321 that performs a cooking operation, a state of an ingredient to be cooked, and a state on the top plate 311A of the cooking robot 1, and outputs an image obtained by the capturing to the controller 361. The camera 401 is provided at various positions such as the front of the cooking assistance system 312 and the distal end of the cooking arm 321.

The olfactory sensor 402 measures the aroma of the food material and transmits olfactory sensor data to the controller 361. The olfactory sensors 402 are disposed at various locations such as the front of the cooking assistance system 312 and the distal end of the cooking arm 321.

The taste sensor 403 measures the taste of the food material and transmits taste sensor data to the controller 361. Also on the reproduction side, a taste sensor 403, for example of the artificial lipid membrane type, is provided.

Accessories having functions such as the smell sensor 402 and the taste sensor 403 can be prepared and used by being attached to the cooking arm 321 at the time of measurement.

The infrared sensor 404 outputs IR light, and generates an IR image. The IR image generated by the infrared sensor 404 is output to the controller 361. Various analyses such as the operation of the cooking robot 1 and the analysis of food materials may be performed based on the IR image captured by the infrared sensor 404 instead of the image (RGB image) captured by the camera 401.

The texture sensor 405 is configured by sensors (e.g., a hardness sensor, a stress sensor, a water content sensor, and a temperature sensor) that output various sensor data for texture analysis. Hardness sensors, stress sensors, water content sensors, and temperature sensors may be provided in accessories mounted on the cooking arm 321 or a cooking tool, such as a kitchen knife, a frying pan, or an oven. The sensor data measured by the texture sensor 405 is output to the controller 361.

The environment sensor 406 is a sensor that measures a dining environment, which is an environment of a space such as a restaurant in which a dish reproduced by the cooking robot 1 is consumed. In the example of fig. 26, the environment sensor 406 includes an image pickup device 441, a temperature/humidity sensor 442, and an illuminance sensor 443. The environment of the reproduction space where the cooking robot 1 performs cooking may be measured by the environment sensor 406.

The camera 441 outputs a captured image of the dining space to the controller 361. For example, the color (brightness, hue, and saturation) of the meal space is measured by analyzing the captured image of the meal space.

The temperature/humidity sensor 442 measures the temperature and humidity of the dining space, and outputs information indicating the measurement result to the controller 361.

The illuminance sensor 443 measures the brightness of the dining space, and outputs information indicating the measurement result to the controller 361.

The communication unit 407 is a wireless communication module such as a wireless LAN module or a portable communication module compatible with Long Term Evolution (LTE). The communication unit 407 communicates with the control apparatus 12 or an external apparatus such as the recipe data management server 21 on the internet.

Further, the communication unit 407 communicates with a portable terminal such as a smartphone or a tablet terminal used by the user. The user is an individual who eats the food reproduced by the cooking robot 1. The user's operation on the cooking robot 1, such as selection of a dish, may be input through an operation on the portable terminal.

As shown in fig. 26, the cooking arm 321 is provided with a motor 431 and a sensor 432.

A motor 431 is provided at each joint of the cooking arm 321. The motor 431 performs a rotating operation around the shaft under the control of the arm control unit 422. An encoder for measuring the amount of rotation of the motor 431, a driver for adaptively controlling the rotation of the motor 431 based on the measurement result of the encoder, and the like are also provided at each joint.

The sensor 432 is configured by, for example, a gyro sensor, an acceleration sensor, a touch sensor, or the like. The sensor 432 measures an angular velocity, an acceleration, and the like of each joint during the operation of the cooking arm 321, and outputs information indicating the measurement result to the controller 361. Sensor data indicating the measurement result of the sensor 432 is also suitably transmitted from the cooking robot 1 to the control device 12.

Information on the specifications of the cooking robot 1, such as the number of cooking arms 321, is provided from the cooking robot 1 to the control device 12 at a predetermined timing. In the control device 12, the operation is planned according to the specification of the cooking robot 1. The instruction command generated by the control device 12 corresponds to the specification of the cooking robot 1.

(3-2) configuration of control device 12

Similarly to the data processing device 11, the control device 12 that controls the operation of the cooking robot 1 is configured by a computer as shown in fig. 19. Hereinafter, a description will be given with appropriate reference to the configuration of the data processing apparatus 11 shown in fig. 19 as the configuration of the control apparatus 12.

Fig. 27 is a block diagram showing a functional configuration example of the control device 12.

At least a part of the functional units shown in fig. 27 is realized by the CPU 201 (fig. 19) of the control device 12 executing a predetermined program.

As shown in fig. 27, in the control device 12, a command generation unit 501 is implemented. The command generating unit 501 includes a recipe data acquiring unit 511, a recipe data analyzing unit 512, a robot state estimating unit 513, a flavor information processing unit 514, a control unit 515, and a command output unit 516.

The recipe data acquisition unit 511 controls the communication unit 209, and acquires the recipe data by receiving the recipe data transmitted from the data processing apparatus 11 or by communicating with the recipe data management server 21. The recipe data acquired by the recipe data acquisition unit 511 is, for example, recipe data of a dish selected by the user.

A database of recipe data may be provided in the storage unit 208. In this case, the recipe data is acquired from the database provided in the storage unit 208. The recipe data acquired by the recipe data acquisition unit 511 is supplied to the recipe data analysis unit 512.

The recipe data analysis unit 512 analyzes the recipe data acquired by the recipe data acquisition unit 511. In the case where a certain cooking process is to be performed, the recipe data analysis unit 512 analyzes a cooking process data set related to the cooking process, and extracts cooking operation information and flavor information. The cooking operation information extracted from the cooking process data set is supplied to the control unit 515, and the flavor information is supplied to the flavor information processing unit 514.

In the case where the recipe data includes attribute information and environmental information, these pieces of information are also extracted by the recipe data analysis unit 512 and supplied to the flavor information processing unit 514.

The robot state estimating unit 513 controls the communication unit 209 to receive the image and the sensor data transmitted from the cooking robot 1. From the cooking robot 1, an image captured by a camera of the cooking robot 1 and sensor data measured by a sensor provided at a predetermined position of the cooking robot 1 are transmitted at a predetermined cycle. In the image captured by the camera of the cooking robot 1, the situation around the cooking robot 1 is captured.

The robot state estimation unit 513 estimates states around the cooking robot 1, such as the state of the cooking arm 321 and the state of the food material, by analyzing the image and the sensor data transmitted from the cooking robot 1. Information indicating the state around the cooking robot 1 estimated by the robot state estimating unit 513 is supplied to the control unit 515.

The flavor information processing unit 514 controls the operation of the cooking robot 1 in cooperation with the control unit 515 based on the flavor information supplied from the recipe data analysis unit 512. The operation of the cooking robot 1 controlled by the flavor control information processing unit 514 is, for example, an operation related to adjustment of the flavor of the food material.

For example, the flavor information processing unit 514 controls the operation of the cooking robot 1 so that the flavor of the food material cooked by the cooking robot 1 becomes the same as the flavor indicated by the flavor sensor information. Details of the control performed by the flavor information processing unit 514 will be described with reference to fig. 28.

The control unit 515 controls the operation of the cooking robot 1 by generating and transmitting an instruction command from the command output unit 516. The operation of the cooking robot 1 is controlled by the control unit 515 based on the cooking operation information supplied from the recipe data analysis unit 512 or a request made by the flavor information processing unit 514.

For example, the control unit 515 identifies the food material to be used in the cooking process to be performed based on the food material information included in the cooking operation information. Further, the control unit 515 identifies a cooking tool used in the cooking process and an operation to be performed by the cooking arm 321 based on operation information included in the cooking operation information.

The control unit 515 sets a state of readiness of the ingredients to a target state, and sets an operation sequence from a current state as a current state of the cooking robot 1 to the target state. The control unit 515 generates an instruction command for executing each operation of the configuration operation sequence, and outputs the instruction command to the command output unit 516.

In the cooking robot 1, the cooking arm 321 is controlled according to the instruction command generated by the control unit 515, and the food material is prepared. Information indicating the state of the cooking robot 1 at each timing including the state of the cooking arm 321 is transmitted from the cooking robot 1 to the control device 12.

Further, in the case where the food material is ready, the control unit 515 sets a state in which cooking (cooking in one cooking process to be performed) using the prepared food material is completed as a target state, and sets an operation sequence from the current state to the target state. The control unit 515 generates an instruction command for executing each operation of the configuration operation sequence, and outputs the instruction command to the command output unit 516.

In the cooking robot 1, the cooking arm 321 is controlled according to the instruction command generated by the control unit 515, and cooking using the food material is performed.

In the case where cooking using food materials is ended, the control unit 515 generates an instruction command for measuring flavor, and outputs the instruction command to the command output unit 516.

In the cooking robot 1, the cooking arm 321 is controlled according to an instruction command generated by the control unit 515, and the flavor of the food material is measured using the image pickup device 401, the smell sensor 402, the taste sensor 403, the infrared sensor 404, and the texture sensor 405 as appropriate. Information indicating the measurement result of the flavor is transmitted from the cooking robot 1 to the control device 12.

In the flavor information processing unit 514, a method of adjusting the flavor and the like is planned, and the flavor information processing unit 514 requests the control unit 515 to perform an operation for adjusting the flavor.

In the case where an operation for adjusting the flavor is requested, the control unit 515 sets a state in which the operation has been completed as a target state, and sets an operation sequence from the current state to the target state. The control unit 515 outputs an instruction command for executing each operation of the configuration operation sequence to the command output unit 516.

In the cooking robot 1, the cooking arm 321 is controlled in accordance with the instruction command generated by the control unit 515, and an operation for adjusting the flavor is performed.

The operation of the cooking robot 1 is controlled by the control unit 515 by using, for example, the above instruction command. The control unit 515 has a function as a generation unit for generating an instruction command.

Note that the instruction command generated by the control unit 515 may be a command for giving an instruction to execute an entire action for causing a certain state transition, or may be a command for giving an instruction to execute a part of an action. In other words, one action may be performed according to one instruction command, or may be performed according to a plurality of instruction commands.

The command output unit 516 controls the communication unit 209, and transmits an instruction command generated by the control unit 515 to the cooking robot 1.

Fig. 28 is a block diagram showing a configuration example of the flavor information processing unit 514.

As shown in fig. 28, the flavor information processing unit 514 includes a flavor measuring unit 521, a flavor adjusting unit 522, a subjective information analyzing unit 523, an attribute information analyzing unit 524, and an environmental information analyzing unit 525.

The flavor measurement unit 521 includes a taste measurement unit 541, a fragrance measurement unit 542, a texture measurement unit 543, a body-sensory temperature measurement unit 544, and a color measurement unit 545.

The taste measurement unit 541 acquires taste sensor data transmitted from the cooking robot 1 in response to the measurement of the flavor. The taste sensor data acquired by taste measurement unit 541 is measured by taste sensor 403 (fig. 26). In the cooking robot 1, the flavor of the food material is measured at a predetermined timing (e.g., a timing when a cooking operation of a certain cooking process is completed).

The aroma measurement unit 542 acquires smell sensor data transmitted from the cooking robot 1 in response to the measurement of the flavor. The olfactory sensor data acquired by the aroma measurement unit 542 is measured by the olfactory sensor 402.

The texture measuring unit 543 acquires texture sensor data transmitted from the cooking robot 1 in response to the measurement of the flavor. The texture sensor data acquired by the texture measurement unit 543 is measured by the texture sensor 405.

The body-sensory-temperature measurement unit 544 acquires body-sensory-temperature sensor data transmitted from the cooking robot 1 in response to the measurement of the flavor. The body-sensory temperature sensor data acquired by the body-sensory temperature measurement unit 544 is measured by a temperature sensor provided at a predetermined position (for example, in the taste sensor 403) of the cooking robot 1.

The color measurement unit 545 acquires color sensor data transmitted from the cooking robot 1 in response to the measurement of the flavor. The color sensor data acquired by the color measurement unit 545 is recognized by analyzing the image captured by the camera 401 of the cooking robot 1.

The sensor data acquired by each section of the flavor measurement unit 521 is provided to the flavor adjustment unit 522.

The flavor adjusting means 522 includes a taste adjusting means 551, a fragrance adjusting means 552, a texture adjusting means 553, a body-sensory temperature adjusting means 554, and a color adjusting means 555. The flavor information provided from the recipe data analysis unit 512 is input to the flavor adjustment unit 522.

The taste adjustment unit 551 compares taste sensor data configuring flavor sensor information included in the recipe data with taste sensor data acquired by the taste measurement unit 541, and determines whether the two kinds of taste sensor data match. Here, in the case where the same operation as the cooking operation of the cook is performed by the cooking robot 1, it is determined whether the taste of the food material obtained by the cooking operation of the cooking robot 1 matches the taste of the food material obtained by the cooking operation of the cook.

In the case where it is determined that taste sensor data configuring flavor sensor information included in recipe data matches taste sensor data acquired by the taste measurement unit 541, the taste adjustment unit 551 determines that no adjustment of the taste is necessary.

On the other hand, in the case where it is determined that taste sensor data configuring flavor sensor information included in the recipe data does not match taste sensor data acquired by the taste measurement unit 541, the taste adjustment unit 551 plans a method of adjusting the taste, and requests the control unit 515 to perform an operation for adjusting the taste.

The control unit 515 is required to perform operations such as adding salt when the salty taste is insufficient and squeezing lemon juice when the sour taste is insufficient.

Similarly, in the other processing units of the flavor adjusting unit 522, it is determined whether the taste of the food material obtained by the cooking operation of the cooking robot 1 matches the taste of the food material obtained by the cooking operation of the chef, and the flavor is adjusted as needed.

That is, the scent adjustment unit 552 compares the olfactory sensor data configuring the flavor sensor information included in the recipe data with the olfactory sensor data acquired by the scent measurement unit 542, and determines whether the two types of olfactory sensor data match. Here, it is determined whether the aroma of the food material obtained by the cooking operation of the cooking robot 1 matches the aroma of the food material obtained by the cooking operation of the chef.

In a case where it is determined that the olfactory sensor data configuring the flavor sensor information included in the recipe data matches the olfactory sensor data acquired by the aroma measuring unit 542, the aroma adjusting unit 552 determines that the aroma does not need to be adjusted.

On the other hand, in a case where it is determined that the olfactory sensor data configuring the flavor sensor information included in the recipe data does not match the olfactory sensor data acquired by the aroma measuring unit 542, the aroma adjusting unit 552 plans a method of adjusting the aroma, and requests the control unit 515 to perform an operation for adjusting the aroma.

The control unit 515 needs to perform operations such as squeezing the lemon juice when the lemon smells acid (green), shredding and adding herbs when the citrus aroma is weak.

The texture adjusting unit 553 compares texture sensor data configuring the flavor sensor information included in the recipe data with the texture sensor data acquired by the texture measuring unit 543, and determines whether the two texture sensor data match. Here, it is determined whether the texture of the food material obtained by the cooking operation of the cooking robot 1 matches the texture of the food material obtained by the cooking operation of the cook.

The texture adjusting unit 553 determines that the texture does not need to be adjusted in the case where the texture sensor data configuring the flavor sensor information included in the recipe data is determined to match the texture sensor data acquired by the texture measuring unit 543.

On the other hand, in a case where it is determined that the texture sensor data configuring the flavor sensor information included in the recipe data does not match the texture sensor data acquired by the texture measurement unit 543, the texture adjustment unit 553 plans a method of adjusting texture, and requests the control unit 515 to perform an operation for adjusting texture.

The control unit 515 is required to perform operations such as beating the food material to be soft when the food material is hard and increasing the time for cooking the food material.

The body-sensory-temperature adjusting unit 554 compares body-sensory-temperature-sensor data configuring flavor sensor information included in the recipe data with body-sensory-temperature-sensor data acquired by the body-sensory-temperature measuring unit 544, and determines whether both the body-sensory-temperature-sensor data match. Here, it is determined whether the body-sensory temperature of the food material obtained by the cooking operation of the cooking robot 1 matches the body-sensory temperature of the food material obtained by the cooking operation of the cook.

When the sensible temperature sensor data of the flavor sensor information included in the recipe data is determined to match the sensible temperature sensor data acquired by the sensible temperature measurement unit 544, the sensible temperature adjustment unit 554 determines that the sensible temperature does not need to be adjusted.

On the other hand, in a case where it is determined that the body-sensory temperature sensor data configuring the flavor sensor information included in the recipe data does not match the body-sensory temperature sensor data acquired by the body-sensory temperature measurement unit 544, the body-sensory temperature adjustment unit 554 plans a method of adjusting the body-sensory temperature, and requests the control unit 515 to perform an operation for adjusting the body-sensory temperature.

The control unit 515 is required to perform operations such as heating the food material using an oven when the body-sensory temperature of the food material is low and cooling the food material when the body-sensory temperature of the food material is high.

The color adjustment unit 555 compares color sensor data configuring the flavor sensor information included in the recipe data with the color sensor data acquired by the color measurement unit 545, and determines whether the two color sensor data match. Here, it is determined whether the color of the food material obtained by the cooking operation of the cooking robot 1 matches the color of the food material obtained by the cooking operation of the chef.

In the case where it is determined that the color sensor data configuring the flavor sensor information included in the recipe data matches the color sensor data acquired by the color measurement unit 545, the color adjustment unit 555 determines that the color does not need to be adjusted.

On the other hand, in the case where it is determined that the color sensor data configuring the flavor sensor information included in the recipe data does not match the color sensor data acquired by the color measurement unit 545, the color adjustment unit 555 plans a method of adjusting the color, and requests the control unit 515 to perform an operation for adjusting the color.

In the case where the placement of the cooked food materials is performed, in the case where the placement method of the cooking robot 1 is different from the placement method of the chef, the control unit 515 is required to perform operations such as moving the position of the food materials to approach the placement method of the chef.

The subjective-information analyzing unit 523 analyzes the flavor subjective information included in the flavor information, and reflects how the chef feels the flavor represented by the flavor subjective information in the flavor adjustment performed by the flavor adjusting unit 522.

The attribute information analysis unit 524 analyzes the attribute information included in the recipe data, and reflects the attributes of the chef in the flavor adjustment performed by the flavor adjustment unit 522.

The environmental information analysis unit 525 analyzes environmental information included in the recipe data, and reflects a difference between the cooking environment measured by the environment sensor 406 and the dining environment in the flavor adjustment performed by the flavor adjustment unit 522.

< operation of cooking System >

Here, the operation of the cooking system having the above configuration will be described.

(1) Operation on the cook side

First, the recipe data generation processing of the data processing apparatus 11 will be described with reference to the flowchart of fig. 29.

When the food and the cooking tool are ready and the cook starts cooking, the process in fig. 29 is started. Capturing by the camera 41, generation of an IR image by the infrared sensor 51, sensing by a sensor attached to the cook's body, and the like are also started.

In step S1, the food material recognition unit 251 of fig. 20 analyzes the image captured by the image pickup device 41, and recognizes the food material to be used by the chef.

In step S2, the operation recognition unit 253 analyzes the image captured by the imaging device 41, sensor data representing the measurement results of the sensor attached to the cook' S body, and the like, and recognizes the cooking operation of the cook.

In step 3, the recipe data generation unit 233 generates cooking operation information based on the food material information generated according to the recognition result of the food material recognition unit 251 and the operation information generated according to the recognition result of the operation recognition unit 253.

In step S4, the recipe data generation unit 233 determines whether one cooking process has been completed, and in the case where it is determined that one cooking process has not been completed, the process returns to step S1, and the above-described processing is repeated.

In the case where it is determined in step S4 that one cooking process has been completed, the process proceeds to step S5.

The flavor information generation process is executed in step S5. The flavor information is generated by a flavor information generation process. Details of the flavor information generation process will be described below with reference to the flowchart of fig. 30.

In step S6, the recipe data generation unit 233 generates a cooking process data set by associating the cooking operation information with the flavor information.

In step S7, the recipe data generation unit 233 determines whether all the cooking processes have been completed, and in the case where it is determined that all the cooking processes have not been completed, the process returns to step S1, and the above-described process is repeated. Similar processing is repeated for the next cooking process.

In the case where it is determined in step S7 that all cooking processes have been completed, the process proceeds to step S8.

In step S8, the recipe data generation unit 233 generates recipe data including all the cooking process data sets.

Next, the flavor information generation process performed in step S5 of fig. 29 will be described with reference to the flowchart of fig. 30.

In step S11, the taste measuring unit 261 measures the taste of the food material by controlling the taste sensor 43.

In step S12, the aroma measuring unit 262 measures the aroma of the food material by controlling the olfactory sensor 42.

In step S13, the texture measuring unit 263 measures the texture of the food material based on the image captured by the image pickup device 41 and the measurement result of the texture sensor 52.

In step S14, the body-sensory-temperature measuring unit 264 measures the body-sensory temperature of the food material measured by the temperature sensor.

In step S15, the color measurement unit 265 measures the color of the food material based on the image captured by the camera 41.

In step S16, the subjective information generating unit 266 generates flavor subjective information based on the sensor data acquired by each of the taste measuring unit 261 to the color measuring unit 265.

In step S17, the recipe data generation unit 233 generates flavor information based on the flavor sensor information including the sensor data measured by the taste measurement unit 261 to the color measurement unit 265 and the flavor subjective information generated by the subjective information generation unit 266.

After the flavor information is generated, the process returns to step S5 in fig. 29, and the processes of step S5 and subsequent steps are performed.

(2) Operation on the reproduction side

The dish reproduction process of the control device 12 will be described with reference to the flowchart of fig. 31.

In step S31, the recipe data acquisition unit 511 of fig. 27 acquires the recipe data transmitted from the data processing apparatus 11. The recipe data acquired by the recipe data acquisition unit 511 is analyzed by the recipe data analysis unit 512, and cooking operation information and flavor information are extracted. The cooking operation information is supplied to the control unit 515, and the flavor information is supplied to the flavor information processing unit 514.

In step S32, the control unit 515 selects one cooking process as the cooking process to be performed. The cooking process is selected in order from a cooking process data set relating to the first cooking process.

In step S33, the control unit 515 determines whether the cooking process to be performed is a cooking process in which cooked food is put. In a case where it is determined in step S33 that the cooking process to be performed is not a cooking process for laying cooked food, the process proceeds to step S34.

In step S34, the control unit 515 prepares a food material for use in the cooking process to be performed, based on the description of the food material information included in the cooking operation information.

In step S35, the control unit 515 generates an instruction command based on the description of the operation information included in the cooking operation information, and transmits the instruction command to the cooking robot 1 to cause the cooking arm 321 to perform a cooking operation.

The flavor measurement process is performed in step S36. By the flavor measurement process, the flavor of the cooked food material cooked by the cooking robot 1 is measured. Details of the flavor measurement process will be described below with reference to the flowchart of fig. 32.

In step S37, the flavor adjustment unit 522 determines whether the flavor of the cooked food material matches the flavor indicated by the flavor sensor information included in the recipe data. Here, regarding all of the taste, aroma, texture, sensory temperature, and color, which are the layout elements of the flavor, when the flavor of the cooked food material matches the flavor indicated by the flavor sensor information, it is determined that the flavors match.

In the case where it is determined in step S37 that the flavors do not match because any of the arrangement elements do not match, a flavor adjustment process is performed in step S38. The flavor of the cooked food material is adjusted by the flavor adjustment process. Details of the flavor adjustment process will be described below with reference to the flowchart of fig. 33.

After the flavor adjustment process is performed in step S38, the process returns to step S36, and the above-described process is repeatedly performed until it is determined that the flavors match.

Meanwhile, in a case where it is determined in step S33 that the cooking process to be performed is a cooking process in which cooked food is put, the process proceeds to step S39.

In step S39, the control unit 515 generates an instruction command based on the description of the cooking operation information, and transmits the instruction command to the cooking robot 1 to cause the cooking arm 321 to lay out the cooked food.

In the case where the placement of the food material is completed or in the case where it is determined in step S37 that the flavor of the cooked food material matches the flavor indicated by the flavor sensor information included in the recipe data, the process proceeds to step S40.

In step S40, the control unit 515 determines whether all the cooking processes have been completed, and in the case where it is determined that all the cooking processes have not been completed, the process returns to step S32, and the above-described process is repeated. Similar processing is repeated for the next cooking process.

On the other hand, when it is determined in step S40 that all the cooking steps have been completed, the dish is completed, and the dish reproduction process is terminated.

Next, the flavor measurement process performed in step S36 of fig. 31 will be described with reference to the flowchart of fig. 32.

In step S51, the taste measurement unit 541 of fig. 28 causes the cooking robot 1 to measure the taste of the cooked food material, and acquires taste sensor data.

In step S52, the aroma measurement unit 542 causes the cooking robot 1 to measure the aroma of the cooked food material, and acquires the olfactory sensor data.

In step S53, the texture measurement unit 543 causes the cooking robot 1 to measure the texture of the cooked food material, and acquires texture sensor data.

In step S54, the body-sensory-temperature measurement unit 544 causes the cooking robot 1 to measure the body-sensory temperature of the cooked food and acquire body-sensory-temperature-sensor data.

In step S55, the color measurement unit 545 causes the cooking robot 1 to measure the color of the cooked food material and acquires color sensor data.

Through the above processing, the flavor of the cooked food material is measured, and the flavor of the cooked food material can be used for the flavor adjustment processing to be described below. Thereafter, the process returns to step S36 of fig. 31, and the processes of step S36 and subsequent steps are performed.

Next, the flavor adjustment process performed in step S38 of fig. 31 will be described with reference to the flowchart of fig. 33.

In step S61, the taste adjustment unit 551 performs the taste adjustment process. When the taste of the cooked food material does not match the taste represented by the taste sensor data included in the flavor sensor information, the taste adjustment process is performed. Details of the taste adjustment process will be described below with reference to the flowchart of fig. 34.

In step S62, the fragrance adjusting unit 552 executes the fragrance adjusting process. When the aroma of the cooked food material does not match the aroma represented by the olfactory sensor data included in the flavor sensor information, the aroma adjustment process is performed.

In step S63, the texture adjustment unit 553 performs texture adjustment processing. When the texture of the cooked food material does not match the texture represented by the texture sensor data included in the flavor sensor information, a texture adjustment process is performed.

In step S64, body-sensory-temperature adjustment section 554 executes body-sensory-temperature adjustment processing. When the sensible temperature of the cooked food does not match the sensible temperature represented by the sensible temperature sensor data included in the flavor sensor information, a sensible temperature adjustment process is performed.

In step S65, the color adjustment unit 555 performs color adjustment processing. When the color of the cooked food material does not match the color represented by the color sensor data included in the flavor sensor information, the color adjustment process is performed.

For example, in the case where an operation of spraying lemon juice on food materials to increase sourness is performed as the taste adjustment process, this may change the aroma of the food materials, and may also require adjustment of the aroma. In this case, the fragrance adjustment process is performed together with the taste adjustment process.

In this way, adjustment of any element of the flavour may affect another element, and in fact, adjustment of multiple elements is performed in common.

Next, the taste adjustment process performed in step S61 of fig. 33 will be described with reference to the flowchart of fig. 34.

In step S71, the taste adjustment unit 551 identifies the current value of the taste of the cooked food material in the taste space based on the taste sensor data acquired by the taste measurement unit 541.

In step S72, the taste adjustment unit 551 sets a target value of the taste based on the description of the flavor sensor information included in the recipe data. The taste of the food material (represented by the taste sensor data included in the flavor sensor information) obtained by the cooking operation performed by the chef is set as a target value.

In step S73, the taste adjustment unit 551 plans adjustment contents for the taste of the food material that shifts from the current value to the target value.

Fig. 35 is a diagram showing an example of a plan.

The vertical axis shown in fig. 35 represents one of the seven tastes, and the horizontal axis represents the other taste. For convenience of description, the taste space is represented as a two-dimensional space in fig. 35, but in the case where the taste includes seven types of salty taste, sour taste, bitter taste, sweet taste, umami taste, pungent taste, and astringent taste as described above, the taste space becomes a seven-dimensional space.

The taste of the cooked food material is represented as a current value by the taste sensor data measured by the cooking robot 1.

Further, the taste to be the target value is set by taste sensor data included in the flavor sensor information. The taste to be the target value is the taste of the food material cooked by the chef.

Since there is no seasoning and food material that change only one taste among salty taste, sour taste, bitter taste, sweet taste, umami taste, spicy taste, and astringent taste, there are some cases where the taste of the food material cannot be directly changed from the current value taste to the target value taste. In this case, as indicated by the white arrows, the cooking operation is planned to achieve the target taste value by a plurality of tastes.

Returning to the description of fig. 34, in step S74, the taste adjustment unit 551 causes the control unit 515 to perform an operation for adjusting the taste according to the plan.

Thereafter, the process returns to step S61 of fig. 33, and the processes of step S61 and subsequent steps are performed.

The fragrance adjustment process (step S62), the texture adjustment process (step S63), the body-sensory-temperature adjustment process (step S64), and the color adjustment process (step S65) are performed similarly to the taste adjustment process of fig. 34. That is, a cooking operation is performed to change the taste of the cooked food material from a current value as the current value and a flavor indicated by the flavor sensor information of the recipe data as the target value to a target value.

Through the above-described series of processes, the cooking robot 1 reproduces a dish having the same flavor as that prepared by the chef. The user can eat a dish having the same flavor as a dish made by a chef.

Furthermore, the cook can serve dishes having the same flavor as the dishes made by the cook to different persons. Furthermore, the cook can retain the dishes made by the cook as recipe data in a reproducible form.

< modification >

Example of updating the cooking process on the reproduction side

In some cases, the reproduction side cannot prepare the same food material as the food material described in the recipe data (food material information) as the food material for cooking. In these cases, the process of partially updating the recipe data may be performed by the control unit 515 (fig. 27).

For example, in the case where a certain food material is insufficient, the control unit 515 refers to the alternative food material database, and selects an alternative food material from among food materials that can be prepared on the reproduction side. The alternative food material is a food material that replaces the food material described in the recipe data as the food material for cooking. The food materials that can be prepared on the reproduction side are specified by, for example, recognizing the situation around the cooking robot 1.

In the alternative food material database referred to by the control unit 515, for example, information on alternative food materials predetermined by a food pairing method is described.

For example, in a case where the food material "sea urchin" described in the recipe data cannot be prepared, the control unit 515 refers to the alternative food material database, and selects a food material combining "pudding" and "soy sauce" as an alternative food material. It is well known that the flavor of "sea urchins" can be reproduced by combining "puddings" and "soy sauce".

The control unit 515 updates the cooking operation information in which the information on the cooking process using the "sea urchin" is described, using the cooking operation information in which the information on the operation combining the "pudding" and the "soy sauce" and the information on the cooking process using the alternative food material are described. The control unit 515 controls the cooking operation of the cooking robot 1 based on the updated cooking operation information.

The flavor of the alternative food material prepared in this manner can be measured, and the flavor can be appropriately adjusted.

Fig. 36 is a flowchart for describing the process of the control device 12 for adjusting the flavor of the alternative food material.

The process of fig. 36 is performed after preparing the alternative food material.

In step S111, the flavor measurement unit 521 of the flavor information processing unit 514 measures the flavor of the prepared alternative food material, and acquires sensor data representing the flavor of the alternative food material.

In step S112, the flavor adjustment unit 522 determines whether the flavor of the alternative food material matches the flavor of the food material before replacement. In the case of the above example, it is determined whether the flavor of the alternative food material combining "pudding" and "soy sauce" matches the flavor of "sea urchin". The flavor of "sea urchins" is specified by flavor sensor information included in the recipe data.

In a case where it is determined in step S112 that the flavor of the alternative food material does not match the flavor of the food material before replacement because the sensor data representing the flavor of the alternative food material does not match the flavor sensor information included in the recipe data, the process proceeds to step S113.

In step S113, the flavor adjusting unit 522 adjusts the flavor of the alternative food material. The adjustment of the flavor of the alternative food material is performed similarly to the process of adjusting the flavor of the cooked food material.

In the case where the adjustment of the flavor of the alternative food material has been performed or in the case where it is determined in step S112 that the flavor of the alternative food material matches the flavor of the food material before the replacement, the process of adjusting the flavor of the alternative food material is terminated. Thereafter, processing is performed according to the updated cooking process using the alternative food material.

Thus, even when the reproduction side cannot prepare the same material as the material used by the cook side, the cooking can be performed using the alternative material. Since the flavour of the alternative food material is the same as the flavour of the food material before the replacement, the final finished dish will be the same or similar to the dish prepared by the chef.

The alternative food material database may be prepared in the control device 12, or may be prepared in a predetermined server such as the recipe data management server 21. The cooking operation information may be updated in the control device 12 or in the data processing device 11.

Examples of use of flavor subjective information

There are some cases where the specifications of the sensors on both sides are different, so that the sensor provided on the cook side has higher measurement accuracy than the sensor provided on the reproduction side. In the case where the specifications of both sides are different, the measurement results are different in the case where the flavors of the same food material are measured by the respective sensors.

In order to be able to specify the flavor of the food material cooked by the cooking robot 1 and the flavor of the food material cooked by the cook even when the specifications of the sensors are different between the cook side and the playback side, flavor subjective information is used.

Fig. 37 is a diagram showing an example of determining a flavor.

In the above example, as shown on the left side of fig. 37, when the cooked food material is obtained by cooking in a certain cooking process on the reproduction side, the flavor is measured and sensor data representing the flavor of the cooked food material is obtained.

Further, as shown in the right side of fig. 37, flavor sensor information is extracted from the recipe data, and sensor data representing the flavor of the cooked food material is compared with the flavor sensor information as shown by an arrow a101, so that determination of the flavor (determination of whether the flavors match) is performed.

Fig. 38 is a diagram showing an example of determining flavor using flavor subjective information.

In the case of determining the flavor using the flavor subjective information, as shown on the left side of fig. 38, the flavor subjective information is calculated based on the sensor data indicating the flavor of the cooked food material on the reproduction side. For the calculation of the flavor subjective information, a model based on how the chef feels the flavor generation as described with reference to fig. 6 was used.

The subjective information analysis unit 523 (fig. 28) of the flavor information processing unit 514 has the same model as that for generating the taste subjective information prepared by the chef side.

As indicated by arrow a102, the subjective information analysis unit 523 determines the flavor by comparing flavor subjective information calculated based on sensor data representing the flavor of the cooked food material with flavor subjective information extracted from the recipe data. In the case where these flavor subjective information match, the flavor match is determined, and the process of the next cooking process is performed.

Thus, even when the specifications of the sensors provided on the chef side and the reproduction side are different, it is possible to reproduce a food or dish having the same flavor as that felt by the chef.

As described above, as the mode for determining the flavor, the mode based on the sensor data and the mode based on the flavor subjective information are prepared.

Fig. 39 is a diagram showing an example of a model for generating sensor data.

As shown in fig. 39, a model capable of calculating sensor data under the specification of a sensor provided on the reproduction side may be prepared for the subjective information analysis unit 523 based on the flavor subjective information included in the recipe data.

The taste sensor information generation model shown in fig. 39 is a model such as a neural network model generated by deep learning based on sensor data on taste measured by a sensor prepared on the reproduction side and a subjective value representing how a chef feels the taste. For example, an administrator who manages recipe data prepares a model according to specifications of various sensors and provides the model to the reproduction side.

In this case, the subjective-information analysis unit 523 calculates corresponding sensor data by inputting the flavor subjective information into the model. The subjective-information analyzing unit 523 determines the flavor by comparing sensor data obtained by measuring the flavor of the food material cooked by the cooking robot 1 with sensor data calculated using the model.

Examples of use of attribute information

The recipe data includes attribute information indicating attributes of the chef and the like. Since age, gender, nationality, living area, etc., affect how to perceive the flavor, the flavor of the reproduced food material may be adjusted according to the difference between the attributes of the chef and the attributes of the individual who eats the dish reproduced by the cooking robot 1.

The attribute information of the cook, which is the attribute information extracted from the recipe data, is supplied to the attribute information analyzing unit 524, and is used to control the flavor adjustment performed by the flavor adjusting unit 522. The attribute information of the eater input by the person eating the dish reproduced by the cooking robot 1, which represents the attribute of the eater, is also provided to the attribute information analysis unit 524.

The attribute information analysis unit 524 identifies the attributes of the chef based on the cooker attribute information, and also identifies the attributes of the eater based on the eater attribute information.

For example, the attribute information analysis unit 524 adjusts the texture of the food material to be soft in the case where it is recognized that the eater is much older than the chef and the eater is an old person.

As described above, when the nationality of the eater and the chef is different, the attribute information analysis unit 524 controls the flavor of the food material adjusted by the flavor adjustment unit 522 according to the nationality difference based on the prepared information. Similarly, in the case where other attributes (e.g., gender and living area) of the eater and the chef are different, the attribute information analysis unit 524 controls the flavor of the food material adjusted by the flavor adjustment unit 522 according to the difference in the attributes between the eater and the chef.

Thus, although the flavor is substantially the same as that of a chef, dishes are reproduced with a flavor finely adjusted according to the preference of the user.

Further, the attribute information analyzing unit 524 identifies the attributes of the food material based on the food attribute information, and also specifies the attributes of the food material prepared on the reproduction side.

When the attributes of the material used by the chef and the material prepared by the reproduction side are different, the attribute information analysis unit 524 controls the flavor of the material adjusted by the flavor adjustment unit 522 according to the difference in the attributes.

In this way, the flavor of the food material can be adjusted at the reproduction side based on the difference in various attributes between the chef side and the reproduction side.

Use examples of Environment information

(1) Adjustment of dining environment

The recipe data includes environment information representing a cooking environment, which is an environment of a space where the cook cooks. Since the color, temperature, brightness, etc. of the space affect how the flavor is perceived, adjustments may be performed to bring the dining environment, such as a restaurant that consumes a dish reproduced by the cooking robot 1, close to the cooking environment. The environmental information extracted from the recipe data is supplied to the environmental information analyzing unit 525 and is used to adjust the dining environment.

For example, the environment information analysis unit 525 controls the lighting devices in the restaurant so that the color in the dining environment measured by analyzing the image captured by the camera 441 (fig. 26) is close to the color in the cooking environment represented by the environment information. The environment information analysis unit 525 has a function as an environment control unit that adjusts the dining environment by controlling an external device.

Further, the environmental information analyzing unit 525 controls the air-conditioning apparatus in the restaurant so that the temperature and humidity in the dining environment measured by the temperature/humidity sensor 442 approach the temperature and humidity in the cooking environment indicated by the environmental information.

The environment information analysis unit 525 controls the lighting devices in the restaurant so that the luminance in the dining environment measured by the illuminance sensor 443 approaches the luminance in the cooking environment represented by the environment information.

Thereby, it is possible to use a meal environment close to a cooking environment, and it is possible to make how an individual eating a dish reproduced by the cooking robot 1 feels a flavor close to how a cook feels a flavor.

(2) Correction of flavor sensor information

Information on the specification of the sensor provided at the chef side may be included in the environmental information and provided to the reproduction side. On the reproduction side, the flavor sensor information included in the recipe data is corrected based on a difference between the sensor provided on the chef side and the sensor provided on the reproduction side.

Fig. 40 is a flowchart for describing the processing for correcting the flavor sensor information by the control device 12.

In step S121, the environmental information analysis unit 525 acquires the specifications of the sensor provided on the cook side based on the environmental information included in the recipe data.

In step S122, the environmental information analysis unit 525 acquires the specifications of the sensors provided around the cooking robot 1 on the reproduction side.

In step S123, the environmental information analysis unit 525 corrects the flavor sensor information included in the recipe data, which is the sensor data measured at the cook side, based on the difference between the specifications of the sensor provided at the cook side and the specifications of the sensors provided around the cooking robot 1. For the environmental information analysis unit 525, information indicating the correspondence between the measurement result of the sensor provided on the chef side and the measurement result of the sensor provided on the reproduction side is prepared as information for correction.

The flavor sensor information corrected in this manner is used to determine flavor. Thereby, the difference in environment is absorbed, and the flavor can be determined.

< others >

Modifications of the configuration

Although the cooking robot that has reproduced dishes based on recipe data is assumed as the cooking robot 1 installed in a home, cooking may be reproduced by cooking robots installed in various places. For example, the above-described technique can be applied even in a case where cooking is reproduced by a cooking robot installed in a factory or a cooking robot installed in a restaurant.

Further, the cooking robot that reproduces a dish based on recipe data is the cooking robot 1 that operates the cooking arm to perform cooking, but the dish may be reproduced by various cooking robots that can cook food in a configuration other than the cooking arm.

In the above description, the cooking robot 1 has been controlled by the control device 12, but the cooking robot 1 may be directly controlled by the data processing device 11 that generates recipe data. In this case, the data processing apparatus 11 is provided with each configuration of the command generating unit 501 described with reference to fig. 27.

Further, each configuration of the command generating unit 501 may be set in the recipe data management server 21.

A server function of the recipe data management server 21 for managing the recipe data and providing the recipe data to other devices may be provided in the data processing device 11 that generates the recipe data.

Fig. 41 is a diagram showing another configuration example of the cooking system.

The recipe data management unit 11A included in the data processing apparatus 11 has a server function for managing the recipe data and providing the recipe data to other apparatuses. The recipe data managed by the recipe data management unit 11A is supplied to the plurality of cooking robots and the control device for controlling the cooking robots.

-data management

Since the above recipe data, cooking process data set (cooking operation information and flavor information), and the like can be said to be products that creatively express ideas and feelings about the cooking process, they can be regarded as literary works.

For example, a cook who cooks (e.g., a cook who operates a famous restaurant) completes creative delicious dishes by repeatedly attempting to select and taste food materials during cooking. In this case, the recipe data and the cooking process data set (cooking operation information and flavor information) have values as data, and a situation that needs compensation when used by others can be assumed.

Therefore, an application of managing copyrights of recipe data, cooking process data sets (cooking operation information and flavor information), and the like in a manner similar to music and the like can be considered.

That is, in the present disclosure, it is also possible to protect individual recipe data and cooking process data sets by using copyright protection techniques such as copy protection and encryption, which provide a protection function for individual data.

In this case, for example, the recipe data management server 21 of fig. 14 (the data processing apparatus 11 of fig. 41) manages the copyright in a state where the cook and the recipe data (or the cooking process data set) are associated with each other.

Next, in a case where the user wants the cooking robot 1 to cook using the recipe data, the user pays a usage fee for the recipe data to use the recipe data downloaded to the control device 12 for, for example, cooking by the cooking robot 1. Note that the usage fee is returned to a cook who is a creator of the recipe data, a data manager who manages the recipe data, and the like.

Further, in the present disclosure, a single recipe data and cooking process data set may also be protected by using blockchain techniques for managing the transaction history of the data as ledgers on a server in a distributed manner.

In this case, for example, the recipe data management server 21 of fig. 14 (the data processing apparatus 11 of fig. 41) manages cooks and recipe data (or cooking process data sets) associated with each other using a blockchain technique for transaction history as ledger management data on a server (a cloud server or an edge server) in a distributed manner.

Next, in a case where the user wants the cooking robot 1 to cook using the recipe data, the user pays a usage fee for the recipe data to use the recipe data downloaded to the control device 12 for, for example, cooking by the cooking robot 1. Note that the usage fee is returned to a cook who is a creator of the recipe data, a data manager who manages the recipe data, and the like.

In this way, the recipe data (or the cooking process data set) can be efficiently managed as creatively expressed works in consideration of the relationship between the chef, the user, and the royalty.

Characterization of food materials using temperature variations in the absorption spectrum

Although the flavor of the food material is represented by sensor data such as taste, aroma, and texture, the flavor may be represented by other indexes. The temperature change in the absorption spectrum can be used as an index for expressing the flavor of the food material.

Principle of

The absorption spectrum of the sample (food material) was measured using a spectrophotometer. The absorption spectrum varies depending on the temperature of the sample. The following reaction can be considered as a background of the change of the absorption spectrum with increasing temperature.

(1) Dissociation of associations

The association state of the components contained in the sample (a state in which two or more molecules move like one molecule due to weak bonds between the molecules) changes with temperature. When the temperature is low, the molecules tend to associate or aggregate, and conversely, when the temperature is increased, the molecules vibrate strongly, so that the molecules tend to dissociate from the association. Therefore, the peak of the absorption wavelength derived from the association decreases, and the peak of the absorption wavelength derived from the dissociated single molecule increases.

(2) Decomposition of molecules by thermal energy

By absorbing heat, the portion having weak binding force falls off, and molecules are split.

(3) Breakdown of molecules by enzymatic activity

The molecules are split by degrading enzymes.

(4) Oxidation reduction

As the temperature increases, the pH of the water decreases (H + concentration increases). In the case of fats and oils, the oxidation rate increases.

Here, from the viewpoint of the taste and aroma of a natural product such as a food material of the ingredients contained in the natural product, the taste substance is an ingredient contained in the liquid phase, and the aroma substance is volatile and is an ingredient contained in the gas.

Molecules in the associated state are less likely to enter the gas phase, and single molecules dissociated from the associated state may be converted to the gas phase.

In addition, for example, terpenes deeply related to aroma are present in plants in the form of glycosides with sugars, but become sugar-free aglucone forms without sugars due to thermal or enzymatic decomposition, and are easily volatilized.

Thus, as the temperature increases, the number of molecules that are easily volatilized increases, the peak of the absorption wavelength of the fragrance at the edge of volatilization increases, and the peak of the absorption wavelength associated with the molecular groups with which the fragrance has been associated decreases.

From this property, it can be considered that the temperature change of the absorption spectrum reflects the phase transition from the liquid phase associated with "taste" to the gas phase associated with "aroma".

Thus, the target sample is incubated at least two or more different temperatures, the absorption spectra of the sample in the respective heat-retention states are measured, and the data set can be used as information characterizing the taste and aroma of the sample. The sample may be identified from a property (pattern) of the absorption spectrum dataset.

This takes into account the fact that: there is a high probability that a phase transition from the liquid phase to the gas phase will occur due to dissociation of the association of the molecules or decomposition of the molecules by thermal/enzymatic decomposition. The method is a method of characterizing a sample by absorption spectroscopy of three-dimensional data, as it were, by adding dimensions of temperature to the absorption spectroscopy of two-dimensional data expressed as wavelength and absorbance.

-procedure

The series of processes described above may be performed by hardware or software. In the case where a series of processes is executed by software, a program constituting the software is installed in a computer, a general-purpose personal computer, or the like incorporated in dedicated hardware.

The program to be installed is recorded on and provided together with a removable medium 211 shown in fig. 19, which includes an optical disk (compact disk-read only memory (CD-ROM), Digital Versatile Disk (DVD), etc.), a semiconductor memory, and the like. Further, the program may be provided via a wired or wireless transmission medium such as a local area network, the internet, or digital broadcasting. The program may be installed in advance in the ROM 202 or the storage unit 208.

The program executed by the computer may be a program that is processed in time series according to the order described in this specification, or may be a program that is executed in parallel or at necessary timing such as when making a call.

Note that in this specification, the term "system" means a group of a plurality of configuration elements (devices, modules (components), and the like), and it does not matter whether all the configuration elements are in the same housing. Therefore, both a plurality of devices accommodated in separate housings and connected via a network and one device accommodating a plurality of modules in one housing are systems.

The effects described in this specification are merely examples, are not limiting, and other effects may be exhibited.

The embodiments of the present technology are not limited to the above-described embodiments, and various modifications may be made without departing from the gist of the present technology.

For example, in the present technology, a configuration of cloud computing in which one function is shared and cooperatively processed by a plurality of apparatuses via a network may be employed.

Further, the steps described in the above flowcharts may be executed by one apparatus, or may be shared and executed by a plurality of apparatuses.

Further, in the case where a plurality of processes are included in one step, the plurality of processes included in one step may be executed by one apparatus, or may be shared and executed by a plurality of apparatuses.

List of reference numerals

1 cooking robot

11 data processing device

12 control device

21 recipe data management server

41 image pickup device

42 sense of smell sensor

43 taste sensor

51 infrared sensor

52 feel sensor

53 environmental sensor

221 data processing unit

231 cooking operation information generating unit

232 flavor information generating unit

233 recipe data generation unit

234 environment information generating unit

235 attribute information generating unit

236 recipe data output unit

321 cooking arm

361 controller

401 image pickup device

402 olfactory sensor

403 taste sensor

404 infrared sensor

405 sense of mass sensor

406 environmental sensor

407 communication unit

501 information processing unit

511 recipe data acquisition unit

512 recipe data analysis unit

513 robot state estimation unit

514 flavor information processing unit

515 control unit

516 command output unit

76页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:具有通道件的抽吸器

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!