Estimation method, estimation model generation method, program, and estimation device

文档序号:174347 发布日期:2021-10-29 浏览:27次 中文

阅读说明:本技术 推测方法、推测模型的生成方法、程序和推测装置 (Estimation method, estimation model generation method, program, and estimation device ) 是由 川崎洋 川上英良 古关惠太 海老原全 天谷雅行 成英次 水野诚 伊藤美树 厚木彻 于 2020-03-19 设计创作,主要内容包括:本发明的推测方法是一种推测有关皮肤功能的参数的推测方法,其包括:图像获取步骤,获取映现皮肤表面凹凸的皮肤图像;提取步骤,针对图像获取步骤中所获取的皮肤图像,提取基于皮肤图像的相位信息的特征值向量;推测步骤,使用根据将特征值向量与有关皮肤功能的参数建立了关联的过去的实测数据所构建的推测模型,根据在提取步骤中所提取的特征值向量来推测有关皮肤功能的参数;以及提示步骤,提示于推测步骤中所推测的有关皮肤功能的参数。(The estimation method of the present invention is an estimation method for estimating a parameter related to skin function, including: an image acquisition step of acquiring a skin image reflecting unevenness on the skin surface; an extraction step of extracting a feature value vector based on phase information of the skin image for the skin image acquired in the image acquisition step; an estimation step of estimating a parameter relating to a skin function from the feature value vector extracted in the extraction step, using an estimation model constructed from past measured data in which the feature value vector is associated with the parameter relating to the skin function; and a presentation step of presenting the parameters related to the skin function estimated in the estimation step.)

1. An estimation method for estimating a parameter related to skin function, comprising:

an image acquisition step of acquiring a skin image reflecting unevenness on the skin surface;

an extraction step of extracting, for the skin image acquired in the image acquisition step, a feature value vector based on phase information of the skin image;

an estimation step of estimating the parameter relating to the skin function from the feature value vector extracted in the extraction step, using an estimation model constructed from past measured data in which the feature value vector is associated with the parameter relating to the skin function; and

a prompting step of prompting the parameter concerning the skin function presumed in the presumption step.

2. The inference method according to claim 1, wherein,

in the extraction step, a corrected image is generated by performing brightness correction processing and binarization processing on the acquired skin image.

3. The inference method according to claim 2, wherein,

the phase information contains information relating to a zero-dimensional feature value and a one-dimensional feature value extracted from the generated corrected image.

4. The inference method according to claim 3, wherein, in the extraction step, a profile indicating persistence of each feature value is generated for each of the zero-dimensional feature value and the one-dimensional feature value, and the feature value vector is extracted from the generated profile.

5. Inference method according to any one of claims 1-4, wherein, in the inference step, the parameter relating to skin function is inferred from attributes of the subject.

6. Inference method according to any one of claims 1-5, wherein the parameter relating to skin function comprises transdermal water loss.

7. Inference method according to any one of claims 1 to 5, wherein,

the parameter relating to skin function comprises the moisture content of the skin.

8. A method for generating a presumption model used for the presumption method according to any one of claims 1 to 7, comprising:

an acquisition step of acquiring past measured data in which the characteristic value vector is associated with the parameter relating to the skin function; and

a construction step of constructing the estimation model that estimates the parameter relating to the skin function based on the feature value vector, from the past measured data acquired in the acquisition step.

9. The inference model generation method of claim 8,

the inference model is a machine learning model including a random forest model that has been learned from the past measured data acquired in the acquiring step.

10. A program for causing an information processing apparatus to execute the estimation method according to any one of claims 1 to 7 or the estimation model generation method according to claim 8 or 9.

11. An estimation device that estimates a parameter related to a skin function, comprising:

an image acquisition unit that acquires a skin image in which irregularities are reflected on the skin surface;

a control unit that extracts a feature value vector based on phase information of the skin image for the skin image acquired by the image acquisition unit, and estimates the parameter relating to the skin function from the extracted feature value vector using an estimation model constructed from past measured data in which the feature value vector and the parameter relating to the skin function are associated with each other; and

and a presentation unit that presents the parameter related to the skin function estimated by the control unit.

Technical Field

The present invention relates to an estimation method, an estimation model generation method, a program, and an estimation device.

Background

Conventionally, a technique for analyzing the state of a biological tissue is known.

For example, patent document 1 discloses a method of operating an optical transmission diagnostic apparatus for arranging a plurality of Light Emitting Diodes (LEDs) for irradiating Light of different wavelengths at different angles with respect to the skin, and supporting the distinction between benign tissue and malignant tissue based on the measured reflectance spectrum. The working method of the optical transmission diagnosis device is related to an optical method for determining several morphological parameters and physiological characteristics of biological tissues, especially for determining morphological parameters and physiological characteristics of benign and malignant tissue lesions.

For example, patent document 2 discloses a skin condition analysis method for analyzing the condition of the skin surface based on the shape of the sulcus on the skin surface. In this skin condition analysis method, a plurality of optical cross-sectional images, which are three-dimensional shape data of the sulcus on the skin surface, are acquired using a confocal microscope, and the condition of the skin surface is evaluated.

For example, in recent years, skin barrier dysfunction caused by abnormalities in the catenin gene (intermediate filament-associated protein gene) and the like has been attracting attention in the pathogenesis of atopic dermatitis. As an example of an index of skin barrier function, transdermal water loss (TEWL) is mainly used. For example, in the case where the effect (efficacy) of the skin barrier function is high, TEWL is low. Conversely, in the case where the effect (efficacy) of the skin barrier function is low, TEWL is high.

Documents of the prior art

Patent document

Patent document 1: japanese patent No. 6035268

Patent document 2: japanese patent No. 6058902

Disclosure of Invention

Technical problem to be solved by the invention

In the prior art described in patent document 1 and patent document 2, consideration is given to analyzing the state of a biological tissue; however, the function of a biological tissue including a skin barrier function and the like, which is not in a state of the biological tissue, is not considered. Therefore, in these conventional techniques, consideration is not given to the point of estimating the function of a biological tissue (living tissue). On the other hand, there are the following requirements: it is desired to estimate parameters related to skin functions including TEWL and the like with high accuracy and to estimate the function (efficacy) of biological tissues including skin barrier functions and the like with high accuracy.

The present invention has been made in view of the above problems, and an object of the present invention is to provide an estimation method, an estimation model generation method, a program, and an estimation device capable of estimating parameters related to skin functions with high accuracy.

Technical scheme for solving problems

In order to solve the above-described problems, an estimation method according to an embodiment of the present invention is an estimation method for estimating a parameter related to skin function, including:

an image acquisition step of acquiring a skin image reflecting unevenness on the skin surface;

an extraction step of extracting, for the skin image acquired in the image acquisition step, a feature value vector based on phase information of the skin image;

an estimation step of estimating the parameter relating to the skin function from the feature value vector extracted in the extraction step, using an estimation model constructed from past measured data in which the feature value vector is associated with the parameter relating to the skin function; and

a prompting step of prompting the parameter concerning the skin function presumed in the presumption step.

In order to solve the above problem, a method of generating an estimation model according to an embodiment of the present invention is a method of generating an estimation model used in the estimation method, including:

an acquisition step of acquiring past measured data in which the characteristic value vector is associated with the parameter relating to the skin function; and

a construction step of constructing the estimation model for estimating the parameter relating to the skin function based on the feature value vector, based on the past measured data acquired in the acquisition step.

In order to solve the above problem, a program according to an embodiment of the present invention causes an information processing apparatus to execute the estimation method or the estimation model generation method.

In order to solve the above problem, an estimation device according to an embodiment of the present invention is an estimation device for estimating a parameter related to a skin function, including:

an image acquisition unit that acquires a skin image in which irregularities are reflected on the skin surface;

a control unit that extracts a feature value vector based on phase information of the skin image for the skin image acquired by the image acquisition unit, and estimates the parameter relating to the skin function from the extracted feature value vector using an estimation model constructed from past measured data in which the feature value vector and the parameter relating to the skin function are associated with each other; and

and a presentation unit that presents the parameter related to the skin function estimated by the control unit.

ADVANTAGEOUS EFFECTS OF INVENTION

The estimation method, the estimation model generation method, the program, and the estimation device according to the embodiment of the present invention can estimate parameters related to skin functions with high accuracy.

Brief description of the drawings

Fig. 1 is a block diagram showing a schematic configuration of an estimation device according to an embodiment of the present invention.

Fig. 2 is a flowchart showing an example of a first operation performed by the estimation device of fig. 1.

Fig. 3 is a flowchart showing an example of a second operation performed by the estimation device of fig. 1.

Fig. 4 is a flowchart showing an example of a third operation performed by the estimation device of fig. 1.

Fig. 5 is a schematic diagram showing an example of the corrected image generated in step S301 in fig. 4.

Fig. 6 is a schematic diagram showing an example of a method for acquiring phase information in step S302 in fig. 4.

Fig. 7 is a schematic diagram showing an example of changes in image and phase information when the threshold value is changed stepwise.

Fig. 8 is a distribution diagram (dot diagram) showing an example of persistence of zero-dimensional feature values.

Fig. 9 is a distribution diagram showing an example of persistence of one-dimensional feature values.

Fig. 10 is a schematic diagram showing a speculative model based on an embodiment of random forest (random forest).

Fig. 11 is a distribution diagram showing an example of the first estimation result obtained by the estimation device of fig. 1.

Fig. 12 is a graph showing an example of the second estimation result obtained by the estimation device of fig. 1.

Description of the reference numerals

1: estimation device

11: control unit

12: communication unit

13: storage unit

14: image acquisition unit

15: data acquisition unit

16: presentation unit

G. R2, R3, R4, R5, R6: region(s)

t1、t2、t3、t4、t5、t6、tbc、tdc、tbh、tdh: threshold value

Detailed Description

An embodiment of the present invention will be described in detail below with reference to the accompanying drawings.

Fig. 1 is a block diagram showing a schematic configuration of an estimation device 1 according to an embodiment of the present invention. The configuration and function of the estimation device 1 according to an embodiment of the present invention will be mainly described with reference to fig. 1.

As an outline of an embodiment, the estimation device 1 acquires a skin image reflecting unevenness on the skin surface. The estimation device 1 extracts a feature value vector based on phase information of a skin image for an acquired skin image. The estimation device 1 estimates the parameters related to the skin function from the extracted feature value vector using an estimation model constructed from past measured data in which the feature value vector is associated with the parameters related to the skin function. The estimation device 1 presents the estimated parameters relating to the skin function. Parameters related to skin function include, for example, TEWL. The parameter related to the skin function may include any index that is related to the function of a biological tissue including a skin barrier function and the like. For example, the parameters related to skin function may also comprise the moisture content of the skin.

The estimation device 1 is an electronic apparatus that estimates parameters related to skin functions from a skin image in which unevenness of a skin surface of a person is reflected, for example. For example, the estimation device 1 may be a dedicated electronic device, or may be any general-purpose electronic device such as a smartphone, a Personal Computer (PC), or a server device. For example, the estimation device 1 may capture a skin image by itself by photographing the skin surface of a person, and estimate parameters related to the skin function from the skin image. For example, the estimation device 1 may acquire a skin image of the skin surface of a person captured by another imaging device or the like by any method such as communication, and estimate parameters related to the skin function from the acquired skin image.

As shown in fig. 1, the estimation device 1 includes: a control unit 11, a communication unit 12, a storage unit 13, an image acquisition unit 14, a data acquisition unit 15, and a presentation unit 16.

The control unit 11 includes one or more processors. In one embodiment, the "processor" is a general-purpose processor or a special-purpose processor dedicated to a specific process, but is not limited thereto. The control unit 11 is communicably connected to each of the components (each of the components) constituting the estimation device 1 to control the operation of the entire estimation device 1.

In one embodiment, for example, the control unit 11 may control the communication unit 12 to transmit the estimation result obtained by the estimation device 1 to any other information processing device. For example, the control unit 11 may control the storage unit 13 to store the estimation result obtained by the estimation device 1 and the acquired skin image. For example, the controller 11 may control the image acquiring unit 14 to acquire a skin image in which irregularities are reflected on the skin surface. For example, the control unit 11 may control the data acquisition unit 15 to acquire past measured data in which the feature value vector is associated with a parameter related to the skin function. For example, the control unit 11 may control the presentation unit 16 to present the estimation result obtained by the estimation device 1 to the user.

The communication unit 12 includes a communication module connected to a network including a mobile communication network and the internet. For example, the communication unit 12 may include a communication module corresponding to a mobile communication standard such as 4G (fourth Generation) and 5G (fifth Generation). For example, the communication unit 12 may include a communication module corresponding to a wired LAN (Local Area Network) standard.

The storage unit 13 includes one or more memories. In one embodiment, the "memory" is, for example, a semiconductor memory, a magnetic memory, an optical memory, or the like, but is not limited thereto. Each memory included in the storage unit 13 may function as a main storage device, an auxiliary storage device, or a cache memory (cache memory), for example. The storage unit 13 stores arbitrary information used for estimating the operation of the apparatus 1. For example, the storage unit 13 may store a system program, an application program, various information acquired by the estimation device 1, an estimation result obtained by the estimation device 1, and the like. The information stored in the storage unit 13 may be updated by information acquired via the communication unit 12 via a network, for example.

The image acquisition unit 14 includes an arbitrary imaging device such as a camera, for example. The image acquiring unit 14 may acquire a skin image in which irregularities are reflected on the skin surface by imaging using an imaging device provided in the image acquiring unit 14 itself, for example. The image acquiring unit 14 may acquire a skin image in which irregularities are reflected on the skin surface by any method. For example, the image acquiring unit 14 may acquire a skin image of the skin surface captured by another imaging device or the like by an arbitrary method such as communication.

The data acquisition unit 15 includes, for example, an arbitrary interface capable of acquiring past measured data in which a parameter related to a skin function is associated with a feature value vector. For example, the data acquisition unit 15 may include an arbitrary input interface capable of accepting an input operation by a user, and acquire the measured data based on the input by the user. For example, the data acquisition unit 15 may include an arbitrary communication interface, and acquire the measured data from an external device or the like by an arbitrary communication protocol.

The presentation unit 16 includes, for example, an arbitrary output interface for outputting an image. The presentation unit 16 includes any display such as a liquid crystal display and an organic Electroluminescence (EL) display. The presentation unit 16 presents the estimation result obtained by the estimation device 1 to a user or the like. For example, the presentation unit 16 presents the parameters related to the skin function estimated by the control unit 11 of the estimation device 1.

Fig. 2 is a flowchart showing an example of a first operation performed by the estimation device 1 of fig. 1. Fig. 2 shows a flow of the estimation device 1 for generating an estimation model from past measured data. That is, fig. 2 shows a method for generating an estimation model used in an estimation method described later using the estimation device 1.

In step S101, the control unit 11 of the estimation device 1 acquires past measured data in which the characteristic value vector is associated with the parameter related to the skin function, using the data acquisition unit 15.

In step S102, the control unit 11 constructs an estimation model for estimating parameters related to the skin function based on the feature value vector from the past measured data acquired in step S101.

The estimation model may be a machine learning model including a random forest model learned from the past measured data acquired in step S101. The estimation model is not limited to this, and may be any machine learning model including a neural network, a local regression model, a Kernel regression model, and the like.

Fig. 3 is a flowchart showing an example of a second operation performed by the estimation device 1 of fig. 1. Fig. 3 mainly shows a flow of estimating the parameters related to the skin function by the estimation device 1 using the estimation model constructed by the flow of fig. 2. That is, fig. 3 shows an estimation method for estimating a parameter related to a skin function using the estimation device 1.

In step S201, the control unit 11 of the estimation device 1 acquires a skin image in which irregularities are reflected on the skin surface by using the image acquisition unit 14.

In step S202, the control unit 11 extracts a feature value vector based on the phase information of the skin image with respect to the skin image acquired in step S201. Since step S202 includes a detailed flow as described later with reference to fig. 4, the blocks of step S202 are indicated by double lines in fig. 3.

In step S203, the control unit 11 estimates parameters related to the skin function from the feature value vector extracted in step S202 using the estimation model constructed by the flow of fig. 2.

In step S204, the control unit 11 presents the parameters related to the skin function estimated in step S203 using the presentation unit 16.

Fig. 4 is a flowchart showing an example of a third operation performed by the estimation device 1 of fig. 1. Fig. 4 is a diagram showing the flow in step S202 of fig. 3 in more detail. The flow of extracting the feature value vector from the acquired skin image by the control unit 11 of the estimation device 1 will be described in more detail with reference to fig. 4.

In step S301, the control unit 11 of the estimation device 1 generates a corrected image in which the brightness correction processing and the binarization (binarization) processing are performed on the skin image acquired in step S201 of fig. 3. Fig. 5 is a schematic diagram showing an example of the corrected image generated in step S301 in fig. 4.

The control unit 11 generates a corrected image as shown in fig. 5 including only information in a predetermined frequency region, for example, using wavelet transform (wavelet transform). By generating such a correction image, the control unit 11 removes unnecessary information that may become noise, which is not related to the unevenness of the skin surface, from the skin image acquired in step S201 in fig. 3.

In step S302 of fig. 4, the control unit 11 acquires information on the zero-dimensional feature value and the one-dimensional feature value extracted from the corrected image generated in step S301. This information related to the zero-dimensional eigenvalue and the one-dimensional eigenvalue constitutes the phase information. Fig. 6 is a schematic diagram showing an example of a method for acquiring phase information in step S302 in fig. 4. A method of extracting the zero-dimensional feature value and the one-dimensional feature value from the corrected image generated in step S301 by the control unit 11 will be mainly described with reference to fig. 6.

The control unit 11 performs density estimation of white pixels on the corrected image generated in step S301, and generates an image that shows the density of white pixels with respect to the pixel region as in a topographic map. For example, in such an image, the density change of the white pixels is expressed as a peak (mountain) in a pixel region where the density of the white pixels is large, and expressed as a valley (valley) in a pixel region where the density of the black pixels is large.

Fig. 6 is a schematic diagram showing one-dimensionally the change in density of white pixels along a predetermined line of pixels in such an image. In fig. 6, the vertical axis represents the density of white pixels. The horizontal axis represents the pixel position.

The control unit 11 changes the threshold t of the density of the white pixels, for example, in a graph showing the change in the density of the white pixels, such as in fig. 6. For example, when a straight line corresponding to the threshold value t shown by a broken line in fig. 6 intersects the graph, the control unit 11 determines all pixels as white for a pixel region where the value of the density of white pixels becomes larger than the threshold value t as in the graph. For example, the control unit 11 determines all pixels as black for the other pixel regions.

Fig. 7 is a schematic diagram showing an example of changes in image and phase information when the threshold value t is changed in stages. More specifically, the series of groups of images in the upper part of fig. 7 shows how the connection mode of the white region obtained when the threshold value t is changed in stages changes. The middle part of fig. 7 shows how the zero-dimensional characteristic value obtained when the threshold value t is changed in stages changes. The lower part of fig. 7 shows how the one-dimensional characteristic value obtained when the threshold value t is changed in stages changes.

For example, when the threshold value t is determined as t1 in fig. 6, the controller 11 determines all pixels in the image as black because the straight line corresponding to the threshold value t1 does not intersect the graph. Therefore, as shown in the upper part of fig. 7, the image in the threshold value t1 becomes an image as if it is blackened overall.

For example, in a case where the controller 11 determines the threshold t as t2 in fig. 6, a straight line corresponding to the threshold t2 intersects the graph in the region R2, and the value of the density of white pixels in the graph becomes larger than the threshold t2 in the region R2. Therefore, the control section 11 determines all the pixels in the region R2 to be white. The control unit 11 determines all the pixels as black in the pixel regions other than the region R2. Therefore, as shown in the upper part of fig. 7, the image in the threshold value t2 becomes an image as if a white area slightly starts to appear, but as a whole, there are many black pixels.

For example, in a case where the controller 11 determines the threshold t as t3 in fig. 6, a straight line corresponding to the threshold t3 intersects the graph in the region R3, and the value of the density of white pixels in the graph becomes larger than the threshold t3 in the region R3. Therefore, the control section 11 determines all the pixels in the region R3 to be white. The control unit 11 determines all the pixels as black in the pixel regions other than the region R3. Therefore, as shown in the upper part of fig. 7, the image in the threshold value t3 becomes an image in which the white area is further increased as compared with the image in the threshold value t 2.

For example, in a case where the controller 11 determines the threshold t as t4 in fig. 6, a straight line corresponding to the threshold t4 intersects the graph in the region R4, and the value of the density of white pixels in the graph becomes larger than the threshold t4 in the region R4. Therefore, the control section 11 determines all the pixels in the region R4 to be white. The control unit 11 determines all the pixels as black in the pixel regions other than the region R4. Therefore, as shown in the upper part of fig. 7, the image in the threshold value t4 becomes an image in which the white area is further increased as compared with the image in the threshold value t 3.

For example, in a case where the controller 11 determines the threshold t as t5 in fig. 6, a straight line corresponding to the threshold t5 intersects the graph in the region R5, and the value of the density of white pixels in the graph becomes larger than the threshold t5 in the region R5. Therefore, the control section 11 determines all the pixels in the region R5 to be white. The control unit 11 determines all the pixels as black in the pixel regions other than the region R5. Therefore, as shown in the upper part of fig. 7, the image in the threshold value t5 becomes an image as if a slightly black region remains, but as a whole, there are many white pixels.

For example, in a case where the controller 11 determines the threshold t as t6 in fig. 6, a straight line corresponding to the threshold t6 intersects the graph as a whole in the region R6, and the value of the density of white pixels in the graph becomes larger than the threshold t6 in the region R6. Therefore, the control section 11 determines all the pixels in the region R6 to be white. Therefore, as shown in the upper part of fig. 7, the image in the threshold value t6 becomes an image entirely painted in white.

As described above, the control unit 11 gradually changes the threshold t to acquire a series of sets of images indicating a change in the connection mode of the white region. The control unit 11 extracts phase information including a zero-dimensional feature value and a one-dimensional feature value for a group of acquired images.

For example, as shown in the middle of fig. 7, the control unit 11 extracts a portion where white pixels are connected as a zero-dimensional feature value for a group of a series of acquired images. As such, the zero-dimensional feature value corresponds to a connected component (connected component) in a group of images of a series. For example, in the image of the threshold t1, the number of zero-dimensional feature values is 0. For example, in the image of the threshold t6, the number of zero-dimensional feature values is 1.

For example, as shown in the lower part of fig. 7, the control unit 11 searches for white pixels in a group of a series of acquired images, and extracts a portion where a black pixel exists in the center as a one-dimensional feature value. As such, the one-dimensional feature values correspond to holes in groups of the series of images. For example, in the images of the thresholds t1 and t6, the number of one-dimensional feature values is 0.

The connected components and holes extracted for the series of image groups shown in the upper part of fig. 7 are generated and disappeared as the threshold value t changes. That is, if the predetermined connected component is at a threshold value tbcIf it is generated, then it has a ratio threshold value tbcSmall value of another threshold tdcDisappear. Similarly, a given hole is at a threshold tbhIf it is generated, then it has a ratio threshold value tbhSmall value of another threshold tdhDisappear.

The control unit 11 sets the threshold tbc and the threshold td for each connected componentcThe set of values of (a) is stored in the storage unit 13. Similarly, the control unit 11 sets the threshold value tb for each holehAnd a threshold tdhThe set of values of (a) is stored in the storage unit 13.

In step S3 of FIG. 403, the control unit 11 controls the operation of the display unit according to the threshold tb stored in the storage unit 13cAnd a threshold tdcFor the zero-dimensional feature value, a distribution diagram representing the persistence of each feature value is generated. Similarly, the control unit 11 controls the threshold tb stored in the storage unit 13hAnd a threshold tdhFor a one-dimensional eigenvalue, a distribution graph is generated that indicates the persistence of each eigenvalue. The control unit 11 may generate a distribution map indicating the persistence of each feature value for each of the zero-dimensional feature value and the one-dimensional feature value from one skin image acquired in step S201 of fig. 3, for example. The control unit 11 may generate a distribution map indicating the persistence of each feature value for each of the zero-dimensional feature value and the one-dimensional feature value from the plurality of skin images acquired in step S201 of fig. 3, for example.

Fig. 8 is a distribution diagram showing an example of persistence of zero-dimensional feature values. In fig. 8, the vertical axis represents the threshold value tbcAnd a threshold tdcThe difference between them. That is, the vertical axis of fig. 8 gives a persistence measure (criterion) of the problem of how continuously the change of the zero-dimensional eigenvalue with respect to the threshold value t is. In fig. 8, the horizontal axis represents the threshold value tbcAnd a threshold tdcAverage value in between. That is, the horizontal axis of fig. 8 gives a reference of the problem of what value of the threshold t exists in which the change of the zero-dimensional eigenvalue with respect to the threshold t exists. In the distribution diagram of fig. 8, all points are drawn in the same form for the purpose of simple illustration, but the density of each point is different from the density of the zero-dimensional feature value. For example, each point may take any value of density. That is, a predetermined number of zero-dimensional eigenvalues overlap at each point.

Fig. 9 is a distribution diagram showing an example of persistence of one-dimensional feature values. In fig. 9, the vertical axis represents the difference between the threshold value tbh and the threshold value tdh. That is, the vertical axis of fig. 9 gives a measure of the persistence of the problem to what extent the change of the one-dimensional eigenvalue with respect to the threshold value t is persistent. In fig. 9, the horizontal axis represents an average value between the threshold tbh and the threshold tdh. That is, the horizontal axis of fig. 9 gives a reference of the problem of what value of the threshold t exists for the change of the one-dimensional eigenvalue with respect to the threshold t. In the distribution diagram of fig. 9, all points are drawn in the same form for the purpose of simple illustration, but the density of one-dimensional feature values differs from that of each point. For example, each point may take any value of density. That is, a predetermined number of one-dimensional eigenvalues overlap at each point.

In step S304 of fig. 4, the control unit 11 extracts a feature value vector from the histogram generated in step S303.

Fig. 10 is a schematic diagram illustrating a probabilistic forest based inference model according to an embodiment. Referring to fig. 10, an example of a method of extracting a feature value vector in step S304 in fig. 4 and a method of estimating a parameter related to a skin function in step S203 in fig. 3 will be mainly described.

In step S304 of fig. 4, the control unit 11 determines a mesh for each of the distribution maps of the zero-dimensional feature values and the one-dimensional feature values generated in step S303, and sets a plurality of regions G. The control unit 11 calculates the number of points included in each of the plurality of set regions G for each region G. The control unit 11 extracts a vector obtained by arranging the calculated number of points in each region G as a feature value vector. In this case, the density values of the respective points in the distribution diagram of the zero-dimensional eigenvalue and the one-dimensional eigenvalue may be considered.

The control unit 11 estimates parameters related to the skin function from the feature value vector extracted through the flow of fig. 4 using the estimation model constructed by the flow of fig. 2. More specifically, the control unit 11 estimates parameters related to the skin function using a machine learning model including a random forest model. In this case, the control unit 11 may estimate parameters related to the skin function from, for example, the attribute of the subject in addition to the feature value vector extracted through the flow of fig. 4. The attributes of the subject may also include, for example, the age and sex of the subject.

As shown in fig. 10, for example, the control unit 11 randomly selects one or more components of the eigenvalue vector extracted through the flow of fig. 4. For example, the control unit 11 associates one or more feature value vector components selected at random with the decision tree 1. For example, the control unit 11 performs the same processing for a plurality of decision trees from the decision tree 2 to the decision tree N. The control unit 11 estimates TEWL using the eigenvalue vector components associated with the decision trees as variables. The control unit 11 averages the plurality of TEWLs estimated for the plurality of decision trees, respectively, and estimates a final TEWL.

Fig. 11 is a distribution diagram showing an example of the first estimation result obtained by the estimation device 1 of fig. 1. In fig. 11, the vertical axis represents measured values of TEWL. The horizontal axis represents the guess value of TEWL. Black circles represent data when using an image of the skin of an adult male. Adult males include males over the age of 20. White circles represent data when using skin images of minor men. Minor men include men from 0 to 19 years of age. White triangles represent data when using an image of skin of a minor female. Minor women include women from 0 to 19 years of age.

As shown in fig. 11, the actual measurement value of TEWL and the estimated value show a good correspondence relationship in the estimation result obtained by the estimation device 1. That is, the difference (difference) between the TEWL value estimated by the use of the estimation device 1 and the measured TEWL value is within a predetermined error range. The determination coefficient at this time was 0.667. Furthermore, regression analysis was performed on TEWL, which is considered to reflect the barrier function of the skin, and it was found that a strong correlation was confirmed between the two. The same analysis was also performed on the moisture content of skin, which is another example of the parameter related to the skin function, and as a result, it was found that a correlation was also confirmed between the two.

Fig. 12 is a graph showing an example of the second estimation result obtained by the estimation device 1 of fig. 1. In fig. 12, the vertical axis represents the type of variable. The horizontal axis represents the importance of the variable.

The estimation apparatus 1 may also present the importance of the variable in the estimation result. For example, when the control unit 11 estimates the parameters related to the skin function from the attribute of the subject in addition to the feature value vector, the estimation device 1 may calculate the importance of the variables using the age and sex as variables in addition to the feature value vector components as variables. Fig. 12 shows that age is more important as a variable than gender in the estimation result obtained by the estimation device 1. In fig. 12, the eigenvalue vector components and the subject attributes including age and sex are shown as variables, but the variables used for estimating TEWL may include any other variables. For example, the variables used to infer TEWL may also include the moisture content of the skin.

According to the estimation device 1 of the above-described embodiment, the parameters relating to the skin function can be estimated with high accuracy. More specifically, the estimation device 1 estimates the parameters related to the skin function using an estimation model constructed from past measured data in which the feature value vector and the parameters related to the skin function are associated with each other. Thus, the estimation device 1 can estimate the parameters related to the skin function with high accuracy by using the learned estimation model. For example, the estimation device 1 can estimate parameters related to the skin function with high accuracy by using a machine learning model including a random forest model that is learned from acquired past measured data.

Since the parameters relating to the skin function can be estimated with high accuracy by the estimation device 1, the function of the biological tissue including the skin barrier function and the like can be estimated with high accuracy. Thus, the estimation device 1 can be effectively applied to a wide range of fields such as medical treatment and beauty treatment. For example, the estimation device 1 may also contribute to diagnosis and evaluation of the health of the skin. It is speculated that the device 1 may also contribute to the treatment of the skin and the verification of the effect on the skin care. The apparatus 1 is also supposed to contribute to the prediction of the onset of skin diseases.

For example, in the conventional TEWL measurement, skin conductance measurement, and the like, it is necessary to wash a test site before measurement and perform stable measurement under a constant temperature and humidity environment. In the conventional TEWL measurement, it is also necessary to leave the site to be tested still for about 10 seconds during the measurement. As described above, in the conventional techniques, it is difficult to perform measurement in an environment where temperature and humidity cannot be controlled, and measurement of a newborn, an infant, or the like, where it is difficult to keep a test site still. Thus, the convenience of using the measuring apparatus of the related art is low.

In the estimation device 1 according to one embodiment, since parameters relating to skin functions can be estimated with high accuracy from a skin image in which irregularities are reflected on the skin surface by using a method using machine learning, it is not necessary to perform stable measurement as in the conventional technique. That is, the user using the estimation device 1 only needs to acquire a skin image showing the unevenness of the skin surface, and the estimation can be performed without being limited to the environment and the subject. For example, the estimation device 1 is applicable to a case where a skin image is directly acquired at a medical site, a beauty-related shop, or the like, or a case where a skin image of a subject or the like located at a remote place is acquired by communication or the like. Further, the estimation may be performed without cleaning the test site as appropriate. As described above, the estimation device 1 improves the convenience of the user in the estimation of the parameter related to the skin function.

The estimation device 1 can present the value of the parameter related to the skin function to the user more easily than the related art, and therefore, for example, can be applied to the production and use of a criterion indicating a criterion such as which kind of a person should use which kind of a moisturizer, a medicine, or the like. That is, unlike the conventional art, since the parameter relating to the skin function can be frequently measured using the estimation device 1, the creation and use of such a criterion become easy.

The estimation device 1 can remove, from the acquired skin image, unnecessary information that can become noise regardless of the irregularities on the skin surface by generating a corrected image obtained by performing the brightness correction process and the binarization process on the acquired skin image. Thus, the estimation device 1 can estimate the parameters related to the skin function with higher accuracy.

The estimation device 1 can accurately separate the noise from the essential information such as the phase information by acquiring a series of image groups in step S302 of fig. 4. For example, in the case of using only one image, it is difficult to determine which of a plurality of connected components and holes included in the image is an essential feature and which is noise. The estimation device 1 can determine, for example, the presence of a connected component in a predetermined region or the continuity of a hole by acquiring a series of sets of images by changing the threshold value t in stages. The estimation device 1 can accurately separate the essential information from the noise based on such persistence.

As in step S202 of fig. 3, the estimation device 1 extracts a feature value vector from a skin image, and estimates parameters related to skin functions using a machine learning model, thereby reducing the number of necessary samples. Further, the estimation device 1 can suppress the amount of calculation. Furthermore, the estimation device 1 facilitates interpretation such as which feature of the skin image is associated with the estimated parameter related to the skin function.

The estimation device 1 estimates the parameters related to the skin function from the attribute of the subject in addition to the feature value vector in step S203 of fig. 3, thereby estimating the parameters related to the skin function with higher accuracy in accordance with the attribute of the subject.

It will be apparent to those skilled in the art that the present invention may be practiced in other specific forms than the described embodiments without departing from the spirit or essential characteristics thereof. Accordingly, the above description is by way of example only and is not intended as limiting. The scope of the invention is defined by the appended claims rather than the foregoing description. All modifications (variations) are to be regarded as being included within the scope of equivalents thereof.

For example, each step in the estimation method using the estimation device 1 and the functions included in each step can be rearranged in a logically non-contradictory manner, and the order of the steps can be changed or a plurality of steps can be combined into one step or divided.

For example, the present invention may be realized as a program describing processing contents for realizing the functions of the estimation device 1 or a storage medium storing the program. It is intended that all such variations be included within the scope of the present invention.

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种利用光学体表运动信号合成实时图像的系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!