Method of generating chemical structure, neural network device, and non-transitory computer-readable recording medium

文档序号:1398284 发布日期:2020-03-03 浏览:26次 中文

阅读说明:本技术 产生化学结构的方法、神经网络设备和非瞬时计算机可读的记录介质 (Method of generating chemical structure, neural network device, and non-transitory computer-readable recording medium ) 是由 金勍德 权宁千 金美淑 庾志镐 崔伦硕 于 2019-02-26 设计创作,主要内容包括:本发明涉及产生化学结构的方法、神经网络设备和非瞬时计算机可读的记录介质。通过使用神经网络使用在对于参比化学结构的描述符或图像中的表达特定性质的表达区域来产生新的化学结构。所述新的化学结构可通过改变在所述参比化学结构中的对应于所述表达区域的局部结构而产生。(The present invention relates to a method of generating a chemical structure, a neural network device, and a non-transitory computer-readable recording medium. New chemical structures are generated by using neural networks using expression regions expressing specific properties in descriptors or images for reference chemical structures. The new chemical structure may be generated by altering the local structure in the reference chemical structure corresponding to the expression region.)

1. A method of generating a chemical structure by using a neural network device, the method comprising:

inputting a descriptor of a chemical structure to a trained neural network, the trained neural network producing property values for a property of the chemical structure, the descriptor of the chemical structure representing a structural characteristic of the chemical structure and the property of the chemical structure being a characteristic possessed by the chemical structure;

determining an expression region in the descriptor for expressing the property, the expression region comprising a bit position in the descriptor; and

generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

2. The method of claim 1, wherein the determining comprises:

determining an expression region in the descriptor for expressing the property by: the trained neural network carries out an interpretation process to determine whether the property value is expressed by the local structure in the chemical structure.

3. The method of claim 2, wherein the determining comprises:

determining an expression region in the descriptor for expressing the property by applying a layer-wise correlation propagation (LRP) technique to the trained neural network,

wherein an activation function applied to a node of the trained neural network is selected as a linear function to apply the LRP technique to the trained neural network, and a Mean Square Error (MSE) is selected for optimization.

4. The method of claim 1, wherein the generating comprises:

obtaining a bit value of a bit position of the expression region in the descriptor; and

generating the new chemical structure by applying a genetic algorithm to the bit values of the bit positions and modifying the local structure corresponding to the expression region.

5. The method of claim 1, wherein the generating comprises:

generating a new first chemical structure by modifying the local structure in the chemical structure, the local structure corresponding to the expression region;

inputting descriptors for the new first chemical structure to the trained neural network to output property values for a particular property of the new first chemical structure; and

generating a new second chemical structure by changing a local structure in the new first chemical structure when a property value for a specific property of the new first chemical structure is less than a preset value, the local structure corresponding to the expression region, and storing the new first chemical structure when the property value for the specific property of the new first chemical structure is equal to or greater than the preset value.

6. A neural network device configured to generate a chemical structure, the neural network device comprising:

a memory configured to store at least one program; and

a processor configured to implement a neural network by executing the at least one program to control the neural network device, the processor being configured when the at least one program is executed to:

inputting a descriptor of a chemical structure to a trained neural network, the trained neural network producing property values for a property of the chemical structure, the descriptor of the chemical structure representing a structural characteristic of the chemical structure and the property of the chemical structure being a characteristic possessed by the chemical structure;

determining an expression region in the descriptor for expressing the property, the expression region comprising a bit position in the descriptor; and

generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

7. The neural network device of claim 6, wherein the at least one program, when executed, the processor is further configured to determine an expression region in the descriptor for expressing the property by: the trained neural network carries out an interpretation process to determine whether the property value is expressed by the local structure in the chemical structure.

8. The neural network device of claim 7, wherein when the at least one program is executed, the processor is further configured to:

determining an expression region in the descriptor for expressing the property by applying a layer-wise correlation propagation (LRP) technique to the trained neural network; and

selecting an activation function applied to a node of the trained neural network as a linear function to apply the LRP technique and a selected Mean Square Error (MSE) to the trained neural network for optimization.

9. The neural network device of claim 6, wherein when the at least one program is executed, the processor is further configured to obtain bit values for bit positions of the expression region in the descriptor and to generate the new chemical structure by applying a genetic algorithm to the bit values for the bit positions and modifying the local structure corresponding to the expression region.

10. The neural network device of claim 6, wherein the processor, when the at least one program is executed, is configured to:

generating a new first chemical structure by modifying the local structure in the chemical structure, the local structure corresponding to the expression region;

inputting descriptors for the new first chemical structure to the trained neural network to output property values for a particular property of the new first chemical structure; and

generating a new second chemical structure by changing a local structure in the new first chemical structure when a property value for a specific property of the new first chemical structure is less than a preset value, the local structure corresponding to the expression region, and storing the new first chemical structure in the memory when the property value for the specific property of the new first chemical structure is equal to or greater than the preset value.

11. A method of generating a chemical structure by using a neural network device, the method comprising:

inputting an image of a chemical structure to a trained neural network, the trained neural network producing property values for properties of the chemical structure, the image of the chemical structure representing structural characteristics of the chemical structure and the properties of the chemical structure being characteristics possessed by the chemical structure;

determining an expression region in the image for expressing the property, the expression region comprising one or more pixels in the image; and

generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

12. The method of claim 11, wherein the determining comprises:

determining an expression region in the image for expressing the property by: the trained neural network carries out an interpretation process to determine whether the property value is expressed by the local structure in the chemical structure.

13. The method of claim 12, wherein the determining comprises:

determining an expression region in the image for expressing the property by applying a layer-wise correlation propagation (LRP) technique to the trained neural network,

wherein an activation function applied to a node of the trained neural network is selected as a linear function to apply the LRP technique to the trained neural network, and a Mean Square Error (MSE) is selected for optimization.

14. The method of claim 11, wherein the generating comprises:

obtaining pixel values for one or more pixels in the expression region in the image; and

generating the new chemical structure by applying Gaussian noise to pixel values of the one or more pixels and modifying the local structure corresponding to the expression region.

15. The method of claim 11, wherein the expression region comprises a plurality of expression regions expressing the property, and the generating comprises:

obtaining coordinate information corresponding to the plurality of expression regions in the image;

calculating center points of the plurality of expression regions in the image based on the coordinate information and obtaining pixel values of the center points; and

generating the new chemical structure by applying Gaussian noise to the pixel values and modifying the local structure corresponding to the central point.

16. The method of claim 11, wherein the generating comprises:

generating a new first chemical structure by modifying the local structure in the chemical structure, the local structure corresponding to the expression region;

inputting an image for the new first chemical structure to the trained neural network to output a property value for a particular property of the new first chemical structure; and

generating a new second chemical structure by changing a local structure in the new first chemical structure when a property value for a specific property of the new first chemical structure is less than a preset value, the local structure corresponding to the expression region, and storing the new first chemical structure when the property value for the specific property of the new first chemical structure is equal to or greater than the preset value.

17. A neural network device configured to generate a chemical structure, the neural network device comprising:

a memory configured to store at least one program; and

a processor configured to implement a neural network by executing the at least one program to control the neural network device, the processor being configured when the at least one program is executed to:

inputting an image of a chemical structure to a trained neural network, the trained neural network producing property values for properties of the chemical structure, the image of the chemical structure representing structural characteristics of the chemical structure and the properties of the chemical structure being characteristics possessed by the chemical structure;

determining an expression region in the image for expressing the property, the expression region comprising one or more pixels in the image; and

generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

18. The neural network device of claim 17, wherein the at least one program, when executed, the processor is further configured to determine an expression region in the image for expressing the property by: the trained neural network carries out an interpretation process to determine whether the property value is expressed by the local structure in the chemical structure.

19. The neural network device of claim 18, wherein when the at least one program is executed, the processor is further configured to:

determining an expression region in the image for expressing the property by applying a layer-wise correlation propagation (LRP) technique to the trained neural network; and

selecting an activation function applied to a node of the trained neural network as a linear function to apply the LRP technique and a selected Mean Square Error (MSE) to the trained neural network for optimization.

20. The neural network device of claim 17, wherein the at least one program, when executed, is further configured to obtain pixel values of one or more pixels in the expression region in the image and to generate the new chemical structure by applying gaussian noise to the pixel values of the one or more pixels and modifying the local structure corresponding to the expression region.

21. The neural network device of claim 17, wherein the expression region comprises a plurality of expression regions that express the property, and when the at least one program is executed, the processor is further configured to:

obtaining coordinate information corresponding to the plurality of expression regions in the image;

calculating center points of the plurality of expression regions in the image based on the coordinate information and obtaining pixel values of the center points; and

generating the new chemical structure by applying Gaussian noise to the pixel values and modifying the local structure corresponding to the central point.

22. The neural network device of claim 17, wherein when the at least one program is executed, the processor is further configured to:

generating a new first chemical structure by modifying the local structure in the chemical structure, the local structure corresponding to the expression region;

inputting an image for the new first chemical structure to the trained neural network to output a property value for a particular property of the new first chemical structure; and

generating a new second chemical structure by changing a local structure in the new first chemical structure when a property value for a specific property of the new first chemical structure is less than a preset value, the local structure corresponding to the expression region, and storing the new first chemical structure in the memory when the property value for the specific property of the new first chemical structure is equal to or greater than the preset value.

23. A non-transitory computer-readable recording medium containing a program which, when executed by a computer, carries out the method according to any one of claims 1 to 5 and 11 to 16.

Technical Field

The present disclosure relates to methods and apparatus for generating chemical structures using neural networks.

Background

Neural networks refer to computational architectures that mimic (model) a biological brain. With advanced neural network technology, various types of electronic systems have analyzed input data and generated optimized information by using neural networks.

In recent years, a great deal of research has been conducted on the following methods: chemical structures to be used in material development are selected by evaluating the properties of the chemical structures using neural network techniques. In particular, there is a need to develop a method of generating a new chemical structure satisfying various requirements by using neural network technology.

Disclosure of Invention

Embodiments of the present disclosure relate to methods and apparatus for generating chemical structures using neural networks. Further, a computer-readable recording medium is provided, which includes a program that when executed by a computer carries out the method. The technical problems to be solved are not limited to those as described, but other technical problems may exist.

Additional aspects will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the embodiments provided.

According to an aspect of an embodiment, there is provided a method of generating a chemical structure by using a neural network device, comprising: inputting a descriptor of a chemical structure to a trained neural network, the trained neural network producing property values for a property of the chemical structure, the descriptor of the chemical structure representing a structural characteristic of the chemical structure and the property of the chemical structure being a characteristic possessed by the chemical structure; determining an expression region in the descriptor for expressing the property, the expression region comprising bit positions in the descriptor; and generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

Determining the expression region may comprise determining the expression region in the descriptor for expressing the property by: the trained neural network carries out an interpretation (interpretation) process to determine whether the property value is expressed by the local structure in the chemical structure.

Determining the expression region may comprise determining the expression region in the descriptor for expressing the property by: applying a layer-wise relevance propagation (LRP) technique to the trained neural network, wherein activation functions applied to nodes of the trained neural network may be selected (specified) as linear functions to apply the LRP technique to the trained neural network, and a Mean Square Error (MSE) may be selected for optimization.

Creating a new chemical structure may include: obtaining a bit value for the bit position of the expression region in the descriptor; and generating the new chemical structure by applying a genetic algorithm to the bit values of the bit positions and modifying the local structure corresponding to the expression region.

Creating a new chemical structure may include: generating a new first chemical structure by modifying the local structure in the chemical structure, the local structure corresponding to the expression region; inputting descriptors for the new first chemical structure to the trained neural network to output property values for a particular property of the new first chemical structure; and generating a new second chemical structure by changing a local structure in the new first chemical structure when the property value for the specific property of the new first chemical structure is less than a preset value, the local structure corresponding to the expression region, and storing the new first chemical structure when the property value for the specific property of the new first chemical structure is equal to or greater than the preset value.

According to an aspect of an embodiment, there is provided a neural network device configured to generate a chemical structure, comprising: a memory configured to store at least one program(s); and a processor configured to drive a neural network by executing the at least one program, wherein the processor is configured to: inputting a descriptor of a chemical structure to a trained neural network, the trained neural network producing property values for a property of the chemical structure, the descriptor of the chemical structure representing a structural characteristic of the chemical structure and the property of the chemical structure being a characteristic possessed by the chemical structure; determining an expression region in the descriptor for expressing the property, the expression region comprising a bit position in the descriptor; and generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

According to an aspect of an embodiment, there is provided a method of generating a chemical structure by using a neural network device, comprising: inputting an image of a chemical structure to a trained neural network, the trained neural network producing property values for properties of the chemical structure, the image of the chemical structure representing structural characteristics of the chemical structure and the properties of the chemical structure being characteristics possessed by the chemical structure; determining an expression region in the image for expressing the property, the expression region comprising one or more pixels in the image; and generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

Determining the expression region may comprise determining the expression region in the image for expressing the property by: the trained neural network carries out an interpretation process to determine whether the property value is expressed by the local structure in the chemical structure.

Determining the expression region may comprise determining the expression region in the image for expressing the property by: applying a layer-by-layer correlation propagation (LRP) technique to the trained neural network, wherein activation functions applied to nodes of the trained neural network may be selected as linear functions to apply the LRP technique to the trained neural network, and a mean-square error (MSE) may be selected for optimization.

Creating a new chemical structure may include: obtaining pixel values of the one or more pixels in the expression region in the image; and generating the new chemical structure by applying gaussian noise to pixel values of the one or more pixels and modifying the local structure corresponding to the expression region.

Creating a new chemical structure may include: when a plurality of expression regions expressing the property in the image exist, obtaining coordinate information in the image corresponding to the plurality of expression regions; calculating center points of the plurality of expression regions in the image based on the coordinate information and obtaining pixel values of the center points; and generating the new chemical structure by applying gaussian noise to the pixel values and modifying the local structure corresponding to the central point.

Creating a new chemical structure may include: generating a new first chemical structure by modifying the local structure in the chemical structure, the local structure corresponding to the expression region; inputting an image for the new first chemical structure to the trained neural network to output a property value for a particular property of the new first chemical structure; and generating a new second chemical structure by changing a local structure in the new first chemical structure when the property value for the specific property of the new first chemical structure is less than a preset value, the local structure corresponding to the expression region, and storing the new first chemical structure when the property value for the specific property of the new first chemical structure is equal to or greater than the preset value.

According to an aspect of an embodiment, there is provided a neural network device configured to generate a chemical structure, comprising: a memory configured to store at least one program; and a processor configured to drive a neural network by executing the at least one program, wherein the processor is configured to: inputting an image of a chemical structure to a trained neural network, the trained neural network producing property values for properties of the chemical structure, the image of the chemical structure representing structural characteristics of the chemical structure and the properties of the chemical structure being characteristics possessed by the chemical structure; determining an expression region in the image for expressing the property, the expression region comprising one or more pixels in the image; and generating a new chemical structure by modifying a local structure in the chemical structure, the local structure corresponding to the expression region.

According to an aspect of an embodiment, there is provided a non-transitory computer-readable recording medium including a program which, when executed by a computer, carries out any of the methods.

Drawings

These and/or other aspects will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:

fig. 1 is a block diagram illustrating a hardware configuration of a neural network device according to an embodiment;

fig. 2 is a diagram illustrating calculations carried out by a deep (deep) neural network (DNN) according to an embodiment;

fig. 3 is a diagram illustrating calculations carried out by a recurrent (recurrent) neural network (RNN) according to an embodiment;

FIG. 4 is a conceptual diagram illustrating a neural network system for generating chemical structures, according to an embodiment;

FIG. 5 is a diagram illustrating a method of representing a chemical structure, according to an embodiment;

FIG. 6 is a diagram illustrating a method of interpreting a neural network, according to an embodiment;

FIG. 7 is a diagram illustrating an example of changing an expression region of a descriptor to generate a new chemical structure, according to an embodiment;

FIG. 8 is a diagram illustrating an example of changing a local structure by changing a bit value of a descriptor, according to an embodiment;

fig. 9 is a diagram illustrating an example of changing a local structure by changing a pixel value of an image according to an embodiment;

fig. 10 is a diagram illustrating an example of changing a pixel value when there are a plurality of expression regions on an image according to an embodiment;

fig. 11 is a flowchart of a method of generating a new chemical structure by changing descriptors for chemical structures in a neural network device, according to an embodiment; and

fig. 12 is a flowchart of a method of generating a new chemical structure by changing an image for the chemical structure in a neural network device according to an embodiment.

Detailed Description

Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. In this regard, the present embodiments may have different forms and should not be construed as limited to the descriptions set forth herein. Accordingly, the embodiments are described below to illustrate aspects only by referring to the drawings. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. Expressions such as "at least one (of) … …" when preceding or succeeding a list of elements modifies the entire list of elements without modifying individual elements of the list such that expressions of "at least one (of) a, b, and c" or expressions similar thereto include: only a, only b, only c, only a and b, only b and c, only a and c, and all of a, b, and c.

The terms "according to some embodiments" or "according to an embodiment" used throughout the specification do not necessarily denote the same embodiment.

Some embodiments of the disclosure may be described in terms of functional block(s) (block) construction and various processing operations. Some or all of these functional blocks may be implemented using various numbers of hardware and/or software components (components, ingredients) that perform the specified functions. For example, the functional blocks of the present disclosure may be implemented using one or more microprocessors or circuits executing instructions to perform a given function. Further, for example, the functional blocks of the present disclosure may be implemented in a variety of programming or scripting languages. The functional blocks may be implemented with algorithms that are executed by one or more processors. The present disclosure may also employ conventional techniques for electronic construction, signal processing, and/or data processing. The terms "mechanism," "element," "unit," and "configuration" may be used in a broad sense and are not limited to mechanical and physical configurations.

Furthermore, the connecting lines or connecting means between the components shown in the figures merely illustrate functional connections and/or physical or electrical connections. In an actual device, the connections between the components may be provided by various functional connections, physical connections, or circuit connections, which may be substituted or added.

Meanwhile, in relation to the terminology used herein, a descriptor, which is data used in a neural network system, refers to an indication value for describing a structural characteristic (feature) of a chemical structure and can be obtained by performing a relatively simple calculation on a given chemical structure. According to an embodiment, the descriptor may include a molecular structure fingerprint (e.g., Morgan (Morgan) fingerprint and Extended Connectivity Fingerprint (ECFP)) indicating whether a specific local structure is included. Further, the descriptor may be a quantitative structure-property relationship (QSPR) model configured with a value that can be immediately calculated from a given chemical structure, such as molecular weight or the number of local structures (e.g., rings) included in a molecular structure.

In addition, properties refer to characteristics possessed by chemical structures and may be real numerical values measured through experiments or may be calculated through simulation. For example, when the chemical structure is used as a display (display) material, the property of the chemical structure may be a transmission wavelength, an emission wavelength, or the like for light. When the substance is used as a battery material, the property of the chemical structure may be a voltage. Unlike descriptors, the computation of properties may require complex simulations, which requires additional computation (computation) and estimation (computation) beyond a similar simulation for descriptors.

Further, the structure refers to an atomic level structure of the chemical structure. In order to deduce (derive) properties by performing first principles calculations, structures need to be expressed at the atomic level. Thus, the atomic-level structure needs to be deduced to generate new chemical structures. The structure may be a structural formula based on an atomic bonding relationship or a character string in a simple format (one-dimensional). The format of the string of expression constructs may be a simplified molecular linear input system (specifications) (SMILES) code, a SMILES architecture Target Specification (smart) code, an international compound identification (InChi) code, or the like.

In addition, a factor refers to an element that defines a relationship between a descriptor, a property, and a structure. The factors may be determined by machine learning based on descriptor-property-structural formulas stored in the database. Thus, it can be determined how the relationship between factors, descriptors, properties, and structural formulae.

Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.

Fig. 1 is a block diagram illustrating a hardware configuration of a neural network device 100 according to an embodiment.

The neural network device 100 may be implemented using various types of devices such as a Personal Computer (PC), a server, a removable device, and an embedded device. Examples of the neural network device 100 may include, but are not limited to, smart phones, tablet devices, Augmented Reality (AR) devices, internet of things (IoT) devices, autonomous vehicles, robots, medical devices, etc., that use neural networks to perform voice recognition, image classification, etc. Further, the neural network device 100 may be a dedicated Hardware (HW) accelerator disposed on, connected to, or installed in the above-described device. The neural network device 100 may be a hardware accelerator such as a neural processor (processing unit) (neural Network Processor) (NPU), Tensor Processor (TPU), or neural engine as a dedicated module (component) for driving a neural network, but is not limited thereto.

Referring to fig. 1, the neural network device 100 includes a processor 110 and a memory 120. Fig. 1 illustrates only the components of a neural network device 100 that are relevant to embodiments of the present disclosure. Accordingly, it will be apparent to those skilled in the art that the neural network device 100 may further include any other general components in addition to those shown in fig. 1.

The processor 110 controls overall functions for driving the neural network device 100. For example, the processor 110 controls the overall operation of the neural network device 100 by executing programs stored in the memory 120 of the neural network device 100. The processor 110 may be implemented as a Central Processing Unit (CPU), a Graphic Processing Unit (GPU), an Application Processor (AP), etc. provided in the neural network device 100, but is not limited thereto.

The memory 120 is a hardware component that stores various data processed in the neural network device 100. For example, the memory 120 may store data processed by the neural network device 100 and to be processed by the neural network device 100. The memory 120 may also store applications, drivers, etc. to be executed by the processor 110 of the neural network device 100. The memory 120 may include Random Access Memory (RAM) such as Dynamic Random Access Memory (DRAM) and Static Random Access Memory (SRAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), CD-ROM, Blu-ray or other optical disk storage, a Hard Disk Drive (HDD), a Solid State Disk (SSD), or flash memory (flash).

The memory 120 may store descriptors for chemical structures and property values numerically representing properties of the chemical structures, which are matched with each other, as one group or pair. The neural network device 100 may read the descriptor and the property value corresponding thereto from the memory 120 or write the descriptor and the property value corresponding thereto into the memory 120. In an embodiment, the descriptor may include a plurality of bit values, and the property value may be a value for a transmission wavelength, an emission wavelength, a voltage, or the like.

Although not shown in fig. 1, the memory 120 may store the image for the chemical structure and the property value numerically representing the property of the chemical structure in association with each other as one group or pair. In an embodiment, the image may include n × m pixels (where n and m are natural numbers). Hereinafter, the description of the descriptor is equally applicable to the case where the descriptor is replaced with the image.

The memory 120 may store structural feature values representing chemical structures and descriptor and property values matching the structural feature values as a group or pair. The structural feature value may be a SMILES code or a smart code in a string format as an expression chemical structure.

The processor 110 may execute instructions to implement an Artificial Neural Network (ANN), such as a Deep Neural Network (DNN) and a Recurrent Neural Network (RNN).

The processor 110 may allow the DNN to learn by using a descriptor and a property value corresponding to the descriptor, and in the process may determine a factor defining a relationship between the descriptor and the property value. The processor 110 may then output the property value corresponding to the new descriptor as output data by: the trained DNN is driven by using new descriptors, which have not been used in the learning process of the DNN, as input data.

The processor 110 may allow the RNN to learn by using descriptors and structural feature values, and in the process may determine factors defining the relationship between the descriptors and the structural feature values. The processor 110 may then output the structural feature value corresponding to the new descriptor as output data by: the trained RNN is driven by using new descriptors, which are not used in the RNN learning process, as input data.

The neural network device 100 may further include a user interface. The user interface refers to a device or software for inputting data to control the neural network device 100. Examples of user interfaces may include, but are not limited to, keypads, dome switches, touch pads (e.g., capacitive overlay, resistive overlay, infrared beam, surface acoustic wave, integral strain gauge, and piezoelectric), jog dials, and jog switches, as well as Graphical User Interfaces (GUIs) that may be displayed to receive user inputs.

Fig. 2 is a diagram illustrating a calculation performed by DNN according to an embodiment.

Referring to fig. 2, DNN20 may have a structure including an input layer, a hidden layer, and an output layer based on received input data (e.g., I |)1And I2) Performing calculations and generating output data (e.g., O) based on the results of the calculations1And O2)。

For example, as shown in fig. 2, DNN20 may include an input layer (layer 1), two hidden layers (layer 2 and layer 3), and an output layer (layer 4). Since DNN20 may include many layers for processing valid information, DNN20 may process complex data as compared to a neural network including a single layer. Meanwhile, although the DNN20 shown in fig. 2 includes 4 layers, the DNN20 is only an example and may also include more or less layers and more or less channels than those shown therein. That is, the DNN20 may have various layer structures different from those shown in fig. 2.

The layers included in DNN20 may each have a plurality of channels. The channels may correspond to a plurality of artificial nodes, respectively, referred to as neurons, Processing Elements (PEs), cells, or similar terms. For example, as shown in fig. 2, layer 1 may include two channels (nodes), and layers 2 and 3 may each include three channels. However, the layers are merely examples, and the layers included in DNN20 may each have various numbers of channels (nodes) and interconnections with other nodes.

The channels included in each of the layers of the DNN20 may be interconnected to process data. For example, a channel may perform a computation of data received from a channel of one layer and output the computation result to a channel of another layer.

The inputs and outputs of each channel may be referred to as input enable and output enable. That is, the activation may be not only an output of one channel but also a parameter corresponding to an input of a channel included in a successive layer. Meanwhile, the channels may each determine their activation based on the activation and the weight received from the channel included in the previous layer. The weights are parameters used to calculate the output activation of each channel and may be values assigned to relationships between the channels.

The channels may each be processed by a computing unit or processing element that receives input and produces output activation. The input-output of each channel can be plotted. For example, when σ is the activation function,

Figure BDA0001978486430000091

Weights from a k channel included in an (i-1) th layer to a j channel included in an i layer,An offset for a jth channel included in an ith layer, and

Figure BDA0001978486430000093

for activation of the jth channel of the ith layer, activation can be calculated using equation 1 below

[ equation 1]

Figure BDA0001978486430000095

As shown in fig. 2, activation of the first channel CH1 of the second layer, layer 2, may serve as

Figure BDA0001978486430000096

And (4) expressing. In addition, according to the equation 1,can be provided with

Figure BDA0001978486430000098

The value of (c). In equation 1, σ denotes an activation function such as Relu, sigmoid (sigmoid), and tanh (hyperbolic tangent). As a result, activation of a particular channel in a particular layer may be indicated by causing

Figure BDA0001978486430000099

Is obtained by said activation function.

However, the above equation 1 is merely an example for describing the activation and the weight for processing the data in the DNN20, and the embodiment is not limited thereto.

In an embodiment, the neural network device 100 may allow the DNN20 to learn by using descriptors (or images) and property values stored in memory. DNN20 may determine factors defining a relationship between descriptors (or images) and property values in a learning process using the descriptors and the property values.

That is, among the layers 1 to 4 constituting the DNN20, the descriptor (or image) may correspond to values of a plurality of channels (nodes) of the input layer (layer 1), the property value may correspond to values of a plurality of channels (nodes) of the output layer (layer 4), and the factor may correspond to a value of a plurality of channels (nodes) of at least one hidden layer (layers 2 and/or 3).

The trained DNN20 may then be driven by receiving the new descriptor (or new image) as input data, and may thus output property values corresponding to the received new descriptor (or new image) as output data.

Fig. 3 is a diagram illustrating calculations performed by the RNN according to an embodiment.

Hereinafter, for convenience of description, the description given above with reference to fig. 2 will not be repeated.

The RNN 30 is a neural network that analyzes data changing with time, such as time-series data, and is constructed by connecting networks between a reference time point t and a next time point t + 1. That is, the RNN 30 is a neural network in which the temporal aspect is considered and a pattern can be efficiently learned from sequentially input data or data input in a feature sequence by modifying a model to allow recursive input to a hidden layer of the neural network.

Referring to fig. 3, a node s constituting a hidden layer of the RNN 30 is shown. Node s may perform calculations based on input data x and generate output data o. The RNN 30 may apply the same task iteration to all sequences, and the final output result of the node s is affected by the results of the previous calculations.

RNN 31 is an expanded looped RNN 30. The term "expanded" with respect to RNN 30 refers to the expression of RNN 30 for the entire sequence. In RNN 31, xtIs an input value at time step t, and stIs the hidden state at time step t. Term stCan be expressed by the following equation 2. In equation 2, a tanh or Relu function may be used as the function f. S for calculating a first hidden statet-1May be initialized to 0 generally. In addition, in RNN 31, otIs the output value at time step t.

[ equation 2]

Figure BDA0001978486430000101

Here, stIs a storage part of the network and stores information about events at a previous time step. Output value otOnly on the storage of the current time step t.

Meanwhile, unlike the existing neural network structure in which the parameters are different from each other, the RNN 31 shares the parameters U, V, and W for all time steps. That is, since each step of the RNN 31 performs almost the same calculation except for the input value, the number of parameters to be learned can be reduced.

In an embodiment, the neural network device 100 may allow the RNN 31 to learn by using descriptors (or images) and property values stored in the memory. Alternatively, the neural network device 100 may allow the RNN 31 to learn by using factors and property values determined in the learning process of the DNN 20.

For example, when W of RNN 31 is a factor determined during learning of DNN20 and the structural feature value represented by the SMILES code is "ABCD", Ot-1And xtCan be "ABC", and OtAnd xt+1May be "BCD". The SMILES codes in each time step may then be aggregated to output one SMILES code "ABCDEFG", i.e. a structural feature value, as output data.

The trained RNN 31 may then be driven by receiving the new descriptor (or new image) as input data, and may thus output as output data the structural feature values corresponding to the received new descriptor (or new image). Alternatively, the trained RNN 31 may be driven by receiving as input data factors for a new descriptor (or new image), and may therefore output as output data structural feature values corresponding to the new descriptor (or new image).

Fig. 4 is a conceptual diagram illustrating a neural network system for generating a chemical structure according to an embodiment.

Referring to fig. 4, a neural network system configured to generate chemical structures by using DNN 410 and RNN 420 is illustrated.

A descriptor as data used in a neural network system can be represented by ECFP as an indication value for representing a structural characteristic of a chemical structure. The property refers to a characteristic possessed by a chemical structure and may be a true numerical value indicating a transmission wavelength and an emission wavelength with respect to light. Structure refers to the atomic level structure of a chemical structure and may be represented by the SMILES code. For example, the structural formula may be expressed according to the SMILES code as shown in equation 3 below.

[ equation 3]

OCl=C(C=C2C=CNC2=Cl)Cl=C(C=CC=Cl)Cl=CC2=C(NC=C2)C=Cl

A factor is an element that defines a relationship between a descriptor, a property, and a structure. The factor may be at least one hidden layer. When the factor includes a plurality of hidden layers, a factor defining a relationship between the descriptor and the property, a factor defining a relationship between the descriptor and the structure, and the like may be determined for each hidden layer.

DNN 410 may be driven by receiving a descriptor as input data and outputting a property value corresponding to the received descriptor as output data. In a learning process using descriptors and property values, DNN 410 may determine factors that define the relationship between descriptors and property values. The RNN 420 may be driven by receiving as input data factors or descriptors determined during the learning process of the DNN 410, and outputting as output data structural feature values.

Fig. 5 is a diagram illustrating a method of representing a chemical structure 510 according to an embodiment.

Referring to fig. 5, a chemical structure 510 represents the shape of a molecule formed by a combination of atoms. Chemical structure 510 may be represented by the position of atoms, the distance between atoms, the strength of atomic bonds, and the like.

In one embodiment, chemical structure 510 may be represented by descriptor 520 comprising a plurality of bit values (1 or 0). From descriptor 520, it may be determined whether chemical structure 510 includes a particular local structure.

In another embodiment, the chemical structure 510 may be represented as an image 530 having a certain size. The image 530 may include three channels (red, green, and blue (RGB)) of n × m pixels (where n and m are natural numbers). In the image, 8 bits, i.e., values from 0 (black) to 255 (white), may be assigned to each pixel of the image 530. For example, bright red may be synthesized with an R channel value of 246, a G channel value of 20, and a B channel value of 50, and when all the channel values are 255, white is synthesized.

Hereinafter, for convenience of description, a method in which the chemical structure 510 is displayed as the image 530 by using one channel will be described.

The atoms constituting the chemical structure 510 may be displayed on the image 530 in colors different from each other. Chemical structure 510 may include carbon (C), nitrogen (N), and oxygen (O), and on image 530, carbon (C) may be displayed in black, nitrogen (N) may be displayed in blue, and oxygen (O) may be displayed in red.

Referring to fig. 5, on the image 530 including 6 × 6 pixels, the pixel at which carbon (C) of the chemical structure 510 is located may have a value of "0", the pixel at which nitrogen (N) is located may have a value of "50", and the pixel at which oxygen (O) is located may have a value of "186". The value of the pixel where no atom exists may be "255".

The type of color that a certain atom is displayed on the image 530, the number of pixels constituting the image 530, and the like are not limited to the above examples.

The descriptor 520 or the image 530 for the chemical structure 510 may be used as input data for a neural network, and the specific property value for the chemical structure 510 may be output as output data for the neural network.

Fig. 6 is a diagram illustrating a method of interpreting a neural network according to an embodiment.

The neural network device 100 can obtain descriptors or images for a reference chemical structure to output specific property values for the reference chemical structure. In an embodiment, the descriptor may include a plurality of bit values, and the image may include n × m pixels (where n and m are natural numbers).

The neural network device 100 may input descriptors or images for the reference chemical structure to a trained neural network as input data and drive the neural network through an inference process 610 to obtain a specific property value for the reference chemical structure as output data of the neural network.

In this case, the neural network device 100 may carry out the interpretation process 620 to determine whether a particular property value is expressed by a local structure in the reference chemical structure.

Referring to fig. 6, in an embodiment, the neural network device 100 may interpret the trained neural network by using a layer-by-layer correlation propagation (LRP) technique. LRP technology is a method of propagating correlations in the opposite direction of a trained neural network (i.e., the direction from the output layer to the input layer). In the LRP technique, when a correlation propagates between layers, a node having the largest correlation with an upper layer among a plurality of nodes of a lower layer obtains the largest correlation from a corresponding node of the upper layer.

The method of calculating the correlation in the LRP technique can be expressed by equation 4. In equation 4, aiAnd ajRespectively, an output value to be determined in a specific node of the ith layer and an output value to be determined in a specific node of the jth layer. w is a+ ijIs a weight value that associates the particular node of layer i with the particular node of layer j. RiAnd RjRespectively representing the correlation of the specific node of the ith layer and the correlation of the specific node of the jth layer.

[ equation 4]

Figure BDA0001978486430000131

In an embodiment, for application of LRP techniques, the neural network device 100 may select an activation function applied to the nodes of the trained neural network as a linear function, and may select a Mean Square Error (MSE) for optimization, by using a regression analysis method. In particular, in a regression analysis method, the neural network may be trained as a linear function by the activation function of the selected output nodes, since the final output value may include several integer values. To implement the regression analysis method, the loss function may be selected as the MSE in the neural network learning process.

However, the techniques that may be used in interpretation process 620 to determine whether a particular property value is expressed by any local structure in the reference chemical structure are not limited to the examples described above.

When the input data of the neural network is a descriptor for the reference chemical structure, the plurality of nodes of the input layer may respectively correspond to bit values constituting the descriptor. The neural network device 100 may obtain the node of the input layer (i.e., the bit position of the descriptor) having the greatest correlation with the expression of the specific property value of the reference chemical structure through the interpretation process 620. Since the bit positions of the descriptors correspond to specific local structures in the reference chemical structure, the neural network device 100 can determine the specific local structure having the greatest correlation with the expression of the specific property value of the reference chemical structure via the bit positions of the descriptors obtained by the interpretation process 620.

When the input data of the neural network is an image for the reference chemical structure, a plurality of nodes of the input layer may respectively correspond to pixel values constituting the image. The neural network device 100 may obtain the node of the input layer (i.e., the pixel coordinates of the image) having the greatest correlation with the expression of the specific property value of the reference chemical structure through the interpretation process 620. Since the pixel coordinates of the image correspond to a particular local structure in the reference chemical structure, the neural network device 100 can determine the particular local structure having the greatest correlation with the expression of the particular property value of the reference chemical structure via obtaining the pixel coordinates of the image through the interpretation process 620.

Hereinafter, the bit position of the descriptor and the pixel coordinates of the image having the greatest correlation with the expression of the specific property value of the reference chemical structure will be referred to as an expression region.

Fig. 7 is a diagram illustrating an example of changing an expression region of a descriptor to generate a new chemical structure according to an embodiment.

Referring to fig. 7, descriptor 712 of reference chemical structure 710 may be "11100011011010110". The neural network device 100 may sequentially input bit values constituting the descriptor 712 to nodes of an input layer of the neural network (e.g., DNN), respectively, and output a property value (i.e., "emission wavelength: 320 nm") for the reference chemical structure 710.

The neural network device 100 may obtain the node of the input layer (i.e., the expression region 713 of the descriptor 712) having the greatest correlation with the expression of the wavelength value of the reference chemical structure 710. Expression region 713 of descriptor 712 may correspond to specific location 711 in reference chemical structure 710. In fig. 7, the expression region 713 corresponds to a bit value. However, the expression region 713 may correspond to a plurality of consecutive bit values, and a plurality of expression regions may be in the descriptor 712.

Neural network device 100 may change the bit values of expression region 713 to improve the properties of reference chemical structure 710. When the bit value of the expression region 713 is changed, the structure of the specific position 711 may be changed. As a method of changing the bit value of the expression region 713, a genetic algorithm may be used, and the details thereof will be described later with reference to fig. 8. The neural network device 100 may change the bit values of the expression region 713 and/or the bit values around the expression region 713.

The neural network device 100 may change the bit values of the expression region 713 and output a new descriptor 722. Referring to fig. 7, when the bit value "1" of the expression region 713 in the new descriptor 722 is changed to "110100", the neural network device 100 may apply a new local structure 721 corresponding to the bit value "110100" to the specific location 711 and generate a new chemical structure 720, the new local structure 721 being applied to the new chemical structure 720.

Regarding the method of generating the new chemical structure 720, the neural network device 100 may input a new descriptor 722 as input data of the neural network (e.g., RNN) and output a structural feature value as output data, and may generate the new chemical structure 720 based on the output structural feature value.

The neural network device 100 may input the descriptor 722 of the new chemical structure 720 into the neural network and output a property value (i.e., "emission wavelength: 325 nm") corresponding to the descriptor 722 input into the neural network. That is, neural network device 100 may improve properties by altering the local structure of reference chemical structure 710 and creating a new chemical structure 720.

The neural network device 100 may repeatedly generate a chemical structure through the above-described process until a chemical structure having a property value close to a preset value (e.g., "emission wavelength: 350 nm") is generated.

Specifically, the neural network device 100 may compare the property value (e.g., "emission wavelength: 325 nm") for the new chemical structure 720 with a preset value (e.g., "emission wavelength: 350 nm"), and generate the new chemical structure by changing the bit value of the expression region 723 of the descriptor 722 when the property value for the new chemical structure 720 is less than the preset value.

When the property value for the new chemical structure generated through the above-described process is equal to or greater than the preset value, the neural network device 100 may store the generated new chemical structure in the memory.

Fig. 8 is a diagram illustrating an example of changing a local structure by changing a bit value of a descriptor according to an embodiment.

In an embodiment, the neural network device 100 may apply a genetic algorithm to the bit values of the descriptors constituting the reference chemical structure, and perform a selection, crossover (overlap) or mutation operation (run) on the bit values.

The neural network device 100 can change the descriptor of the reference chemical structure by applying a genetic algorithm to the bit values of the descriptors that make up the reference chemical structure. When the descriptor of the reference chemical structure is changed or modified, a local structure in the reference chemical structure may be mutated, removed, or replaced, or a local structure may be added to the reference chemical structure.

Referring to fig. 8, the neural network device 100 may mutate a local structure in the reference chemical structure by applying a genetic algorithm to bit values of descriptors constituting the reference chemical structure. For example, the neural network device 100 can change the carbon (C) in the first location 810 in the reference chemical structure to nitrogen (N). Alternatively, the neural network device 100 may change the adjacent atoms 811 and 812 combined with the atom in the first location 810 to other atoms.

In addition, the neural network device 100 may add local structures to the reference chemical structure by applying genetic algorithms to bit values of descriptors constituting the reference chemical structure. For example, the neural network device 100 can add a local structure 823 to connect to an atom in the second position 820 in the reference chemical structure. Alternatively, neural network device 100 may add local structures to connect to adjacent atoms 821 and 822 that are combined with atoms in second location 820. Alternatively, the neural network device 100 may add the partial structure 824 in the form of a fused ring connected to both the atom in the second location 820 and the adjacent atom 821 combined with the atom in the second location 820.

In addition, the neural network device 100 may remove local structures in the reference chemical structure by applying a genetic algorithm to bit values of descriptors constituting the reference chemical structure. For example, neural network device 100 may remove local structure 831 attached to an atom in third location 830 in the reference chemical structure. Alternatively, the neural network device 100 may alter the ring structure by removing atoms in the third position 830.

Additionally, the neural network device 100 can replace local structures in the reference chemical structure by applying genetic algorithms to bit values of descriptors that make up the reference chemical structure. For example, neural network device 100 may change the ring structure at fourth location 840 in the reference chemical structure to a new local structure 841 or 842.

However, an example of changing the local structure by changing the bit value of the descriptor is not limited to the above description.

Fig. 9 is a diagram illustrating an example of changing a local structure by changing a pixel value of an image according to an embodiment.

Referring to fig. 9, an image 912 of a reference chemical structure 910 may include 6 × 6 pixels. The atoms that make up the reference chemical structure 910 may be displayed in different colors from each other on the image 912. The reference chemical structure 910 may include carbon (C), nitrogen (N), and oxygen (O), and on the image 912, the carbon (C) may be displayed in black, the nitrogen (N) may be displayed in blue, and the oxygen (O) may be displayed in red. On the image 912, the value of the pixel where carbon (C) is located may be "0", the value of the pixel where nitrogen (N) is located may be "50", and the value of the pixel where oxygen (O) is located may be "186".

The neural network device 100 may sequentially input pixel values constituting the image 912 to nodes of an input layer of a neural network (e.g., DNN), respectively, and thus may output a property value (i.e., "emission wavelength: 320 nm") for the reference chemical structure 910.

The neural network device 100 may obtain the node of the input layer (i.e., the expression region 913 of the image 912) having the greatest correlation with the expression of the wavelength value of the reference chemical structure 910. The expression region 913 of the image 912 may correspond to a particular location 911 in the reference chemical structure 910. In fig. 9, the expression region 913 corresponds to one pixel value. However, the expression region 913 may correspond to a plurality of adjacent pixel values, and a plurality of expression regions may be provided in the image 912.

The neural network device 100 may change the pixel values of the expression region 913 and/or the pixel values around the expression region 913 to improve the properties of the reference chemical structure 910. When the pixel values of the expression region 913 and/or the pixel values around the expression region 913 are changed, the structure of the specific location 911 may be changed. In an embodiment, the pixel values of the expression region 913 and/or the pixel values around the expression region 913 may be changed by using gaussian noise. Gaussian noise refers to noise whose distribution function in an arbitrary order (order) is represented by a normal distribution.

The neural network device 100 may change pixel values of the expression region 913 and/or pixel values around the expression region 913, and thus output a new image 922. Referring to fig. 9, when the pixel values of the expression region 913 in the new image 922 and/or the pixel values around the expression region 913 are changed, the neural network device 100 may apply a new local structure 921 corresponding to the changed pixel values and generate a new chemical structure 920 to the specific location 911.

Regarding the method of generating the new chemical structure 920, the neural network device 100 may input a new image 922 as input data of the neural network (e.g., RNN) and output a structural feature value as output data, and may generate the new chemical structure 920 based on the output structural feature value.

The neural network device 100 may input an image 922 of the new chemical structure 920 into the neural network and output a property value (i.e., "emission wavelength: 325 nm") corresponding to the image 922 input into the neural network. That is, the neural network device 100 may improve properties by changing the local structure of the reference chemical structure 910 and creating a new chemical structure 920.

The neural network device 100 may repeatedly generate a chemical structure through the above-described process until a chemical structure having a property value close to a preset value (e.g., "emission wavelength: 350 nm") is generated.

Specifically, the neural network device 100 may compare the property value (e.g., "emission wavelength: 325 nm") for the new chemical structure 920 with a preset value (e.g., "emission wavelength: 350 nm"), and generate the new chemical structure by changing the pixel values of the expression region 913 and/or the pixel values around the expression region 913 in the image 922 when the property value for the new chemical structure 920 is smaller than the preset value.

When the property value for the new chemical structure generated through the above-described process is equal to or greater than the preset value, the neural network device 100 may store the generated new chemical structure in the memory.

Fig. 10 is a diagram illustrating an example of changing a pixel value when there are a plurality of expression regions on an image according to an embodiment.

Referring to fig. 10, an image 1012 of a reference chemical structure 1010 may include 6 × 6 pixels. The atoms that make up the reference chemical structure 1010 may be displayed in different colors from each other on the image 1012. For example, on the image 1012, the value of the pixel at which carbon (C) is located may be "0", the value of the pixel at which nitrogen (N) is located may be "50", and the value of the pixel at which oxygen (O) is located may be "186".

The neural network device 100 may sequentially input pixel values constituting the image 1012 and output property values (i.e., "emission wavelength: 320 nm") for the reference chemical structure 1010 to nodes of an input layer of a neural network (e.g., DNN).

In an embodiment, there may be multiple nodes of the input layer that have the greatest correlation, or a high correlation relative to other nodes, with the expression of the wavelength value of the reference chemical structure 1010. That is, there may be a plurality of expression regions, i.e., a first expression region 1013a and a second expression region 1013b, on the image 1012. First expression region 1013a and second expression region 1013b in image 1012 can correspond to first location 1011a and second location 1011b, respectively, in reference chemical structure 1010. As shown in fig. 10, a second location 1011b corresponding to a second expression region 1013b can be outside of the reference chemical structure 1010.

When there are a plurality of expression regions, i.e., the first expression region 1013a and the second expression region 1013b, on the image 1012, the neural network device 100 may obtain coordinate information corresponding to the plurality of expression regions, i.e., the first expression region 1013a and the second expression region 1013b, on the image 1012. For example, based on the lower left corner of the image 1012 having the origin (0,0), the coordinate information of the first expression region 1013a may be (3,3) and the coordinate information of the second expression region 1013b may be (5, 3).

The neural network device 100 may output the coordinate information of the center point 1014 based on the coordinate information corresponding to the plurality of expression regions, i.e., the first expression region 1013a and the second expression region 1013b (4, 3). The neural network device 100 can change the pixel values of the central point 1014 and/or the pixel values around the central point 1014 to improve the properties of the reference chemical structure 1010. When the pixel values of center point 1014 and/or the pixel values around center point 1014 are changed, the structure of a particular location 1015 on reference chemical structure 1010 corresponding to center point 1014 may be changed. In an embodiment, the pixel values of the center point 1014 and/or the pixel values around the center point 1014 may be changed by using gaussian noise.

The neural network device 100 may change the pixel values of the central point 1014 and/or the pixel values around the central point 1014 and output a new image 1022. Referring to fig. 10, when the pixel value of the central point 1014 in the new image 1022 and/or the pixel values around the central point 1014 are changed, the neural network device 100 may apply a new local structure 1021 corresponding to the changed pixel value and generate a new chemical structure 1020 to the specific location 1015.

The neural network device 100 may input an image 1022 of the new chemical structure 1020 into the neural network and output a property value (i.e., "emission wavelength: 325 nm") corresponding to the image 1022 input into the neural network. That is, the neural network device 100 may improve properties by altering the local structure of the reference chemical structure 1010 and creating a new chemical structure 1020.

Fig. 11 is a flowchart of a method of generating a new chemical structure by changing a descriptor for the chemical structure in a neural network device, according to an embodiment.

The method of generating a chemical structure in a neural network device relates to the embodiments described above with reference to the drawings, and thus, although omitted in the following description, the description given above with reference to the drawings is also applicable to the method shown in fig. 11.

Referring to fig. 11, in operation 1110, a neural network device may obtain descriptors for a reference chemical structure.

Descriptors are indicative values for structural properties representing a chemical structure. Descriptors can be obtained by performing relatively simple operations on a given chemical structure. In an embodiment, a descriptor may be represented by an ECFP and may include multiple bit values. However, the expression of the descriptor is not limited thereto.

Hereinafter, the descriptor for the reference chemical structure will be referred to as a reference descriptor.

In operation 1120, the neural network device may input a reference descriptor into the trained neural network and output a property value for a particular property of the reference chemical structure.

The property refers to a characteristic possessed by a chemical structure and may be a true numerical value indicating a transmission wavelength and an emission wavelength with respect to light. Unlike the case of descriptors, the computation of properties can require complex simulations and is time consuming.

The memory of the neural network device may store descriptors for a specific chemical structure and property values numerically representing properties of the specific chemical structure, which are matched with each other, as one group.

In an embodiment, the neural network device may allow a neural network (e.g., DNN) to learn by using descriptors and property values stored in memory. In a learning process using the descriptor and the property value, a factor defining a relationship between the descriptor and the property value may be determined in the neural network.

The neural network device may output a property value corresponding to the reference descriptor as output data of the neural network by: inputting the reference descriptor as input data for a trained neural network, and driving the neural network.

In operation 1130, the neural network device may determine an expression region in the reference descriptor that expresses a particular property.

The neural network device may perform an interpretation process to determine whether a particular property value is expressed by any local structure in the reference chemical structure.

In an embodiment, the neural network device may interpret the trained neural network by using LRP techniques. LRP technology is a method of propagating correlations in the opposite direction of a trained neural network (i.e., the direction from the output layer to the input layer). In the LRP technique, when a correlation is propagated between layers, a node having the largest correlation with an upper layer among a plurality of nodes of a lower layer obtains the largest correlation from a corresponding node of the upper layer.

For application of LRP techniques, the neural network device may select the activation function applied to the nodes of the trained neural network as a linear function, and may select the MSE for optimization.

A plurality of nodes of an input layer of the neural network may respectively correspond to bit values constituting the descriptor. The neural network device may obtain, through an interpretation process, a node of the input layer having the greatest correlation in the expression of the specific property value of the reference chemical structure, that is, a bit position (or expression region) of the reference descriptor. Since the expression region of the reference descriptor corresponds to a specific local structure in the reference chemical structure, the neural network device may determine the specific local structure having the greatest correlation in the expression of the specific property value of the reference chemical structure via obtaining the expression region of the reference descriptor through the interpretation process.

In operation 1140, the neural network device may generate a new chemical structure by altering a local structure in the reference chemical structure corresponding to the expression region.

The neural network device may receive a target property value as an input. In an embodiment, the neural network device may include a user interface, which is a tool for inputting data for controlling the neural network device. For example, the user interface may be a key pad, a touch pad, etc., but is not limited thereto.

The target property value is a numerical value of a specific property of a chemical structure to be finally generated in the neural network device. In embodiments, the target property value may be a refractive index value, an elastic modulus, a melting point, a transmission wavelength, and/or an emission wavelength. For example, the neural network device may receive "transmit wavelength: 350nm "as the target property value. Alternatively, the target property value may be set in an increasing (+) direction or a decreasing (-) direction, rather than a numerical value.

The neural network device may generate a new chemical structure having a property value close to the target property value by changing a local structure in the reference chemical structure.

In an embodiment, the neural network device may output a new descriptor by changing a bit value of an expression region of the reference descriptor. When the bit value of the expression region of the reference descriptor is changed, the local structure in the reference chemical structure may be changed. The method of changing the place value of the expression region may use a genetic algorithm, but is not limited thereto.

The neural network device may output a structural feature value corresponding to the new descriptor as output data of the neural network by: inputting a new descriptor in which a bit value of an expression region of the reference descriptor is changed as input data of a trained neural network (e.g., RNN), and driving the neural network. The neural network device may generate a new chemical structure based on the outputted structural feature values. Alternatively, the neural network device may use the factors for the new descriptors output in the learning process of the DNN as input data to a trained neural network (e.g., RNN).

The neural network device may repeatedly generate chemical structures through the above-described process until a chemical structure having a property value close to a target property value (e.g., "emission wavelength: 350 nm") is generated.

In particular, the neural network device may compare the property value for the new chemical structure with a target property value, and when the property value for the new chemical structure is less than the target property value, regenerate the new chemical structure by changing the bit value of the expression region of the reference descriptor.

When the property value of the new chemical structure generated through the above-described process is equal to or greater than the target property value, the neural network device may store the generated new chemical structure in the memory.

Fig. 12 is a flowchart of a method of generating a new chemical structure by changing an image for the chemical structure in a neural network device according to an embodiment.

Hereinafter, the same descriptions as those given with reference to fig. 11 are omitted.

Referring to fig. 12, in operation 1210, the neural network device may obtain an image for a reference chemical structure.

In an embodiment, the image for the reference chemical structure may comprise n × m pixels (where n and m are natural numbers). For example, 8 bits, i.e., values from 0 (black) to 255 (white), may be assigned to each pixel of the image.

Hereinafter, the image for the reference chemical structure will be referred to as a reference image.

In operation 1220, the neural network device may input a reference image into the trained neural network and output a property value for a particular property of the reference chemical structure.

The memory of the neural network device may store images for a specific chemical structure and property values numerically representing properties of the specific chemical structure, which are matched with each other, as one group.

In an embodiment, the neural network device may allow a neural network (e.g., DNN) to learn by using images and property values stored in memory. In a learning process using the image and the property value, a factor defining a relationship between the image and the property value may be determined in the neural network.

The neural network device may output the property value corresponding to the reference image as output data of the neural network by: inputting the reference image as input data to a trained neural network, and driving the neural network.

In operation 1230, the neural network device may determine an expression region in the image that expresses a particular property.

A plurality of nodes of an input layer of the neural network may respectively correspond to pixel values constituting the image. The neural network device may obtain, through an interpretation process, a node of the input layer having the greatest correlation in the expression of the specific property value of the reference chemical structure, that is, a pixel coordinate (or an expression region) of the reference image. Since the expression region of the reference image corresponds to a specific local structure in the reference chemical structure, the neural network device can determine the specific local structure having the greatest correlation in the expression of the specific property value of the reference chemical structure through obtaining the expression region of the reference image through an interpretation process.

In operation 1240, the neural network device may generate a new chemical structure by altering a local structure in the reference chemical structure corresponding to the expression region.

In an embodiment, the neural network device may generate a new image by changing pixel values of an expression region of a reference image and/or pixel values around the expression region. When the pixel values of the expression region of the reference image and/or the pixel values around the expression region are changed, the local structure in the reference chemical structure may be changed. In an embodiment, the pixel value of the expression region of the reference image and/or the pixel values around the expression region may be changed by using gaussian noise, but the method of changing the pixel values is not limited thereto.

The neural network device may output a structural feature value corresponding to a new image as output data of the neural network by: inputting a new image in which pixel values of an expression region of the reference image and/or pixel values around the expression region are changed as input data of a trained neural network (e.g., RNN), and driving the neural network. The neural network device may generate a new chemical structure based on the outputted structural feature values. Alternatively, the neural network device may use factors for the new image output in the learning process of the DNN as input data to the trained neural network (e.g., RNN).

The neural network device may repeatedly generate chemical structures through the above-described process until a chemical structure having a property value close to a target property value (e.g., "emission wavelength: 350 nm") is generated.

In particular, the neural network device may compare the property value for the new chemical structure with a target property value, and when the property value for the new chemical structure is less than the target property value, regenerate the new chemical structure by changing the pixel values of the expression region of the reference image and/or the pixel values around the expression region.

When the property value of the new chemical structure generated through the above-described process is equal to or greater than the target property value, the neural network device may store the generated new chemical structure in the memory.

According to the above embodiments, the trained neural network can be interpreted to unambiguously (specialize) local structures expressing the nature of the chemical structure. In addition, by changing the well-defined local structure, new chemical structures with improved properties can be created.

Further, the foregoing embodiments may be embodied in the form of a recording medium storing instructions executable by a computer, such as program modules, executed by a computer. Computer readable media can be any recording media that can be accessed by the computer and can include both volatile and nonvolatile media, and removable and non-removable media. Additionally, the computer-readable media may include computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. The communication media includes computer readable instructions, data structures, program modules, or other data in a modulated data signal or other transport mechanism and may include any transmission media.

In addition, throughout the specification, the term "unit" may be a hardware component such as a processor or a circuit and/or a software component executed by a hardware component such as a processor.

The above description of the present disclosure is provided for the purpose of illustration, and it will be understood by those skilled in the art that various changes and modifications may be made without changing the technical concept and essential features of the present disclosure. It is therefore to be understood that the foregoing illustrative embodiments are illustrative in all respects and not restrictive of the disclosure. For example, components described as a single type may be implemented in a distributed manner. Also, components described as distributed may be implemented in a combined manner.

It is to be understood that the embodiments described herein are to be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects in various embodiments should typically be considered as available for other similar features or aspects in other embodiments.

Although one or more embodiments have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.

31页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于预测原子元素及其合金材料的结构和性质的系统和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!