Information processing device, information processing system, method, and program

文档序号:914505 发布日期:2021-02-26 浏览:2次 中文

阅读说明:本技术 信息处理装置、信息处理系统、方法以及程序 (Information processing device, information processing system, method, and program ) 是由 后藤安章 于 2018-07-23 设计创作,主要内容包括:检修数据接收部(120)从便携式设备接收在针对诊断对象的检修作业中由便携式设备收集到的检修数据。学习部(130)将过去收集到的检修数据输入至神经网络,使神经网络对诊断对象有无损伤的诊断进行学习。诊断部(140)将检修数据接收单元接收到的检修数据输入至神经网络,根据神经网络的输出而对诊断对象有无损伤进行诊断。指示部(150)根据诊断部(140)对诊断对象进行诊断得到的诊断结果,对检修作业的指示的内容进行判别,将与判别出的结果相应的指示发送至便携式设备。(An inspection data receiving unit (120) receives, from the portable device, inspection data collected by the portable device during an inspection operation for a diagnostic object. A learning unit (130) inputs the inspection data collected in the past to the neural network, and causes the neural network to learn the presence of non-invasive diagnosis of the object to be diagnosed. A diagnosis unit (140) inputs the overhaul data received by the overhaul data reception unit to the neural network, and diagnoses whether or not the object to be diagnosed is damaged based on the output of the neural network. An instruction unit (150) determines the content of an instruction for maintenance work on the basis of a diagnosis result obtained by diagnosing the object to be diagnosed by the diagnosis unit (140), and transmits an instruction corresponding to the determined result to the portable device.)

1. An information processing apparatus having:

a service data receiving unit that receives service data collected by a portable device in a service operation for a diagnostic object from the portable device;

a learning unit that inputs inspection data collected in the past to a neural network and causes the neural network to learn whether or not there is a non-invasive diagnosis of the diagnostic object;

a diagnosis unit which inputs the overhaul data received by the overhaul data receiving unit to the neural network and diagnoses whether the diagnostic object is damaged or not according to the output of the neural network; and

and an instruction unit that transmits an instruction corresponding to a diagnosis result obtained by diagnosing the diagnosis target by the diagnosis unit to the portable device.

2. The information processing apparatus according to claim 1,

the inspection data includes image data obtained by imaging the diagnostic object by the portable device and sound data of a hitting sound generated by a hammering inspection performed by an operator,

and the diagnosis unit diagnoses whether the diagnosis object has damage according to the image data and the sound data of the impact sound received by the overhaul data receiving unit.

3. The information processing apparatus according to claim 2,

the instruction unit supplies information indicating a position where an operator taps the diagnostic object to the portable device, and causes a display device of the portable device to display the position where the operator taps the diagnostic object.

4. The information processing apparatus according to claim 2 or 3,

the instruction unit determines whether or not the intensity of the tap of the operator on the diagnostic object is appropriate based on the sound data of the tap sound acquired from the portable device, and if it is determined that the intensity of the tap of the operator on the diagnostic object is not appropriate, supplies an instruction to change the tap intensity to the portable device, and notifies the operator of the instruction via the portable device.

5. The information processing apparatus according to any one of claims 1 to 4,

further comprising: a treatment information storage unit that stores information representing content of treatment for a diagnosis target associated with a diagnosis result,

the instruction unit transmits, to the portable device, an instruction to perform a treatment associated with a diagnosis result of the treatment information storage unit that matches a diagnosis result of the diagnosis target obtained by the diagnosis unit.

6. The information processing apparatus according to any one of claims 1 to 5,

the instruction unit supplies information indicating a route to the diagnostic object to the portable device, and causes a display device of the portable device to display the route to the diagnostic object.

7. The information processing apparatus according to claim 6,

the instruction unit transmits an instruction indicating an inspection operation to the portable device when it is determined that the operator has reached the vicinity of the diagnostic object based on the information indicating the current position of the operator received from the portable device.

8. An information processing system having a portable device and an information processing apparatus capable of communicating with the portable device each other,

the portable device has a unit that transmits service data collected in a service operation for a diagnostic object to the information processing apparatus,

the information processing apparatus includes:

a service data receiving unit that receives the service data from the portable device;

a diagnosis unit which inputs the overhaul data to a neural network and diagnoses whether or not there is damage to a diagnostic object based on an output of the neural network; and

an instruction unit that transmits an instruction corresponding to a diagnosis result of the diagnosis unit to the portable device.

9. A method performed by a computer capable of communicating with a portable device, the method comprising the steps of:

receiving, from the portable device, service data collected by the portable device in a service operation for a diagnostic object;

inputting the overhaul data into a neural network, and diagnosing whether a diagnostic object is damaged or not according to the output of the neural network; and

transmitting an instruction corresponding to a diagnosis result regarding the diagnosis object diagnosed in the step of performing the diagnosis to the portable device.

10. A program that causes a computer to perform the operations of:

receiving, from a portable device, service data collected by the portable device in a service work for a diagnosis object,

inputting the overhaul data into a neural network, diagnosing whether a diagnostic object is damaged or not according to the output of the neural network,

transmitting an indication corresponding to the diagnosis result to the portable device.

Technical Field

The invention relates to an information processing apparatus, an information processing system, a method and a program.

Background

In the case of plant equipment, maintenance and inspection of the equipment are indispensable for maintaining safety and quality. Therefore, for example, in a chemical plant, in order to inspect pipes, tanks, and the like, a plurality of workers perform inspection while making rounds in the plant.

As nondestructive inspections performed by operators on a daily basis, there are visual inspections and hammering inspections. For example, an operator visually checks the pipe or determines whether damage has occurred in the pipe based on a hitting sound generated when the pipe is hit. Since visual inspection and hammering inspection are based on manual work by an operator, the operator needs to be experienced and skilled skills to accurately judge whether an abnormality occurs. Therefore, a worker with low experience performs the same operation as a worker with skilled skills, and accumulates the experience of the operation. In addition, when a worker with skilled skills is not present, there is a fact that a worker with less experience performs work while reading a manual or receiving a telephone instruction from a control management room.

Patent document 1 describes that, in the information processing system described above, an instruction value of a measuring instrument read by an operator and an adjustment amount for adjusting an operation parameter of a production apparatus are transmitted from a wearable computer carried by the operator to a host computer in a factory. The host computer sends the collected indication value of the measuring instrument and the adjustment quantity of the working parameter made by the operator to a wearable computer carried by other operators.

Patent document 1: japanese laid-open patent publication No. 2001-22505

Disclosure of Invention

In patent document 1, only the indicated values of the measuring instruments and the adjustment amounts of the operating parameters, which are assigned to the operators, are shared among the operators, and finally, the operators themselves need to determine whether or not there is an abnormality based on the indicated values of the instruments assigned to the operators themselves. However, it is difficult for a worker with low experience to make an appropriate judgment.

The present invention has been made in view of the above circumstances, and an object thereof is to diagnose a diagnostic object based on inspection data and present an appropriate flow for inspection to an operator.

In order to achieve the above object, in the information processing apparatus according to the present invention, the inspection data receiving means receives, from the portable device, the inspection data collected by the portable device during the inspection work on the diagnosis target. The learning unit inputs the repair data collected in the past to the neural network, and the neural network learns the non-invasive diagnosis of the diagnostic object. The diagnosis unit inputs the overhaul data received by the overhaul data receiving unit into the neural network, and diagnoses whether the diagnosis object is damaged or not according to the output of the neural network. The instruction unit transmits an instruction corresponding to a diagnosis result obtained by diagnosing the diagnosis target by the diagnosis unit to the portable device.

ADVANTAGEOUS EFFECTS OF INVENTION

An information processing apparatus according to the present invention inputs inspection data collected by a portable device during an inspection operation on a diagnostic object to a neural network, and diagnoses whether the diagnostic object is damaged or not based on an output of the neural network. An indication corresponding to the diagnostic result is sent to the portable device. With such a configuration, the diagnostic object can be diagnosed based on the inspection data, and an appropriate flow for inspection can be presented to the operator.

Drawings

Fig. 1 is a block diagram showing a hardware configuration of an information processing system according to an embodiment.

Fig. 2 is a diagram showing an example of the configuration of the portable device according to the embodiment.

Fig. 3A is a functional block diagram of an information processing apparatus according to an embodiment.

Fig. 3B is a diagram showing an example of data of the procedure table according to the embodiment.

Fig. 4A is a diagram showing an example of an image displayed on a display panel of a portable device according to an embodiment.

Fig. 4B is a diagram showing another example of an image displayed on the display panel of the portable device according to the embodiment.

Fig. 5 is a diagram showing an example of an image showing an instruction for hammer overhaul according to the embodiment.

Fig. 6 is a diagram showing another example of an image showing an instruction for hammer overhaul according to the embodiment.

Fig. 7 is a diagram showing an example of an image showing an instruction related to a maintenance job according to the embodiment.

Fig. 8 is a diagram showing an example of an image of a factory floor map displayed on a display of an information processing device according to an embodiment.

Fig. 9 is a flowchart showing the inspection work process according to the embodiment.

Fig. 10 is a diagram showing an example of data of the procedure table according to modification 1.

Detailed Description

Hereinafter, the information processing system 1 according to the embodiment of the present invention will be described in detail with reference to the drawings.

(embodiment mode)

As shown in fig. 1, an information processing system 1 according to embodiment 1 of the present invention includes an information processing apparatus 100 and a plurality of portable devices 200. The information processing system 1 is a system for managing maintenance and repair operations in, for example, a chemical plant, a steel mill, or the like. The information processing apparatus 100 diagnoses whether or not damage has occurred in the equipment in the plant based on the inspection data of the equipment in the plant collected by the portable device 200.

The information processing apparatus 100 is installed in a control management room of a factory. The portable devices 200 are carried by the workers 50 who perform the inspection work while making a tour of the factory area of the factory. As shown in fig. 2, the portable device 200 is a wearable computer worn by the worker 50. Specifically, the portable device 200 is integrally attached to the helmet 10 worn by the operator 50. As shown in fig. 1, the information processing apparatus 100 performs wireless communication with the portable device 200. For example, the information processing device 100 transmits an instruction to the operator 50 to the portable device 200. The portable device 200 transmits the acquired inspection data to the information processing apparatus 100. The service data is data indicating the state of the diagnostic object. In the embodiment, the inspection data includes sound data of a hitting sound of the hammering inspection performed by the operator 50, image data obtained by imaging a diagnostic object, and a measurement value of the gas concentration.

As a hardware configuration, the information processing apparatus 100 has: a memory 101 that stores various data; an input device 102 that detects an input operation by a user; an output device 103 that outputs an image to a display device; a wireless communication circuit 104 that performs wireless communication with another apparatus; and a processor 105 that controls the entire information processing apparatus 100. The memory 101, the input device 102, the output device 103, and the wireless communication circuit 104 are connected to the processor 105 via the bus 109, and communicate with the processor 105.

The memory 101 includes volatile memory and nonvolatile memory, and the memory 101 stores programs and various data. In addition, the memory 101 is used as a working memory of the processor 105. In addition, the memory 101 stores a program 1000 for realizing the inspection job processing in the information processing apparatus 100.

The input device 102 includes a keyboard, a mouse, a touch panel, and the like, detects an input operation by a user in the control management room, and outputs a signal indicating the detected input operation by the user to the processor 105.

The output device 103 includes a display, a touch panel, and the like, and displays an image based on a signal supplied from the processor 105. The output device 103 displays a map of a factory floor of a plant on a display in response to, for example, an operation by a user in the control management room. The output device 103 displays image data captured by the portable device 200 on a display.

The wireless communication circuit 104 has an antenna 104a, and the wireless communication circuit 104 includes a network interface circuit for performing wireless communication with other devices. The wireless communication circuit 104 converts data supplied from the processor 105 into an electric signal, and outputs the electric signal by carrying the electric signal on a radio wave. The wireless communication circuit 104 receives radio waves output from another device, restores electric signals carried by the radio waves into data, and outputs the data to the processor 105.

Processor 105 includes a cpu (central Processing unit), and processor 105 executes various programs stored in memory 101 to realize various functions of information Processing apparatus 100. The processor 105 also has a dedicated processor for ai (intellectual intelligence).

As shown in fig. 2, the portable device 200 is a head-mounted display of ar (augmented reality) that is integrally attached to the helmet 10 worn by the worker 50. As shown in fig. 1, the portable device 200 has, as a hardware configuration: a memory 201 that stores various data; an output device 202 that presents information supplied from the information processing device 100 to the operator 50; a collecting device 203 that collects data relating to the overhaul; a wireless communication circuit 204 that performs wireless communication with other apparatuses; and a processor 205 that controls the portable device 200 as a whole. The memory 201, the output device 202, the collection device 203, and the wireless communication circuit 204 are connected to the processor 205 via the bus 209, and communicate with the processor 205.

The memory 201 includes a volatile memory, a nonvolatile memory, and the memory 201 stores programs and various data. The memory 201 is used as a working memory for the processor 205. In addition, the memory 201 stores a program 2000 for implementing the service data collection process in the portable device 200. The memory 201 is housed in the main body portion 20 shown in fig. 2.

The output device 202 has a display panel 202a and a speaker 202 b. The display panel 202a displays an image received from the information processing apparatus 100 on the display panel 202a according to the control of the processor 205. The image displayed on the display panel 202a is, for example, an image indicating a work instruction to the worker 50. As shown in fig. 2, in the embodiment, the display panel 202a is disposed in front of one eye of the worker 50. The display panel 202a has a size to cover one eye of the worker 50.

As shown in fig. 1, the speaker 202b outputs the sound received from the information processing apparatus 100 in accordance with the control of the processor 205. The sound output from the speaker 202b is, for example, a sound indicating a work instruction to the worker 50. The speaker 202b is housed in the main body 20 shown in fig. 2.

As shown in fig. 1, the collection device 203 includes a camera 203a, a microphone 203b, a gas detection sensor 203c, and a gps (global Positioning system) receiver 203d, and collects various data related to the inspection under the control of the processor 205.

As shown in fig. 2, the camera 203a is disposed on a side surface of the face of the operator 50 so that the lens faces in the same direction as the line of sight of the operator 50. The camera 203a photographs a diagnostic object existing in the direction faced by the operator 50. The camera 203a continuously performs shooting while the power is turned on, and outputs the shot image data to the processor 205.

As shown in fig. 2, the microphone 203b is located on the side of the face of the operator 50 and is attached to a position sandwiched between the helmet 10 and the camera 203 a. The microphone 203b collects sounds generated by the operator 50, impact sounds generated by hammer inspection, and the like. The microphone 203b outputs the collected sound to the processor 205 if the input of the sound is detected.

The gas detection sensor 203c sucks ambient air, measures the concentration of a predetermined gas in the air, and outputs the measured value to the processor 205. As shown in fig. 2, the gas detection sensor 203c is connected to the main body 20 via a cable 203 e. The gas detection sensor 203c outputs the measurement value to the processor 205 via the cable 203 e. This is to allow the operator 50 to move the gas detection sensor 203c and suck the surrounding gas.

The GPS receiver 203d shown in fig. 1 specifies the current position of the operator 50 from satellite radio waves received from GPS satellites, and outputs position data indicating the specified position to the processor 205. The position data output by the GPS receiver 203d is represented by 3-dimensional coordinates. The GPS receiver 203d outputs information indicating the current position of the operator 50 to the processor 205 in units of a predetermined period, for example, every 1 minute. The GPS receiver 203d is housed in the main body unit 20 shown in fig. 2.

The wireless communication circuit 204 shown in fig. 1 has an antenna 204a, and the wireless communication circuit 204 performs wireless communication with wireless communication circuits of other devices. The wireless communication circuit 204 converts data supplied from the processor 205 into an electric signal, and outputs the electric signal by carrying the electric signal on a radio wave. The wireless communication circuit 204 receives a radio wave output from another device, restores an electric signal carried by the radio wave to data, and outputs the data to the processor 205. The wireless communication circuit 204 is housed in the main body portion 20 shown in fig. 2.

The processor 205 shown in fig. 1 includes a CPU, and the processor 205 executes various programs stored in the memory 201 to realize various functions of the portable device 200.

Further, the processor 205 executes the program 2000 to perform the process of collecting service data. The processor 205 transmits image data output from the camera 203a to the information processing apparatus 100 via the wireless communication circuit 204 in units of a predetermined period. If the microphone 203b outputs sound data, the processor 205 transmits the sound data to the information processing apparatus 100 via the wireless communication circuit 204. If the gas detection sensor 203c outputs a measurement value of the gas concentration, the processor 205 transmits the measurement value to the information processing apparatus 100 via the wireless communication circuit 204. In addition to the inspection data, the processor 205 transmits information indicating the current position of the worker 50 output from the GPS receiver 203d to the information processing device 100 via the wireless communication circuit 204 in units of a predetermined period. The processor 205 is housed in the main body portion 20 shown in fig. 2.

As shown in fig. 3A, the information processing apparatus 100 functionally has: a storage unit 110 for storing various data related to inspection and diagnosis; an inspection data receiving unit 120 that receives inspection data from the portable device 200; a learning unit 130 that performs deep learning; a diagnosis unit 140 for diagnosing whether or not the diagnosis target is damaged based on the inspection data received by the inspection data receiving unit 120 and the learning result of the learning unit 130; and an instruction unit 150 that determines the content of the instruction to the operator 50 and outputs the determined content of the instruction to the mobile device 200. The inspection data receiving unit 120 is an example of the inspection data receiving unit of the present invention. The learning unit 130 is an example of the learning means of the present invention. The diagnosis unit 140 is an example of the diagnosis unit of the present invention. The instruction section 150 is an example of an instruction unit of the present invention.

In the embodiment, the information processing apparatus 100 diagnoses whether or not a lesion is generated in the diagnosis target using the neural network that has completed the learning. The diagnosis result output by the information processing apparatus 100 is either a diagnosis result in which no damage has occurred in the diagnosis target or a diagnosis result in which damage has occurred in the diagnosis target.

The storage unit 110 stores various data related to inspection and diagnosis. Specifically, the storage unit 110 stores: a service manual 111 related to the service work; history data 112 relating to the maintenance; various reference data 113 for overhaul; a learning model 114 defining a neural network; learning data 115; and a treatment table 116 in which data indicating the contents of a treatment to be performed on the diagnosis target is stored. The function of the storage unit 110 is realized by the memory 101. The procedure table 116 is an example of the procedure information storage unit of the present invention.

The service manual 111 includes information on the location of the service object and the service method. The service manual 111 includes map data of a plant area of the plant and position data indicating positions of pipes, tanks, and the like, which are diagnostic targets, in the plant area of the plant. Further, the position data of the diagnostic object is represented by 3-dimensional coordinates.

The inspection manual 111 includes information indicating a position where the diagnostic object is struck during hammering inspection. Information indicating the position where the diagnostic object is tapped is also indicated by 3-dimensional coordinates. For example, in the case where the diagnostic object is a pipe extending in a factory floor, sometimes in hammer inspection of 1 pipe, it is necessary to perform knocking on a plurality of sites. In such a case, the service manual 111 includes not only the position where the pipe is tapped but also information indicating the order of tapping the pipe.

Specifically, the history data 112 includes: data of an inspection history including the inspection data received from the portable device 200 and the date and time when the inspection was performed; information for determining a diagnostic object; and diagnostic result data. As described above, the inspection data includes image data obtained by imaging the diagnostic object, sound data of the impact sound of the hammering inspection performed by the operator, and the measurement value of the gas concentration. The information for specifying the diagnostic object is, for example, information indicating the position of the diagnostic object. The diagnostic result data includes data indicating a diagnostic result diagnosed by the diagnostic unit 140 based on the inspection data. For example, the diagnosis result data shows whether the diagnosis object generates a lesion.

The reference data 113 includes a reference value indicating an appropriate magnitude of the sound of the hammering inspection. When the magnitude of the hitting sound received from the mobile device 200 is smaller than the reference value, the instruction unit 150 instructs the operator 50 to hit more strongly via the mobile device 200. The reference data 113 stores a threshold value of the gas concentration, which is a reference for determining whether or not a gas leak has occurred.

The learning model 114 contains information defining the shape and scale of the neural network. Specifically, the learning model 114 includes a mathematical expression representing the learning model, the total number of intermediate layers, the number of neurons (nodes) in each layer, a weighting coefficient for each neuron, and the like. In the embodiment, the storage unit 110 stores a learning model suitable for image recognition and a learning model suitable for voice recognition. This is because the diagnosis unit 140 inputs image data obtained by imaging a diagnostic object and sound data of a beat sound to the neural network having completed learning, and diagnoses the diagnostic object based on the output of the neural network. The weighting coefficient of each neuron element of the learning model stored in the storage unit 110 is updated by the learning of the learning unit 130.

The learning data 115 includes data obtained by classifying each of the inspection data collected in the past as the presence or absence of damage to the inspection target in the learning of the learning unit 130 described later.

For example, assume that the diagnostic object is a pipe. In this case, the learning data 115 includes data obtained by combining image data of a damaged pipe, sound data of a hitting sound when the damaged pipe is hit, and a value indicating whether there is damage to each diagnostic object. The learning data 115 includes data obtained by combining image data of a pipe where no damage has occurred, sound data of a hitting sound when the pipe where no damage has occurred is hit, and a value indicating whether there is no damage to each diagnostic object. The image data and the sound data included in the learning data 115 may be data collected by the portable device 200 at the time of past maintenance, or may be data collected by another device, for example, a separate camera, a microphone, or the like. The learning data 115 is grouped according to the premise of the diagnosis target such as the material of the diagnosis target, the size of the diagnosis target, and the gas flowing inside. The classification of whether the inspection data included in the learning data 115 is data on an inspection target with a damage or data on an inspection target without a damage is based on a diagnosis result determined by a skilled operator based on image and sound data collected in the past inspection, for example.

The procedure table 116 defines, for each diagnostic object, a procedure to be taken by the operator 50 when it is diagnosed that damage or the like has occurred in the diagnostic object. In the example shown in fig. 3B, it is defined that the valve is closed as a countermeasure in the case where it is diagnosed that gas leakage has occurred. It is defined that the report is made when the damage of the pipeline is diagnosed. This is because, in this case, replacement of the pipe is required, and after the report is made to the control management room, repair work, replacement work, and the like of the pipe are performed.

The inspection data receiving unit 120 receives an input of inspection data related to a diagnosis target from the portable device 200. The functions of the service data receiving section 120 are realized by the wireless communication circuit 104 and the processor 105. The inspection data receiving unit 120 outputs the inspection data received from the portable device 200 to the diagnosing unit 140 in association with the date and time of reception and the position data of the portable device 200. The inspection data receiving unit 120 stores the received inspection data in the storage unit 110 in association with the date and time at the time of reception and information for identifying the transmission source mobile device 200.

The learning unit 130 inputs the learning data 115 into the neural network having the structure defined by the learning model 114, and determines a learning model used for diagnosis by the diagnosis unit 140 by adjusting the weighting coefficient of the neuron element in the intermediate layer of the neural network so that the output of the neural network approaches a real value obtained in advance, for example, by a back propagation method. In the embodiment, since the presence or absence of a lesion is discriminated, the learning unit 130 arranges 2 output layer neurons. As the learning data, a plurality of sets of data in which past inspection data and numbers indicating whether or not there is damage to the neuron to be inspected are combined are prepared. The learning unit 130 supplies the repair data of each learning data to the neural network, and adjusts the weighting coefficients of each neuron element of the intermediate layer and the output layer by using the back propagation method so that the neuron element of the output layer indicated by the corresponding number is triggered. By the learning, the value of the weighting coefficient of the learning model 114 is updated with the adjusted weighting coefficient. In this way, the neural network is made to learn the relationship between the input and the output. The function of the learning section 130 is realized by the processor 105.

An example of learning data 115 assigned to a neural network is shown. For example, the learning data 115 includes: (a) a combination between image data of a damaged pipe and a value indicating that a damage exists; (b) a combination between image data of an undamaged conduit and a value representing the absence of damage; (c) a combination between sound data of a hitting sound of a damaged pipe and a value indicating that damage exists; and (d) a combination between sound data of a hitting sound of an undamaged pipe and a value indicating that there is no damage.

The neural network using a learning model suitable for image recognition adjusts the weighting coefficients of the neurons in the intermediate layer by setting the image data of (a) as input and setting the value indicating that damage has occurred as output, and adjusts the weighting coefficients of the neurons in the intermediate layer by setting the image data of (b) as input and setting the value indicating that damage has not occurred as output.

The neural network using the learning model suitable for voice recognition adjusts the weighting coefficients of the neurons in the intermediate layer by setting the voice data in (c) as input and setting the value indicating that damage has occurred as output, and adjusts the weighting coefficients of the neurons in the intermediate layer by setting the voice data in (d) as input and setting the value indicating that damage has not occurred as output. Note that the above learning is an example, and the learning data 115 may include only some of the data in (a) to (d). Deep learning is also possible in this case.

Therefore, if new overhaul data is input to the neural network that has completed learning, the neural network outputs a value indicating that there is no damage with respect to the diagnostic object. That is, the neural network having completed learning can determine whether or not there is no damage to the diagnostic object. In this way, the neural network having completed learning can diagnose whether or not the pipe is damaged based on the image data and the sound data collected by the portable device 200. The learning by the learning unit 130 needs to be performed in advance before the inspection data is supplied from the portable device 200.

The diagnosis unit 140 inputs the inspection data received from the portable device 200 to the neural network that realizes the learning model in which the learning unit 130 has completed learning, and diagnoses whether or not the object to be diagnosed is damaged based on the output of the neural network. The function of the diagnosis section 140 is realized by the processor 105. The diagnosis unit 140 inputs, for example, a captured image of a pipe located in a plant captured by the portable device 200 to a neural network using a learned learning model, and diagnoses whether or not a diagnosis target is damaged based on an output value of the neural network. The diagnosis unit 140 outputs the result of the diagnosis to the instruction unit 150.

The diagnosis unit 140 inputs the sound of the impact sound collected by the portable device 200 to a neural network using a learning model that has been learned, and diagnoses whether or not the object to be diagnosed is damaged based on an output value of the neural network. The diagnosis unit 140 outputs the result of the diagnosis to the instruction unit 150.

The diagnosis unit 140 determines that gas leakage has occurred when the measured value of the gas concentration measured by the portable device 200 exceeds the threshold defined by the reference data 113. The diagnosis unit 140 outputs the result of the diagnosis to the instruction unit 150.

The instruction unit 150 determines the content of the instruction for the inspection work, and transmits an instruction for the inspection work corresponding to the determined result to the mobile device 200. Here, the instructions for the inspection work determined by the instruction unit 150 include (i) a work instruction that does not depend on the diagnosis result of the diagnosis unit 140 and (ii) a work instruction based on the diagnosis result of the diagnosis unit 140.

As an example of (i), for example, the instruction unit 150 instructs the operator 50 to move to the position of the inspection target. In this case, the instruction unit 150 acquires the position data indicating the position of the operator 50 obtained by the GPS receiver 203d of the portable device 200, and acquires the position data of the inspection target from the inspection manual 111. The instructing unit 150 transmits, to the portable device 200, current position data indicating the operator 50 on the map of the plant area of the plant, position data of the diagnostic object, and an instruction to move to the position of the diagnostic object.

Accordingly, the processor 205 of the portable device 200 displays the image of the map of the factory floor of the factory and the image representing the position 30 of the diagnostic object as shown in fig. 4A on the display panel 202 a. In the illustrated example, the current position of the operator 50 is represented by a black human-shaped image, and the position 30 to be diagnosed is represented by a black star-shaped image. A path from the operator 50 to the position 30 to be diagnosed is indicated by an arrow. The instruction unit 150 may transmit a signal indicating that an instruction is to be output in an audio manner to the mobile device 200. In this case, the processor 205 of the portable device 200 outputs a sound indicating the movement to the specified position 30 from the speaker 202 b.

Further, the instruction unit 150 may control the portable device 200 to display an image of an arrow indicating a route on the display panel 202a as shown in fig. 4B while the operator 50 is moving. In this case, the processor 205 of the portable device 200 displays an image of an arrow on the display panel 202 a. Therefore, images of the path and the arrow of the actual factory floor are displayed on the display panel 202a in an overlapping manner.

If the instruction unit 150 determines that the operator 50 has reached the position 30 to be diagnosed based on the position data received from the portable device 200 and the map data of the factory floor of the factory included in the inspection manual 111, the instruction unit 150 transmits an instruction of the inspection work to the portable device 200. Specifically, the instruction unit 150 transmits information of a position to be tapped during hammering inspection and an instruction to tap the position, which are included in the inspection manual 111, to the portable device 200 in order to indicate the part to be tapped to the operator 50.

Therefore, as shown in fig. 5, the processor 205 of the portable device 200 displays an image indicating a position to be tapped by the operator 50 and an instruction "tap" on the display panel 202 a. In the illustrated example, the diagnostic target is a pipe 1002 among the pipes 1001 and 1002, and a portion indicated by a diagonal line indicates the position of the tap. Therefore, on the display panel 202a, the actual pipes 1001 and 1002, the indication of "tapping", and the image indicating the position to be tapped are displayed in a superimposed manner. The instruction unit 150 may transmit a signal indicating that an instruction is to be output in an audio manner to the mobile device 200. In this case, the processor 205 of the portable device 200 outputs a sound indicating that the diagnostic object is tapped from the speaker 202 b.

The instruction unit 150, upon receiving the sound data of the impact sound from the mobile device 200, determines whether or not the sound is of a sufficient size, and transmits an instruction to perform a stronger impact to the mobile device 200 if the impact sound is not of a sufficient size. Therefore, the processor 205 of the portable device 200 displays an indication of a stronger tap on the display panel 202a as shown in fig. 6. In the illustrated example, a "slightly stronger tapping" indicating the content of the instruction is displayed. The instruction unit 150 may transmit a signal indicating that an instruction is to be output in an audio manner to the mobile device 200. In this case, the processor 205 of the portable device 200 outputs a sound indicating that the tapping is made stronger from the speaker 202 b.

The instruction section 150 also transmits an instruction to detect the gas concentration to the portable device 200. Therefore, the processor 205 of the portable device 200 displays an instruction to detect the gas concentration using the gas detection sensor 203c on the display panel 202 a. The instruction unit 150 may transmit a signal indicating that an instruction is to be output in an audio manner to the mobile device 200. In this case, the processor 205 of the portable device 200 outputs a sound from the speaker 202b indicating that the gas concentration is detected using the gas detection sensor 203 c.

In addition, when the result of diagnosis by the diagnosis unit 140 indicates that damage has occurred, the instruction unit 150 issues (ii) an operation instruction based on the result of diagnosis by the diagnosis unit 140 as follows. For example, when the diagnosis unit 140 determines that gas leakage has occurred, the instruction unit 150 transmits an instruction to be given by the operator 50 to the portable device 200 based on the diagnosis result and the treatment table 116. As a result of the diagnosis, gas leakage occurs, and therefore, as shown in fig. 3B, the operation of rotating the valve clockwise is a measure to be taken. The instruction unit 150 transmits information indicating the position of the target valve and information indicating the direction in which the valve is rotated to the portable device 200.

Therefore, as shown in fig. 7, the processor 205 of the portable device 200 displays on the display panel 202a an instruction of "rotate and close" and an image of an arrow indicating a direction in which the valve is rotated, for the valve to be operated. Therefore, the display panel 202a displays the actual valve, the instruction "rotate and close" indicating the rotation of the valve, and the image of the arrow indicating the rotation direction, while overlapping them. The instruction unit 150 may transmit a signal indicating that an instruction is to be output in an audio manner to the mobile device 200. In this case, the processor 205 of the portable device 200 outputs a sound representing an instruction to close the valve from the speaker 202 b.

The instruction unit 150 displays a screen as shown in fig. 4A and 8 on the display of the output device 103 so that a monitor in the control management room can confirm the position of the operator 50 in the factory floor. In fig. 4A, the position of the entire operator carrying the portable device 200 is displayed on the map of the factory floor of the factory. In fig. 8, a moving operator 50 indicated by a broken line and an arrow indicating a moving path are displayed. Therefore, the monitor can confirm the movement of the operator 50 who has received the instruction to move. The function of the instruction section 150 is realized by the wireless communication circuit 104 and the processor 105.

The inspection work process in which the information processing device 100 having the above-described configuration instructs the operator 50 in cooperation with the portable device 200 will be described with reference to fig. 9.

As shown in fig. 4A, the worker 50 is assumed to be in the factory floor of the factory while wearing the helmet 10 with the portable device 200 attached thereto. The worker 50 carries a hammer for hammer inspection. Here, the information processing device 100 instructs the operator 50 to inspect the diagnostic object. The learning unit 130 completes the learning of the neural network, and the learning model having completed the learning is stored in the storage unit 110.

As described above, the camera 203a, the microphone 203b, and the gas detection sensor 203c of the mobile device 200 acquire the inspection data at predetermined timings (timing). The processor 205 of the portable device 200 transmits the service data to the information processing apparatus 100. The GPS receiver 203d also acquires position data at a predetermined timing. The processor 205 transmits the position data to the information processing apparatus 100.

The instructing unit 150 transmits an instruction to the portable device 200 to move the operator 50 to the position 30 to be diagnosed (step S11). Specifically, the instruction unit 150 transmits a signal including map data of the factory, coordinate values indicating the current position of the operator 50, coordinate values indicating the position 30 to be diagnosed, and a movement instruction to the portable device 200. In response to this, the portable device 200 displays an image of a map of the factory floor of the factory and an instruction to move to the position as shown in fig. 4A on the display panel 202 a. Also, the processor 205 of the portable device 200 outputs a sound instructing to move to the position 30 from the speaker 202 b. The operator 50 moves to the designated position 30 according to the instruction.

The instructing unit 150 determines whether or not the operator 50 has reached the specified position 30 based on the position data received from the portable device 200 and the map data of the factory floor of the factory included in the inspection manual 111 (step S12). If it is determined that the operator 50 has reached the designated position 30 (step S12; Yes), the instructing unit 150 determines whether or not image data obtained by imaging the diagnostic object is received from the portable device 200 (step S13). As described above, the camera 203a of the mobile device 200 continuously performs shooting, and the mobile device 200 transmits image data to the information processing apparatus 100 in units of a predetermined period. If the instruction unit 150 determines that the image data has been received (step S13; Yes), the received image data is stored in the storage unit 110 for the purpose of the diagnosis process. On the other hand, if the instruction section 150 determines that the image data is not received from the portable device 200 (step S13; No), it waits until the image data is received.

The instruction unit 150 transmits an instruction to the portable device 200 to perform hammering inspection of the operator 50 (step S14). As an instruction for hammer inspection, a signal including information indicating a portion of the inspection manual 111 where the inspection target is struck is transmitted to the portable device 200. The portable device 200 displays an image indicating a position to be tapped on the display panel 202a as shown in fig. 5 based on the signal received from the instruction section 150. Also, the portable device 200 outputs a sound indicating that the diagnostic object is tapped from the speaker 202 b. The operator 50 performs hammering inspection as instructed. Thus, the portable device 200 transmits the sound data to the information processing apparatus 100.

The instructing unit 150 determines whether or not the sound data of the impact sound has been received from the mobile device 200 (step S15). Here, the instruction unit 150 determines whether or not the magnitude of the received impact sound is equal to or larger than a reference value of an appropriate magnitude of the sound indicating the hammering inspection included in the reference data 113.

When the size of the impact sound received from the portable device 200 is smaller than the reference value, the instructing section 150 determines that the sound data of the impact sound of an appropriate size has not been received (step S15; No). In this case, the instruction section 150 again makes an instruction about hammer overhaul (step S14). Specifically, the instruction unit 150 transmits an instruction to more strongly tap the diagnosis target to the portable device 200. In response to this, the portable device 200 displays information indicating that the tap is performed more strongly on the display panel 202a as shown in fig. 6. Also, the portable device 200 outputs a sound indicating that the tap is made stronger from the speaker 202 b. The operator 50 strongly hits the diagnosis target in accordance with the instruction.

If the instruction section 150 determines that the sound data of the impact sound of a sufficient size has been received from the portable device 200 (step S15; Yes), the sound data is stored in the storage section 110. Next, the instruction unit 150 transmits an instruction to measure the gas concentration to the mobile device 200 (step S16). In response to this, the portable device 200 displays information indicating the measurement of the gas concentration on the display panel 202 a. Then, the portable device 200 outputs a sound instructing to measure the gas concentration from the speaker 202 b. The operator 50 uses the gas detection sensor 203c to measure the gas concentration in accordance with the instruction. Therefore, the portable device 200 transmits the data of the measurement value to the information processing apparatus 100.

The instructing unit 150 determines whether or not data of the measured value of the gas concentration has been received from the portable device 200 (step S17). If the instruction unit 150 determines that the data of the measured value of the gas concentration is received from the portable device 200 after waiting for a predetermined time (step S17; Yes), the data of the measured value is stored in the storage unit 110. On the other hand, if the instruction unit 150 determines that the data of the measured value of the gas concentration has not been received from the mobile device 200 after waiting for the predetermined time (step S17; No), the process of step S16 is executed again.

Next, the diagnosis unit 140 diagnoses whether or not the diagnosis target is damaged based on the inspection data received by the inspection data receiving unit 120 (step S18).

Specifically, the diagnosis unit 140 performs the diagnosis process as follows. The diagnosis unit 140 inputs the image data received by the inspection data receiving unit 120 to a neural network using a learning model for image analysis that has been learned by the learning unit 130, and diagnoses whether or not the object to be diagnosed is damaged. Next, the diagnosis unit 140 inputs the sound data received by the inspection data receiving unit 120 to a neural network using a learning model for sound analysis that has been learned by the learning unit 130, and diagnoses whether or not the diagnosis target is damaged. The diagnosis unit 140 determines whether or not gas leakage has occurred based on whether or not the measurement value of the gas concentration received by the inspection data receiving unit 120 exceeds the threshold value of the reference data 113. When it is determined that the gas leakage has occurred, the diagnosis unit 140 diagnoses that the object to be diagnosed has a damage. The diagnosis unit 140 records the diagnosis result in the history data 112 of the storage unit 110.

Instruction unit 150 presents the diagnosis result and the work instruction to operator 50 (step S19). Here, the instruction unit 150 presents the operator 50 that the diagnosis target is damaged when at least 1 diagnosis result out of the diagnosis result based on the image data, the diagnosis result based on the sound data, and the diagnosis result based on the measurement value of the gas concentration indicates that the diagnosis target is damaged. The instruction unit 150 transmits information indicating a method of treatment based on the treatment table 116 to the portable device 200 together with the diagnosis result.

For example, the diagnostic target is a duct, and a diagnostic result based on a measured value of the gas concentration indicates that the diagnostic target is damaged. In this case, the instruction unit 150 transmits an instruction to close the valve of the pipe to be diagnosed to the portable device 200. Therefore, the portable device 200 displays the diagnosis result and the coping method received from the information processing apparatus 100 on the display panel 202 a.

If the diagnosis unit 140 diagnoses that the object to be diagnosed is not damaged in step S17, the instruction unit 150 may not present the operation instruction to the operator 50 in step S18.

Then, the instruction unit 150 again executes step S11, and transmits an instruction to move to the position of the next diagnosis target to the portable device 200. This completes the inspection work performed by the information processing apparatus 100.

As described above, in the information processing system 1 according to the embodiment, an appropriate flow for inspection is presented to the operator 50, the object to be diagnosed is diagnosed based on the inspection data, and the work to be performed is instructed to the operator 50. Therefore, even a worker 50 with low experience can perform the inspection with an appropriate flow. Further, since the information processing device 100 performs diagnosis based on the inspection data and presents the handling method to the operator 50 based on the diagnosis result, even the operator 50 with less experience can perform appropriate treatment.

Further, by accumulating the experience of hammer inspection in accordance with the instruction of the information processing device 100, even an operator with low experience can learn the striking position, the striking intensity, and the like. Furthermore, the supervisory operator in the control and management room does not need to explain the method of work to the less experienced worker 50 by telephone as in the conventional art.

(modification 1)

In the embodiment, the diagnosis unit 140 diagnoses the presence/absence of a lesion to be diagnosed, but is not limited thereto. For example, a level indicating the degree of damage may be defined in advance, and the diagnosis unit 140 may diagnose the level of damage and/or the cause of damage, and the like, when there is damage, in addition to the diagnosis of the presence or absence of damage to the diagnosis target.

In this case, the learning unit 130 arranges the number of neurons of the output layer of the neural network so as to match the number of items to be discriminated. For example, when the presence of a lesion, the presence of a lesion level 1, the presence of a lesion level 2, …, and the presence of a lesion level n are determined, neurons of (n +1) output layers are arranged. For example, when the presence of a damage, the presence of a damage cause 1, the presence of damage causes 2 and …, and the presence of a damage cause m are determined, (m +1) neurons of the output layer are arranged. As the learning data, a plurality of sets of data are prepared in which the inspection data and the numbers of neurons in the output layer indicating the degree and/or cause of damage or damage to the inspection object are combined. The learning unit 130 supplies the repair data of each learning data to the neural network, and adjusts the weighting coefficients of each neuron element of the intermediate layer and the output layer so that the neuron element of the corresponding number indicated triggers, using a back propagation method or the like. That is, the learning unit 130 causes the neural network to learn the learning data. The diagnosis unit 140 inputs the repair data received from the portable device 200 to the neural network having completed the learning, and diagnoses the degree and/or cause of the damage of the diagnosis target based on the output of the neural network. Even if there is damage, if the degree or cause of the damage is small, continuous monitoring may be necessary, but it is not necessary to perform maintenance work urgently.

The instruction unit 150 may instruct the portable device 200 to perform a task based on a procedure table 116a shown in fig. 10, for example, according to the degree of damage. In the example shown in fig. 10, the larger the value of the rank is, the larger the damage is generated.

(modification 2)

In the embodiment, an example in which the information processing apparatus 100 has the procedure table 116 in which data of contents of a procedure is stored in advance is described, but the present invention is not limited thereto. The learning unit 130 may learn the contents of the treatment according to the diagnosis result, and the diagnosis unit 140 may diagnose whether the diagnosis target is intact or not and may determine the contents of the treatment. The method of diagnosing whether the object to be diagnosed in the diagnosing section 140 is intact is as described above.

For example, treatment 1, treatment 2, …, and treatment p are determined. In this case, as the learning data, a plurality of sets of data in which the data of the diagnosis result and the numbers of the neurons of the output layer indicating the treatment content are combined are prepared. The learning unit 130 supplies data of the diagnosis result of each learning data to the neural network, and adjusts the weighting coefficient of each neuron by a back propagation method or the like. As the learning data, for example, data relating to the diagnosis result and the contents of treatment determined by a skilled worker in the past inspection work can be used. The diagnosis unit 140 diagnoses whether or not there is damage to the diagnostic object, inputs the diagnosis result to the neural network that has completed learning, and determines the content of the treatment based on the output of the neural network.

In the embodiment, the information processing device 100 gives an instruction to the operator 50 to perform hammering inspection, closing the valve, or the like, but the instruction to the operator 50 is not limited to this. For example, the information processing device 100 presents the position of the measuring instrument to the operator 50, and gives an instruction to the operator 50 to move the operator to the position of the measuring instrument. The information processing apparatus 100 may cause the worker 50 after the movement to take an image of the value indicated by the measuring instrument using the camera 203a, and instruct the worker 50 to perform an appropriate operation corresponding to the value of the measuring instrument.

In the embodiment, the information processing apparatus 100 transmits a movement instruction to the mobile device 200, and the operator 50 performs hammering inspection and imaging of a diagnosis target at the movement destination, but the invention is not limited thereto. For example, when the operator 50 is traveling around the factory and the information processing device 100 determines that the operator 50 is located near the diagnostic target based on the position data received from the mobile device 200, it may transmit an instruction to the mobile device 200 to perform hammering inspection and imaging of the diagnostic target.

In addition, when it is known from the inspection history recorded in the history data 112 that the inspection for the diagnostic object is performed in the past certain period, the information processing apparatus 100 may not instruct the operator 50 to perform the inspection for the diagnostic object.

When the operator 50 arrives at the position of the diagnosis target, the information processing apparatus 100 may transmit the inspection history recorded in the history data 112 and the past diagnosis result to the portable device 200. Therefore, the portable device 200 displays the history of the inspection and the past diagnosis received from the information processing apparatus 100 on the display panel 202 a.

In the above embodiment, the example in which the mobile device 200 includes the camera 203a, the microphone 203b, the gas detection sensor 203c, and the GPS receiver 203d has been described, but the mobile device 200 may not include all of these components. For example, a part of the workers may carry the mobile device 200 without the camera 203a and perform only hammer inspection. The remaining operators may carry the mobile device 200 without the microphone 203b and may take only the image of the diagnostic object. In addition, the portable device 200 may also have other sensors, such as a temperature sensor, a humidity sensor, a pressure sensor.

In the embodiment, an example in which the neural network is realized by the processor 105 of the information processing apparatus 100 performing an operation is described, but the neural network may be realized without software. For example, a neural network may be implemented by hardware using a neural chip in which neurons are formed by LSI chips. Alternatively, a neural network using an analog circuit realized by a SQUID (Superconducting Quantum Interference Device), an optical neural network using light, or the like can be used.

As a recording medium for recording the program, a computer-readable recording medium including a magnetic disk, an optical disk, an opto-magnetic disk, a flash memory, a semiconductor memory, and a magnetic tape can be used.

The present invention is capable of various embodiments and modifications without departing from the spirit and scope in its broadest form. The above embodiments are illustrative of the present invention, and do not limit the scope of the present invention. That is, the scope of the present invention is shown not by the embodiments but by the claims. Further, various modifications made within the scope of the claims and within the meaning of the equivalent invention are considered to be within the scope of the present invention.

Description of the reference numerals

1 information processing system, 10 helmet, 20 main body, 30 position, 50 operator, 100 information processing device, 101, 201 memory, 102 input device, 103, 202 output device, 104, 204 wireless communication circuit, 104a, 204a antenna, 105, 205 processor, 109, 209 bus, 110 storage unit, 111 inspection manual, 112 history data, 113 reference data, 114 learning model, 115 learning data, 116 treatment table, 120 inspection data receiving unit, 130 learning unit, 140 diagnosis unit, 150 indication unit, 200 portable equipment, 202 output device, 202a display panel, 202b speaker, 203 collection device, 203a camera, 203b, 203c gas detection sensor, 203d GPS receiver, 203e cable, 1000, 2000 program, 1001, 1002 pipeline.

24页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于可穿戴系统的跨模态输入融合

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类