Intelligent garbage classification method combining vision and touch

文档序号:1401633 发布日期:2020-03-06 浏览:7次 中文

阅读说明:本技术 视觉与触觉相结合的智能垃圾分类方法 (Intelligent garbage classification method combining vision and touch ) 是由 田卫新 张力 温征 刘本梦 于 2019-11-29 设计创作,主要内容包括:本发明公开了视觉与触觉相结合的智能垃圾分类方法,利用分拣头抓取垃圾分类面板上的垃圾,分拣头的抓爪内侧设有触觉传感器,将分拣头移动到垃圾的上方,采集垃圾的图像,分拣头抓取垃圾时触觉传感器取得垃圾的触觉数据,分拣头的控制器采用神经网络模型作为分类器,根据垃圾的图像和触觉数据对抓取的垃圾进行识别、分类,随后分拣头依据垃圾的分类将抓取的垃圾放入对应的垃圾分类区。本发明采用视觉与触觉相结合的方式,相对现有的单纯利用图像对垃圾进行识别分类的方法,大大降低误拣率,分类精度高,尤其易于识别出湿垃圾,分类效率高,省时省力。(The invention discloses an intelligent garbage classification method combining vision and touch, which comprises the steps of grabbing garbage on a garbage classification panel by utilizing a sorting head, arranging a touch sensor on the inner side of a grabbing claw of the sorting head, moving the sorting head to the upper side of the garbage, collecting images of the garbage, obtaining touch data of the garbage by the touch sensor when the sorting head grabs the garbage, using a neural network model as a classifier by a controller of the sorting head, identifying and classifying the grabbed garbage according to the images and the touch data of the garbage, and then placing the grabbed garbage into a corresponding garbage classification area by the sorting head according to the classification of the garbage. Compared with the existing method for identifying and classifying the garbage by simply utilizing the image, the method has the advantages of greatly reducing the false picking rate, being high in classification precision, being particularly easy to identify the wet garbage, being high in classification efficiency, saving time and labor.)

1. The intelligent garbage classification method combining vision and touch is characterized in that a sorting head is used for grabbing garbage on a garbage classification panel, a touch sensor is arranged on the inner side of a grabbing claw of the sorting head, the sorting head is moved above the garbage to collect images of the garbage, the touch sensor obtains touch data of the garbage when the sorting head grabs the garbage, a controller of the sorting head adopts a neural network model as a classifier, the grabbed garbage is identified and classified according to the images and the touch data of the garbage, and then the sorting head puts the grabbed garbage into a corresponding garbage classification area according to the classification of the garbage.

2. The vision and touch combined intelligent garbage classification method according to claim 1, characterized in that the collected and tiled garbage is sequentially sorted row by adopting a sorting head, the sorting distance is set according to the grabbing range of a grabbing claw of the sorting head, the sorted garbage is classified and put into a corresponding classification area, the intelligent garbage classification method comprises the following steps,

step 1: moving the picking spacing in the Y-axis direction;

step 2: moving the picking spacing in the X-axis direction;

and step 3: collecting image data of the garbage under the garbage sorting head;

and 4, step 4: the sorting head grabs garbage to obtain touch data;

and 5: the classifier classifies the captured garbage according to the image and the tactile data of the garbage;

step 6: according to the classification of the garbage in the step 5, the captured garbage is placed into a corresponding classification area, and the current sorting point is returned;

and 7: judging whether the sorting in the X-axis direction is finished or not;

step 7.1: if the picking in the X-axis direction is not finished, executing the step 2:

step 7.2: if the sorting in the X-axis direction is finished, executing a step 8;

and 8: judging whether the sorting in the Y-axis direction is finished or not;

step 8.1: if the sorting in the Y-axis direction is not finished, executing the step 1;

step 8.2: when the sorting in the Y-axis direction is completed, the process ends.

3. The vision and touch combined intelligent garbage classification method according to claim 1, characterized in that the collected and tiled garbage is sorted row by row in sequence by using a sorting head, the sorting distance is set according to the grabbing range of a grabbing claw of the sorting head, the sorted garbage is classified and put into a corresponding classification area, the intelligent garbage classification method comprises the following steps,

step 1: moving the picking spacing in the X-axis direction;

step 2: moving the picking spacing in the Y-axis direction;

and step 3: collecting image data of the garbage under the garbage sorting head;

and 4, step 4: the sorting head grabs garbage to obtain touch data;

and 5: the classifier classifies the captured garbage according to the image and the tactile data of the garbage;

step 6: according to the classification of the garbage in the step 5, the captured garbage is placed into a corresponding classification area, and the current sorting point is returned;

and 7: judging whether the sorting in the Y-axis direction is finished or not;

step 7.1: if the sorting in the Y-axis direction is not finished, executing step 2:

step 7.2: if the sorting in the Y-axis direction is finished, executing the step 8;

and 8: judging whether the sorting in the X-axis direction is finished or not;

step 8.1: if the sorting in the X-axis direction is not finished, executing the step 1;

step 8.2: when the sorting in the X-axis direction is completed, the process ends.

4. The intelligent garbage classification method combining vision and touch according to claim 2, characterized in that the intelligent garbage classification method selects the garbage thrown by citizens at one time, puts the garbage into the corresponding classification areas one by one, collects the images of the citizens' heads before the selection to confirm the identity of the citizens, and evaluates the garbage thrown by the citizens after the sorting and classification of the garbage thrown by citizens at one time, the specific steps are as follows,

step 1: moving the picking spacing in the Y-axis direction;

step 2: moving the picking spacing in the X-axis direction;

and step 3: collecting image data of the garbage under the garbage sorting head;

and 4, step 4: the sorting head grabs garbage to obtain touch data;

and 5: the classifier classifies the captured garbage according to the image and the tactile data of the garbage;

step 6: according to the classification of the garbage in the step 5, the captured garbage is placed into a corresponding classification area, and the current sorting point is returned;

and 7: judging whether the sorting in the X-axis direction is finished or not;

step 7.1: if the picking in the X-axis direction is not finished, executing the step 2:

step 7.2: if the sorting in the X-axis direction is finished, executing a step 8;

and 8: judging whether the sorting in the Y-axis direction is finished or not;

step 8.1: if the sorting in the Y-axis direction is not finished, executing the step 1;

step 8.2: if the sorting in the Y-axis direction is completed, executing step 9;

and step 9: recording the sorting times of sorting heads in the garbage sorting process as the quantity of garbage thrown by citizens at a single time, recording the garbage sorting processing time after the citizens throw the garbage at a single time, dividing the garbage sorting processing time by the quantity of the garbage to obtain the average processing time of single garbage, and evaluating the quality of the garbage thrown by the citizens according to the average processing time of the unit garbage thrown by the citizens.

5. The intelligent visual and tactile garbage classification method according to any one of claims 1-4, wherein the classifier is trained by collecting image data and tactile data of garbage, and the training method of the classifier comprises the following steps,

step 1: collecting images and touch data of recoverable garbage and harmful garbage, and dividing the images and touch data into a training set and a testing set;

step 2: training the classifier by using the training set in the step 1;

and step 3: testing the trained classifier by using the test set in the step 1 as the input of the classifier, and if the test is qualified, performing the step 4, otherwise, executing the step 1 for further training;

and 4, step 4: collecting images and touch data of wet garbage and dry garbage, and dividing the images and the touch data into a training set and a testing set;

and 5: training the classifier again by using the training set in the step 4;

step 6: and (4) testing the trained classifier by using the test set in the step (4) as the input of the classifier, finishing the training if the test is qualified, and otherwise, executing the step (4) for further training.

6. The intelligent visual and tactile garbage classification method according to any one of claims 1-4, wherein the neural network model is a CNN neural network model.

Technical Field

The invention belongs to the field of garbage classification, and particularly relates to an intelligent garbage classification method combining vision and touch.

Background

With the rapid development of economy and the improvement of living standard, the yield of the domestic garbage is also increased sharply. The household garbage is various in types, some garbage can be recycled, some garbage has great influence on the environment, and if the garbage is discarded at will, great pollution can be caused. The garbage classification and recovery work has been carried out for many years in China, and mainly classified garbage cans are used as the main part. But the effect of garbage classification is not ideal, and the awareness of people on garbage classification is weak.

In recent two years, China has begun to establish a strict garbage classification system in some cities. Taking the sea as an example, residents are required to classify garbage into four categories of recyclables, harmful garbage, wet garbage and dry garbage. Since the waste classification is required to be complicated and can not be performed by residents completely, a device for inspecting and secondarily classifying the classified waste thrown by users is urgently needed.

Disclosure of Invention

The invention has the technical problems that the existing method for automatically identifying and classifying garbage by simply adopting garbage images has high error rate, and the classification is wrong due to the fact that the garbage material is difficult to distinguish by adopting a single visual sensor according to the current garbage classification standard.

The invention aims to solve the problems and provides an intelligent garbage classification method combining vision and touch, which comprises the steps of collecting images of garbage by using a camera, collecting touch data of the garbage by using a touch sensor, using the images and the touch data of the garbage as input of a neural network model, identifying and classifying the garbage by using the neural network model, putting the garbage into a corresponding classification area, recording the quantity of the garbage put by citizens once and the time for completing classification treatment, and evaluating the classification condition of the garbage before the citizens put the garbage.

The technical scheme of the invention is an intelligent garbage classification method combining vision and touch, a sorting head is used for grabbing garbage on a garbage classification panel, a touch sensor is arranged on the inner side of a grabbing claw of the sorting head, the sorting head is moved above the garbage to collect images of the garbage, the touch sensor obtains touch data of the garbage when the sorting head grabs the garbage, a controller of the sorting head adopts a neural network model as a classifier, the grabbed garbage is identified and classified according to the images and the touch data of the garbage, and then the sorting head puts the grabbed garbage into a corresponding garbage classification area according to the classification of the garbage.

Furthermore, the intelligent garbage classification method adopts the sorting head to sort the gathered and tiled garbage row by row in sequence, the sorting distance is set according to the grabbing range of the grabbing claws of the sorting head, the sorted garbage is classified and put into the corresponding classification area, the intelligent garbage classification method comprises the following steps,

step 1: moving the picking spacing in the Y-axis direction;

step 2: moving the picking spacing in the X-axis direction;

and step 3: collecting image data of the garbage under the garbage sorting head;

and 4, step 4: the sorting head grabs garbage to obtain touch data;

and 5: the classifier classifies the captured garbage according to the image and the tactile data of the garbage;

step 6: according to the classification of the garbage in the step 5, the captured garbage is placed into a corresponding classification area, and the current sorting point is returned;

and 7: judging whether the sorting in the X-axis direction is finished or not;

step 7.1: if the picking in the X-axis direction is not finished, executing the step 2:

step 7.2: if the sorting in the X-axis direction is finished, executing a step 8;

and 8: judging whether the sorting in the Y-axis direction is finished or not;

step 8.1: if the sorting in the Y-axis direction is not finished, executing the step 1;

step 8.2: when the sorting in the Y-axis direction is completed, the process ends.

Furthermore, sorting the gathered and tiled garbage row by adopting a sorting head, setting a sorting distance according to the grabbing range of a grabbing claw of the sorting head, classifying the sorted garbage and putting the sorted garbage into a corresponding classification area, wherein the intelligent garbage classification method comprises the following steps of,

step 1: moving the picking spacing in the X-axis direction;

step 2: moving the picking spacing in the Y-axis direction;

and step 3: collecting image data of the garbage under the garbage sorting head;

and 4, step 4: the sorting head grabs garbage to obtain touch data;

and 5: the classifier classifies the captured garbage according to the image and the tactile data of the garbage;

step 6: according to the classification of the garbage in the step 5, the captured garbage is placed into a corresponding classification area, and the current sorting point is returned;

and 7: judging whether the sorting in the Y-axis direction is finished or not;

step 7.1: if the sorting in the Y-axis direction is not finished, executing step 2:

step 7.2: if the sorting in the Y-axis direction is finished, executing the step 8;

and 8: judging whether the sorting in the X-axis direction is finished or not;

step 8.1: if the sorting in the X-axis direction is not finished, executing the step 1;

step 8.2: when the sorting in the X-axis direction is completed, the process ends.

Further, the intelligent garbage classification method selects the garbage thrown in by citizens once, puts the garbage into the corresponding classification areas one by one, collects the head images of the citizens before selecting to confirm the identity of the citizens, and evaluates the garbage quality of the garbage thrown in by the citizens after the garbage thrown in by the citizens once is selected and classified,

step 1: moving the picking spacing in the Y-axis direction;

step 2: moving the picking spacing in the X-axis direction;

and step 3: collecting image data of the garbage under the garbage sorting head;

and 4, step 4: the sorting head grabs garbage to obtain touch data;

and 5: the classifier classifies the captured garbage according to the image and the tactile data of the garbage;

step 6: according to the classification of the garbage in the step 5, the captured garbage is placed into a corresponding classification area, and the current sorting point is returned;

and 7: judging whether the sorting in the X-axis direction is finished or not;

step 7.1: if the picking in the X-axis direction is not finished, executing the step 2:

step 7.2: if the sorting in the X-axis direction is finished, executing a step 8;

and 8: judging whether the sorting in the Y-axis direction is finished or not;

step 8.1: if the sorting in the Y-axis direction is not finished, executing the step 1;

step 8.2: if the sorting in the Y-axis direction is completed, executing step 9;

and step 9: recording the sorting times of sorting heads in the garbage sorting process as the quantity of garbage thrown by citizens at a single time, recording the garbage sorting processing time after the citizens throw the garbage at a single time, dividing the garbage sorting processing time by the quantity of the garbage to obtain the average processing time of single garbage, and evaluating the quality of the garbage thrown by the citizens according to the average processing time of the unit garbage thrown by the citizens.

Furthermore, the classifier acquires image data and tactile data of garbage to train the classifier, and the training method of the classifier comprises the following steps,

step 1: collecting images and touch data of recoverable garbage and harmful garbage, and dividing the images and touch data into a training set and a testing set;

step 2: training the classifier by using the training set in the step 1;

and step 3: testing the trained classifier by using the test set in the step 1 as the input of the classifier, and if the test is qualified, performing the step 4, otherwise, executing the step 1 for further training;

and 4, step 4: collecting images and touch data of wet garbage and dry garbage, and dividing the images and the touch data into a training set and a testing set;

and 5: training the classifier again by using the training set in the step 4;

step 6: and (4) testing the trained classifier by using the test set in the step (4) as the input of the classifier, finishing the training if the test is qualified, and otherwise, executing the step (4) for further training.

Preferably, the neural network model adopts a CNN neural network model.

Preferably, the classification of the garbage in the intelligent garbage classification method comprises recyclables, harmful garbage, wet garbage and dry garbage.

Compared with the prior art, the invention has the beneficial effects that:

1) the invention adopts a mode of combining vision and touch, adopts a neural network model to identify and classify the garbage, classifies and places the garbage, is convenient for further disposal of the garbage, is convenient for recycling recyclable materials, is convenient for special treatment of harmful garbage, and reduces the pollution to the environment;

2) the garbage is identified and classified by the neural network model according to the image and the tactile data of the garbage in a mode of combining vision and tactile sense, so that compared with the existing method for identifying and classifying the garbage by only utilizing the image, the method has the advantages of greatly reducing the false picking rate, being high in classification precision, being particularly easy to identify wet garbage, being high in classification efficiency, saving time and labor;

3) the method of the invention evaluates the quality of the garbage thrown by citizens, and the average single piece processing time of the device for processing the garbage thrown by citizens is calculated to evaluate the quality of the garbage classification before the garbage is thrown, so that the method is convenient for knowing the garbage classification condition before the citizens throw the garbage, and is convenient for management departments to reward the citizens with good garbage classification, or judge whether the propaganda education of the garbage classification is necessary to be further strengthened for the citizens.

Drawings

The invention is further illustrated by the following figures and examples.

Fig. 1 is a schematic flow chart of an intelligent garbage classification method according to an embodiment.

Fig. 2 is a schematic structural diagram of the intelligent garbage classification device.

Fig. 3 is a schematic structural view of a garbage inlet control mechanism.

Fig. 4 is a schematic structural diagram of the sorting mechanism.

Fig. 5 is a schematic structural diagram of the transfer mechanism.

Fig. 6 is a schematic structural diagram of a neural network model according to an embodiment.

Description of reference numerals: the garbage classification box body 1, the control cabinet 2, the sorting mechanism 3, a first X-axis slide rail 301, a first X-axis slide table 302, a first Y-axis slide rail 303, a first Y-axis slide table 304, a Z-axis slide rail 305, a Z-axis slide table 306, a manipulator 307, a vision sensor 308, the conveying mechanism 4, a second X-axis slide rail 401, a second X-axis slide table 402, an X-axis baffle 403, a second Y-axis slide rail 404, a second Y-axis slide table 405, a Y-axis baffle 406, the garbage classification panel 5, the garbage input port 6, a first garbage cover 601, a second garbage cover 602, a hydraulic rod 603 and a camera 7.

Detailed Description

The intelligent garbage classification method combining vision and touch utilizes a sorting head to grab garbage on a garbage classification panel, a touch sensor is arranged on the inner side of a grab of the sorting head, the sorting head is moved above the garbage to collect images of the garbage, the touch sensor obtains touch data of the garbage when the sorting head grabs the garbage, a controller of the sorting head adopts a neural network model as a classifier, the grabbed garbage is identified and classified according to the images and the touch data of the garbage, and then the sorting head puts the grabbed garbage into a corresponding garbage classification area according to the classification of the garbage.

The intelligent garbage classification method comprises the steps of spreading and gathering garbage thrown into a citizen once, then sorting the garbage, setting a sorting interval according to the grabbing range of a gripper of a sorting head, sorting the garbage one by one, then putting the garbage into a corresponding classification area, collecting head images of the citizen before starting sorting to confirm the identity of the citizen, and evaluating the quality of the garbage thrown into the citizen after sorting and classifying the garbage thrown into the citizen once, as shown in figure 1,

step 1: moving the picking spacing in the Y-axis direction;

step 2: moving the picking spacing in the X-axis direction;

and step 3: collecting image data of the garbage under the garbage sorting head;

and 4, step 4: the sorting head grabs garbage to obtain touch data;

and 5: the classifier classifies the captured garbage according to the image and the tactile data of the garbage;

step 6: according to the classification of the garbage in the step 5, the captured garbage is placed into a corresponding classification area, and the current sorting point is returned;

and 7: judging whether the sorting in the X-axis direction is finished or not;

step 7.1: if the picking in the X-axis direction is not finished, executing the step 2:

step 7.2: if the sorting in the X-axis direction is finished, executing a step 8;

and 8: judging whether the sorting in the Y-axis direction is finished or not;

step 8.1: if the sorting in the Y-axis direction is not finished, executing the step 1;

step 8.2: if the sorting in the Y-axis direction is completed, executing step 9;

and step 9: recording the sorting times of sorting heads in the garbage sorting process as the quantity of garbage thrown by citizens at a single time, recording the garbage sorting processing time after the citizens throw the garbage at a single time, dividing the garbage sorting processing time by the quantity of the garbage to obtain the average processing time of single garbage, and evaluating the quality of the garbage thrown by the citizens according to the average processing time of the unit garbage thrown by the citizens.

As shown in fig. 2-5, the intelligent garbage classification device of the intelligent garbage classification method combining visual sense and tactile sense comprises a garbage classification box 1 and a control cabinet 2, wherein a garbage input port 6 is arranged at the top of the garbage classification box 1, a sorting mechanism 3 and a conveying mechanism 4 are arranged in the garbage classification box 1, a garbage classification panel 5 is arranged below the garbage input port 6 in the garbage classification box 1, a plurality of garbage recovery ports are arranged at the front of the garbage classification box 1 close to the bottom surface, a recoverable garbage hopper, a harmful garbage hopper, a wet garbage hopper and a dry garbage hopper are arranged below the garbage classification panel 5, a movable door is arranged at the position of the recoverable garbage hopper, the harmful garbage hopper, the wet garbage hopper and the dry garbage hopper corresponding to the garbage recovery ports, and the movable door is used for environmental sanitation personnel to take out classified garbage in the intelligent garbage classification device. The height of switch board 2 is greater than the height of waste classification box 1, and switch board 2 is equipped with camera 7 near the top of the lateral wall of rubbish input 6. A controller is arranged in the control cabinet 2, and the control end of the sorting mechanism 3 is electrically connected with the controller.

As shown in fig. 3, the garbage inlet 6 is provided with a garbage inlet control mechanism, the garbage inlet control mechanism comprises a first garbage cover 601, a second garbage cover 602 and a hydraulic rod 603, one side of the first garbage cover 601 is hinged to the top of the garbage classification box 1, the base of the electronic hydraulic rod 603 is fixedly connected to the top of the garbage classification box 1, the rod head of the telescopic rod of the electronic hydraulic rod 603 is fixedly connected to the second garbage cover 602, and the second garbage cover 602 controls the opening and closing of the garbage inlet 6 along with the extension and retraction of the rod head of the electronic hydraulic rod 603. An electromagnetic valve is arranged in a hydraulic oil pipe of the electronic hydraulic rod 603, and the electronic hydraulic rod 603 is controlled to stretch and retract through the electromagnetic valve. The control end of the electromagnetic valve is electrically connected with the controller.

In the garbage classification process, before the garbage on the garbage classification panel 5 is sorted, the garbage input opening 6 is closed, and after the garbage on the garbage classification panel 5 is sorted, the garbage input opening 6 is opened.

As shown in fig. 4, the sorting mechanism 3 includes a first X-axis slide rail 301, a first Y-axis slide rail 303, and a Z-axis slide rail 305, two ends of the first X-axis slide rail 301 are fixedly connected to the side wall of the garbage sorting box 1, the first X-axis slide table 302 is in sliding fit with the first X-axis slide rail 301, and an end of the first Y-axis slide rail 303 is fixedly connected to the first X-axis slide table 302; first Y axle slip table 304 and first Y axle slide rail 303 sliding fit, the tip and the first Y axle slip table 304 fixed connection of Z axle slide rail 305, rubbish letter sorting head and Z axle slide rail 305 sliding fit, rubbish letter sorting head include Z axle slip table 306 and rather than the manipulator 307 of being connected, the gripper inboard of manipulator 307 is equipped with touch sensor, sets up grabbing, the release of the digital steering engine control manipulator 307 gripper on the manipulator 307. A screw rod is arranged on the first X-axis slide rail 301, a through hole matched with the screw rod is arranged on the first X-axis sliding table 302, a roller path is arranged on the wall of the through hole, balls are filled in the roller path, the screw rod is fixedly connected with a rotating shaft of a servo motor, and when the servo motor rotates, the first X-axis sliding table 302 linearly moves towards one end or the other end of the slide rail; the first Y-axis slide rail 303 and the first Y-axis sliding table 304 have the same structure as the first X-axis slide rail 301 and the first X-axis sliding table 302, respectively; the first Z-axis slide rail 305 and the first Z-axis slide table 306 have the same structure as the first X-axis slide rail 301 and the first X-axis slide table 302, respectively. In one embodiment, the first X-axis slide rail 301 and the first X-axis slide table 302 are linear modules. And the control end of a servo motor of the sorting mechanism 3 is electrically connected with the controller. And the control end of the digital steering engine on the manipulator 307 is electrically connected with the controller.

One end of a Y-axis sliding table 304 of the sorting mechanism 3 is provided with a vision sensor 308, the output end of the vision sensor 308 is electrically connected with the controller, and the vision sensor 308 is used for collecting images of the garbage on the garbage classification panel 5. In an embodiment, the vision sensor 308 employs a camera.

When the controller controls the sorting mechanism 3 to sort the garbage on the garbage classification panel 5, the X-axis sliding table, the Y-axis sliding table and the Z-axis sliding table are respectively controlled by the servo motor to move, so that the mechanical arm 307 is close to the garbage on the garbage classification panel 5, the grabbing claw of the mechanical arm 307 is controlled by the digital steering engine on the mechanical arm 307 to grab the garbage, after the garbage is grabbed, the X-axis sliding table, the Y-axis sliding table and the Z-axis sliding table are moved again by the servo motor, the mechanical arm 307 is moved to the position above a garbage hopper corresponding to the garbage, and the grabbing claw of the mechanical arm 307 is controlled by the digital steering engine to release the.

As shown in fig. 5, the conveying mechanism 4 is configured to gather the garbage on the garbage classification panel 5, the conveying mechanism 4 includes a second X-axis slide rail 401 and a second Y-axis slide rail 404 that are respectively fixedly connected to the garbage classification panel 5, the second X-axis slide rail 402 is in sliding fit with the second X-axis slide rail 401, the X-axis blocking piece 403 is fixedly connected to the second X-axis slide rail 402, the second Y-axis slide rail 405 is in sliding fit with the second Y-axis slide rail 404, and the Y-axis blocking piece 406 is fixedly connected to the second Y-axis slide rail 405. The second X-axis slide rail 401 and the second X-axis sliding table 402 have the same structure as the first X-axis slide rail 301 and the first X-axis sliding table 302, respectively; the second Y-axis slide rail 404 and the second Y-axis slide table 405 are the same as the first X-axis slide rail 301 and the first X-axis slide table 302, respectively. The control end of the servo motor of the transmission mechanism 4 is electrically connected with the controller. The X-axis sliding table and the X-axis blocking piece fixedly connected with the X-axis sliding table are moved by controlling the servo motor, garbage on the garbage classification panel 5 is gathered along the second Y-axis sliding rail 404, and then the Y-axis sliding table and the Y-axis blocking piece fixedly connected with the Y-axis sliding table are moved by controlling the servo motor, so that the garbage gathered along the second Y-axis sliding rail 404 is gathered.

In step 2, the neural network model adopts a CNN neural network model. As shown in fig. 6, the CNN neural network model includes convolutional layers, fully-connected layers, and classifiers, and uses 4 convolutional layers, each convolutional layer is linked to a pooling layer, the activation function uses a ReLU function, and finally connected is a softmax layer. The size of the input image is 160 x 160 pixels. In the examples, the convolution kernels of the convolutional layers were all 4 × 4, and the learning rate was set to 0.01.

12页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:智能垃圾分类装置及智能垃圾分类方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!