Pest and disease monitoring method based on leaf spot lesion area

文档序号:1756785 发布日期:2019-11-29 浏览:21次 中文

阅读说明:本技术 基于叶片病斑面积的病虫害监测方法 (Pest and disease monitoring method based on leaf spot lesion area ) 是由 颜华 魏言聪 刘龙 宫华泽 陈祺 于 2019-08-30 设计创作,主要内容包括:本发明公开了一种基于叶片病斑面积的病虫害监测方法,包括步骤:通过无人机采集多张农作物图像,所述图像为三维张量,包括图像高度、图像宽度和波段数,与所述图像对应的头文件中包括经度坐标和纬度坐标;所述无机人中包括深度学习模块,实时对采集的所述农作物图像进行深度学习模型处理,识别和定位农作物叶片斑,并定量计算农作物叶片斑与农作物叶片面积比,确定病虫害等级分布。本发明使用无人机遥感与计算机视觉目标识别的集成技术,基于深度学习对病虫害的空间动态分布进行高精度实时监测。(The invention discloses a kind of pest and disease monitoring methods based on leaf spot lesion area, comprising steps of acquiring multiple crop map pictures by unmanned plane, described image is three-dimensional tensor, it include longitude coordinate and latitude coordinate in header file corresponding to the image including picture altitude, picture traverse and wave band number;Include deep learning module in the inorganic people, deep learning model treatment is carried out to the crop map picture of acquisition in real time, identifies and positions crops blade spot, and quantify and calculate crops blade spot and crops blade area ratio, determines pest and disease damage distribution of grades.The present invention uses the integrated technology of unmanned aerial vehicle remote sensing and computer vision target identification, carries out high-precision real-time monitoring to the Spatial distributions distribution of pest and disease damage based on deep learning.)

1. a kind of pest and disease monitoring method based on leaf spot lesion area, which is characterized in that comprising steps of

Multiple crop map pictures are acquired by unmanned plane, described image is three-dimensional tensor, including picture altitude, picture traverse and wave Number of segment includes longitude coordinate and latitude coordinate in header file corresponding to the image;

Include deep learning module in the inorganic people, the crop map picture of acquisition is carried out at deep learning model in real time Reason, identifies and positions crops blade spot, and quantifies and calculate crops blade spot and crops blade area ratio, determines pest and disease damage Distribution of grades, comprising:

Each crop map picture of acquisition is pre-processed and marked;

The data set is divided into training set and test by multiple described crops picture construction data sets after being labeled Collection;

The training set is trained by the LINKNET convolutional neural networks divided for image, semantic, is obtained for supervising Survey the deep learning model of scab;

The test set is input in the deep learning model, is mentioned by the encoder in LINKNET convolutional neural networks Take the semantic feature of every crop map picture in the test set;

The semantic feature is partitioned into crops blade spot profile by the decoder in LINKNET convolutional neural networks With the gray scale thermodynamic chart of blade profile, the gray scale thermodynamic chart is segmentation result, the crops leaf in the gray scale thermodynamic chart Film mottle profile and blade profile have different gray scales, the size and the crops picture size phase of acquisition of the gray scale thermodynamic chart Together, the value of any pixel represents semantic category belonging to the deep learning model prediction position object in the gray scale thermodynamic chart Other: 0 represents background, and 1 represents scab, and 2 represent blade, calculate separately semantic column not Wei 1 and semantic classes be 2 sum of all pixels, The crop map can be obtained as corresponding scab/blade area ratio;

Quantitatively calculate crops blade spot and crops blade area ratio according to the gray scale thermodynamic chart, according to the longitude, Dimension and crops blade spot and crops blade area ratio carry out interpolation in map grid, form the disease of target area Insect pest hierarchic space distribution map.

2. the pest and disease monitoring method according to claim 1 based on leaf spot lesion area, which is characterized in that described pair is adopted Each crop map picture of collection is pre-processed and is marked, comprising:

Each crop map picture is pre-processed, is deleted without the image of crops or the visual recognition farming of people The image of object difficulty, obtains pretreatment image;

The pretreatment image is labeled, by pretreatment image crops blade and crops blade spot carry out retouching wheel It is wide.

3. the pest and disease monitoring method according to claim 2 based on leaf spot lesion area, which is characterized in that the pre- place Reason further includes image cropping processing, each crop map picture is cut, each crops picture altitude is small after cutting In or equal to the first presetted pixel value, picture traverse is less than or equal to the second presetted pixel value.

4. the pest and disease monitoring method according to claim 1 based on leaf spot lesion area, which is characterized in that described to pass through LINKNET convolutional neural networks for image, semantic segmentation are trained the training set, obtain for monitoring scab Deep learning model, comprising:

From the training set, each iteration randomly selects the image data in multiple training sets and constitutes a batch progress entirely Deep learning model parameter updates, and calculates loss function FocalLoss, is carried out using the algorithm of gradient decline and backpropagation excellent Change training, when the difference of the Loss value calculated every time and last Loss value is no more than predetermined threshold, training terminates, and saves Current parameter configuration.

5. the pest and disease monitoring method according to claim 1 based on leaf spot lesion area, which is characterized in that described The bottom of the encoder and decoder of LINKNET convolutional neural networks includes that cavity volume unit is in parallel.

6. the pest and disease monitoring method according to claim 1 based on leaf spot lesion area, which is characterized in that the farming Object is rice.

7. the pest and disease monitoring method according to claim 1 based on leaf spot lesion area, which is characterized in that the wave band Number is 3.

Technical field

The present invention relates to artificial intelligence fields, more particularly, to a kind of pest and disease monitoring based on leaf spot lesion area Method.

Background technique

Crop disease and insect has highlighted to restrict the principal element of agricultural production.China is the multiple country of pest and disease damage, Disaster-stricken range is wide, degree has seriously caused direct heavy economic losses to agricultural production.Therefore, using advanced pest and disease damage Monitoring technology finds pest and disease damage early, monitors the occurrence and development situation of pest and disease damage, takes in key developmental stages scientific and effective anti- Means are controlled, is beneficial to ensure agricultural product quality and safety, realizes the sustainable development of agricultural.

Traditional pest and disease monitoring method is by the way of field fixed point monitoring or random searching, and directly with the naked eye observation is sick Evil judges a possibility that pest and disease damage occurs with the method catching pests.Conventional method has that observation error is big, lacks quantitative mark Quasi-, the disadvantages of time-consuming, low efficiency.

By taking rice as an example, when infecting rice blast, multiple positions of rice plant will appear scab, including blade spot, leaf Sheath spot, stipes spot and fringe neck spot etc., wherein blade spot feature is obvious, and distributing position is easy to realize, therefore blade spot and blade face Product ratio is to measure the important indicator of the rice state of an illness.

Summary of the invention

In view of this, the present invention provides a kind of pest and disease monitoring methods based on leaf spot lesion area, comprising steps of

Multiple crop map pictures are acquired by unmanned plane, described image is three-dimensional tensor, including picture altitude, picture traverse It include longitude coordinate and latitude coordinate in header file corresponding to the image with wave band number;

Include deep learning module in the inorganic people, deep learning mould is carried out to the crop map picture of acquisition in real time Type processing, identifies and positions crops blade spot, and quantifies and calculate crops blade spot and crops blade area ratio, determines disease Insect pest distribution of grades, comprising:

Each crop map picture of acquisition is pre-processed and marked;

The data set is divided into training set and surveyed by multiple described crops picture construction data sets after being labeled Examination collection;

The training set is trained by the LINKNET convolutional neural networks divided for image, semantic, is used In the deep learning model of monitoring scab;

The test set is input in the deep learning model, the coding in LINKNET convolutional neural networks is passed through Device extracts the semantic feature of every crop map picture in the test set;

The semantic feature is partitioned into crops blade spot by the decoder in LINKNET convolutional neural networks The gray scale thermodynamic chart of profile and blade profile, the gray scale thermodynamic chart is segmentation result, the farming in the gray scale thermodynamic chart Object blade spot profile and blade profile have different gray scales, and the size of the gray scale thermodynamic chart and the crop map of acquisition are as ruler Very little identical, the value of any pixel represents semanteme belonging to the deep learning model prediction position object in the gray scale thermodynamic chart Classification: 0 represents background, and 1 represents scab, and 2 represent blade, calculate separately semantic column not Wei 1 and semantic classes be 2 pixel it is total Number, can be obtained the crop map as corresponding scab/blade area ratio;

Crops blade spot and crops blade area ratio are quantitatively calculated according to the gray scale thermodynamic chart, according to the warp Degree, dimension and crops blade spot and crops blade area ratio carry out interpolation in map grid, form target area Pest and disease damage hierarchic space distribution map.

Preferably, each crop map picture of described pair of acquisition is pre-processed and is marked, comprising:

Each crop map picture is pre-processed, is deleted without the image of crops or the visual recognition of people The image of crops difficulty, obtains pretreatment image;

The pretreatment image is labeled, by pretreatment image crops blade and crops blade spot carry out Retouch profile.

Preferably, the pretreatment further includes image cropping processing, each crop map picture is cut, after cutting Each crops picture altitude is less than or equal to the first presetted pixel value, and picture traverse is less than or equal to the second presetted pixel Value.

Preferably, the LINKNET convolutional neural networks by dividing for image, semantic carry out the training set Training, obtains the deep learning model for monitoring scab, comprising:

From the training set, each iteration randomly selects the image data in multiple training sets and constitutes a batch progress Entire depth learning model parameter updates, and calculates loss function Focal Loss, using the algorithm of gradient decline and backpropagation Training is optimized, when the difference of the Loss value calculated every time and last Loss value is no more than predetermined threshold, training knot Beam saves current parameter configuration.

Preferably, the bottom of the encoder and decoder of the LINKNET convolutional neural networks includes cavity volume unit It is in parallel.

Preferably, the crops are rice.

Preferably, the wave band number is 3.

Compared with prior art, the pest and disease monitoring method provided by the invention based on leaf spot lesion area, is at least realized It is following the utility model has the advantages that

1, the present invention uses the integrated technology of unmanned aerial vehicle remote sensing and computer vision target identification, based on deep learning to disease The Spatial distributions distribution of insect pest carries out high-precision real-time monitoring.Pass through the rice field of the optical sensor captured in real-time of UAV flight Photo, in unmanned mainboard pass through deep learning model treatment, can accurately identify with locating blades spot, and quantify calculating its With blade area ratio, instant, reliable guidance is provided for application process and the work of other prevention and control of plant diseases, pest control;

2, loss function of the invention improves the stability of training stage, solves just using Focal Loss The problem of training result caused by negative sample proportional imbalance is flooded by negative sample;

3, empty convolution group parallel connection is used in the encoding and decoding bottom of Linknet in the present invention, improves backbone network Receptive field of the characteristic pattern in upper sampling process is exported, contextual information is increased, so that crops blade and crops blade The contour edge of spot is finer and smoother.

Certainly, implementing any of the products of the present invention specific needs while must not reach all the above technical effect.

By referring to the drawings to the detailed description of exemplary embodiment of the present invention, other feature of the invention and its Advantage will become apparent.

Detailed description of the invention

It is combined in the description and the attached drawing for constituting part of specification shows the embodiment of the present invention, and even With its explanation together principle for explaining the present invention.

Fig. 1 is the pest and disease monitoring method flow diagram in the embodiment of the present invention 1 based on leaf spot lesion area;

Fig. 2 is the crop map picture marked in embodiment 2;

Fig. 3 is gray scale thermodynamic chart obtained in embodiment 2;

Fig. 4 is the distribution of grades monitoring result of pest and disease damage on the spot once carried out in Beijing in the embodiment of the present invention 3.

Specific embodiment

Carry out the various exemplary embodiments of detailed description of the present invention now with reference to attached drawing.It should also be noted that unless in addition having Body explanation, the unlimited system of component and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally The range of invention.

Be to the description only actually of at least one exemplary embodiment below it is illustrative, never as to the present invention And its application or any restrictions used.

Technology, method and apparatus known to person of ordinary skill in the relevant may be not discussed in detail, but suitable In the case of, the technology, method and apparatus should be considered as part of specification.

It is shown here and discuss all examples in, any occurrence should be construed as merely illustratively, without It is as limitation.Therefore, other examples of exemplary embodiment can have different values.

It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined in a attached drawing, then in subsequent attached drawing does not need that it is further discussed.

13页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:基于深度学习的缝隙定位方法、装置和存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!