Grinding wheel dressing method and device based on machine vision

文档序号:1454776 发布日期:2020-02-21 浏览:23次 中文

阅读说明:本技术 基于机器视觉的砂轮修整方法及装置 (Grinding wheel dressing method and device based on machine vision ) 是由 王帅 赵延军 李彬 李世华 刘天立 于 2019-10-22 设计创作,主要内容包括:本发明涉及一种基于机器视觉的砂轮修整方法及装置,该砂轮的修整方法通过采集砂轮旋转时多幅原始轮廓图像,将所有原始轮廓图像叠加在一起,找出表示砂轮最外侧的像素边缘点,通过对像素边缘点的分析找出相应的修整量使砂轮工作层的平均出刃高度最大,出刃高度离散度最小;减小出刃高度离散度,使单位时间内同时参与磨削的磨料数量增加,提高平均出刃高度以保证砂轮工作层的容屑空间,使得修整后的砂轮具有更好的磨削效果,可用于精密磨削,有助于提高修整后的砂轮磨削性能的稳定性。(The invention relates to a grinding wheel dressing method and a device based on machine vision, the dressing method of the grinding wheel superposes all original contour images by collecting a plurality of original contour images when the grinding wheel rotates, finds out pixel edge points representing the outermost side of the grinding wheel, and finds out corresponding dressing amount by analyzing the pixel edge points so as to lead the average exposure height of a working layer of the grinding wheel to be maximum and lead the dispersion of the exposure height to be minimum; the dispersion of the height of the edge is reduced, the quantity of the grinding materials which participate in grinding simultaneously in unit time is increased, and the average edge height is improved to ensure the chip containing space of a grinding wheel working layer, so that the finished grinding wheel has a better grinding effect, can be used for precise grinding, and is beneficial to improving the stability of the grinding performance of the finished grinding wheel.)

1. A grinding wheel dressing method based on machine vision is characterized by comprising the following steps:

1) collecting at least two original contour images when the grinding wheel rotates; carrying out gray level processing and superposition on each original contour image, and extracting to obtain pixel edge points representing the outermost side of the grinding wheel to obtain an outermost side contour line;

2) segmenting the outermost contour line according to the curvature change of the pixel edge point to obtain N basic line segments, wherein the basic line segments are straight line segments and/or circular arc segments;

3) if the basic line segment comprises straight line segments, performing straight line fitting on pixel edge points in each straight line segment, calculating the average exposure height and variance between the pixel edge points and parallel lines which are at a distance a in the trimming direction of the corresponding fitted straight line, wherein a is the straight line feeding amount along the trimming direction, and if the average exposure height obtained when feeding for f times is the maximum and the variance is the minimum, the trimming amount is fa;

if the basic line segment comprises an arc segment, performing circle fitting on pixel edge point segments in each arc segment, calculating the average exposure height and variance between the pixel edge point and a point corresponding to the center of the fitted arc and having a distance of b in the trimming direction, wherein b is the circle feeding amount along the trimming direction, and if the average exposure height and the variance obtained when feeding for f times are the largest, the trimming amount is fb;

wherein the dressing direction is the radial direction of the grinding wheel;

4) when the pixel edge point is only divided into straight line segments, taking the maximum value in the trimming quantities of all the fitted straight lines as the final trimming quantity to trim the grinding wheel working layer; when the pixel edge point is only divided into arc segments, taking the maximum value in the trimming amounts of all fitting arcs as the final trimming amount to trim the grinding wheel working layer; and when the pixel edge point is divided into a straight line segment and an arc segment, taking the maximum value in the trimming quantities of all the fitting straight lines and all the fitting arcs as the final trimming quantity to trim the grinding wheel working layer.

2. The machine vision-based grinding wheel dressing method according to claim 1, wherein the specific process of superposing the original profile images in step 1) and extracting the pixel edge points representing the outermost side of the grinding wheel comprises:

1) graying each original contour image to obtain the gray value of each pixel, and selecting the gray images which represent the superposition of the grinding wheel and the outermost pixel points on the circumference one by one;

2) and establishing a coordinate system by taking the pixel point at the lower left corner of the gray level image as the coordinate origin, wherein the positive X direction is horizontally towards the right, and extracting the pixel edge point of each gray level image according to the Canny operator integrated in the Matlab.

3. The machine vision-based grinding wheel dressing method according to claim 1 or 2, wherein a parallel line of any one of the fitting straight lines is represented as Ax + By + C ═ 0, and a point set of pixel edge points remaining after dressing corresponding to each parallel line is represented as E (x)n,yn) And n represents a pixel edge point, and the average exposure height between the pixel edge point and the corresponding parallel line is:

Figure FDA0002243208780000021

wherein d isL1nDistance between pixel edge point n and parallel line:

Figure FDA0002243208780000022

the variance between the pixel edge point and the parallel line is:

Figure FDA0002243208780000023

4. the machine vision-based grinding wheel dressing method according to claim 1 or 2, wherein a point at which the center of any fitting arc is translated in the dressing direction is recorded as UR1(xR1,yR1) The radius of the fitted arc is r1And the point set of the edge points of the pixel remained after trimming corresponding to each translation is marked as F (x)m,ym) And n represents a pixel edge point, and the average exposure height between the pixel edge point and the translated point is:

Figure FDA0002243208780000024

wherein d isR1mDistance between pixel edge point m and translated point:

if d isR1m<0, then dR1m=0;

The variance between the pixel edge point and the translated point is:

Figure FDA0002243208780000031

5. a machine vision-based wheel truing apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor when executing the program implements the machine vision-based wheel truing method of any one of claims 1 to 4.

Technical Field

The invention relates to a grinding wheel dressing method and a grinding wheel dressing device based on machine vision.

Background

Grinding wheels are commonly used for grinding articles such as electroplated grinding wheels, ceramic, resin, metal grinding wheels, etc., and electroplated grinding wheels generally refer to a grinding wheel working layer produced by co-depositing a single layer of abrasive particles with a matrix metal on a steel substrate by electroplating. Compared with the superhard material grinding wheel such as resin, ceramic, sintered metal bond and the like, the electroplated grinding wheel has the advantages of strong holding force of the bond on the abrasive particles, high exposed edge of the abrasive particles, large chip containing space of the grinding wheel and the like; in addition, with the development of numerical control technology, the steel matrix can be made into various complex curved surfaces, namely the excircle profile of the electroplated grinding wheel can be made into various complex curved surfaces, so that the electroplated grinding wheel is widely applied to forming grinding. However, the electroplated grinding wheel has the reasons that the consistency of the abrasive particle concentration and the edge height is not easy to control, and the like, so that the jumping and the section profile accuracy of a working layer of the grinding wheel are poor, and the electroplated grinding wheel is generally considered to be only suitable for rough grinding of complex profiles of parts.

In order to finish the grinding wheel, the profile of a working layer of the grinding wheel to be finished needs to be known, and in order to obtain the profile, profile information is acquired in the prior art in a machine vision mode, specifically, the profile information of the grinding wheel to be finished can be acquired by irradiating a backlight source along the tangential direction of the grinding wheel to be finished and then acquiring the profile information through a corresponding CCD camera; the dressing condition of the grinding wheel to be dressed can be detected according to the obtained profile information, and the grinding wheel can also be dressed according to the dressing amount obtained by the profile information. However, for the precise grinding wheel modification with tiny profile change, the prior art has low processing precision when processing the collected profile data, and cannot meet the requirement of precise modification.

Disclosure of Invention

The invention aims to provide a grinding wheel dressing method based on machine vision, which is used for solving the problem of poor precision of dressing a grinding wheel in the prior art; the invention provides a grinding wheel dressing device based on machine vision, which is used for solving the problem that the precision of dressing a grinding wheel is poor when the existing device is used.

In order to achieve the above object, the present invention provides a grinding wheel dressing method based on machine vision, comprising the steps of:

1) collecting at least two original contour images when the grinding wheel rotates; carrying out gray level processing and superposition on each original contour image, and extracting to obtain pixel edge points representing the outermost side of the grinding wheel to obtain an outermost side contour line;

2) segmenting the outermost contour line according to the curvature change of the pixel edge point to obtain N basic line segments, wherein the basic line segments are straight line segments and/or circular arc segments;

3) if the basic line segment comprises straight line segments, performing straight line fitting on pixel edge points in each straight line segment, calculating the average exposure height and variance between the pixel edge points and parallel lines which are at a distance a in the trimming direction of the corresponding fitted straight line, wherein a is the straight line feeding amount along the trimming direction, and if the average exposure height obtained when feeding for f times is the maximum and the variance is the minimum, the trimming amount is fa;

if the basic line segment comprises an arc segment, performing circle fitting on pixel edge point segments in each arc segment, calculating the average exposure height and variance between the pixel edge point and a point corresponding to the center of the fitted arc and having a distance of b in the trimming direction, wherein b is the circle feeding amount along the trimming direction, and if the average exposure height and the variance obtained when feeding for f times are the largest, the trimming amount is fb;

wherein the dressing direction is the radial direction of the grinding wheel;

4) when the pixel edge point is only divided into straight line segments, taking the maximum value in the trimming quantities of all the fitted straight lines as the final trimming quantity to trim the grinding wheel working layer; when the pixel edge point is only divided into arc segments, taking the maximum value in the trimming amounts of all fitting arcs as the final trimming amount to trim the grinding wheel working layer; and when the pixel edge point is divided into a straight line segment and an arc segment, taking the maximum value in the trimming quantities of all the fitting straight lines and all the fitting arcs as the final trimming quantity to trim the grinding wheel working layer.

The method has the advantages that contour line information of a plurality of positions is obtained, all contour lines are overlapped together, pixel edge points representing the outermost side of the grinding wheel are found out, corresponding trimming amount is found out through analysis of the pixel edge points, so that the average exposure height of a grinding wheel working layer is maximum, and dispersion (namely variance) of the exposure height is minimum; the dispersion of the height of the edge is reduced, the quantity of the grinding materials which participate in grinding simultaneously in unit time is increased, and the average edge height is improved to ensure the chip containing space of a grinding wheel working layer, so that the finished grinding wheel has a better grinding effect, can be used for precise grinding, and is beneficial to improving the stability of the grinding performance of the finished grinding wheel.

Further, a mode of firstly superposing and then extracting pixel edge points is adopted to reduce the calculation amount and shorten the calculation time, and the specific process of superposing each original contour image in the step 1) and extracting the pixel edge points representing the outermost side of the grinding wheel comprises the following steps:

1) graying each original contour image to obtain the gray value of each pixel, and selecting the gray images which represent the superposition of the grinding wheel and the outermost pixel points on the circumference one by one;

2) and establishing a coordinate system by taking the pixel point at the lower left corner of the gray level image as the coordinate origin, wherein the positive X direction is horizontally towards the right, and extracting the pixel edge point of each gray level image according to the Canny operator integrated in the Matlab.

Further, the parallel line of any one of the fitting straight lines is represented as Ax + By + C as 0, and the point set of the pixel edge points remaining after the "trimming" corresponding to each parallel line is represented as E (x)n,yn) And n represents a pixel edge point, and the average exposure height between the pixel edge point and the parallel line is:

Figure BDA0002243208790000031

wherein d isL1nDistance between pixel edge point n and parallel line:

Figure BDA0002243208790000032

the variance between the pixel edge point and the parallel line is:

Figure BDA0002243208790000033

further, the point of the center of any fitting arc after translation along the trimming direction is recorded as UR1(xR1,yR1) The radius of the fitted arc is r1Each timeThe set of points for the corresponding "trimmed" residual pixel edge points after translation is denoted as F (x)m,ym) And n represents a pixel edge point, and the average exposure height between the pixel edge point and the translated point is:

wherein d isR1mDistance between pixel edge point m and translated point:

if d isR1m<0, then dR1m=0;

The variance between the pixel edge point and the translated point is:

Figure BDA0002243208790000042

the invention provides a machine vision-based grinding wheel dressing device, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the program to realize the machine vision-based grinding wheel dressing method and the improvement method thereof.

The method has the advantages that contour line information of a plurality of positions is obtained, all contour lines are overlapped together, pixel edge points representing the outermost side of the grinding wheel are found out, corresponding trimming amount is found out through analysis of the pixel edge points, so that the average exposure height of a grinding wheel working layer is maximum, and dispersion (namely variance) of the exposure height is minimum; the dispersion of the height of the edge is reduced, the quantity of the grinding materials which participate in grinding simultaneously in unit time is increased, and the average edge height is improved to ensure the chip containing space of a grinding wheel working layer, so that the finished grinding wheel has a better grinding effect, can be used for precise grinding, and is beneficial to improving the stability of the grinding performance of the finished grinding wheel.

Drawings

FIG. 1 is a schematic structural diagram of a grinding wheel dressing method based on machine vision according to the present invention;

FIG. 2 is an original profile image of a radial cross-section of a working layer of the grinding wheel of the present invention taken with a high resolution CCD camera;

FIG. 3 is a schematic representation of the image processing of the grinding wheel of the present invention to determine dressing amount based on throw-off height;

FIG. 4 is a binary image of a radial cross section of a grinding wheel working layer of the present invention after edge extraction using a Matlab integrated Canny operator;

FIG. 5 is a profile of a radial cross section of a grinding wheel working layer of the present invention after converting the coordinate points of pixel edge points to actual dimensions;

FIG. 6 is a schematic diagram of the arc segment trimming of the present invention;

in the figure, 1 is a grinding wheel to be dressed, 2 is a rotating shaft, 3 is a workbench, 4 is a main shaft, 5 is a dressing roller, 6 is a computer, 11 is a grinding wheel base body to be dressed, 12 is an edge point, 13 is a fit line, 14 is a parallel line, 15 is a post-dressing point, and 16 is a pre-dressing and post-dressing common point.

Detailed Description

The present invention will be described in further detail with reference to the accompanying drawings.

The method comprises the following steps:

the invention provides a grinding wheel dressing method based on machine vision, which needs to collect a group of original contour images of a grinding wheel rotating for at least one circle, as shown in figure 1, the invention provides a schematic diagram of the principle of obtaining the original contour images of the grinding wheel contour, and figure 1 is provided with a high-resolution CCD camera, a lens, a back light source, a main shaft 4, a rotating shaft 2, a workbench 3, a computer 6, image processing and analyzing software, a motion controller and a servo system. The grinding wheel 1 to be dressed is arranged on a rotating shaft 2, a dressing roller 5 is arranged on a main shaft 4, a back light source vertically irradiates the surface of a working layer of the grinding wheel 1 to be dressed, a high-resolution CCD camera and a controller are both connected with a computer 6 provided with an image acquisition card, the rotating shaft 2 drives the grinding wheel 1 to be dressed to slowly rotate at a constant speed, the high-resolution CCD camera continuously shoots radial section images, namely original contour images, of the working layer of the grinding wheel at a fixed frequency, the images are stored in a hard disk of the computer 6 through the image acquisition card, then each image is subjected to operation processing by using an image processing program compiled by Matlab to obtain the optimal dressing amount and is transmitted to a motion controller, the motion controller drives a servo system to control the dressing roller 5 to move according to a dressing path generated by the theoretical graph of the radial section of the grinding wheel and the optimal dressing, and finally, finishing interpolation dressing on the grinding wheel 1 to be dressed on the XY plane. The grinding wheel to be dressed will be referred to as a grinding wheel hereinafter.

The specific treatment method comprises the following steps:

1) the original contour image includes information of a grinding wheel contour line, as shown in fig. 2, is an original contour image of a radial section of a grinding wheel working layer, which is shot by a high-resolution CCD camera, and each original contour image is subjected to gray processing and superimposed, and pixel edge points representing the outermost side of the grinding wheel are extracted.

Superposing the original contour images, and extracting to obtain pixel edge points representing the outermost side of the grinding wheel, wherein the steps are as follows:

① the obtained original outline images Image1, Image2, Image3, … …, ImageN were subjected to a graying process.

②, overlapping gray level images of all original contour images, namely Image1, Image2, Image3, Image … … and Image N, wherein the CCD camera is fixed in pixel and fixed in position with the grinding wheel, and the maximum gray level value of the pixel points at the same position in all the images (namely the pixel points at the outermost side of the grinding wheel on the same circumference of the grinding wheel) is selected one by one to form a final overlapped pixel matrix.

③, establishing a coordinate system by taking the pixel point at the lower left corner of the superposed image as the origin of coordinates, wherein the positive X direction is horizontally towards the right, and extracting the pixel edge point of the superposed image by using a Canny operator integrated in Matlab.

The method comprises the following steps of firstly carrying out gray processing and superposition on an original contour image, selecting pixel points representing the outermost side of a grinding wheel, and then extracting pixel edge points of the superposed image; as shown in fig. 3, Section1, Section1, …, and Section1 represent original contour images, the supposing image represents a superimposed image after gray scale, pixel edge points are extracted from the superimposed image, and finally, coordinate points of the pixel edge points are converted into a contour map with an actual size. The gray level image is a binary image obtained after edges are extracted by using a Canny operator, for example, a radial section of the grinding wheel working layer shown in fig. 4 is obtained after edges are extracted by using a Matlab integrated Canny operator; the contour map is a contour map obtained by converting coordinate points of pixel edge points into actual dimensions, and is, for example, a contour map of a radial cross section as shown in fig. 5, where the unit of coordinates in fig. 5 is mm.

2) Segmenting according to the curvature change of the pixel edge points, finding out feature points, decomposing the complex graph into basic line segments, namely straight lines and circular arcs, and then segmenting the superposed image to be expressed as: { straight line 1, straight line 2, …, straight line n, arc 1, arc 2, …, arc n }.

Fitting each section of basic line segment by using a least square method, and fitting straight lines L1, L2, … … and Ln; performing circle fitting on the circular arc R1, the circular arcs R2, … … and the circular arc Rn; the fitting process can adopt not only a least square method but also the existing following straight line fitting method and circle fitting method.

3) And calculating the average abrasive grain edge height and the dispersion of the abrasive grain edge height of the grinding wheel working layer.

①, calculating the average cutting height and the dispersion of the cutting height of the abrasive grains at the positions of all straight lines in the grinding wheel working layer, wherein the process of calculating the average cutting height and the dispersion of the cutting height of the abrasive grains at the position of the straight line at the 1 st position is as follows:

the equation for line L1 is expressed as: ax + By + C is 0, and when the linear feed amount in the X direction is a at the time of trimming, a set E of pixel edge points remaining after "trimming" (X) corresponding to a parallel line having a distance a from the straight line L1 is calculatedn,yn) If the feeding is performed f times, the average throw-out height is maximum, the throw-out height dispersion is minimum, and the dressing amount is vL1Fa (the product of f and a) is handled in a similar manner to the circular arc segment (as shown in fig. 6).

Suppose each parallel line pairThe set of points for the pixel edge points that remain after the "trimming" should be noted as E (x)n,yn) And assuming a common reference line L1 'parallel to L1 and far from LI, which is greater than fa, the equation is Ax + By + C ═ 0, then the distance between any pixel edge point in the point set and L1' is:

Figure BDA0002243208790000071

the average edge height of the trimmed abrasive particles is as follows:

Figure BDA0002243208790000072

the dispersion of the cutting height of the trimmed abrasive grains is as follows:

similarly, the trimming amounts v of the remaining straight line L2, the straight lines L3, … …, and the straight line Ln determined by the average throw-off height and the throw-off height dispersion of the abrasive grains are calculated respectivelyL2,vL3,……,vLn(ii) a The final dressing amount of the grinding wheel working layer determined by the average edge height and the edge height dispersion of the abrasive grains at all the straight line parts is

VL=max{vL1,vL2,...,vLn}。

②, calculating the average cutting height and cutting height dispersion of the abrasive particles at all circular arc positions of the grinding wheel working layer, wherein the process of calculating the average cutting height and cutting height dispersion of the abrasive particles at the position of the 1 st circular arc is as follows:

the circle center of the arc R1 fitted by the extracted outermost circle pixel edge point is marked as UR1(xR1,yR1) Radius r1In the trimming, if the circular feed amount in the X direction is b, a point set F (X) of the edge points of the residual pixel is calculatedm,ym) Average throw height and throw height dispersion to a point shifted by a distance b in the X direction from the center of circular arc R1(i.e., variance), if feeding f times, the average edge height is maximum, the dispersion of edge height is minimum, and the trimming amount is vR1Fb (product of f and b), the principle of which is shown in fig. 6, the calculation method for the optimal dressing is to collect the collection of the outermost points of the whole cross section of the outer circle of the grinding wheel working layer of the grinding wheel base body 11 to be dressed, such as the edge points 12 in fig. 6, and then to fit the edge points 12 to obtain a fitted line 13; then parallel lines 14 parallel to the fitting line 13 or with the same radius can be obtained, and the difference of the curvature centers along the feeding direction is the feeding amount b of the imaginary circle; further, at this time, if the grinding wheel is dressed and the dressing path is a parallel line, a part of the edge points should be dressed, and the dressed points may be replaced with intersection points of a connecting line (a dotted line in fig. 6) between the edge points and the center of curvature and the parallel line 14 (a perpendicular point obtained by making a perpendicular line from the edge points to the parallel line if the fitted line is a straight line), as indicated by "dressed points 15" in the figure; on the other hand, some edge points are closer to the curvature center themselves and are not modified in a certain "virtual trimming", and their positions (coordinates) are not changed, as shown by "before-trimming common points 16" in fig. 6. All the edge data points after trimming at the feed amount b can be obtained according to the above process, and the data points are composed of two parts, namely a "post-trimming point 15" and a "pre-trimming and post-trimming common point 16", and the average exposure height and the variance are calculated according to the points. The set of points for the residual pixel edge points after a certain "trim" edge is denoted as F (x)m,ym) The height of the edge of the abrasive particles is as follows:

Figure BDA0002243208790000081

the average edge height of the abrasive particles at the arc R1 part is as follows:

Figure BDA0002243208790000082

the dispersion of the cutting height of the abrasive particles at the arc R1 part is as follows:

similarly, trimming amounts v of the remaining circular arc R2, the circular arcs R3, … …, and the circular arc Rn by the average throw-off height and the throw-off height dispersion of the abrasive grains are calculated respectivelyR2,vR3,……,vRn(ii) a The final trimming amount of all arc parts of the grinding wheel working layer determined by the average edge height and the edge height dispersion of the abrasive particles is

VR=max{vR1,vR2,...,vRn}。

In the embodiment of the method, the contour line is divided into two basic line segments of a straight line segment and a circular arc segment, and then the final trimming amount V when the basic line segment is the straight line segment is obtained respectivelyLAnd the final trimming amount V when the basic line segment is the arc segmentRTaking the maximum value as the trimming amount V determined by the average edge height and the edge height dispersion1=max{VL,VR}。

In practical use, the contour line can also be only divided into straight line segments or circular arc segments. If the base line segment is a straight line segment, the final trimming amount V can be usedLI.e. V1=VLTrimming the grinding wheel working layer; if the basic line segment is a circular arc segment, the final trimming amount V can be usedRI.e. V1=VRAnd (5) finishing the grinding wheel working layer.

The invention provides a specific grinding wheel dressing process, taking an electroplated CBN grinding wheel with a Gotty arc-shaped working layer outline as an example, the outline line is only divided into arc segments in the processing process. The nominal dimensions of the wheel and the profile accuracy of the working layer are shown in table 1. The Gothic arc is formed by splicing two arcs with equal radiuses, wherein the left contact angle and the right contact angle refer to the included angle between the connecting line of a left tangent point and a right tangent point and the center of a ball of a steel ball and the vertical direction when a virtual steel ball with the radius of 3mm is tangent to the two arcs of the Gothic arc.

TABLE 1

Figure BDA0002243208790000092

First, the electroplated CBN grinding wheel is mounted on the rotating shaft 2 and fastened.

Then, the position between the rotating shaft 2 and the high-resolution CCD camera is adjusted to enable the radial section of the grinding wheel working layer to be clearly imaged on the computer 6, the position of the CCD camera at the moment is fixed, and the technical parameters of the CCD camera are shown in the table 2.

The rotating shaft 2 is arranged on the workbench 3, a hand wheel and a locking device are arranged on the workbench 3, and the hand wheel can control the workbench 3 to move along the X direction and the Y direction; in addition, the high resolution CCD camera can be manually controlled to move in the Z direction. By adjusting the mutual position between the workbench 3 and the CCD camera, the radial section of the working layer of the electroplating grinding wheel can be clearly imaged on the computer 6.

TABLE 2

Model number Port View 300
Pixel] 1280 (horizontal) × 1024 (vertical)
Visual field range [ mm ]] 5.69 (horizontal) × 4.55 (vertical)

And then, inputting the command into the computer 6, and transmitting the command to a motion controller, wherein the motion controller drives a servo system to enable the rotating shaft 2 to drive the electroplating grinding wheel to slowly rotate at least one circle at a constant speed, the CCD camera continuously shoots the original profile Image of the radial section of the working layer of the grinding wheel at a fixed frequency, the original profile Image is stored in a hard disk of the computer 6 through a graphic acquisition card in sequence, and the images are named as Image1, Image2, … … and Image.

In the computer 6, the image processing software based on Matlab is used for carrying out operation processing on each original contour image to obtain the optimal trimming amount Vbest

① the obtained original outline images Image1, Image2, Image3, … …, ImageN were subjected to a graying process.

②, overlapping all gray level images of original outline images, namely Image1, Image2, Image3, … … and ImageN, and selecting the maximum value of the gray level of the pixel points at the same position in all the images one by one to form a final overlapped pixel matrix.

③, establishing a coordinate system by taking the pixel point at the lower left corner of the superposed image as the origin of coordinates, wherein the positive X direction is horizontally towards the right, and extracting the pixel edge point of the superposed image by using a Canny operator integrated in Matlab.

④ dividing the superposed image according to the curvature change of the edge points, dividing the edge points on the left and right sides of the point into left and right parts with the highest data point as the characteristic point, fitting the left and right two groups of data points with least square circle to obtain the center positions of the left and right fitted circles, subtracting the radius of the fitted circle from the distance between each data point and the corresponding center to obtain the edge height, and considering the distance less than the radius as 0bestCalculated by VbestIs 0.025 mm. The base line segment is set to only the circular arc segment in this step.

And finally, generating a dressing path according to the theoretical profile of the radial section of the working layer of the electroplated grinding wheel, importing the dressing path into CAM software in a computer 6, generating a numerical control program by the CAM software for the dressing path, driving a servo system to control the motion of a dressing roller 5 according to the numerical control program motion controller, feeding the dressing roller along the X direction by 0.025mm, and finally completing interpolation dressing on the electroplated grinding wheel on the XY plane. The basic technical parameters of the dressing roller 5 are shown in table 3.

TABLE 3

Specification [ mm ]] 3F1 125*31.75*6*2
Abrasive grain Diamond
Abrasive grain size 170/200
Binding agents Sintered metal binder
Dressing rotational speed [ rpm] 6500

The invention also detects the profile accuracy of the grinding wheel working layer, a graphite sample wafer with the thickness of 2mm is ground by using the repaired electroplated CBN grinding wheel, and the graphite sample wafer is detected by using a profilometer HOMEL T8000RC200-400, so that the profile accuracy of the grinding wheel working layer can be reflected. The trued working layer of the grinding wheel had the accuracy shown in table 4. From table 4, it can be seen that the precision of the working layer of the grinding wheel after dressing is within the tolerance range.

TABLE 4

Figure BDA0002243208790000111

Figure BDA0002243208790000121

The embodiment of the device is as follows:

the invention provides a grinding wheel dressing device based on machine vision, which comprises a memory, a processor and a computer program, wherein the computer program is stored in the memory and can run on the processor; of course, the dressing apparatus may be incorporated into the computer 6 shown in fig. 1.

The method for dressing the grinding wheel can be used for the electroplating grinding wheel with the complex profile, and breaks through the limitation that the dressing amount cannot be accurately controlled and the dressing of the working layer of the grinding wheel is not easy to be carried out by adopting an interpolation dressing method in the prior industry due to the electroplating grinding wheel with the complex profile. According to the image processing software compiled based on Matlab, the accurate calculation method of the dressing amount based on the jumping of the working layer of the grinding wheel is provided, the profile accuracy of the working layer of the grinding wheel is ensured, and the dressing efficiency of the grinding wheel can be greatly improved.

14页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种砂轮磨损可修复装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!