Method for interpreting fine control quantity of mechanical arm and computer-readable storage medium

文档序号:1913662 发布日期:2021-12-03 浏览:16次 中文

阅读说明:本技术 机械臂精调控制量判读方法、计算机可读存储介质 (Method for interpreting fine control quantity of mechanical arm and computer-readable storage medium ) 是由 李立春 孙军 苗毅 程肖 冯晓萌 李贵良 于春红 刘晓辉 刘作成 于 2020-05-27 设计创作,主要内容包括:本发明公开了一种机械臂精调控制量判读方法、计算机可读存储介质,该方法包括:预存目标场景图像;获取实时场景图像;在所述目标场景图像和所述实时场景图像中分别提取参考尺度标志,根据两图像之间的尺度关系对所述实时场景图像进行校正得到校正图像,其中所述校正图像与所述目标场景图像的成像尺寸相同;根据所述实时场景图像与所述校正图像之间的尺度变化关系确定所述机械臂在光轴方向的待运动量;在所述目标场景图像以及所述校正图像中分别提取定位参考点,根据两个定位参考点之间的位置偏差计算所述机械臂在垂直光轴方向上的待运动量。该方法不要求相机具备精确或有效的相机位置姿态标定参数就能够测量到精确的控制量,适应性强。(The invention discloses a method for interpreting fine adjustment control quantity of a mechanical arm and a computer readable storage medium, wherein the method comprises the following steps: pre-storing a target scene image; acquiring a real-time scene image; respectively extracting reference scale marks from the target scene image and the real-time scene image, and correcting the real-time scene image according to the scale relation between the two images to obtain a corrected image, wherein the imaging size of the corrected image is the same as that of the target scene image; determining the amount of the mechanical arm to be moved in the optical axis direction according to the scale change relation between the real-time scene image and the correction image; and respectively extracting positioning reference points from the target scene image and the correction image, and calculating the amount of the mechanical arm to be moved in the direction vertical to the optical axis according to the position deviation between the two positioning reference points. The method can measure the accurate control quantity without requiring the camera to have accurate or effective camera position and posture calibration parameters, and has strong adaptability.)

1. The method for interpreting the fine adjustment control quantity of the mechanical arm is characterized by comprising the following steps of:

pre-storing a target scene image, wherein the target scene image is an image which is shot by the camera and is perpendicular to the direction of an optical axis when the mechanical arm moves to a target position;

acquiring a real-time scene image, wherein the real-time scene image is an image which is shot by the camera and is perpendicular to the optical axis direction in the current position state of the mechanical arm;

respectively extracting reference scale marks from the target scene image and the real-time scene image, and correcting the real-time scene image according to the scale relation between the two images to obtain a corrected image, wherein the imaging size of the corrected image is the same as that of the target scene image;

determining the amount of the mechanical arm to be moved in the optical axis direction according to the scale change relation between the real-time scene image and the correction image; and

and respectively extracting positioning reference points from the target scene image and the correction image, and calculating the amount of the mechanical arm to be moved in the direction vertical to the optical axis according to the position deviation between the two positioning reference points.

2. The method for interpreting the fine adjustment control quantity of the mechanical arm according to claim 1, wherein reference scale marks are respectively extracted from the target scene image and the real-time scene image, and the correcting the real-time scene image according to the scale relation between the two images comprises:

extracting a first characteristic point and a second characteristic point on the target scene image, and recording position coordinates of the first characteristic point and the second characteristic point on the target scene image;

respectively extracting the homonymous matching points of the first characteristic point and the second characteristic point on the real-time scene image, recording the position coordinates of the homonymous matching points of the first characteristic point in the real-time scene image, and recording the position coordinates of the homonymous matching points of the second characteristic point in the real-time scene image;

respectively calculating the characteristic scales of the target scene image and the real-time scene image;

calculating a scaling factor for correcting the real-time scene image; and

and calculating the corresponding relation between the corrected image and the real-time scene image according to the optical center coordinates on the real-time scene image and the scaling coefficient, and resampling based on a bilinear interpolation method to generate the corrected image corresponding to the real-time scene image.

3. The method for interpreting the fine control quantity of the mechanical arm according to claim 2, wherein the step of calculating the characteristic dimensions of the target scene image and the real-time scene image respectively comprises the steps of:

calculating the characteristic scale d of the target scene image according to the first type0Wherein the first formula is Is the position coordinate of the first characteristic point on the target scene image,is the position coordinate of the second feature point on the target scene image, an

Calculating the characteristic scale d of the real-time scene image according to a second formulaiWherein the second formula is The position coordinates of the first characteristic point on the real-time scene image,for the second feature point in the real timeLocation coordinates on the scene image.

4. The method for interpreting a fine control quantity of a mechanical arm according to claim 3, wherein calculating a scaling factor for correcting the real-time scene image comprises:

calculating the scaling factor k according to a third equationiWherein the third formula is ki=d0/di

5. The method for interpreting the fine adjustment control quantity of the mechanical arm according to claim 4, wherein the step of calculating the corresponding relationship between the corrected image and the real-time scene image according to the optical center coordinates on the real-time scene image and the scaling coefficient, and the step of generating the corrected image corresponding to the real-time scene image based on the resampling by the bilinear interpolation method comprises the steps of:

obtaining optical center coordinates P of the real-time scene imageC(xC,yC);

Determining a point p on the corrected imagei′(xi′,yi′) With image point p on said real-time scene mapi(xi,yi) Wherein the corresponding relationship is xi=xC+(xi′-xC)/kiAnd yi=yC+(yi′-yC)/ki(ii) a And

and determining the position of the same-name point on the corresponding real-time scene graph for each pixel point on the corrected image according to the corresponding relation, calculating the pixel gray value of the same-name point, and endowing the pixel gray value to the gray of the pixel point on the corrected image so as to obtain the corrected image.

6. The method for interpreting a fine adjustment control quantity of a mechanical arm according to claim 3, wherein determining the quantity to be moved of the mechanical arm in the optical axis direction according to the scale change relationship between the real-time scene image and the correction image comprises:

determine what isThe actual physical dimension D between the first characteristic point and the second characteristic point0(ii) a And is

Calculating the waiting movement amount delta Z of the mechanical arm in the optical axis direction according to a fourth formula, wherein the fourth formula isWherein f is a camera focal length of the robotic arm.

7. The method for interpreting a fine adjustment control quantity of a mechanical arm according to claim 6, wherein calculating the quantity to be moved of the mechanical arm in the direction perpendicular to the optical axis based on the positional deviation between the two positioning reference points comprises:

extracting the homonymous matching point of the first characteristic point on the corrected image, and recording the position coordinate of the homonymous matching point of the first characteristic point in the corrected image

Calculating the spatial resolution r of the target scene image according to a fifth equation0Wherein the fifth formula is r0=D0/d0(ii) a And

according to the coordinates of the first feature point on the target scene imagePosition coordinates of the homonymous matching point of the first feature point in the corrected imageAnd the spatial resolution r0And calculating the waiting movement amount delta X of the mechanical arm in the first direction and the waiting movement amount delta Y of the mechanical arm in the second direction on a plane vertical to the optical axis.

8. The method for interpreting a fine control quantity of a robot arm according to claim 7, wherein calculating the amount of waiting movement Δ X of the robot arm in a first direction on a plane perpendicular to an optical axis comprises:

calculating the amount of waiting movement DeltaX according to a sixth equation

9. The method for interpreting a fine control quantity of a robot arm according to claim 7, wherein calculating the to-be-moved quantity Δ Y of the robot arm in the second direction on the plane perpendicular to the optical axis comprises:

calculating the amount of waiting movement DeltaX according to a seventh equation

10. A computer-readable storage medium for executing the method for interpreting a fine control amount of a robot arm according to any one of claims 1 to 9.

Technical Field

The invention relates to the technical field of measurement and control, in particular to a method for interpreting fine adjustment control quantity of a mechanical arm and a computer readable storage medium.

Background

The key step in the robot arm motion control is the measurement of the relative position of the robot arm tip (manipulator or manipulator gripper) and the object of operation (grab target, loft position). One type of existing methods is to install a high-performance camera at the tail end of a mechanical arm, carry a complex hand-eye system in a mechanical arm controller, process imaging information of the hand-eye system in real time by the mechanical arm controller, and use image information of a cooperative target in a visual field for resolving. Another typical solution is to install a visual monitoring measurement or active monitoring measurement system device in a working scene outside the mechanical arm itself, process measurement data in real time by using a device and a processing system independent of the mechanical arm itself, and transmit a result of resolving a relative position relationship between the mechanical arm and an operation object to a mechanical arm control system.

The inventor finds that the relative position measuring method based on the mechanical arm hand-eye system has the defects of strong professional requirements, needs of professional staff to calibrate camera parameters and a cooperative target three-dimensional structure in advance, and has invalid working capacity after working conditions change.

The relative position measuring method based on the monitoring image calculation is often used in the field with single working environment, a target and a mechanical arm need to be imaged simultaneously by a binocular or multi-camera, the working range is small, and the precision is difficult to guarantee.

The relative position measuring method based on active sensor monitoring is characterized in that active distance measuring equipment such as a laser distance meter or an infrared active distance meter is arranged at the tail end of a mechanical arm, and the relative position between the mechanical arm and an operation target is calculated through processing of detection data of the active distance measuring equipment.

The information disclosed in this background section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.

Disclosure of Invention

The invention aims to provide a method for interpreting a fine control quantity of a mechanical arm and a computer-readable storage medium, which can measure an accurate control quantity without requiring a camera to have accurate or effective camera position and posture calibration parameters and have strong adaptability.

In order to achieve the above object, the present invention provides a method for interpreting a fine adjustment control quantity of a mechanical arm, wherein a camera is mounted on the mechanical arm, and the method for interpreting the fine adjustment control quantity of the mechanical arm comprises: pre-storing a target scene image, wherein the target scene image is an image which is shot by the camera and is perpendicular to the direction of an optical axis when the mechanical arm moves to a target position; acquiring a real-time scene image, wherein the real-time scene image is an image which is shot by the camera and is perpendicular to the optical axis direction in the current position state of the mechanical arm; respectively extracting reference scale marks from the target scene image and the real-time scene image, and correcting the real-time scene image according to the scale relation between the two images to obtain a corrected image, wherein the imaging size of the corrected image is the same as that of the target scene image; determining the amount of the mechanical arm to be moved in the optical axis direction according to the scale change relation between the real-time scene image and the correction image; and respectively extracting positioning reference points from the target scene image and the correction image, and calculating the amount of the mechanical arm to be moved in the direction vertical to the optical axis according to the position deviation between the two positioning reference points.

In an embodiment of the present invention, the extracting reference scale marks from the target scene image and the real-time scene image, respectively, and the correcting the real-time scene image according to the scale relationship between the two images includes: extracting a first characteristic point and a second characteristic point on the target scene image, and recording position coordinates of the first characteristic point and the second characteristic point on the target scene image; respectively extracting the homonymous matching points of the first characteristic point and the second characteristic point on the real-time scene image, recording the position coordinates of the homonymous matching points of the first characteristic point in the real-time scene image, and recording the position coordinates of the homonymous matching points of the second characteristic point in the real-time scene image; respectively calculating the characteristic scales of the target scene image and the real-time scene image; calculating a scaling factor for correcting the real-time scene image; and calculating the corresponding relation between the corrected image and the real-time scene image according to the optical center coordinates on the real-time scene image and the scaling coefficient, and resampling based on a bilinear interpolation method to generate the corrected image corresponding to the real-time scene image.

In an embodiment of the present invention, the calculating the feature scale of the target scene image and the feature scale of the real-time scene image respectively includes: calculating the characteristic scale d of the target scene image according to the first type0Wherein the first formula is Is the position coordinate of the first characteristic point on the target scene image,calculating the position coordinates of the second feature point on the target scene image according to a second formulaCharacteristic dimension d of the real-time scene imageiWherein the second formula is The position coordinates of the first characteristic point on the real-time scene image,and the position coordinates of the second feature point on the real-time scene image are obtained.

In an embodiment of the present invention, calculating a scaling factor for correcting the real-time scene image includes: calculating the scaling factor k according to a third equationiWherein the third formula is ki=d0/di

In an embodiment of the present invention, calculating a point correspondence relationship between the corrected image and the real-time scene image according to the optical center coordinates on the real-time scene image and the scaling factor, and generating the corrected image corresponding to the real-time scene image by resampling based on a bilinear interpolation method includes: obtaining optical center coordinates P of the real-time scene imageC(xC,yC) (ii) a Determining a point p on the corrected imagei′(xi′,yi′) With image point p on said real-time scene mapi(xi,yi) Wherein the corresponding relationship is xi=xC+(xi′-xC)/kiAnd yi=yC+(yi′-yC)/ki(ii) a And determining the position of the same-name point on the corresponding real-time scene graph for each pixel point on the corrected image according to the corresponding relation, calculating the pixel gray value of the same-name point, and endowing the pixel gray value to the gray of the pixel point on the corrected image so as to obtain the corrected image.

In one embodiment of the invention, the scale change between the real-time scene image and the corrected image is based onThe relation determination of the waiting movement amount of the mechanical arm in the optical axis direction comprises the following steps: determining an actual physical dimension D between the first feature point and the second feature point0(ii) a Calculating the waiting movement amount delta Z of the mechanical arm in the optical axis direction according to a fourth formula, wherein the fourth formula isWherein f is a camera focal length of the robotic arm.

In an embodiment of the present invention, calculating the amount of waiting movement of the robot arm in the direction perpendicular to the optical axis according to the positional deviation between the two positioning reference points includes: extracting the homonymous matching point of the first characteristic point on the corrected image, and recording the position coordinate of the homonymous matching point of the first characteristic point in the corrected imageCalculating the spatial resolution r of the target scene image according to a fifth equation0Wherein the fifth formula is r0=D0/d0(ii) a According to the coordinates of the first feature point on the target scene imagePosition coordinates of the homonymous matching point of the first feature point in the corrected imageAnd the spatial resolution r0And calculating the waiting movement amount delta X of the mechanical arm in the first direction and the waiting movement amount delta Y of the mechanical arm in the second direction on a plane vertical to the optical axis.

In an embodiment of the present invention, calculating the amount of waiting movement Δ X of the robot arm in the first direction on the plane perpendicular to the optical axis includes: calculating the amount of waiting movement DeltaX according to a sixth equation

In an embodiment of the present invention, calculating the amount of waiting movement Δ Y of the robot arm in the second direction on the plane perpendicular to the optical axis includes: calculating the amount of waiting movement DeltaX according to a seventh equation

The invention further provides a computer-readable storage medium, which is used for executing the method for judging and reading the fine control quantity of the mechanical arm in any embodiment.

Compared with the prior art, the method for interpreting the fine adjustment control quantity of the mechanical arm is used for interpreting the comparison between the current state imaging and the target position imaging under the positive visual angle of the camera carried by the mechanical arm in the fine adjustment operation process according to the target reference scale and the reference mark point, and obtaining the accurate distance of the mechanical arm operating mechanism relative to the final operating position through image correction, offset calculation and other processing. Compared with a method based on a hand-eye system and mechanical arm external monitoring image resolving, the method provided by the invention does not require a camera to have accurate or effective camera position and posture calibration parameters, can solve the problems of camera-free accurate calibration parameters and calculation of motion control amount under the condition of camera parameter failure, enhances the adaptability of the mechanical arm control process and the relative position measurement method of the operation target, can obtain a larger public view field and has stronger work adaptability compared with the relative position measurement of a stereoscopic vision requiring a binocular public view field. In addition, compared with an active sensor measuring method, the method belongs to passive measuring equipment, and the equipment is small in size, weight and energy consumption and strong in adaptability.

Drawings

FIG. 1 is a block diagram of the steps of a method for interpreting the fine control of a robot arm according to an embodiment of the present invention;

FIG. 2 is a schematic view of a camera configuration for imaging interpretation of the position of a robotic arm during motion in accordance with an embodiment of the present invention;

FIG. 3 is a schematic diagram of an image of a target scene according to an embodiment of the invention;

FIG. 4 is a schematic diagram of a real-time scene image according to an embodiment of the invention;

FIG. 5 is a schematic diagram of a corrected image according to an embodiment of the invention.

Detailed Description

The following detailed description of the present invention is provided in conjunction with the accompanying drawings, but it should be understood that the scope of the present invention is not limited to the specific embodiments.

Throughout the specification and claims, unless explicitly stated otherwise, the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element or component but not the exclusion of any other element or component.

In order to overcome the problems in the prior art, the invention provides a method for judging and reading the fine adjustment control quantity of a mechanical arm, which is characterized in that on the premise that an imaging area has a target reference scale and reference mark points (such as two characteristic points or characteristic marks of a circle, a rectangle and the like) which are arranged in a direction perpendicular to an optical axis, according to the comparison and interpretation of the current state imaging and the target position imaging under the positive viewing angle of a camera carried by the mechanical arm in the fine adjustment operation process, the accurate distance of the tail end of an operating mechanism of the mechanical arm relative to the final operating position (an operating target and a lofting target position) is obtained through image correction, offset calculation and other processing, and guide information is provided for the motion control of the mechanical arm.

Fig. 1 is a block diagram of the steps of a method for interpreting the fine control quantity of a robot arm according to an embodiment of the present invention. The method includes steps S1 to S5.

The target scene image is prestored in step S1. The target scene image is an image which is shot by the camera and is perpendicular to the optical axis direction when the mechanical arm moves to the target position state.

A real-time scene image is acquired in step S2. The real-time scene image is an image which is shot by the camera and is perpendicular to the optical axis direction in the current position state of the mechanical arm.

The real-time scene image is corrected in step S3 such that the resulting corrected image is the same as the imaged size of the target scene image. Specifically, reference scale marks are respectively extracted from the target scene image and the real-time scene image, and the real-time scene image is corrected according to a scale relation between the two images to obtain a corrected image, wherein the imaging size of the corrected image is the same as that of the target scene image. The reference scale mark can be two feature points, or one feature point and one reference scale, such as the center and radius of a mark circle.

In step S4, the amount of waiting movement of the mechanical arm in the optical axis direction is determined according to the scale change relationship between the real-time scene image and the correction image.

The amount of waiting movement of the robot arm in the direction perpendicular to the optical axis is calculated in step S5. And respectively extracting positioning reference points from the target scene image and the correction image, and calculating the amount of motion to be carried of the mechanical arm in the direction vertical to the optical axis according to the position deviation between the two positioning reference points.

Specifically, in the present embodiment, a typical arrangement of the camera for imaging and interpreting the position of the robot arm during the movement process is shown in fig. 2, the camera is mounted at the end of the robot arm, the optical axis of the camera is perpendicular to the target area plane, the camera can image the target area during the operation process, the movement control work is taken as a coordinate system O-XYZ, and the plane O-XY is parallel to the marking area plane. In the method, an identifiable target in a visual field area is selected as an interpretation auxiliary mark, and the mark can be selected as two characteristic points, such as a point P in a figure 21 0Andin other embodiments, for a scene without two feature point markers in the imaging range, the method may be implemented based on one feature point marker and one reference scale, for example, a circle marker may be implemented by referring to the steps of this embodiment by using a circle radius as the reference scale and a circle center as the feature point marker.

The specific steps of the present embodiment are as follows.

First, a target scene image and a real-time scene image are acquired. An image A of a camera carried by a robot arm for acquiring the robot arm to a target position in advance is shown in FIG. 3As shown. In the target scene image A, characteristic points are selectedAndtaking the positions of the two points as the center, selecting the images near the two points as matching templates respectivelyAndin the state i of the motion of the mechanical arm, a camera carried by the mechanical arm is used for collecting and imaging the characteristic point visual field to obtain a real-time scene image AiAs shown in fig. 4.

And then correcting the real-time scene image to obtain a corrected image. As shown in fig. 5, is a corrected image ai′. According to the characteristic points on the target scene image AAndimage coordinate position ofAnddetermining a real-time scene image AiThe initial value of the corresponding feature point position is obtained by using two template imagesAndin real time scene image AiCarrying out matching positioning to obtain the matching points with the same nameAndhaving coordinates respectively ofAndcomputing a real-time scene image AiAnd the characteristic dimension d on the target scene image AiAnd d0The calculation method comprises the following steps:calculating a scaling factor k for real-time correctioniThe method comprises the following steps: k is a radical ofi=d0/di(ii) a Calculating a corrected image Ai′With a real-time scene image AiThe point corresponding relation is resampled and generated into a real-time scene graph A based on a bilinear interpolation methodiCorresponding corrected image Ai′In particular, the real-time scene image A is acquirediOptical center coordinate P ofC(xC,yC) Determining the corrected image Ai′Point p oni′(xi′,yi′) And the real-time scene graph AiUpper image point pi(xi,yi) Wherein the corresponding relationship is xi=xC+(xi′-xC)/kiAnd yi=yC+(yi′-yC)/kiAccording to the corresponding relation, the corrected image A is processedi′Determining corresponding real-time scene graph A of each pixel pointiAnd calculating the pixel gray value of the same-name point, and assigning the pixel gray value to the corrected image Ai′Calculating the gray scale of the pixel point one by one to obtain a corrected image Ai′

Secondly, the amount of waiting movement Delta Z in the Z-axis direction (optical axis direction) is calculated according to the corresponding relation, and the calculation method is thatWherein f is the focal length of the camera, D0For the actual physical scale of the reference scale, D in this embodiment0Is two feature points P1 0Andthe actual distance value of (2).

And finally, calculating the amount of motion to be carried in the X direction and the Y direction (vertical to the optical axis direction). According to the characteristic points on the target scene image AImage coordinate position ofDetermining a corrected image Ai′The initial value of the corresponding feature point position is obtained by using the template imageIn correcting image Ai′Carrying out matching positioning to obtain the matching points with the same nameHaving coordinates ofThe amount of waiting movement Delta X of the X-direction movement is calculated byIn the formula r0The spatial resolution of the standard map is calculated as r0=D0/d0(ii) a The amount of waiting movement delta Y of the motion amount in the Y direction is calculated by the following method

Based on the same inventive concept, the present embodiment also provides a computer-readable storage medium for executing the method for interpreting the fine control amount of the mechanical arm according to the above embodiment.

In summary, according to the method for interpreting the fine adjustment control quantity of the mechanical arm in the embodiment, based on the target reference scale and the reference mark point, the comparison interpretation of the current state imaging and the target position imaging at the positive viewing angle of the camera carried by the mechanical arm in the fine adjustment operation process is performed, and the accurate distance between the tail end of the mechanical arm operation mechanism and the final operation position (the operation target and the lofting target position) is obtained through image correction, offset calculation and other processing. Compared with a method based on a hand-eye system and mechanical arm external monitoring image resolving, the method of the embodiment does not require a camera to have accurate or effective camera position and posture calibration parameters, can solve the problems of camera-free accurate calibration parameters and calculation of motion control amount under the condition that the camera parameters fail, enhances the adaptability of the mechanical arm control process and the method for measuring the relative position of the operation target, can obtain a larger public view field and has stronger work adaptability compared with the measurement of the relative position of a stereoscopic vision which needs a binocular public view field. In addition, compared with an active sensor measuring method, the method belongs to passive measuring equipment, and the equipment is small in size, weight and energy consumption and strong in adaptability.

As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.

The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. It is not intended to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable one skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims and their equivalents.

11页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:机器人安装角度校验方法、装置以及计算机存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!