Three-dimensional measuring device, control device, and robot system

文档序号:1685411 发布日期:2020-01-03 浏览:17次 中文

阅读说明:本技术 三维测量装置、控制装置及机器人系统 (Three-dimensional measuring device, control device, and robot system ) 是由 清水武士 日野真希子 于 2019-06-24 设计创作,主要内容包括:本发明提供三维测量装置、控制装置及机器人系统,所述三维测量装置使用激光进行对象物的三维测量,所述三维测量装置包括:激光照射部,配置于机器人的可动部,并向包括所述对象物的区域照射所述激光;照射控制部,控制所述激光照射部的驱动;拍摄部,拍摄被所述激光照射了的所述对象物而获取图像数据;以及点云数据生成部,基于所述图像数据,生成包括所述对象物的区域的三维点云数据,所述激光照射部包括激光光源和漫射部,所述漫射部使从所述激光光源发射的所述激光漫射。(The present invention provides a three-dimensional measuring apparatus, a control apparatus, and a robot system, the three-dimensional measuring apparatus performing three-dimensional measurement of an object using a laser, the three-dimensional measuring apparatus including: a laser irradiation unit that is disposed in a movable part of the robot and irradiates a region including the object with the laser beam; an irradiation control unit that controls driving of the laser irradiation unit; an imaging unit that images the object irradiated with the laser beam to acquire image data; and a point cloud data generation unit that generates three-dimensional point cloud data including a region of the object based on the image data, wherein the laser irradiation unit includes a laser light source and a diffusion unit that diffuses the laser light emitted from the laser light source.)

1. A three-dimensional measurement apparatus for performing three-dimensional measurement of an object using a laser beam, the three-dimensional measurement apparatus comprising:

a laser irradiation unit that is disposed in a movable part of the robot and irradiates a region including the object with the laser beam;

an irradiation control unit that controls driving of the laser irradiation unit;

an imaging unit that images the object irradiated with the laser beam to acquire image data; and

a point cloud data generation unit that generates three-dimensional point cloud data of a region including the object based on the image data,

the laser irradiation portion includes a laser light source and a diffusion portion that diffuses the laser light emitted from the laser light source.

2. The three-dimensional measuring device of claim 1,

the diffusion portion includes a mirror that scans the laser light by swinging.

3. The three-dimensional measuring device of claim 2,

the irradiation control section starts emission of the laser light after the mirror starts to swing.

4. The three-dimensional measuring device of claim 2,

the irradiation control section stops emission of the laser light before the mirror stops swinging.

5. The three-dimensional measuring device of claim 2,

the three-dimensional measuring device has a speed detecting section for detecting a moving speed of the movable section,

the irradiation control unit starts emission of the laser light after detecting that the moving speed of the movable unit is equal to or lower than a first speed by a signal from the speed detection unit.

6. The three-dimensional measuring device of claim 5,

the irradiation control unit starts emission of the laser light after the movable unit is detected to be in a stopped state by a signal from the speed detection unit.

7. The three-dimensional measuring device of claim 5,

the irradiation control unit starts the swing of the mirror after detecting that the moving speed of the movable unit is equal to or lower than a second speed which is faster than the first speed by a signal from the speed detection unit.

8. The three-dimensional measuring device of claim 1,

the diffusion portion includes a lens for diffusing the laser light.

9. The three-dimensional measuring device of claim 1,

the diffusion portion includes a diffractive optical element for diffusing the laser light.

10. A control device for controlling a laser irradiation portion that is disposed on a robot arm and irradiates a region including an object with laser light diffused by a diffusion portion, the control device comprising:

an irradiation control unit that controls driving of the laser irradiation unit; and

a speed signal receiving section that receives a signal indicating a moving speed of the robot arm,

the irradiation control unit starts emission of the laser light after the speed signal receiving unit receives a signal indicating that the moving speed is a first speed or less.

11. The control device according to claim 10,

the diffusion portion includes a mirror that scans the laser light by swinging,

the irradiation control unit emits the laser light in a state where the mirror is swung.

12. The control device according to claim 11,

the speed signal receiving unit starts the swing of the mirror after receiving a signal indicating that the moving speed is a second speed or less which is faster than the first speed.

13. A robot system, characterized in that,

a robot including a robot arm;

a laser irradiation unit that is disposed on the robot arm and irradiates a region including an object with laser light diffused by a diffusion unit;

a robot control device that controls driving of the robot arm;

an irradiation control unit that controls driving of the laser irradiation unit; and

a speed signal receiving section that receives a signal indicating a moving speed of the robot arm,

the irradiation control unit starts emission of the laser light after the speed signal receiving unit receives a signal indicating that the moving speed is a first speed or less.

14. The robotic system of claim 13,

the diffusion portion includes a mirror that scans the laser light by swinging,

the irradiation control unit emits the laser light in a state where the mirror is swung.

15. The robotic system of claim 14,

the speed signal receiving unit starts the swing of the mirror after receiving a signal indicating that the moving speed is a second speed or less which is faster than the first speed.

Technical Field

The invention relates to a three-dimensional measuring device, a control device, and a robot system.

Background

Patent document 1 describes an articulated robot equipped with a three-dimensional shape measuring device. The three-dimensional shape measuring device mounted on the articulated robot includes a measuring laser irradiator that projects a pattern on an object by scanning measuring laser light on the object, and a light receiver that acquires an image of the object on which the pattern is projected, and is configured to perform three-dimensional measurement of the object based on the image acquired by the light receiver.

Patent document 1: japanese patent laid-open publication No. 2004-333369

Disclosure of Invention

In this configuration, since the three-dimensional shape measuring device is provided in the arm of the articulated robot, the emission direction of the measurement laser light changes according to the orientation of the arm. Further, when the measurement laser light is continuously irradiated toward one point without being scanned or diffused, the energy per unit area becomes large, and therefore, when a person exists on the optical path of the measurement laser light, there is a possibility that the person may be affected.

A three-dimensional measuring apparatus according to the present invention is a three-dimensional measuring apparatus for measuring an object three-dimensionally using a laser beam, the three-dimensional measuring apparatus including: a laser irradiation unit that is disposed in a movable part of the robot and irradiates a region including the object with the laser beam; an irradiation control unit that controls driving of the laser irradiation unit; an imaging unit that images the object irradiated with the laser beam to acquire image data; and a point cloud data generation unit that generates three-dimensional point cloud data including a region of the object based on the image data, wherein the laser irradiation unit includes a laser light source and a diffusion unit that diffuses the laser light emitted from the laser light source.

The control device according to the present invention is a control device for controlling a laser irradiation unit that is disposed on a robot arm and irradiates a region including an object with laser light diffused by a diffusion unit, the control device including: an irradiation control unit that controls driving of the laser irradiation unit; and a speed signal receiving unit that receives a signal indicating a moving speed of the robot arm, wherein the irradiation control unit starts emission of the laser beam after the speed signal receiving unit receives the signal indicating that the moving speed is equal to or lower than a first speed.

The robot system of the invention is characterized in that the robot comprises a robot arm; a laser irradiation unit that is disposed on the robot arm and irradiates a region including an object with laser light diffused by a diffusion unit; a robot control device that controls driving of the robot arm; an irradiation control unit that controls driving of the laser irradiation unit; and a speed signal receiving unit that receives a signal indicating a moving speed of the robot arm, wherein the irradiation control unit starts emission of the laser beam after the speed signal receiving unit receives the signal indicating that the moving speed is equal to or lower than a first speed.

Drawings

Fig. 1 is a diagram showing an overall configuration of a robot system according to a first embodiment of the present invention.

Fig. 2 is a perspective view showing the robot.

Fig. 3 is a diagram showing the overall structure of the three-dimensional measurement apparatus.

Fig. 4 is a plan view showing an optical scanning unit included in the three-dimensional measuring apparatus shown in fig. 3.

Fig. 5 is a graph showing the moving speed of the robot arm.

Fig. 6 is a plan view showing a projection pattern projected by the laser emitting section.

Fig. 7 is a diagram showing an overall configuration of a laser irradiation unit included in a robot system according to a second embodiment of the present invention.

Fig. 8 is a plan view showing a diffractive optical element included in the laser irradiation unit shown in fig. 7.

Description of the reference numerals

1 … robotic system; 2 … robot; 21 … base; 22 … robotic arm; 221 to 226 … first to sixth arms; 24 … end effector; 251 to 256 … first to sixth driving devices; 3 … human body sensing sensor; a 31 … camera; 4 … three-dimensional measuring device; a 40 … speed detection unit; 41 … laser irradiation unit; 42 … laser light source; 43 … diffractive optical element; 431-438 … diffraction gratings; 44 … optical system; 441 … collimating lens; 442 … rod lenses; 445 … projection lens; 45 … light scanning section; 451 … movable part; 452 … a support portion; 453 … beam section; 454 … mirror; 455 … permanent magnet; 456 … coil; 457 … piezoresistor part; 47 … imaging unit; 471 … cameras; 472 … camera element; 473 … condenser lens; 48 … control device; 481 … human body sensing information receiving part; 482 … an irradiation control unit; 483 … light scanning control section; 484 … imaging control unit; 485 … point cloud data generating unit; 486 … speed signal receiving part; 49 … actuator; 5 … robot control device; 6 … host computer; 61 … calculation section; a … central axis; a C … controller; an E … encoder; j … rotating shaft; an L … laser; O1-O6 … first to sixth shafts; p … projection pattern; position P1 …; p2 … grip position; a Q1 … acceleration region; q2 … deceleration region; s … detecting a region; a V1 … first speed; a second speed of V2 …; vm … maximum speed; w … target.

Detailed Description

Hereinafter, the three-dimensional measuring apparatus, the control apparatus, and the robot system according to the present invention will be described in detail based on the embodiments shown in the drawings.

First embodiment

Fig. 1 is a diagram showing an overall configuration of a robot system according to a first embodiment of the present invention. Fig. 2 is a perspective view showing the robot. Fig. 3 is a diagram showing the overall structure of the three-dimensional measurement apparatus. Fig. 4 is a plan view showing an optical scanning unit included in the three-dimensional measuring apparatus shown in fig. 3. Fig. 5 is a graph showing the moving speed of the robot arm. Fig. 6 is a plan view showing a projection pattern projected by the laser emitting section.

The robot system 1 shown in fig. 1 is a human-coexisting robot system that coexists with a human, that is, is premised on a human working in the surroundings. Therefore, the robot system 1 is configured to detect the presence of a person within the detection range and respond accordingly.

The robot system 1 includes a robot 2, a human body sensor 3 for sensing a human body in a detection area S (detection range) set around the robot 2, a three-dimensional measuring device 4 for three-dimensionally measuring an object W using a laser beam L, a robot control device 5 for controlling the driving of the robot 2 based on the measurement result of the three-dimensional measuring device 4, and a host computer 6 communicable with the robot control device 5. When the human body sensing sensor 3 senses a person within the detection area S, the robot system 1 ensures safety of the person within the detection area S, for example, by slowing down the moving speed of the robot 2 or reducing the output of the laser light L. Hereinafter, such a robot system 1 will be described in detail.

Robot

The robot 2 is, for example, a robot that performs operations such as supplying, removing, transporting, and assembling of materials to the precision equipment and its constituent components. However, the use of the robot 2 is not particularly limited.

The robot 2 is a six-axis robot, and as shown in fig. 2, has a base 21 fixed to a floor or a ceiling, and a robot arm 22 (movable portion) movable relative to the base 21. Further, the robot arm 22 has a first arm 221 coupled to the base 21 so as to be rotatable about a first axis O1, a second arm 222 coupled to the first arm 221 so as to be rotatable about a second axis O2, a third arm 223 coupled to the second arm 222 so as to be rotatable about a third axis O3, a fourth arm 224 coupled to the third arm 223 so as to be rotatable about a fourth axis O4, a fifth arm 225 coupled to the fourth arm 224 so as to be rotatable about a fifth axis O5, and a sixth arm 226 coupled to the fifth arm 225 so as to be rotatable about a sixth axis O6. Further, a hand connecting portion is provided on the sixth arm 226, and the end effector 24 corresponding to the work performed by the robot 2 is mounted on the hand connecting portion.

The robot 2 further includes a first drive 251 for rotating the first arm 221 with respect to the base 21, a second drive 252 for rotating the second arm 222 with respect to the first arm 221, a third drive 253 for rotating the third arm 223 with respect to the second arm 222, a fourth drive 254 for rotating the fourth arm 224 with respect to the third arm 223, a fifth drive 255 for rotating the fifth arm 225 with respect to the fourth arm 224, and a sixth drive 256 for rotating the sixth arm 226 with respect to the fifth arm 225.

The first to sixth driving devices 251 to 256 each include, for example, a motor M as a driving source, a controller C for controlling the driving of the motor M, and an encoder E for detecting the amount of rotation of the corresponding arm. The first to sixth driving devices 251 to 256 are independently controlled by the robot control device 5.

The various parts of the robot 2 and the robot controller 5 as described above may communicate by wire or wirelessly, and the communication may also be performed via a network such as the internet. The same applies to the communication between the robot controller 5 and the host computer 6 and the communication between the control device 48 and the host computer 6, which will be described later.

Note that the robot 2 is not limited to the configuration of the present embodiment, and the number of arms may be 1 to 5, or 7 or more, for example. In addition, for example, the type of the robot 2 may be a scalar robot or a two-arm robot.

Human body sensing sensor

The human body sensor 3 senses a human body in a detection area S set around the robot 2, and transmits the sensing result to the three-dimensional measuring device 4 and the robot control device 5. Hereinafter, information output from the motion sensing sensor 3 when there is no person in the detection area is referred to as "no-person sensing signal", and information output from the motion sensing sensor 3 when there is a person in the detection area S is referred to as "human sensing signal".

The structure of the human body sensing sensor 3 is not particularly limited as long as such an object can be achieved. As shown in fig. 1, the motion sensor 3 of the present embodiment includes a camera 31 that is provided above the robot 2 and can capture an entire area of the detection area S, and is configured to sense a person in the detection area S based on image data captured by the camera 31.

However, the arrangement of the camera 31 is not limited to the ceiling, and may be provided on a wall, a movable or fixed stand, or a floor, for example. In addition, two or more cameras 31 may also be configured. The camera 31 may be provided to the robot 2, for example. In this case, a camera 471 described later may also be used as the camera 31. In addition, as the human body sensor 3, a weight sensor, a laser sensor, an infrared sensor, a capacitance sensor, or the like may be used.

Note that the object detected by the human body sensor 3 is not limited to a human being, and may be any object that can enter the detection area S, such as an animal other than a human being, a robot other than the robot 2, a moving body such as an AGV (automated guided vehicle), and various other movable potential devices. When these objects are present in the detection area S, they may be determined in the same manner as a human or may be determined separately from a human.

Robot control device

The robot control device 5 receives a position command of the robot 2 from the host computer 6, and independently controls the driving of the first drive device 251 to the sixth drive device 256 so that the arms 221 to 226 are positioned in accordance with the received position command.

Further, as the drive mode of the robot 2, the robot control device 5 includes: a first drive mode (high-speed drive mode) in which the drive of each of the first drive device 251 to the sixth drive device 256 is controlled so that the maximum moving speed of the robot arm 22 is equal to or lower than a first speed; and a second drive mode (low-speed drive mode) in which the drive of each of the first to sixth drive devices 251 to 256 is controlled so that the maximum moving speed of the robot arm 22 is equal to or lower than a second speed that is slower than the first speed.

When no person is present in the detection area S, that is, when the human body sensor 3 outputs the unmanned sensing signal, the robot control device 5 drives the robot 2 in the first drive mode. This thereby improves the working efficiency of the robot 2. On the other hand, when a person, i.e., the human body sensing sensor 3 outputs a human sensing signal, in the detection area S, the robot control device 5 drives the robot 2 in the second drive mode. This ensures safety of the person in the detection area S.

Such a robot control device 5 is constituted by a computer, for example, and has a processor (CPU) for processing information, a memory communicably connected to the processor, and an external interface. Various programs executable by the processor are stored in the memory, and the processor can read and execute the various programs stored in the memory.

Three-dimensional measuring device

The three-dimensional measurement device 4 detects the posture, position, and the like of the object W by a phase shift method. As shown in fig. 3, the three-dimensional measurement device 4 includes: a laser irradiation unit 41 that irradiates a region including the object W with laser light L; an imaging unit 47 that images the object W irradiated with the laser light L to acquire image data; a speed detection unit 40 that detects the moving speed of the robot arm 22; and a control device 48 that controls the driving of the laser irradiation unit 41 and the imaging unit 47, and generates three-dimensional point cloud data of the object W from the image data acquired by the imaging unit 47.

Among these components, the laser irradiation unit 41 and the imaging unit 47 are fixed to the fifth arm 225 of the robot 2. Further, the laser irradiation unit 41 is configured to irradiate the laser light L toward the distal end side (end effector 24 side) of the fifth arm 225, and the imaging unit 47 is configured to image a region including the irradiation range of the laser light L toward the distal end side (end effector 24 side) of the fifth arm 225.

Even if the arms 221 to 224, 226 other than the fifth arm 225 are movable, the relationship in which the end effector 24 is located on the distal end side of the fifth arm 225 is maintained. Therefore, by fixing the laser irradiation section 41 and the imaging section 47 to the fifth arm 225, the three-dimensional measurement device 4 can always emit the laser light L to the distal end side of the end effector 24 and can image the distal end side of the end effector 24. Therefore, regardless of the posture when the end effector 24 is intended to grip the object W, that is, the posture in which the end effector 24 faces the object W, the laser light L can be irradiated toward the object W in the posture and the object W can be imaged. Therefore, the three-dimensional measurement of the object W can be performed more reliably. However, the arrangement of the laser irradiation unit 41 and the imaging unit 47 is not particularly limited, and may be fixed to the first to fourth arms 221 to 224 and the sixth arm 226, for example.

The laser irradiation unit 41 has a function of projecting a predetermined projection pattern P onto the object W by irradiating the object W with the laser light L. Such a laser irradiation unit 41 includes a laser light source 42 that emits laser light L, an optical system 44 including a plurality of lenses through which the laser light L passes, and a light scanning unit 45 that scans the laser light L passing through the optical system 44 toward the object W.

The laser light source 42 is not particularly limited, and for example, a semiconductor laser such as a Vertical Cavity Surface Emitting Laser (VCSEL) or an external resonator type vertical surface emitting laser (VECSEL) can be used. The wavelength of the laser beam L is not particularly limited, and may be in the visible region (400 to 700nm), the invisible region (400nm or less, 1400nm to 1mm), or the near infrared region (700 to 1400 nm). However, the wavelength of the laser light L is preferably in the visible region (400 to 700 nm). In the visible region, even if the laser light L enters the eyes of a person who coexists with the robot 2, the person immediately feels dazzling and exhibits a defensive response to blinking. Therefore, the robot system 1 is safer by setting the wavelength of the laser light L to the visible region.

The optical system 44 is a diffusion optical system, and constitutes a diffusion portion. The optical system 44 includes: a collimator lens 441 that collimates the laser light L emitted from the laser light source 42; and a rod lens 442 (a lens diffusing in a one-dimensional direction), and makes the laser light L collimated by the collimator lens 441 into a linear shape extending in a direction (depth direction of the paper surface in fig. 3) parallel to a rotation axis J described later.

The optical scanning unit 45 has a function of scanning the laser light L formed in a linear shape by the rod lens 442. This enables two-dimensional (planar) irradiation of the laser beam L. The optical scanning unit 45 is not particularly limited, and for example, MEMS (Micro electro mechanical Systems), galvano mirrors, polygon mirrors, and the like can be used. The light scanning unit 45 also functions as a diffusing unit that diffuses the linear laser light L into a planar shape.

The optical scanning unit 45 of the present embodiment is formed of a MEMS. As shown in fig. 4, the optical scanning unit 45 includes a movable portion 451, a support portion 452 supporting the movable portion 451, a beam portion 453 connecting the movable portion 451 and the support portion 452 and making the movable portion 451 rotatable about a rotation axis J with respect to the support portion 452, a mirror 454 disposed on the front surface (the surface on the near side of the paper surface in fig. 4) of the movable portion 451 and reflecting the laser light L, a permanent magnet 455 disposed on the back surface (the surface on the far side of the paper surface in fig. 4) of the movable portion 451, and a coil 456 disposed opposite to the permanent magnet 455. The movable portion 451, the support portion 452, and the beam portion 453 are integrally formed of, for example, a silicon substrate.

The optical scanning unit 45 is disposed such that the rotation axis J substantially coincides with the extending direction of the linear laser light L. When a drive signal (alternating voltage) is applied to the coil 456, the movable portion 451 rotates around the rotation axis J, thereby scanning the linear laser light L.

The optical scanning unit 45 further includes a piezoelectric resistor 457 provided on the support portion 452. In the piezoelectric resistance portion 457, the resistance value changes according to the stress generated in the support portion 452 as the movable portion 451 rotates about the rotation axis J. Therefore, in the optical scanning unit 45, the rotation angle of the movable portion 451 can be detected based on the change in the resistance value of the piezoelectric resistance portion 457. The piezoelectric resistance portion 457 can be formed by doping (diffusing or implanting) a silicon substrate with an impurity such as phosphorus or boron.

The laser irradiation unit 41 has been described above. As described above, in such a laser irradiation section 41, the laser light L is two-dimensionally diffused by the optical system 44 (diffusion section) and the light scanning section 45 (diffusion section). This enables three-dimensional measurement of the object W to be described later. Further, since the laser light L is diffused and irradiated, the intensity of the laser light L, that is, the energy per unit time in each region to which the laser light L can be irradiated decreases as the distance from the laser irradiation unit 41, in other words, the optical path length of the laser light L increases. By adopting such a configuration, adverse effects due to the high-intensity laser light L entering the human eye can be more effectively suppressed. Therefore, the robot system 1 is safer for a person who coexists with the robot 2.

The configuration of the laser irradiation unit 41 is not particularly limited as long as the predetermined projection pattern P can be projected onto the object W. For example, in the present embodiment, the laser light L is diffused in a linear shape by the optical system 44, but the present invention is not limited thereto, and for example, the laser light L may be diffused in a linear shape by using a MEMS or a current mirror. That is, the laser light L can be two-dimensionally scanned using two light scanning sections (the same light scanning section as the light scanning section 45) whose diffusion directions are different from each other. In addition, for example, the laser light L may be two-dimensionally scanned using a gimbal-type MEMS having biaxial degrees of freedom. These also constitute the diffuser portion of the present invention.

The imaging unit 47 images a state in which the projection pattern P is projected on at least one object W. That is, the imaging unit 47 images at least one object W including the projection pattern P. As shown in fig. 3, the image pickup unit 47 is constituted by, for example, a camera 471, and the camera 471 includes an image pickup element 472 such as a CMOS image sensor or a CCD image sensor, and a condenser lens 473.

The speed detection unit 40 detects the moving speed of the robot arm 22 based on the output from the encoder E included in each of the first to sixth driving devices 251 to 256. Here, the "moving speed of the robot arm 22" may be a moving speed of the robot arm 22 at an arbitrary position, but is preferably a moving speed at a position where the three-dimensional measurement device 4 is arranged. In the present embodiment, since the three-dimensional measuring device 4 is disposed on the fifth arm 225, for convenience of explanation, the moving speed of the robot arm 22 will be described below as having the same meaning as the moving speed of the fifth arm 225.

The configuration of the speed detecting unit 40 is not particularly limited, and the moving speed of the robot arm 22 may be detected based on a position command transmitted from the host computer 6 to the robot control device 5, for example. In addition, for example, the speed detecting section 40 has an inertial sensor disposed on the fifth arm 226, and can detect the moving speed of the robot arm 22 based on inertia applied to the inertial sensor. The inertial sensor is not particularly limited, and for example, a composite sensor having a three-axis acceleration sensor that can detect acceleration along three axes orthogonal to each other, that is, the X-axis, the Y-axis, and the Z-axis, and a three-axis angular velocity sensor that can detect angular velocity around each of the X-axis, the Y-axis, and the Z-axis can be used.

As shown in fig. 3, the control device 48 includes a human body sensing information receiving unit 481 that receives information from the human body sensing sensor 3, an irradiation control unit 482 that controls driving of the laser light source 42, a light scanning control unit 483 that controls driving of the light scanning unit 45, an imaging control unit 484 that controls driving of the imaging unit 47, a point cloud data generating unit 485 that generates three-dimensional point cloud data of an area including the object W based on image data acquired by the imaging unit 47, and a speed signal receiving unit 486 that receives a signal from the speed detecting unit 40.

Such a control device 48 is constituted by a computer, for example, and has a processor (CPU) for processing information, a memory communicably connected to the processor, and an external interface. Various programs executable by the processor are stored (stored) in the memory, and the processor can read and execute the various programs stored in the memory.

The optical scanning control section 483 controls the driving of the optical scanning section 45 by applying a driving signal to the coil 456. Further, the optical scanning control section 483 detects the rotation angle of the movable section 451 based on the change in the resistance value of the piezoelectric resistance section 457. The optical scanning control section 483 drives the movable section 451 in a non-resonant manner. That is, the light scanning control section 483 applies a drive signal having a frequency sufficiently distant from the resonance frequency of the vibration system including the movable section 451 and the beam section 453 to the coil 456. Thus, compared to the case where the movable portion 451 is driven by resonance, the waveform, amplitude, frequency, and the like of the movable portion 451 can be freely controlled. However, the light scanning control section 483 may drive the movable section 451 in a resonant manner.

Further, after the speed signal receiving section 486 detects that the moving speed of the robot arm 22 is equal to or lower than the second speed V2 by the signal from the speed detecting section 40, the optical scanning control section 483 starts driving of the optical scanning section 45, that is, swinging of the movable section 451. More specifically, as shown in fig. 5, when the robot arm 22 moves from the current position P1 to the gripping position P2 for gripping the object W to grip the object W, at least an acceleration region Q1 in which the robot arm 22 accelerates and a deceleration region Q2 in which the robot arm 22 decelerates are generated. In the deceleration region Q2, the light scanning control section 483 starts driving of the light scanning section 45 after the moving speed of the fifth arm 225 is the second speed V2 or less. However, the second speed V2 is greater than 0 (zero) and less than the maximum speed Vm of the robotic arm 22. Namely, the relationship 0 < V2 < Vm is satisfied.

This can shorten the driving time of the optical scanning unit 45, for example, compared to the case where the optical scanning unit 45 is always driven. Therefore, power saving of the robot system 1 can be achieved. In addition, compared to the case where the driving of the optical scanning unit 45 is started after the stop of the robot arm 22, the time required from the stop of the robot arm 22 to the start of the three-dimensional measurement of the object W can be shortened. Therefore, the working efficiency of the robot 2 is improved. The second speed V2 is not particularly limited, and may be, for example, 10mm/s or more and 100mm/s or less. Note that the stop of the robot arm 22 means that the driving of each of the first to sixth driving devices 251 to 256 is stopped and the vibration (residual vibration) remaining after the stop is not considered.

As another example, the light scanning control section 483 may start the driving of the light scanning section 45 based on the timing. Specifically, the light scanning control section 483 may start driving the light scanning section 45 a predetermined time before the time when the robot arm 22 becomes the gripping position P2. For example, if based on a position command transmitted from the host computer 6, the time at which the gripping position P2 is reached can be calculated. Therefore, immediately before the time when the grip position P2 is reached, the optical scanning control section 483 starts driving the optical scanning section 45. This can shorten the driving time of the optical scanning unit 45, compared to the case where the optical scanning unit 45 is always driven. Therefore, power saving of the robot system 1 can be achieved. In addition, compared to the case where the driving of the optical scanning unit 45 is started after the completion of the movement of the robot arm 22, the time required from the stop of the robot arm 22 to the start of the three-dimensional measurement of the object W can be shortened. Therefore, the working efficiency of the robot 2 is improved. In addition, the timing of starting the driving of the optical scanning section 45 is not particularly limited, but is, for example, preferably one second before the timing of changing to the gripping position P2, and more preferably 0.5 second before.

The irradiation control unit 482 controls the driving of the laser light source 42 by applying a drive signal to the laser light source 42. The irradiation control section 482 emits laser light L from the laser light source 42 in synchronization with the rotation of the movable section 451 detected by the light scanning control section 483, and forms a projected pattern P of ribs represented by light and dark of a luminance value on the object W, for example, as shown in fig. 6.

The irradiation control section 482 has, as the output mode of the laser light L, a first output mode in which the laser light L is emitted at a first output and a second output mode in which the laser light L is emitted at a second output lower than the first output, and can select these modes. Further, the irradiation control section 482 may also have a third output mode in which the laser light L is emitted at a third output lower than the second output, a fourth output mode in which the laser light L is emitted at a fourth output lower than the third output, and the like. That is, the number of output modes that the irradiation control unit 482 has is not particularly limited.

The first output is not particularly limited, and for example, it is preferable that the intensity immediately after emission from the laser light source 42, i.e., immediately after the emission port, be about class 2, class 2M, class 3R as shown in japanese industrial standard "JIS C6802". This makes it possible to project a sufficiently bright projection pattern P on the object W, and the imaging unit 47 can acquire image data having sufficient brightness and contrast with a short exposure time. On the other hand, the second output is not particularly limited, and for example, it is preferable that the intensity immediately after emission from the laser light source 42 is a level 1 or less as shown in japanese industrial standard "JIS C6802". Thereby, the intensity of the laser light L emitted in the second output mode is of a sufficiently safe level for detecting persons within the area S.

Here, after the movable portion 451 starts rotating, the irradiation control portion 482 emits the laser light L from the laser light source 42. Preferably, the irradiation control section 482 emits the laser light L from the laser light source 42 after the amplitude (the rotation angle about the rotation axis J) of the movable section 451 is equal to or greater than a predetermined magnitude. For example, when the laser light L is emitted in a state where the movable portion 451 does not rotate and the posture of the movable portion is constant, the laser light L is continuously emitted to the same position. If a person's eye is present on the optical path of the laser light L, the laser light L continues to enter the person's eye, and the eye may be affected by the intensity of the laser light L or the like. On the other hand, if the movable portion 451 starts rotating before the laser light L is emitted, the laser light L is scanned and does not continue to be irradiated to the same position. Therefore, the above-described problem is not easily generated, so that the robot system 1 is safer. Note that the "predetermined size" is not particularly limited, and is, for example, preferably 30 ° or more, and more preferably 45 ° or more. This makes the above-described effect more remarkable.

Before the movable portion 451 stops rotating, the emission control portion 482 stops the emission of the laser light L from the laser light source 42. Preferably, the irradiation control section 482 emits the laser light L from the laser light source 42 until the amplitude (the rotation angle around the rotation axis J) of the movable section 451 becomes a predetermined magnitude or less. This allows the laser beam L to be always scanned by the optical scanning unit 45. Therefore, the laser light L does not continuously enter the eyes of the person, so that the robot system 1 is safer.

The irradiation control unit 482 may determine whether or not the movable unit 451 rotates based on whether or not the light scanning control unit 483 applies a drive signal to the coil 456, but more preferably determines whether or not the movable unit 451 rotates based on a change in the resistance value of the piezoelectric resistance unit 457. For example, it is considered that the movable portion 451 may not start rotating even if a drive signal is applied to the coil 456 due to a failure or disconnection of the optical scanning unit 45. On the other hand, if the resistance value of the piezoelectric resistance portion 457 changes, the resistance value of the piezoelectric resistance portion 457 does not change when the movable portion 451 does not start rotating, and therefore it can be reliably confirmed that the movable portion 451 starts rotating. Further, since the resistance value of the piezoelectric resistance portion 457 varies depending on the amplitude (rotation angle) of the movable portion 451, the amplitude of the movable portion 451 can be easily detected.

Further, the irradiation control section 482 starts emission of the laser light L after detecting that the moving speed of the robot arm 22 is the first speed V1 or less slower than the second speed V2 based on the signal from the speed detecting section 40. As described above, since the driving of the optical scanning unit 45 is started after the moving speed of the robot arm 22 becomes equal to or lower than the second speed V2, the laser light L can be emitted in a state where the movable unit 451 rotates more reliably by emitting the laser light L after the first speed V1 becomes slower than the second speed V2. Therefore, the robot system 1 is safer. Note that the first speed V1 is not particularly limited, but is, for example, preferably 0.75 or less of the second speed V2, and more preferably 0.5 or less of the second speed V2. Thus, the time of deceleration to the second speed V2 and the time of deceleration to the first speed V1 can be sufficiently shifted, and the movable part 451 can be more reliably brought into a rotated state before the time of deceleration to the first speed V1.

In particular, it is preferable that the irradiation control section 482 starts emission of the laser light L after detecting that the robot arm 22 is in a stopped state according to the signal from the speed detection section 40. This effectively suppresses the laser light L from being emitted in an undesired direction.

The photographing control unit 484 controls driving of the photographing unit 47 (camera 471). Here, the projection pattern P is projected four times with a shift of pi/2, and the imaging control unit 484 images the object W on which the projection pattern P is projected by the imaging unit 47 each time. However, the number of times of projection of the projection pattern P is not particularly limited as long as the number of phases can be calculated from the imaging result. In addition, the same projection and imaging may be performed using a pattern with a large pitch or a pattern with a small pitch, and phase connection may be performed. As the pitch type increases, the measurement range and resolution can be improved, but the more the number of times of photographing, the more time required to acquire image data, and thus the working efficiency of the robot 2 decreases. Therefore, the number of projection times of the projection pattern P may be appropriately set according to the accuracy of the three-dimensional measurement and the balance between the measurement range and the work efficiency of the robot 2.

The point cloud data generation unit 485 generates three-dimensional point cloud data of a region including the object W from the plurality of image data acquired by the imaging unit 47 by using a phase shift method. Then, the three-dimensional point cloud data generated by the point cloud data generation unit 485 is transmitted to the host computer 6.

The host computer 6 includes a calculation unit 61, and the calculation unit 61 calculates three-dimensional information including the posture, position (spatial coordinates), and the like of the object W based on the three-dimensional point cloud data received from the point cloud data generation unit 485. For example, the posture and position of the object W can be calculated by storing information relating to the shape of the object W in the calculation unit 61 in advance and matching the three-dimensional point cloud data with the shape of the object W. However, the present invention is not limited to this, and the shape of the object W may be acquired from the three-dimensional point cloud data.

The host computer 6 generates a position command of the robot 2 based on the calculated three-dimensional information of the object W, and transmits the generated position command to the robot controller 5. The robot controller 5 independently drives the first to sixth driving devices 251 to 256, respectively, based on the position command received from the host computer 6, and moves the first to sixth arms 221 to 226 to the instructed positions.

In the present embodiment, the host computer 6 includes the calculation unit 61, but is not limited to this, and for example, the three-dimensional measurement device 4 and the robot control device 5 may include the calculation unit 61 or may include other devices.

The configuration of the robot system 1 is described above. Next, a control method of the robot system 1 will be described. First, a case where no person is present in the detection region S, that is, a case where the human body sensing information receiving unit 481 receives an unmanned sensing signal from the human body sensing sensor 3 will be described. In this case, there is no person in the detection area S, and therefore the high-intensity laser light L is hardly likely to enter the eyes of the person. Therefore, when performing three-dimensional measurement of the object W, the irradiation control unit 482 emits the laser light L at the first output. In addition, in the case where there is no human in the detection area S, the robot 2 and the human are less likely to collide. Therefore, the robot controller 5 drives the first to sixth arms 221 to 226 in the first drive mode.

Next, a case where a person is present in the detection region S, that is, a case where the human body sensing information receiving unit 481 receives a human body sensing signal from the human body sensing sensor 3 will be described. In this case, since a person is present in the detection area S, there is a possibility that the laser light L enters the eyes of the person. Therefore, when performing three-dimensional measurement of the object W, the irradiation control unit 482 emits the laser light L in the second output mode having a lower intensity than the first output mode. In addition, when a person is present in the detection area S, there is a possibility that the robot 2 and the person collide with each other. Therefore, the robot controller 5 drives the first to sixth arms 221 to 226 in the second drive mode in which the maximum movement speed is slower than in the first drive mode.

In this way, when there is no person in the detection area S, the laser light L of high intensity is used as compared with the case where there is a person in the detection area S. This makes it possible to project a brighter projection pattern P onto the object W, and to perform three-dimensional measurement of the object W with higher accuracy. In addition, the exposure time of the camera 471 (in other words, the shutter speed is increased) corresponding to the brightness of the projection pattern P can be shortened, and the time required for three-dimensional measurement of the object W can be shortened. Therefore, the working efficiency of the robot 2 is improved. On the other hand, when there is a person in the detection region S, the laser light L of low intensity is used as compared with the case where there is no person in the detection region S. Thereby, the output of the laser light L is reduced to a level that is safe even for the eyes of a person who enters the detection area S, and therefore the safety of the person can be ensured.

The exposure time of the camera 471 in this case is not particularly limited, and may be, for example, the same exposure time as when no person is present in the detection area S or a longer exposure time than when no person is present in the detection area S. When a person is present in the detection region S, the projected pattern P is darker than when no person is present in the detection region S. Therefore, in the former case, the accuracy of three-dimensional measurement of the object W is lowered as compared with the case where no person is present in the detection region S, but the time required for three-dimensional measurement of the object W is substantially equal. On the other hand, in the latter case, since the exposure amount can be set to substantially the same amount as in the case where no person is present in the detection region S, the accuracy of three-dimensional measurement of the object W is substantially the same, but the time required for three-dimensional measurement of the object W becomes long. Therefore, in consideration of the balance between the accuracy of the three-dimensional measurement of the object W and the operation efficiency of the robot 2, the exposure time of the camera 471 may be appropriately set.

The control method of the robot system 1 is not limited to this, and for example, the irradiation control section 482 may stop the emission of the laser light L when there is a person in the detection area S. As described above, when the detection area S has the third output mode, the fourth output mode, or the like, the irradiation control unit 482 may switch the output mode of the laser light L to the third output mode or the fourth output mode if a person enters the detection area S. In addition, when there is a person in the detection area S, the robot controller 5 may stop driving the first arm 221 to the sixth arm 226, respectively. Thereby, the robot system 1 is safer for detecting a person within the area S.

In the above configuration, the output of the laser light L is changed depending on whether or not a person is present in the detection region S, but the drive mode of the optical scanning unit 45 may be changed instead of or simultaneously with this. For example, as the driving mode of the light scanning section 45, the light scanning control section 483 has a first rotation mode in which the rotation angle of the movable section 451 around the rotation axis J is a first angle θ 1, and a second rotation mode in which the rotation angle of the movable section 451 around the rotation axis J is a second angle θ 2 larger than the first angle θ 1, and can select these modes. Further, for example, by changing the intensity (amplitude) of the drive signal applied to the coil 456, the first rotation mode and the second rotation mode can be easily switched.

When no person is present in the detection region S, the light scanning control section 483 drives the light scanning section 45 in the first rotation mode. Thus, the scanning range of the laser beam L is narrowed as compared with the case of driving in the second rotation mode, and the laser beam L can be efficiently irradiated onto the object W. Therefore, the three-dimensional measurement of the object W can be performed with higher accuracy. On the other hand, when a person is present in the detection area S, the light scanning control section 483 drives the light scanning section 45 in the second rotation mode. Thereby, the scanning range of the laser light L is widened compared with the case of driving in the first rotation mode, and therefore, even in the case where there is an eye of a person in the scanning range of the laser light L, the energy of the laser light L entering the eye can be reduced. Therefore, the robot system 1 is safer for a person who coexists with the robot 2.

In addition, as the driving modes of the light scanning section 45, the light scanning control section 483 has a first frequency mode in which the frequency of the driving signal is the first frequency F1, and a second frequency mode in which the frequency of the driving signal is the second frequency F2 higher than the first frequency F1, and can select these modes. When no person is present in the detection area S, the light scanning control section 483 drives the light scanning section 45 in the first frequency mode. As a result, the scanning speed of the laser beam L is slower than that in the case of driving in the second frequency mode, and the object W can be efficiently irradiated with the laser beam L. Therefore, the three-dimensional measurement of the object W can be performed with higher accuracy. On the other hand, when a person is detected in the detection area S, the light scanning control section 483 drives the light scanning section 45 in the second frequency mode. Thus, the scanning speed of the laser light L becomes faster than that in the case of driving in the first frequency mode, and therefore, even in the case where there is an eye of a person within the scanning range of the laser light L, the energy of the laser light L entering the eye can be reduced. Therefore, the robot system 1 is safer for a person who coexists with the robot 2.

The robot system 1 as described above is a system including, as described above: a robot 2 including a robot arm 22; a laser irradiation unit 41 that is disposed on the robot arm 22 and irradiates a region including the object with the laser light L diffused by the optical system 44 and the optical scanning unit 45 as a diffusion unit; a robot control device 5 that controls driving of the robot arm 22; an irradiation control unit 482 that controls driving of the laser irradiation unit 41; and a speed signal receiving part 486 for receiving a signal indicating the moving speed of the robot arm 22. Then, after the velocity signal receiving part 486 receives a signal indicating that the moving velocity of the robot arm 22 is equal to or lower than the first velocity V1, the irradiation control part 482 starts emission of the laser light L. In this way, by diffusing the laser light L by the optical system 44 and the light scanning section 45, the intensity of the laser light L (irradiation energy per unit time in each region where the laser light L can be irradiated) decreases as it goes away from the laser irradiation section 41, in other words, the optical path length of the laser light L becomes. Therefore, the high-intensity laser light L can be more effectively suppressed from entering the eyes of the human, so that the robot system 1 is safe for the human coexisting with the robot 2. In addition, power-saving driving of the robot system 1 becomes possible as compared with the case where the laser light L is emitted all the time. In addition, compared to the case where the laser light L is emitted after the robot arm 22 is stopped, the time from when the robot arm 22 is stopped to when the three-dimensional measurement of the object W is started can be shortened.

As described above, the three-dimensional measurement apparatus 4 for performing three-dimensional measurement of the object W using the laser light L includes: a laser irradiation unit 41 that is disposed on the robot arm 22 (movable unit) of the robot 2 and irradiates a region including the object W with laser light L; an irradiation control unit 482 that controls driving of the laser irradiation unit 41; an imaging unit 47 that images the object W irradiated with the laser light L to acquire image data; and a point cloud data generation unit 485 configured to generate three-dimensional point cloud data of a region including the object W based on the image data. The laser irradiation unit 41 includes a laser light source 42, an optical system 44 serving as a diffusion unit for diffusing the laser light L emitted from the laser light source 42, and a light scanning unit 45. In this way, by diffusing the laser light L by the diffusion portion, the intensity of the laser light L decreases as it goes away from the laser irradiation portion 41, in other words, the optical path length of the laser light L becomes. Therefore, the high-intensity laser light L can be more effectively suppressed from entering the eyes of the human, so that the robot system 1 is safe for the human coexisting with the robot 2.

Note that the "diffusion" means that, for example, the irradiation range of the laser light L is expanded in the emission direction of the laser light L with the optical axis of the laser light L kept constant, or the irradiation range of the laser light L is expanded in the emission direction of the laser light L by changing the optical axis of the laser light L.

In addition, as described above, the diffusion portion includes the mirror 454 that scans the laser light L by swinging. Thereby, the structure of the diffusion portion is simplified. As described above, the diffusion unit further includes the rod lens 442 as a lens for diffusing the laser light L. Thereby, the structure of the diffusion portion is simplified.

Further, as described above, after the mirror 454 starts to swing, the irradiation control section 482 starts emission of the laser light. This prevents the laser beam L from being continuously irradiated to the same position. Therefore, the robot system 1 is safer. As described above, the irradiation control unit 482 stops the emission of the laser beam L before the mirror 454 stops swinging. This prevents the laser beam L from being continuously irradiated to the same position. Therefore, the robot system 1 is safer.

Further, as described above, the irradiation control unit 482 has the speed detection unit 40 that detects the moving speed of the robot arm 22, and starts emission of the laser light L after detecting that the moving speed of the robot arm 22 is equal to or lower than the first speed V1 based on the signal from the speed detection unit 40. Thereby, power-saving driving of the robot system 1 becomes possible as compared with the case where the laser light L is emitted all the time. In addition, compared to the case where the laser light L is emitted after the robot arm 22 is stopped, the time from when the robot arm 22 is stopped to when the three-dimensional measurement of the object W is started can be shortened.

In addition, as described above, the irradiation control section 482 may also start emission of the laser light L after detecting that the robot arm 22 is in the stopped state by the signal from the speed detection section 40. Thereby, the emission direction of the laser light L is kept constant, and the irradiation range is narrowed, so the robot system 1 is safer.

As described above, after detecting that the moving speed of the robot arm 22 is equal to or lower than the second speed V2 which is faster than the first speed V1 based on the signal from the speed detecting unit 40, the irradiation control unit 482 starts the swinging of the mirror 454. Thereby, power-saving driving of the robot system 1 becomes possible as compared with the case where the mirror 454 is always swung. In addition, compared to the case where the mirror 454 is swung after the robot arm 22 is stopped, the time from when the robot arm 22 is stopped to when the three-dimensional measurement of the object W is started can be shortened.

As described above, the control device 48 for controlling the laser irradiation unit 41 that is disposed on the robot arm 22 and irradiates the region including the object W with the laser light L diffused by the diffusion unit includes: an irradiation control unit 482 that controls driving of the laser irradiation unit 41; and a speed signal receiving part 486 for receiving a signal indicating the moving speed of the robot arm 22. Then, after the velocity signal receiving part 486 receives a signal indicating that the moving velocity of the robot arm 22 is equal to or lower than the first velocity V1, the irradiation control part 482 starts emission of the laser light L. By diffusing the laser light L by the diffusion portion, the intensity of the laser light L (irradiation energy per unit time in each region where the laser light L can be irradiated) decreases as it goes away from the laser irradiation portion 41, in other words, the optical path length of the laser light L becomes. Therefore, the high-intensity laser light L can be more effectively suppressed from entering the eyes of the human, so that the robot system 1 is safe for the human coexisting with the robot 2. In addition, power-saving driving of the robot system 1 becomes possible as compared with the case where the laser light L is emitted all the time. In addition, compared to the case where the laser light L is emitted after the robot arm 22 is stopped, the time from when the robot arm 22 is stopped to when the three-dimensional measurement of the object W is started can be shortened.

In addition, as described above, the diffusion portion includes the mirror 454 that scans the laser light L by swinging. Then, the irradiation control unit 482 emits the laser light L in a state where the mirror 454 swings. This prevents the laser beam L from being continuously irradiated to the same position. Therefore, the robot system 1 is safer.

As described above, when the speed signal receiving unit 486 receives a signal indicating that the moving speed of the robot arm 22 is equal to or lower than the second speed V2 which is higher than the first speed V1, the mirror 454 starts to swing. Thereby, power-saving driving of the robot system 1 becomes possible as compared with the case where the mirror 454 is always swung. In addition, compared to the case where the mirror 454 is swung after the robot arm 22 is stopped, the time from when the robot arm 22 is stopped to when the three-dimensional measurement of the object W is started can be shortened.

Second embodiment

Fig. 7 is a diagram showing an overall configuration of a laser irradiation unit included in a robot system according to a second embodiment of the present invention. Fig. 8 is a plan view showing a diffractive optical element included in the laser irradiation unit shown in fig. 7.

In the following description, differences between the robot system according to the second embodiment and the above-described embodiments will be mainly described, and descriptions of the same matters will be omitted. The robot system 1 of the second embodiment is basically the same as the robot system 1 of the first embodiment described above, except for the configuration of the laser irradiation section. In fig. 7 and 8, the same components as those of the above embodiment are denoted by the same reference numerals.

As shown in fig. 7, the laser irradiation unit 41 of the present embodiment includes a laser light source 42 that emits laser light L, an optical system 44 including a projection lens 445, a diffractive optical element 43 located between the laser light source 42 and the projection lens 445, and an actuator 49 that rotates the diffractive optical element 43 about a central axis a. As shown in FIG. 8, the diffractive optical element 43 has eight pattern diffraction gratings 431-438 arranged around the central axis A. By rotating the diffractive optical element 43 about the center axis a, the predetermined diffraction gratings 431 to 438 can be positioned on the optical path of the laser light L. It should be noted that, although not shown, the diffraction gratings 431 to 434 are in the form of stripes in which the phases of the projection patterns on the projection surface are shifted by π/2. The diffraction gratings 435 to 438 are in the form of stripes in which the phases of the projection patterns on the projection surface are shifted by pi/2, respectively, and the pitch of the projection patterns on the projection surface is twice the pitch of the projection patterns of the diffraction gratings 431 to 434.

In the laser irradiation unit 41 having such a configuration, the diffraction gratings 431 to 438 are sequentially positioned on the optical path of the laser beam L, thereby sequentially projecting the eight projection patterns P onto the object W.

In such a second embodiment, the laser irradiation section 41 has a diffractive optical element 43 as a diffusion section that diffuses the laser light L. In this way, by diffusing the laser light L, the intensity of the laser light L decreases as it goes away from the laser irradiation section 41, and therefore the three-dimensional measuring apparatus 4 is safer.

The three-dimensional measuring device, the control device, and the robot system according to the present invention have been described above based on the illustrated embodiments, but the present invention is not limited thereto, and the configurations of the respective portions may be replaced with any configurations having the same function. In addition, other arbitrary structures may be added to the present invention.

23页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种基于空域二值编码的结构光三维测量方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!